ChatGPT Message Cap (GPT-4 Usage Limit)

GPT-4 has been available for ChatGPT Plus subscribers since March 14, 2023 – and it’s a significant improvement over GPT-3.5 in quality of output (noticeable improvement in reasoning ability & conciseness).

Really the 2 most noteworthy drawbacks associated with GPT-4 at present moment are: (1) speed (it’s slothy compared to GPT-3.5) & (2) usage limits (message caps over a several hour period & token limits).

I expect both these drawbacks to be addressed over the next few months (similar to the rapid progress observed between GPT-3 & GPT-3.5).

Eventually, I think that the message cap will decrease significantly (such that users are allowed to send more messages in a smaller time-window) OR end completely for Plus subscribers.

ChatGPT Message Cap (Usage Limit)

After logging into my ChatGPT Plus account and selecting “GPT-4” as the model for my queries, directly above the prompt box is the following notice:

“GPT-4 currently has a cap of 25 messages every 3 hours. Expect significantly lower caps, as we adjust for demand.”

Before the recent cap of 25 messages every 3 hours, GPT-4 had caps of:

  • “50 messages every 4 hours”
  • “100 messages every 4 hours”

Obviously many people are frustrated with the cap fluctuations – especially after the drop to 25 messages every 3 hours (from a starting point of 100 messages per 4 hours).

Some speculate that the cap might become more restrictive in the near future (e.g. 20 or 10 messages every 3-4 hours)… hopefully 25 messages per 3 hours is the worst-case-scenario.

If you’ve observed lower caps than what I’ve reported, feel free to share… the lowest I’ve seen as of March-April 2023 is 25 messages per 3 hours.

How the message cap works (GPT-4)

With the notice of a message cap, many became confused about how it works – probably because it wasn’t explained well by OpenAI.


How the cap works…

  1. When you send your first message, a 3-hour timer begins (specific time window implemented by OpenAI).
  2. You can send up to 25 messages (or whatever message limit is imposed) before the timer runs out.
  3. After the 3-hour timer ends, your message limit is reset – and you can send another 25 messages.

How it does NOT work: (A) 3-hour timer after each specific message (once timer ends you get an additional message back) & (B) 3-hour intervals throughout the day where you get a specific number of messages (e.g. 25 messages from 3-6 AM then 6-9 AM, etc.).

OpenAI mentions initial usage cap for GPT-4

Some were surprised by a “usage cap” (restriction on number of interactions with GPT-4) for Plus subscribers, but this was documented by OpenAI in their initial GPT-4 press release.

OpenAI GPT-4 Research Paper: (R)

ChatGPT Plus subscribers will get GPT-4 access on with a usage cap.

We will adjust the exact usage cap depending on demand and system performance in practice, but we expect to be severely capacity constrained (though we will scale up and optimize over upcoming months).

Depending on the traffic patterns we see, we may introduce a new subscription level for higher-volume GPT-4 usage; we also hope at some point to offer some amount of free GPT-4 queries so those without a subscription can try it too.

The research paper directly stated that GPT-4 was introduced to Plus subscribers with a usage cap.

OpenAI will adjust the usage cap based on a combination of demand/usage AND system performance – caps will initially be low.

After several months, OpenAI will “scale up” and “optimize” GPT-4 for ChatGPT such that the cap will significantly increase OR end entirely – at least for Plus members.

There may also be a new subscription level for subscribers who wish to use GPT-4 more frequently (which will likely be a bit costlier than the basic plan).

Plus subscribers wish there were no cap…

It’s understandable that there is a message cap for a new product that: just rolled out, is heavily used, hasn’t been scaled up, lacks optimization.

However, many Plus subscribers wish there were a workaround or a different subscription tier that could be purchased to bypass usage caps.

Some think it’s unfair that OpenAI grants select companies access to the GPT-4 API which allows unlimited usage (with zero message caps) as long as costs are covered… especially because many Plus subscribers are willing to pay based on usage.

Considering statements from OpenAI and quick progress observed from GPT-3 to GPT-3.5, many believe the usage caps will disappear within a few months for Plus users.

How much does ChatGPT-4 cost to operate currently? (March 31, 2023)

Many ChatGPT Plus subscribers likely aren’t aware of operational costs associated with GPT-4 on the ChatGPT platform (it’s generally more expensive than using the GPT-4 API independently).

Although short messages and answers in GPT-4 don’t break the bank, heavy usage (especially for longer-form prompts & responses) is likely more expensive than most think.

According to OpenAI, GPT-4 models with 8k context lengths (GPT-4 & GPT-4-0314), the cost as of March 31, 2023 is: (R)

  • $0.03/1k prompt tokens
  • $0.06/1k sampled tokens

Let’s assume a 100-word prompt message (your query) and a 500-word response:

  • 100-word prompt: 500 tokens = $0.015
  • 500-word response: 2,500 tokens = $0.15

If we assume you’re sending 25 messages every 3 hours with GPT-4 of this exact length (100-word prompt & 500-word reply), the cost is: $4.125.

Assuming you’ve optimized this so that you’re able to do this 8 times per day (8 sets of 25 messages every 3 hours) – this ends up costing: $33 per day.

If the messages are lengthier and/or the GPT-4 allows for more messages per day – then the costs are obviously higher.

If we extrapolate the $33 per day to an entire 30-day month, then the usage costs would reach $990 (much higher than the $20 per month for a Plus subscription).

Although GPT-4 can process up to 25,000 words of text – this long-form input/output is not available for Plus subscribers – as this for GPT-4 (32K) which costs more.

The GPT-4 (8K) version allows for a maximum of 6,000 combined words (prompt + response), which, assuming: (1) ~5 tokens per word & (2) equally divided input/output (3,000 words each) – would cost: $1.35 (as of March 31, 2023).

Extrapolate this to 25 messages per 3 hours optimized for the entire day (8 sets of 25 messages every 3 hours) – this costs: $270.

Extrapolated to an entire 30-day month and this costs $8,100… so no you’re not getting shortchanged on your Plus subscription.

Note: In many cases, usage costs associated with OpenAI’s GPT-4 are significantly higher than using the GPT-4 API.

Reactions to the GPT-4 message cap…

Below are some angry reactions to the message cap imposed by OpenAI on GPT-4.

  • Reaction #1: I understand that demand is very high for GPT, but the usage cap is really low. This cap is extremely low for the cost of a subscription. (Actually it isn’t).
  • Reaction #2: I’m using the new GPT-4 heavily but the message cap is driving me crazy. It just dropped again and I keep hitting my limit.
  • Reaction #3: I love paying money for more restrictions. The number of messages isn’t enough… what am I even paying for?
  • Reaction #4: It’s ridiculous. We started at 100 and now we’re at 25 and expect lower? What is happening? Get a grip OpenAI. We are paying for this service.
  • Reaction #5: Why am I paying for a monthly subscription? I’m a heavy user and don’t want to be limited. Perhaps those who are not subscribed should face incredibly strict limitations to allow more bandwidth for paying users.

Some people probably purchased ChatGPT Plus specifically for the ability to use GPT-4 – and when restricted with a message cap, they become frustrated.

That said, there really isn’t an alternative if you don’t have API access… so GPT-4 with a usage cap is still better than nothing.

Most of these complainers may not understand current operational costs attached to heavy usage of GPT-4 on ChatGPT… OpenAI is probably losing money from heavy-use Plus subscribers.

If you’re a heavy user with lengthy prompts & answers, you’re racking up hundreds to thousands per month in operational costs for OpenAI while only paying $20 per month.

Though many Plus subscribers are willing to pay on a per-use basis, some would likely claim that paying-per-use is too expensive or unaffordable.

My reaction to the GPT-4 message cap (25 messages per 3 hours)

Immediately after the release of GPT-4, the message cap was set at 100 messages per 4 hours… and thought I’d definitely hit the cap given my heavy usage.

However, I never hit the initial message cap… I probably could’ve, but I was somewhat afraid of hitting it so I’d switch to GPT-3.5 whenever I had an extremely basic query and/or wanted a faster response.

GPT-4 is better than GPT-3.5 at complex analysis, reasoning, and clarity – and I prefer it most of the time (obviously would use it exclusively if no cap limits).

With the 25 messages per 3-4 hours imposed as the current cap, I reach my limit almost daily (especially if there’s a lot of back-and-forth with questions about website code or if I want to get feedback on my writing).

Hitting the cap limit is a bit frustrating… but when considering the newness of GPT-4, operational costs associated with GPT-4 on ChatGPT, and the value of paying just $20/mo. – it’s really not that big of a deal.

That said, I wish that I could buy a membership or plan that charged based on usage OR a higher-level subscription which reduces the cap and/or eliminates it altogether.

What should OpenAI do about the GPT message cap? (My thoughts)

OpenAI knows that a subset of Plus subscribers are frustrated with message caps – particularly those who’ve applied for the GPT-4 API yet lack access (users are regularly complaining in OpenAI forum threads).

Add a timer: OpenAI could add a timer to the ChatGPT interface so that users know exactly how much time they have before the message cap resets. (Just add a timer above the prompt box – would be helpful to users).

Add a message counter: OpenAI could also add a message counter to the ChatGPT interface so that users know exactly how many messages they have remaining during a specific time-window.

Add subscriptions: OpenAI proposed adding higher-level subscriptions for power users of ChatGPT. Something easy OpenAI could do: Plus (standard) for $20; Plus (double) for $40; Plus (triple) for $60; Plus (quad) for $80 – and have the cap reflect amount paid. (100 messages per 3 hours for quad vs. 25 per 3 hours for standard).

Custom subscriptions: OpenAI could consider charging based on token usage. If power users want access to ChatGPT 32K, then they can pay the exact operational costs (or even more so that OpenAI makes a profit).

Overage fees: OpenAI could consider charging a standard $20 per month for the Plus membership, then allow users to pay “overage fees” based on additional token usage (beyond cap limits).

Grant API access (?): OpenAI could grant more users access to the GPT-4 API so that they can use GPT-4 exactly how they wish and pay based on usage. Someone could create a “chat” interface similar to ChatGPT, use the GPT-4 API, then refine the model so that it suits their needs.

Note: OpenAI might NOT want to add a timer or counter because they’re already losing money – and these adjustments may result increase usage per Plus user (because they know exactly the time/messages remaining and can plan for it).

What can you do about the message cap? (Any workarounds?)

Are there any workarounds to the GPT-4 message cap? Not really.

There are some things you can do to optimize GPT-4 usage on ChatGPT though.

API access: If you’re comfortable with a bit of coding, you could apply for API access to GPT-4. If you get API access, you can build a platform that allows you to use GPT-4 however you want while paying per-use with zero restrictions. (This may be difficult for most people who aren’t developers and/or CEOs of companies).

Track messages & time: Although OpenAI does not have a timer or message limit tracker within the ChatGPT interface, you can track these on your own. This may help you optimize ChatGPT usage so that you get more total messages per day.

Maximize each interaction: One tactic you can implement is to maximize the number of tokens per ChatGPT (GPT-4) prompt and response. Keep in mind that there’s a token limit per prompt/response combo (~4,096 tokens). You can use OpenAI’s tokenizer tool to track & optimize token usage if you want.

Multiple phone #s: I got 2 phones, one for the plug and one for the load… consider using them both on OpenAI. (Might need to modify credentials slightly for this to work).

Buy accounts for fam/friends: Buy Plus memberships for family members and/or friends. Use their login credentials to get additional work done after you’ve hit your message cap.

Switch to GPT-3.5: GPT-3.5 is still extremely useful and spits out responses rapidly. Though GPT-4 is superior to GPT-3.5, this doesn’t mean GPT-3.5 is completely useless… use GPT-3.5 if you’ve maxed out GPT-4 usage.

What do you think about the GPT-4 message cap?

As already mentioned, I think the message cap is probably necessary at the moment for various reasons – otherwise there wouldn’t be one.

Even if people were willing to pay for higher-usage plans (e.g. $60 for triple-Plus) – money likely isn’t the only thing OpenAI is considering (scaling & optimization matter).

Do you have any thoughts about the message cap in ChatGPT-4?

In which specific way(s) are you using GPT-4 that causes you to reach a message cap quickly? (e.g. programming, brainstorming, writing, etc.)

How much would you be willing to pay per month for zero message cap? (Would you be willing to pay based on token usage?)

2 thoughts on “ChatGPT Message Cap (GPT-4 Usage Limit)”

  1. I’m developing some code. It’s frustrating when I hit the cap. Especially if it’s because there was an error (network or it went off on a tangent) and I had to regen a response. But most times, although it slows progress, it’s an excuse for a screen break. Having said that, I have GitHub copilot as well. I don’t revert a chat to gpt3 though, I start a new gpt3 chat, and wait to get access back on the gpt4 thread. But using 3.5 and 4 back to back like this really puts in to contrast their abilities. 3.5 is like the human meta crisis Doctor who, good but just not the Full Doctor.


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.