[Support]: Facing context_length_exceeded Error with Claude-4.5-Haiku Due to Message Token Limit #10308
Unanswered
mansiibm
asked this question in
Troubleshooting
Replies: 1 comment 1 reply
-
|
You need to adjust "max output tokens" since by default it is 64,000, that should help. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I was using the anthropic model- claude-4.5-haiku which has the token limit of -
some 200k tokens so I got the following warning as -
{"attemptNumber":1,"code":"context_length_exceeded","error":{"code":"context_length_exceeded","message":"This model's maximum context length is 128000 tokens. However, your messages resulted in 167630 tokens. Please reduce the length of the messages.","param":"messages","type":"invalid_request_error"},"headers":{}
I have not set any limits for the tokens for my anthropic model, still I face this issue.
I am using the LC version - v0.8.0. How can I avoid having this issue? Occurs for other models also.
Beta Was this translation helpful? Give feedback.
All reactions