Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Llama3.3 models default system max token #5004

Closed
1 task done
qymab opened this issue Dec 16, 2024 · 0 comments · Fixed by #5024
Closed
1 task done

[Bug]: Llama3.3 models default system max token #5004

qymab opened this issue Dec 16, 2024 · 0 comments · Fixed by #5024
Labels
bug Something isn't working

Comments

@qymab
Copy link

qymab commented Dec 16, 2024

What happened?

The prompt token count exceeds the maximum token count. Currently, the maximum token count from the Groq API is 128,000.

Steps to Reproduce

  1. Using either Groq or custom litellm endpoints produces the same error due to the token count.

What browsers are you seeing the problem on?

Firefox

Relevant log output

warn: Prompt token count exceeds max token count (5910 / 4095).

Screenshots

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@qymab qymab added the bug Something isn't working label Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant