Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix max token limit parameter for larger models #4597

Closed

Conversation

Tiiiiiida
Copy link

Original 512000 limit is not right for larger models such as gemini 1.5 pro, which need a limit up to 1049576.

Copy link

vercel bot commented Apr 30, 2024

@Tiiiiiida is attempting to deploy a commit to the NextChat Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Contributor

Your build has completed!

Preview deployment

@Tiiiiiida Tiiiiiida closed this May 3, 2024
@Tiiiiiida Tiiiiiida deleted the fix-maxtoken-parameter branch May 5, 2024 10:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant