-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
关于max_new_tokens的运行错误,请问如何修改 #50
Comments
你好!这个错误是因为模型的最大输入长度是1024,但是推理输入的长度是1303,超过了最大限制。由于模型及浏览器检索的输入限制,建议将prompt长度缩短到支持的范围内。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
关于max_new_tokens的运行错误,请问如何修改?
错误描述如下:
Input length of input_ids is 1303, but
max_length
is set to 1024. This can lead to unexpected behavior. You should consider increasingmax_new_tokens
.The text was updated successfully, but these errors were encountered: