You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using Letta with the Qwen-plus model, I consistently encounter the following warning:
d:\AI-practicing\Lettagent\env-letta-agent\Lib\site-packages\letta\local_llm\utils.py:215: UserWarning: num_tokens_from_messages() is not implemented for model qwen-plus. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.
warnings.warn(
Warning: model not found. Using cl100k_base encoding.
Additional Observations:
The warning appears in the local_llm/utils.py file
It suggests that token calculation method is not implemented for the qwen-plus model
The system defaults to using cl100k_base encoding
The text was updated successfully, but these errors were encountered:
When using Letta with the Qwen-plus model, I consistently encounter the following warning:
d:\AI-practicing\Lettagent\env-letta-agent\Lib\site-packages\letta\local_llm\utils.py:215: UserWarning: num_tokens_from_messages() is not implemented for model qwen-plus. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.
warnings.warn(
Warning: model not found. Using cl100k_base encoding.
Additional Observations:
The warning appears in the local_llm/utils.py file
It suggests that token calculation method is not implemented for the qwen-plus model
The system defaults to using cl100k_base encoding
The text was updated successfully, but these errors were encountered: