-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] 导入微调后的模型,直接没法正常输出 / Concise description of the issue #4034
Comments
To address the issue of receiving garbled text as output after importing a fine-tuned model, consider the following steps:
For further guidance, you might want to consult the documentation or support resources for the models and libraries you're using.
|
adapter_config.json以及将底座模型修改为chatglm3-6b模型
环境配置中的PEFT_SHARE_BASE_WEIGHTS=true 也修改了
startup.py文件中的args.model_names也增加了peft文件路径
但是输出结果却为乱码
MODEL_PATH配置
The text was updated successfully, but these errors were encountered: