-
Notifications
You must be signed in to change notification settings - Fork 198
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
talking bot backend for windows-pc is not working, notebook need to be updated #1518
Comments
@raj-ritu17 Hi, please share me the model link and full script you used.
for this error, actually we can get the model type from the model config directly. Please share me more details that I can reproduce your error. |
@raj-ritu17 you only need to pass
Still get some issues and will update the notebook and get back to you later. |
followed the guidelines mentioned here:
https://github.com/intel/intel-extension-for-transformers/blob/main/intel_extension_for_transformers/neural_chat/examples/deployment/talkingbot/server/backend/README.md
first error: positional argument 'model_type' is missing, which is not given in example
so, I have added the argument:
model.init_from_bin(model_name="llama", model_path="ne_llama_q.bin", max_new_tokens=43, do_sample=False, model_type="llama")
according to the file:
https://github.com/intel/neural-speed/blob/main/neural_speed/__init__.py
after adding the position_argument furthermore errors occurs:
Please, can you update the notebook example
The text was updated successfully, but these errors were encountered: