-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
getting called Result::unwrap()
on an Err
value: DecodeFailed(1) while building chat example.
#76
Comments
Sorry for the delay in replying. Whatever is failing is in the C++ side of things, could you please use tracing and post the output? |
Hi! I am experiencing the same issue trying to build a chat example, where one session is used for the entirety of the chat. Currently using version 0.3.2 of the llama_cpp crate, with the phi-2.Q5_K_M model that I downloaded here: https://huggingface.co/TheBloke/phi-2-GGUF I have included the full backtrace below. Please let me know if there is anything else I can share to help with debugging this. Thanks. Full backtrace:
|
Hi there,
i am trying to build simple chat example with the llma_cpp-rs crate, following is my code
but after some time while giving next prompt i am facing below issue.
it will helpful if someone guide me, why i am facing this error.
as for the model i am using the Meta-Llama-3-8B-Instruct-GGUF (Meta-Llama-3-8B-Instruct.Q5_K_S.gguf) from following link
https://huggingface.co/QuantFactory/Meta-Llama-3-8B-Instruct-GGUF/tree/main
The text was updated successfully, but these errors were encountered: