-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
embedjs llama-cpp integration fails to run #180
Comments
For a more clear sample code:
|
Update; actually, I think the OllamaEmbeddings doesn't set the modelName correctly, as it still defaults to |
Yes please. There was another issue that I found in your original post. The way the initialization was happening allowed for a race condition - this has been addressed in the version just published ( |
I'm still seeing this issue unfortunately. npm list outputs:
So I am up to date, however it seems it is getting slightly further, as the error log is now:
|
@adhityan if you could reopen this issue that would be fantastic, as I'm still having issues 👍 |
🐛 Describe the bug
To reproduce:
This code works if ran with ollama on both setModel and setEmbeddingModel.
It appears that it fails due to the file at
models/embedjs-llama-cpp/src/llama-cpp-embeddings.ts
not importing the requisite functiongetEmbeddingFor
as called on line 24. Resulting in error;Am i missing any imports or obvious errors?
The text was updated successfully, but these errors were encountered: