-
-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Link remotely hosted LLAMA #5
Comments
Currently Cria isn't really meant for online models, so this is not supported. If people are intrested though, it could be implemented. The closest thing available now would be using an ollama instance on a server somewhere, and that is supported. To use a different host other than I should note that I haven't test this, but it should work. I will work on adding native support for this later, so that it can be possible to pass in the URL to a Until then, I would need more information about your setup in order to help with your use case. TLDR: Set an |
If my
llama3:8b
is hosted on remote server somewhere in AWS cloud, how do I pass it in to this?The text was updated successfully, but these errors were encountered: