Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Link remotely hosted LLAMA #5

Open
shantanuo opened this issue May 4, 2024 · 1 comment
Open

Link remotely hosted LLAMA #5

shantanuo opened this issue May 4, 2024 · 1 comment

Comments

@shantanuo
Copy link

shantanuo commented May 4, 2024

If my llama3:8b is hosted on remote server somewhere in AWS cloud, how do I pass it in to this?

cria.Cria(
    model: Optional[str] = 'llama3:8b',
    standalone: Optional[bool] = False,
    run_subprocess: Optional[bool] = False,
    capture_output: Optional[bool] = False,
    allow_interruption: Optional[bool] = True,
    silence_output: Optional[bool] = False,
    close_on_exit: Optional[bool] = True,
) -> None
@leftmove
Copy link
Owner

leftmove commented May 4, 2024

Currently Cria isn't really meant for online models, so this is not supported.

If people are intrested though, it could be implemented.

The closest thing available now would be using an ollama instance on a server somewhere, and that is supported. To use a different host other than localhost, create an environment variable called OLLAMA_HOST and input your server's URL.

I should note that I haven't test this, but it should work. I will work on adding native support for this later, so that it can be possible to pass in the URL to a Cria instance instead of using environment variables.

Until then, I would need more information about your setup in order to help with your use case.

TLDR: Set an OLLAMA_HOST environment variable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants