Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support provider Ollama for automatic configuration. #389

Merged
merged 2 commits into from
Jan 11, 2025

Conversation

namin
Copy link
Contributor

@namin namin commented Nov 29, 2024

Ollama is already supported manually, as I describe in this discussion item: #379

Here, I make it possible to support Ollama in automatic configurations, e.g., as conveniently as:

export CONTROLFLOW_LLM_MODEL=ollama/qwen2.5

I updated the docs here:
https://controlflow.ai/guides/configure-llms#automatic-configuration
with a line for Ollama in supported providers for automatic configuration.

I also added a test in tests/llm/test_models.py, and accordingly added langchain-ollama as an optional tests dependency.

I systematically calqued all these changes by following the occurrences of the already supported provider groq in the repository.

Thanks!

@github-actions github-actions bot added documentation Improvements or additions to documentation tests Adds or improves unit tests labels Nov 29, 2024
@jlowin jlowin merged commit daf0883 into PrefectHQ:main Jan 11, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation tests Adds or improves unit tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants