Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add support for ollama structured outputs #7344

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

vitreuz
Copy link

@vitreuz vitreuz commented Dec 21, 2024

Title

Add support for ollama structured outputs

Relevant issues

Fixes #7131

Type

🆕 New Feature

Changes

  • Added process_response_format function to convert OpenAI-style response format to Ollama API format.
  • Updated OllamaConfig and OllamaChatConfig to use the new process_response_format function.
  • Added unit test test_ollama_structured_format to validate the structured JSON schema format.

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Screenshot 2024-12-20 at 10 44 49 PM
from pydantic import BaseModel

import litellm


class Country(BaseModel):
    name: str
    capital: str
    languages: list[str]


messages = [{"role": "user", "content": "Tell me about Canada."}]
print("Sending the following messages to the model:", messages)

response = litellm.completion(
    "ollama_chat/mistral-nemo", messages, response_format=Country
)

print(response.choices[0].message)

output:

❯ python test.py
Sending the following messages to the model: [{'role': 'user', 'content': 'Tell me about Canada.'}]
Message(content='{ "name": "Canada", "capital": "Ottawa", "languages": ["English", "French", "Aboriginal languages"] }', role='assistant', tool_calls=None, function_call=None)

- Added `process_response_format` function to convert OpenAI-style
  response format to Ollama API format.
- Updated `OllamaConfig` and `OllamaChatConfig` to use the new
  `process_response_format` function.
- Added unit test `test_ollama_structured_format` to validate the
  structured JSON schema format.
Copy link

vercel bot commented Dec 21, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Dec 21, 2024 7:07am

- Updated `process_response_format` to return `Optional[str]`
- Removed fallback to 'json' for unsupported response formats
- Adjusted `OllamaConfig` and `OllamaChatConfig` to handle `None` format
@vitreuz
Copy link
Author

vitreuz commented Dec 21, 2024

Added an updated to revert to old behavior when type is neither json_schema or json

@vitreuz vitreuz changed the title feat: Add process_response_format function feat: Add support for ollama structured outputs Dec 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature]: Ollama schema structured output
1 participant