Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Ollama to Magentic One #4280

Open
wants to merge 0 commits into
base: main
Choose a base branch
from

Conversation

MervinPraison
Copy link

Why are these changes needed?

  • Added support for a new chat completion client using the Ollama API (OllamaChatCompletionClient).
  • Enhanced image handling with base64 encoding for the Ollama API.
  • Updated environment-based client creation to accommodate the new Ollama client.

Related Issue Number

  • No specific issue linked. You can mention an issue if applicable.

Checks

  • Documentation updated to reflect new changes.
  • Tested locally with Ollama

@MervinPraison
Copy link
Author

@microsoft-github-policy-service agree

@jackgerrits
Copy link
Member

This is great thank you! So it can be used in more places would you mind moving it to autogen-ext next to the other model clients?

@husseinmozannar
Copy link
Contributor

husseinmozannar commented Nov 20, 2024

This is awesome! As Jack mentioned, we'd love for this to be used broadly in autogen beyond magentic-one and in the AgentChat version of magentic-one

Quoting @jackgerrits : "Creating a custom model client is a matter of implementing this interface."

Happy to help on this

@ekzhu
Copy link
Collaborator

ekzhu commented Nov 20, 2024

Possible to set up Ollama on CI action to test this?

@MervinPraison
Copy link
Author

@jackgerrits @husseinmozannar @ekzhu Moved the Ollama Client to autogen-ext as per the suggestion

@afourney afourney self-requested a review November 21, 2024 13:51
Copy link
Member

@afourney afourney left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great! Thanks for moving it to ext for broader applicability.

We should add unit tests as Eric mentioned, but I trust that this can be done in a follow up.

class OllamaChatCompletionClient(ChatCompletionClient):
def __init__(
self,
config: OllamaConfig = OllamaConfig(),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we keep the dev experience similar to other model clients so we use keyword arguments from the config only?

temperature: float = 0.7
top_p: float = 0.95

class OllamaChatCompletionClient(ChatCompletionClient):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doc string for the client. See other clients' doc

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants