Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FR] Allow custom OpenAi api base url #943

Open
TigerBeanst opened this issue Oct 28, 2024 · 4 comments
Open

[FR] Allow custom OpenAi api base url #943

TigerBeanst opened this issue Oct 28, 2024 · 4 comments
Assignees

Comments

@TigerBeanst
Copy link

1~3 main use cases of the proposed feature
For now, it looks like the ai container only requests to OpenAI by api.openai.com. It would be great if the base url could be customized.

@khorshuheng
Copy link
Collaborator

khorshuheng commented Nov 8, 2024

We are using langchain python package to build clients for OpenAI, so it might be possible to use OPENAI_BASE_URL to switch the url. Will need time to experiment if this will work properly. The biggest issue, is whether the current compatibility mode available (example: one that is provided by Ollama) can actually accept the current request payload from AppFlowy AI service without any changes at all. If it is only a proxy server that relay the request to OpenAI, then this will probably work.

Reference for the latest API doc:
https://python.langchain.com/api_reference/_modules/langchain_openai/chat_models/base.html#BaseChatOpenAI

Specifically, this line:

self.openai_api_base = self.openai_api_base or os.getenv("OPENAI_API_BASE")

@khorshuheng
Copy link
Collaborator

Was spending some time looking into this: so as it is right now, even using OPENAI_API_BASE doesn't help, because we still check for API Key validity even when OPENAI_API_BASE is set. Will need to see if it is possible to fix this.

@khorshuheng
Copy link
Collaborator

khorshuheng commented Nov 8, 2024

This might be difficult to solve: we are using OpenAIEmbeddings, and for some reason, pydantic validation failed when open api key is not supplied, similar to the issue reported here: langchain-ai/langchain#7251

And validation of open api key is not sent to the proxy or alternate base url, but the actual Open AI endpoint: platform.openai.com.

I assume (apologize in advanced if i am wrong) that the primary motivation of having a custom OpenAI api base url, is to use Ollama that is hosted on a server with OpenAI compatibility. For such use case, it doesn't appear that having an alternate base url can resolve the issue. Instead, actual support for Ollama-based embeddings is needed in appflowy ai service: https://python.langchain.com/docs/integrations/text_embedding/ollama/

@ArakiSatoshi
Copy link

I support this. Some users might decide to use alternative providers, such as Anthropic, Google, LLM aggregators like OpenRouter, or even locally deployed models (that support OpenAI-compatible endpoints) to avoid dependency on third-party services.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants