-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't use crew with "memory=true" with AzureOpenAI #614
Comments
I am having this same issue, I have not yet figured out a resolution. |
I believe I have it working now - I included a parameter called api_key in the config from my Azure account.
|
I have the same issue. It works when I disable the "Memory" feature. I am using the AzureChatOpenAI class to create LLM Model for the agents. |
@fkucuk, @ziki99 here my solution After some investigation, I realized that CrewAI uses the embedchain library for embedding and that by using embedchain with the correct environment variables set in my .env file, the issue was resolved. Specifically, I set the following environment variables: .env (file) now both the agent based on Azure and memory based on Azuer is working well app.py (file) import os from crewai import Agent, Task, Crew load_dotenv() _llm = AzureChatOpenAI( ) crew = Crew( |
@fantinis you are right. It is a great flexibility to use a different LLM for every agent. But it has to be the same LLM if you are using the "Memory" feature. It makes sense. But the error message is misleading :) |
Thanks for the solution @fantinis - I'm starting out and also faced this issue. |
In order to make Chroma to work with azureOpenAI
try passing those parameters to your embedder for regular local usage out of azure it is sufficient to pass just api_key and model_name, but for azure, api_type='azure', api_base, and api_version must be set. |
I wanted to enable crew memory with AzureOpenAI by adding the embedding based on https://docs.crewai.com/core-concepts/Memory/ as following. I have also defined the .env with my AzureOpenAI endpoint and key and used them successfully for reaserche LLM in the same main.py code
do you have any ideas why the following is giving the following error Unauthorized. Access token
the error is :
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
The text was updated successfully, but these errors were encountered: