Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use crew with "memory=true" with AzureOpenAI #614

Open
ziki99 opened this issue May 14, 2024 · 7 comments
Open

Can't use crew with "memory=true" with AzureOpenAI #614

ziki99 opened this issue May 14, 2024 · 7 comments

Comments

@ziki99
Copy link

ziki99 commented May 14, 2024

I wanted to enable crew memory with AzureOpenAI by adding the embedding based on https://docs.crewai.com/core-concepts/Memory/ as following. I have also defined the .env with my AzureOpenAI endpoint and key and used them successfully for reaserche LLM in the same main.py code

do you have any ideas why the following is giving the following error Unauthorized. Access token

tech_crew = Crew(
  agents=[researcher, writer],
  tasks=[research_task, write_task],
  process=Process.sequential,  # Optional: Sequential task execution is default
  memory=True,
  embedder={
            "provider": "azure_openai",
            "config":{
                "model": 'text-embedding-ada-002',
                "deployment_name": "text-embedding-ada-002"
            }
        },
  cache=True,
  max_rpm=100,
  share_crew=True
)

the error is :
File "/home/zeus/miniconda3/envs/cloudspace/lib/python3.10/site-packages/openai/_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}

@ziki99 ziki99 changed the title can't use crew memory=true with AzureOpenAI Can't use crew with "memory=true" with AzureOpenAI May 14, 2024
@DavidGayda
Copy link

I am having this same issue, I have not yet figured out a resolution.

@DavidGayda
Copy link

I believe I have it working now - I included a parameter called api_key in the config from my Azure account.

"config":{ "model": 'text-embedding-ada-002', "deployment_name": "text-embedding-ada-002" "api_key": os.environ.get("AZURE_OPENAI_KEY") }

@fkucuk
Copy link

fkucuk commented May 20, 2024

I have the same issue. It works when I disable the "Memory" feature. I am using the AzureChatOpenAI class to create LLM Model for the agents.

@fantinis
Copy link

fantinis commented May 20, 2024

@fkucuk, @ziki99 here my solution

After some investigation, I realized that CrewAI uses the embedchain library for embedding and that by using embedchain with the correct environment variables set in my .env file, the issue was resolved. Specifically, I set the following environment variables:

.env (file)
OPENAI_API_TYPE="azure"
OPENAI_API_VERSION="xxx"
AZURE_OPENAI_ENDPOINT="xxx"
OPENAI_API_KEY="xxx"

now both the agent based on Azure and memory based on Azuer is working well

app.py (file)

import os
from dotenv import load_dotenv

from crewai import Agent, Task, Crew
from langchain_openai import AzureChatOpenAI

load_dotenv()

_llm = AzureChatOpenAI(
api_version=os.environ.get("OPENAI_API_VERSION"),
azure_endpoint=os.environ.get("AZURE_OPENAI_ENDPOINT"),
api_key=os.environ.get("OPENAI_API_KEY"),
azure_deployment="xxx"

)

crew = Crew(
agents=[a1],
tasks=[t1],
verbose=2,
memory=True,
embedder={
"provider": "azure_openai",
"config": {"model": 'text-embedding-ada-002',
"deployment_name": "text-embedding-ada-002"}
}
)

@fkucuk
Copy link

fkucuk commented May 21, 2024

@fantinis you are right. It is a great flexibility to use a different LLM for every agent. But it has to be the same LLM if you are using the "Memory" feature. It makes sense.

But the error message is misleading :)

@MrSimonC
Copy link

Thanks for the solution @fantinis - I'm starting out and also faced this issue.

@goran-ristic-dev
Copy link

In order to make Chroma to work with azureOpenAI
i have configured
embedding_function = OpenAIEmbeddingFunction(

   api_key='<your api_key>', 
    model_name='<azure embedding model name>',
    api_type='azure', # this has to  be set to azure  
    api_base='<put your  api azure url>',
    api_version="<put your>")

try passing those parameters to your embedder

for regular local usage out of azure it is sufficient to pass just api_key and model_name, but for azure, api_type='azure', api_base, and api_version must be set.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants