Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue on docs #256

Open
kimyounkyung01 opened this issue Sep 10, 2024 · 1 comment
Open

Issue on docs #256

kimyounkyung01 opened this issue Sep 10, 2024 · 1 comment

Comments

@kimyounkyung01
Copy link

Path: /redis/troubleshooting/command_count_increases_unexpectedly

from langchain_aws import ChatBedrock
from langchain.globals import set_llm_cache
from langchain_community.cache import InMemoryCache
from langchain_community.cache import RedisSemanticCache
from langchain_openai import OpenAIEmbeddings

from tests.integration_tests.vectorstores.fake_embeddings import FakeEmbeddings

from langchain_community.embeddings import FakeEmbeddings

import langchain
from langchain_community.cache import UpstashRedisCache
from upstash_redis import Redis
from langchain.globals import set_llm_cache
from upstash_semantic_cache import SemanticCache
from upstash_redis import Redis

UPSTASH_VECTOR_REST_URL = "<rest_url>"
UPSTASH_VECTOR_REST_TOKEN = "<rest_token>"

cache = SemanticCache(
url=UPSTASH_VECTOR_REST_URL, token=UPSTASH_VECTOR_REST_TOKEN, min_proximity=0.7
)

set_llm_cache(cache)

llm = ChatBedrock(
model_id="anthropic.claude-3-sonnet-20240229-v1:0",
model_kwargs=dict(temperature=0),
# other params...
)

ai_msg = llm.invoke("Which city has the highest population in the USA?")
print(ai_msg)

ERROR :
Traceback (most recent call last):
File "/home/ec2-user/chatbedrock_redis.py", line 38, in
ai_msg = llm.invoke("Which city has the highest population in the USA?")
File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 277, in invoke
self.generate_prompt(
File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 777, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 634, in generate
raise e
File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 624, in generate
self._generate_with_cache(
File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 810, in _generate_with_cache
cache_val = llm_cache.lookup(prompt, llm_string)
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_semantic_cache/semantic_cache.py", line 65, in lookup
result = self.get(prompt)
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_semantic_cache/semantic_cache.py", line 45, in get
response = self._query_key(key)
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_semantic_cache/semantic_cache.py", line 146, in _query_key
response = self.index.query(
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_vector/core/index_operations.py", line 200, in query
for obj in self._execute_request(
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_vector/client.py", line 42, in _execute_request
return execute_with_parameters(
File "/home/ec2-user/.local/lib/python3.9/site-packages/upstash_vector/http.py", line 57, in execute_with_parameters
raise UpstashError(response["error"])
upstash_vector.errors.UpstashError: ERR Command is not available: 'QUERY-DATA'. See https://upstash.com/docs/redis/overall/rediscompatibility for details

issue :
I use uptash redis cache for langchain cache function on aws server. Among them, I am trying to use a sematic cache, and I wrote it according to the sample code, but this error occurs.
What error is this? thank you

@CahidArda
Copy link
Contributor

CahidArda commented Sep 12, 2024

Hi @kimyounkyung01,

It looks like you entered the credentials of your databases incorrectly. The vector sdk tried to make a request to an Upstash redis server.

Can you make sure that UPSTASH_VECTOR_REST_URL and UPSTASH_VECTOR_REST_TOKEN credentials are from an Upstash vector store instead of a redis database?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants