Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adapted output keys set(output.keys())={'深度', '相关性', '清晰度', '结构'} do not match with the original output keys: output_keys[i]={'structure', 'clarity', 'depth', 'relevance'} #964

Open
qism opened this issue May 17, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@qism
Copy link

qism commented May 17, 2024

[ ] I have checked the documentation and related resources and couldn't resolve my bug.

Describe the bug

>>> generator.adapt(language, evolutions=[simple])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/generator.py", line 305, in adapt
    evolution.adapt(language, cache_dir=cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/evolutions.py", line 326, in adapt
    super().adapt(language, cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/evolutions.py", line 262, in adapt
    self.node_filter.adapt(language, cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/filters.py", line 69, in adapt
    self.context_scoring_prompt = self.context_scoring_prompt.adapt(
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/llms/prompt.py", line 241, in adapt
    set(output.keys()) == output_keys[i]
AssertionError: Adapted output keys set(output.keys())={'深度', '相关性', '清晰度', '结构'} do not match with the original output keys: output_keys[i]={'structure', 'clarity', 'depth', 'relevance'}

Ragas version: 0.1.8.dev18+g2d79365
Python version:3.10

Code to Reproduce

from ragas.testset.generator import TestsetGenerator
from ragas.testset.evolutions import simple, reasoning, multi_context
from langchain_openai import ChatOpenAI, OpenAIEmbeddings

inference_server_url = "http://xxxxxx:port/v1"
openai_api_key = "sk-xxx"

generator_llm  = ChatOpenAI(model="gpt-3.5-turbo-1106",
    openai_api_key=openai_api_key,
    openai_api_base=inference_server_url
)
critic_llm = ChatOpenAI(model="gpt-4-1106-preview",
    openai_api_key=openai_api_key,
    openai_api_base=inference_server_url
)
                                      
embeddings = HuggingFaceBgeEmbeddings(
            model_name="BAAI/bge-large-en-v1.5",
            model_kwargs={"device": "cpu"},
            encode_kwargs={"normalize_embeddings": True},
            query_instruction="embedding this sentence",
        )

generator = TestsetGenerator.from_langchain(
    generator_llm,
    critic_llm,
    embeddings
)

from ragas.testset.evolutions import simple, reasoning, multi_context,conditional
language = "Chinese"
generator.adapt(language, evolutions=[simple, reasoning, conditional, multi_context])
generator.save(evolutions=[simple, reasoning, multi_context,conditional])

Error trace

@qism qism added the bug Something isn't working label May 17, 2024
@jjmachan
Copy link
Member

jjmachan commented Jun 1, 2024

Hey @qism were you able to fix it?
this was a bug because the adaptation was incorrect, we will fix that shortly from our end. But from your end what you could do is just try running the adaptation again. if that doesn't work I would be more than happy to jump on a call and fix this for you

@jimmytanj
Copy link

still exists in ragas 0.1.9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants