ragas icon indicating copy to clipboard operation
ragas copied to clipboard

Adapted output keys set(output.keys())={'深度', '相关性', '清晰度', '结构'} do not match with the original output keys: output_keys[i]={'structure', 'clarity', 'depth', 'relevance'}

Open qism opened this issue 1 year ago • 3 comments

[ ] I have checked the documentation and related resources and couldn't resolve my bug.

Describe the bug

>>> generator.adapt(language, evolutions=[simple])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/generator.py", line 305, in adapt
    evolution.adapt(language, cache_dir=cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/evolutions.py", line 326, in adapt
    super().adapt(language, cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/evolutions.py", line 262, in adapt
    self.node_filter.adapt(language, cache_dir)
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/testset/filters.py", line 69, in adapt
    self.context_scoring_prompt = self.context_scoring_prompt.adapt(
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/anaconda3/envs/rags_new/lib/python3.12/site-packages/ragas/llms/prompt.py", line 241, in adapt
    set(output.keys()) == output_keys[i]
AssertionError: Adapted output keys set(output.keys())={'深度', '相关性', '清晰度', '结构'} do not match with the original output keys: output_keys[i]={'structure', 'clarity', 'depth', 'relevance'}

Ragas version: 0.1.8.dev18+g2d79365 Python version:3.10

Code to Reproduce

from ragas.testset.generator import TestsetGenerator
from ragas.testset.evolutions import simple, reasoning, multi_context
from langchain_openai import ChatOpenAI, OpenAIEmbeddings

inference_server_url = "http://xxxxxx:port/v1"
openai_api_key = "sk-xxx"

generator_llm  = ChatOpenAI(model="gpt-3.5-turbo-1106",
    openai_api_key=openai_api_key,
    openai_api_base=inference_server_url
)
critic_llm = ChatOpenAI(model="gpt-4-1106-preview",
    openai_api_key=openai_api_key,
    openai_api_base=inference_server_url
)
                                      
embeddings = HuggingFaceBgeEmbeddings(
            model_name="BAAI/bge-large-en-v1.5",
            model_kwargs={"device": "cpu"},
            encode_kwargs={"normalize_embeddings": True},
            query_instruction="embedding this sentence",
        )

generator = TestsetGenerator.from_langchain(
    generator_llm,
    critic_llm,
    embeddings
)

from ragas.testset.evolutions import simple, reasoning, multi_context,conditional
language = "Chinese"
generator.adapt(language, evolutions=[simple, reasoning, conditional, multi_context])
generator.save(evolutions=[simple, reasoning, multi_context,conditional])

Error trace

qism avatar May 17 '24 03:05 qism

Hey @qism were you able to fix it? this was a bug because the adaptation was incorrect, we will fix that shortly from our end. But from your end what you could do is just try running the adaptation again. if that doesn't work I would be more than happy to jump on a call and fix this for you

jjmachan avatar Jun 01 '24 10:06 jjmachan

still exists in ragas 0.1.9

jimmytanj avatar Jun 11 '24 13:06 jimmytanj

tagging #890 fixes, do keep track of that

jjmachan avatar Aug 02 '24 07:08 jjmachan

This has been fixed with v0.2 - I know finally 😅 🎉

do checkout the docs here: https://docs.ragas.io/en/stable/howtos/customizations/metrics/_metrics_language_adaptation/ reference here: https://docs.ragas.io/en/stable/references/prompt/#ragas.prompt.PromptMixin

and if you're migrating from v0.1 check out the migration docs here: https://docs.ragas.io/en/stable/howtos/migrations/migrate_from_v01_to_v02

could you check it out and verify if not feel free to comment here and I'll help you out - really sorry again that it tool this while

jjmachan avatar Oct 18 '24 06:10 jjmachan