[R-224] Ragas integration with Langfuse to trace both llm outputs and scores in the same place
[x ] I checked the documentation and related resources and couldn't find an answer to my question.
I checked and tested the integration of ragas with Langfuse as demonstrated here. Although a lot of the code in that page did not work, I used the evaluate function to get the scores, and I logged all the scores in Langfuse.
My Question: how to customize llm traces with ragas
Is there any way we can log the llm generations when evaluating the results using some given trace_ids ?
I already have the generations logged into langfuse, but the problem is I want to log these generations into some other corresponding trace_ids I already generated.
Is this possible ?
Code Examples
score = evaluate(
dataset,
metrics=[
context_precision,
context_recall,
context_relevancy,
],
llm=llm_ragas,
embeddings=embedding_model_ragas,
)
# at this point, the traces of all the llm generations done by ragas are already logged into langfuse.
# is there a way we can log them using some given trace ids ?
....
for k, v in score_row.items():
try:
if v not math.isnan(v):
langfuse.score(trace_id=trace_id, name=k, value=v)
else:
logger.error(f"Score {k} is None")
except Exception as e:
logger.error(f"Error logging score: {e}")
....
I believe these issues are related: https://github.com/explodinggradients/ragas/issues/896, https://github.com/explodinggradients/ragas/issues/893
Additional context Here are the requirements used.
langfuse==2.26.3 litellm==1.35.12 ragas==0.1.7
Hey @databill86 thanks for raising this - will be fixing these issues shortly and doing a release with the fixes for you 🙂
cheers