phoenix
phoenix copied to clipboard
[BUG][evals][inferences] Invalid format for column name: `:feature.[[str]]:llm_messages`
Describe the bug
from_open_inference(as_dataframe(cbh.flush_query_data_buffer())) fails with
File "../rag_with_phoenix.py", line 74, in <module>
dataset = px.Dataset.from_open_inference(query_dataframe)
File ".venv/lib/python3.9/site-packages/phoenix/datasets/dataset.py", line 130, in from_open_inference
sorted(
File ".venv/lib/python3.9/site-packages/phoenix/datasets/dataset.py", line 722, in _parse_open_inference_column_name
raise ValueError(f"Invalid format for column name: {column_name}")
ValueError: Invalid format for column name: :feature.[[str]]:llm_messages
To Reproduce
from llama_index.callbacks.openinference import OpenInferenceCallbackHandler
callback_handler = OpenInferenceCallbackHandler()
callback_manager = CallbackManager([callback_handler])
Settings.callback_manager = callback_manager
# llama_index RAG goes here
query_data_buffer = callback_handler.flush_query_data_buffer()
query_dataframe = as_dataframe(query_data_buffer)
dataset = px.Dataset.from_open_inference(query_dataframe) ## boom!
Expected behavior Dataset is smoothly made
Screenshots NDA
Environment :
- OS: GPU-free MacOS
- ~~Notebook~~ Runtime Python 3.9
- Version
main
Notes
It seems as_dataframe() yields string list column feature.[[str]]:llm_messages, btw it's empty in my sandbox.
Then, regexp in _parse_open_inference_column_name can't handle double square braces, which seems not a big deal.
- should this column be there?
- should it has such a type?
- shouldn't receiver handle it somehow reasonably?
Really like to take it and fix.
** Workaround **
query_dataframe = query_dataframe.drop(':feature.[[str]]:llm_messages', axis=1)
hi @mkhludnev! Thanks for noticing this. Do you mind if we ask what you're thinking of using this code path for? We've noticed that this code should be deprecated since we've migrated our OpenInference spec to be OTel compatible.
The openinference as dataframes for inferences is largely not going to be supported going forward. Apologies for any inconvenience. Please try out our tracing solutions instead.