opik icon indicating copy to clipboard operation
opik copied to clipboard

[Bug]: Costs not showed working with 01-preview Openai model

Open edumesones opened this issue 7 months ago • 4 comments

What component(s) are affected?

  • [ ] Python SDK
  • [x] Opik UI
  • [ ] Opik Server
  • [ ] Documentation

Opik version

  • Opik version: x.x.x

Describe the problem

Im using track_crewai function and cost are not displayed, the OpenAI model used is 01 preview. I will be delighted to have a deeper knowledge of the tools I can use with crew, such as adding metrics and everthing

Reproduction steps and code snippets

No response

Error logs or stack trace

No response

Healthcheck results

No response

edumesones avatar May 06 '25 11:05 edumesones

Hi @edumesones! Could you please share the LLM span data for that model? e.g. screenshots from the UI + metadata tab.

alexkuzmik avatar May 06 '25 15:05 alexkuzmik

Image

ID: 01969b31-c31c-78a6-ac13-9aabeece53e9

edumesones avatar May 07 '25 05:05 edumesones

Hi @edumesones Looking at the trace you shared, it seems like we are not tracking the LLM calls which is why we are not seeing the cost. Can you share a small reproducible script so we can take a look

jverre avatar May 07 '25 15:05 jverre

@edumesones Hey there! Comet SDK engineer here. To better understand the issue, it would be really helpful to have a code example. That said, I’ve tested the O1 model and agents in CrewAI. Here’s what I found:

  • The O1 model works fine
  • Cost tracking also works properly

However, models like O1 Preview, O1 Mini, and possibly others can run into compatibility issues in CrewAI, especially when used for agent creation.

In some cases, the code may crash with an error like:

raise BadRequestError(
    message=f"{exception_provider} - {message}",
    llm_provider=custom_llm_provider,
    model=model,
    response=getattr(original_exception, "response", None),
    litellm_debug_info=extra_information,
    body=getattr(original_exception, "body", None),
)
litellm.exceptions.BadRequestError: OpenAIException - Unsupported parameter: 'stop' is not supported with this model.

If you could please share a snippet that reproduces the issue, I’d be happy to take a closer look!

japdubengsub avatar May 13 '25 09:05 japdubengsub

Hi @edumesones, it seems that the latest version of CrewAI supports the o1 model family and we also support computing its cost. Could you try to update both CrewAI and Opik if you are still facing some issues?

Lothiraldan avatar Aug 26 '25 14:08 Lothiraldan