Lucas Gomide
Lucas Gomide
@ChuckJonas which crewai version are you using? Can you try the latest one?
Could you please show the chromadb dependency version? Uv pip show chromadb
@patricktu2 have you tried what @romellfudi said?
double checking.. what happens when you run it: ```python llm = LLM( model="openai/sabia-3", temperature=0.7, base_url='https://chat.maritaca.ai/api', api_key="SABIA_API_KEY" ) llm.call(messages=[{"role": "user", "content": "Hello, how are you?"}]) ``` Does it works?
@tortolero-ruben do you mind to share a real case that it address?
hm gotcha.. Your `input` is well structured. It's exactly [feature-request](https://github.com/crewAIInc/crewAI/issues/2650), right?
hey @tspecht I gotcha the same issue last week with other models at all. I hope push a solution by the end of the week
just submitted this PR to address that https://github.com/crewAIInc/crewAI/pull/2742/files Now you can simply set stop=None directly in your LLM to remove this flag from LiteLLM calls
> [@lucasgomide](https://github.com/lucasgomide), hey, I know I'm pretty late with this suggestion, especially since the PR is already quite far along, and I've only tested this using o4 Mini on OpenRouter,...
> @lucasgomide with additional_drop_params=["stop"], it says "Tool is currently inaccessible." I tested it yesterday.. would you mind to share how you are calling it?