OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: xai/grok-beta model getting litellm.BadRequestError

Open star8618 opened this issue 1 year ago • 5 comments

Is there an existing issue for the same bug?

  • [X] I have checked the existing issues.

Describe the bug and reproduction steps

litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=xai/grok-beta Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

OpenHands Installation

Docker command in README

OpenHands Version

No response

Operating System

None

Logs, Errors, Screenshots, and Additional Context

No response

star8618 avatar Nov 11 '24 02:11 star8618

Are you planning enable using Ollama too ?

zdehasek avatar Dec 23 '24 21:12 zdehasek

Are you planning enable using Ollama too ?

No immediate plans.

Shpigford avatar Dec 23 '24 22:12 Shpigford

Hi @Shpigford , curious about how the AI feature will work. At the moment, any estimate for the timing to have the first release?

asdasdad23332s avatar Jan 14 '25 22:01 asdasdad23332s

This will work only with OpenAI? Or another LLM?

asdasdad23332s avatar Jan 14 '25 22:01 asdasdad23332s

I just found this interesting ruby gem

https://github.com/icebaker/ruby-nano-bots

Is a library that provides an interface to address multiple chatbots.

Among them ChatGPT and Oolama. Also seems to make the whole configuration of assistants very easy. I think it's a good idea to not only concentrate on ChatGPT.

  • No dependency (good for maybe hosted if OpenAi raises prices)
  • It's one thing to give you the data, but not everyone wants to give it to OpenAi on top of that.
  • With more effort for the hardcore selfhosters it can also be made to run independently

M123-dev avatar Feb 07 '25 11:02 M123-dev