OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Groq not being listed as a provider in the default litellm list that gets the models.

Open rkeshwani opened this issue 1 year ago • 4 comments

Describe the bug Set the default models to the groq model per the litellm docs in the config.toml file. https://docs.litellm.ai/docs/providers/groq The model does not appear on the list and using the mixtral model doesn't work.

Steps to Reproduce

  1. Set config values in the config file to LLM_MODEL="groq/mixtral-8x7b-32768" LLM_BASE_URL="https://api.groq.com/openai/v1/"
  2. Run OpenDevin
  3. Review the list of models
  4. The groq models are not in the list.

Expected behavior Groq should render in the list OR should be preset based on the default config values. Actual behavior GPT4 gets pre-selected. Additional context

Not sure if the issue is due to litellm or opendevin but this is a issue of either usability or user experience and a technical issue with litellm. When you set the default value in the config file, the expected behavior should be that the model should be pre-selected to that in the config file.

rkeshwani avatar Mar 31 '24 15:03 rkeshwani

I just had the same issue while trying to use groq with opendevin.

Have you found a way to use the model ?

I found a very hacky way of doing it, by inspecting the select input, changing the value of an option tag using the dev tools, and the selecting it : it does send the correct request and load the selected model.

But I agree, a model specified in the config should be selected automatically ! Even better, that would be great to be able to provide a list of LLM to add to the select input, maybe something like LLM_MODELS="model1,model2", and so the LLM_MODEL variable would be to select the default one when initializing the assistant.

thewebkit avatar Mar 31 '24 18:03 thewebkit

This no longer works with the latest version that has an autocomplete. I have not found a way to use the model. I suspect the issue is with LiteLLM which is not returning it's full list.

rkeshwani avatar Mar 31 '24 20:03 rkeshwani

I solved it. It is now stored in localStorage instead of a selectdropdown. So you can add the model there. But it currently struggles to write hello world. The thought process is there. but I don't see a written file in the workspace. image Leaving this open for now because the issue isn't solved.

rkeshwani avatar Mar 31 '24 20:03 rkeshwani

@rkeshwani did you set LLM_API_KEY with your groq key? also I don't think you should set the base url that is only for local setup, LLMLite should take care of what endpoints to call.

goudbor avatar Apr 01 '24 09:04 goudbor

Looks like groq is now in the list. Thanks y'all

rbren avatar Apr 05 '24 18:04 rbren