Groq not being listed as a provider in the default litellm list that gets the models.
Describe the bug Set the default models to the groq model per the litellm docs in the config.toml file. https://docs.litellm.ai/docs/providers/groq The model does not appear on the list and using the mixtral model doesn't work.
Steps to Reproduce
- Set config values in the config file to
LLM_MODEL="groq/mixtral-8x7b-32768" LLM_BASE_URL="https://api.groq.com/openai/v1/" - Run OpenDevin
- Review the list of models
- The groq models are not in the list.
Expected behavior Groq should render in the list OR should be preset based on the default config values. Actual behavior GPT4 gets pre-selected. Additional context
Not sure if the issue is due to litellm or opendevin but this is a issue of either usability or user experience and a technical issue with litellm. When you set the default value in the config file, the expected behavior should be that the model should be pre-selected to that in the config file.
I just had the same issue while trying to use groq with opendevin.
Have you found a way to use the model ?
I found a very hacky way of doing it, by inspecting the select input, changing the value of an option tag using the dev tools, and the selecting it : it does send the correct request and load the selected model.
But I agree, a model specified in the config should be selected automatically ! Even better, that would be great to be able to provide a list of LLM to add to the select input, maybe something like LLM_MODELS="model1,model2", and so the LLM_MODEL variable would be to select the default one when initializing the assistant.
This no longer works with the latest version that has an autocomplete. I have not found a way to use the model. I suspect the issue is with LiteLLM which is not returning it's full list.
I solved it. It is now stored in localStorage instead of a selectdropdown. So you can add the model there.
But it currently struggles to write hello world. The thought process is there. but I don't see a written file in the workspace.
Leaving this open for now because the issue isn't solved.
@rkeshwani did you set LLM_API_KEY with your groq key? also I don't think you should set the base url that is only for local setup, LLMLite should take care of what endpoints to call.
Looks like groq is now in the list. Thanks y'all