RaGe
RaGe
Seems to be an issue with litellm. Raised a PR in their repo. https://github.com/BerriAI/litellm/issues/2803 https://github.com/BerriAI/litellm/pull/2806
Upstream PR is now merged. If you use `litellm==v1.34.22`, you will see gemini in the drop down box. https://github.com/BerriAI/litellm/releases/tag/v1.34.22
> https://github.com/OpenDevin/OpenDevin/pull/641 accidentally fixed the problem LOL. No it didn't. This did: https://github.com/BerriAI/litellm/pull/2806
> but it gives an error during installation. What's the error? Please fill in the [bug template](https://github.com/OpenDevin/OpenDevin/blob/main/.github/ISSUE_TEMPLATE/bug_report.md) when creating issues.
I can repro this issue. steps: Set Env vars: ``` LLM_MODEL="gemini/gemini-pro" LLM_API_KEY="A....." ``` Run ``` PYTHONPATH="./" python opendevin/main.py -d ./workspace/ -i 100 -t "Write a bash script that prints hello...
My best guess is `LLM_MODEL="gemini/gemini-pro"` which the command line invocation uses, is not the same as `gemini-pro` set from the UI.
https://github.com/OpenDevin/OpenDevin/pull/654 fixes this issue. Well, merging 654, _and_ selecting `gemini/gemini-pro` from the model drop down box as opposed to `gemini-pro` because both entries are available.
@rbren would you mind merging #654 when you have a chance please.
#654 is now merged. @jpshack-at-palomar `gemini/gemini-pro` should be available from the UI dropdown now.
Would you know how Stack Graphs compare to [Lossless Semantic Trees](https://docs.openrewrite.org/concepts-explanations/lossless-semantic-trees) ?