OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

make setup-config ignores "Enter your LLM Base URL" response

Open tpsjr7 opened this issue 1 year ago • 1 comments

Describe the bug

When you run make setup-config there's no variable related to "Enter your LLM Base URL" put into config.toml so it is ignored.

Setup and configuration

Current version:

492feecb67e825e9cf27d363c1de27edd65accc7

My operating system: WSL2 using bash

Steps to Reproduce: Run "make setup-config" and enter values as below

Logs, error messages, and screenshots:

make[1]: Entering directory '/home/user/code/OpenDevin'
Enter your LLM Model name (see https://docs.litellm.ai/docs/providers for full list) [default: gpt-3.5-turbo-1106]:
Enter your LLM API key: na
Enter your LLM Base URL [mostly used for local LLMs, leave blank if not needed - example: http://localhost:5001/v1/]: http://localhost:5000/v1
/bin/sh: 2: [[: not found
Enter your LLM Embedding Model
Choices are openai, azureopenai, llama2 or leave blank to default to 'BAAI/bge-small-en-v1.5' via huggingface
>
Enter your workspace directory [default: ./workspace]:
make[1]: Leaving directory '/home/user/code/OpenDevin'

resulting config.toml is missing info about llm base url.

LLM_MODEL="gpt-3.5-turbo-1106"
LLM_API_KEY="na"
LLM_EMBEDDING_MODEL=""
WORKSPACE_BASE="./workspace"

Additional Context

I'm trying to host my own using this guide for oobabooga web-ui https://github.com/OpenDevin/OpenDevin/commit/08a2dfb01af1aec6743f5e4c23507d63980726c0#commitcomment-140559598 so I need it to point to localhost.

tpsjr7 avatar Apr 19 '24 13:04 tpsjr7

/bin/sh: 2: [[: not found

That's suspicious. Probably a syntax error in the makefile

rbren avatar Apr 19 '24 13:04 rbren