Alok Saboo
Alok Saboo
I also tried the url: https://smith.langchain.com/studio/?baseUrl=http://192.168.2.162:2024 but that gave 
I logged in to Langchain and I am still seeing:  I see the same even at https://smith.langchain.com/studio/?baseUrl=http://0.0.0.0:2024 The container logs don't have the 404 anymore. ```bash Building ollama-deep-researcher @...
I updated the `OLLAMA_BASE_URL=http://192.168.2.162:11434/` in the docker-compose. Here are some of the fetch requests:  Just to give you some idea about the setup, I have Ollama running on my...
Notice that even http://192.168.2.162:2024/info is blocked even though it works when I navigate to the URL.
@cgmartin could be...here's what I see:  Not sure what to do about this.
I tried on the Mac with 127.0.0.1 and localhost, but same error.
```bash $ interpreter -y --context_window 1000 --api_base "http://192.168.2.162:11434" --model ollama/codestral:22b-v0.1-f16 --api_key "fake_key" [Errno 2] No such file or directory: 'ollama' ▌ Ollama not found Please download Ollama from ollama.com to...
If I remove `ollama` from the mode, I get a very different error. ```bash $ interpreter -y --context_window 1000 --api_base http://192.168.2.162:11434/v1 --model llama3 --api_key "fake_key" > list the files in...
Just to confirm that Ollama is working and that model is available, the following works: ```python curl http://192.168.2.162:11434/api/chat -d '{ "model": "codestral:22b-v0.1-f16", "messages": [ { "role": "user", "content": "why is...
@quantumalchemy, why not create a PR against the main repo?