Timmo
Timmo
can you try the fix branch?
Hi, the basic compile and upload procedure isn't much different than other ESP8266 project. 1. Download Visual Studio Code 2. Download PlatformIO-Extension (in VSCode itself => View => Extensions) 3....
have you tried the "fix"-branch?
Mmh, I just updated everything (ESP8266 2.6.3) and it compiles fine. Sorry I can't reproduce it. Even with your lib_deps it builds fine.
Is this really fixed? I can't confirm. I use ollama 0.1.34 and OpenWebUI v0.1.124. Here is my conversation with two pictures:   It seems like it mixed the Context...
Same issue here. After creating a new agents some stuff is downloaded, including a pytorch model. So it seems the backend wants to run a model on the backend (spacy?)....
Same issue here. I'm running front and backend on a different machine in my network "agent-llm.fritz.box" My compose file looks like this ```yaml version: "3.8" services: frontend: build: ./frontend init:...
Sorry, my fault. I did't realize that the NEXT_PUBLIC_API_URI is basically hardcoded to the frontend. So before building the frontend container the NEXT_PUBLIC_API_URI has to be set properly
Try to create a ".env.local" in the frontend directory with "NEXT_PUBLIC_API_URI=http://yourhostname.local:7437" and then create everything again
> Do you have the link to the plugin? It is build-in already: https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai When I use Langchain in python I just have to set OPENAI_API_KEY and OPENAI_API_BASE environment variables...