papiche

Results 33 comments of papiche

This is my settings. ![Settings](https://github.com/user-attachments/assets/0646224f-7eb1-4b81-9ae0-9c34b7aaccf2) What surprise me is that I only have "open_ai" to choose from Chat model Provider

I am using Perplexica on a LAN GPU equiped computer. I used ``` perplexica-frontend: build: context: . dockerfile: app.dockerfile args: - NEXT_PUBLIC_API_URL=http://127.0.0.1:3001/api - NEXT_PUBLIC_WS_URL=ws://127.0.0.1:3001 ``` as docker_compose parameters then I...

![Settongs2](https://github.com/user-attachments/assets/fb4184a4-e166-4549-8e28-955a17203915) I succeed connecting to ollama with this parameters (OpenAI custom parameters) But i get no answers in frontend Still the same backend errors ``` error: Error loading Ollama models:...

Using LAN address (which is 192.168.1.27 5000/3001) in docker-compose.yaml, accessing without ssh tunnel ![ollama](https://github.com/user-attachments/assets/95d59f2c-ad41-4db7-b05a-f39f9cb9196e) IT WORKS

Is there any way to access Perplexica from WAN ?

OK. At least, it needs "websocket" relay for port 3001 do you plan to add user access control ?

I'll try this other branch. Thx And so many thanks to you and the wonderful FOSS you made! You help AI to become a "common good"

It also happens to me regularly. Just wait in front of the prompt and after a while "Failed to connect to server" appears ![Capture d’écran du 2024-09-25 11-57-39](https://github.com/user-attachments/assets/a8f6814e-e2e1-4224-8e97-6b402871fe71) In console,...

you are right, it could be network quality, and it can happen in LAN (wifi with room mates condition) so maybe having some retry instead of direct timeout could help....

> fwiw, I ran into a similar error and what fixed it for me was changing the base image of `node` that runs from within `backend.dockerfile`. Essentially changing it to...