Arun
Arun
@ItzCrazyKns It seems that the WebSocket server is not defined in any provided Dockerfiles or the docker-compose.yaml file. So then how do I start or initiate and handle the WebSocket...
Also, how do I make sure the API server is accessible from the browser at http://127.0.0.1:3001/? > Hi, I understand the problem and I would recommend you to follow the...
@ItzCrazyKns **From _VM_Ubuntu_Bash_ `**:~/Perplexica$ docker compose logs**` of perplexica-backend** ``` ... perplexica-perplexica-backend-1 | yarn run v1.22.19 perplexica-perplexica-backend-1 | $ node dist/app.js perplexica-perplexica-backend-1 | WebSocket server started on port 3001 perplexica-perplexica-frontend-1...
@ItzCrazyKns In the config.toml case, there are still errors particularly related to Perplexica Backend docker container : WebSocket/API servers Error loading Ollama models: TypeError: Fetch Failed Connection Closed despite the...
@ItzCrazyKns Could you please show or share with us how you've resolved or fixed this issue particularly? Anything else is done to fix this WebSocket issue particularly, as I still...
@ItzCrazyKns Sorry for the confusion, as you already mentioned it's under development, it wouldn't be possible to use it in a production environment I think. And also what I meant...
> Hello, could you please respond to this issue within the next 24 hours? If I don't hear back from you by then, I'll assume that everything is resolved and...
> > This is not working on GitHub CodeSpace. Why? Anyways to.. > > Github codespaces aren't meant to run things on production so we'll not talk it about here...
> It might be because the websocket connection couldn't be established due to configuration reasons. Please update those in the Docker compose file before rebuilding the images. And yes, Perplexica...
@ItzCrazyKns these APIs for ollama LLMs or Azure OpenAI LLMs or embeddings models don't work at all despite these changes.