Timothy Carambat
Timothy Carambat
@atljoseph understandable and I was on the same page as you that this was your specific endeavor and use-case and it may not be portable to others. The PR still...
The only public port that should be exposed in the docker container running AnythingLLM in any template is 3001. Def dont expose 8888 and looking at the current K8 community...
I see you are on `Local Development` - did you go through onboarding and/or have an `LLM_PROVIDER` value set in the env? for local dev this is `server/.env.development` when running...
If you are running development `yarn dev:server` and make a code change in the backend, it will hot reload the server and what `ENV` you may have saved will revert...
User-agent for which interaction? Streaming chats or something else?
@jazelly the `pageContent` of the associated docment is empty?
> Issue faced with local deployment as well. LLM responses are poor. Has nothing to do with the deployment method or RAG structure, the RAG results are bad because the...
Can `Oobabooga` serve more than a single model at once? The reason the field does not exist is because when integrating we did not see that option to hot-load multiple...
@RahSwe what exactly are these types of files you are trying to upload. Unable to reproduce otherwise
We do not have any plans to build out a mobile app currently. This we feel is handled by the Docker browser mobile UI (which does need improvement). The current...