ShadowArcanist
ShadowArcanist
### Description Hi! On the readme file https://github.com/Mintplex-Labs/anything-llm/blob/master/README.md Supported Embedding models list mentions "LM Studio" but on AnythingLLM v1.4.4 (macOS) I can't see the LM Studio as the Embedding Provider....
On the update v1.5.3 we have added 2 new LLMs and 1 new Embedding model. LLMs: - Cohere - KoboldCPP Embedding Model: - Cohere I am going to add these...
### Pull Request Type - [x] 📝 docs ### What is in this change? - [x] Added KoboldCPP to supported LLMs list - [x] Added Cohere to supported LLMs list...
Let's say I am using Olllama as LLM Provider and have following models installed: - Yarn Mistral >> 128K Context Window - Mistral 7B >> 32K Context Window - Llama...
### Pull Request Type - [x] ✨ feat - [x] 🐛 fix - [x] ♻️ refactor ### Relevant Issues resolves #22 #28 ### What is in this change? #### ♻️...
On our [docs](https://docs.useanything.com/) we only have uninstall guide for [MacOS](https://docs.useanything.com/getting-started/installation/desktop/macos#uninstalling-the-application) and I am writing an uninstall guide (a new page which only have how to uninstall anythingllm from all platform)...
On our discord people often ask "How can I access AnythingLLM from my mobile phone or different machine" Users can access AnythingLLM by exposing their Localhost to the internet by...
On our discord people often say I visited http://localhost:11434/ and it is showing ollama is running but anythingllm is not showing ollama models. The issue here is they entered http://localhost:11434/...
On our discord we got alot of questions like - Agents were not working for me - Agents were saying "I can't use internet" In most cases the user is:...
Currently, the only way to zoom in is by using `cmd or ctrl +`, which zooms in on all blocks in the UI. Sometimes, I want to zoom in on...