LLocalSearch
LLocalSearch copied to clipboard
LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress o...

**Describe the bug** When I run a search it errors when trying to connect to the searxng container. **To Reproduce** Steps to reproduce the behavior: 1. git clone https://github.com/nilsherzig/LLocalSearch 2....
Hmmm it seems to just ignore the prompt entirely.
Not sure if this is already suggested but i searched "history" on the issues page and nothing really showed up.
Pretty much what the title says haha. Its glitchy as the model types and the whole page bounces up and down!
Usefull -> Useful
Use [conc](https://github.com/sourcegraph/conc) to simplify a bit the concurrency code Some cleanup
This is lovely! It makes me very happy to see a new AI related project using Vite.
**Describe the bug** A clear and concise description of what the bug is. When use `mistral:latest` as model, I get repetive output from agents, examples: I didn't notice this behavior...
* * description error * *. Run the latest version of the error, open the Web interface on http://localhost:3000, the search can not see the progress and the final answer....