LLocalSearch
LLocalSearch copied to clipboard
LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress o...
Expose filters like "programming" to the user interface. This would allow us to only scrape through "trusted" or topic relevant sites like stack overflow for example
Accepts a list of LLMs / settings and basically does table testing with them. Uses user input to vote on the results. Used to find good configurations by brute force....
Usefull for bug reports. Should contain: - model used - git commit hash (version) - recursion depth - all used env vars
Settings like #2 should be hidden behind a settings menu to avoid cluttering the main UI. Something like the settings popup from claude? 
This would be useful to use apps other than ollama, as there are tons of backend apps and servers that are openai api compatible.