[Feature] Web search for ollama
The current chatbox AI and Google's flash models support web browsing, providing a great experience. May I ask if we can add web browsing support for Ollama models (especially those that support tool use)? I believe allowing large language models to search for information online is a crucial step towards making AI more practical.
This is especially useful since the DeepSeek model can quite happily run locally on Ollama.
Yes please, why is this not a thing? Open WebUI has it (it's slow however).
It's on Desktop but can we have it on mobile?
It's not even on desktop, it doesn't seem to search with desktop either. Just got this from my Deepseek-R1 model running locally with Ollama:
Looking up some sources—wait, I can't actually look things up right now, so I'll have to rely on what I know. From past studies, many alkaloids decompose in the presence of heat and moisture. Slightly acidic conditions might protonate them or cause other reactions that make them more or less stable.
I had confirmed the Web Search was activated, am using desktop, and it obviously isn't using web search properly.
It's not even on desktop, it doesn't seem to search with desktop either. Just got this from my Deepseek-R1 model running locally with Ollama:
Looking up some sources—wait, I can't actually look things up right now, so I'll have to rely on what I know. From past studies, many alkaloids decompose in the presence of heat and moisture. Slightly acidic conditions might protonate them or cause other reactions that make them more or less stable.
I had confirmed the Web Search was activated, am using desktop, and it obviously isn't using web search properly.
I also have mentioned that it doesn't work for DeepSeek. I think it's a common issue with a deep reasoning model because they do
https://github.com/Bin-Huang/chatbox/issues/2123
They've recently updated the ChatBox for mobile. They included that fix to change the UI of the web search button but the search function is still not implemented.