binarynoise
binarynoise
Also, please make sure, the full `field`/`field__container` is clickable, not just the ``.
I understand that he rather wants to connect to OpenAI-API-compatible servers that do not use the ollama-API.
> What are "OpenAI-API-compatible servers"? Bodhi App seems to be one: > It also exposes these LLM inference capabilities as OpenAI API compatible REST APIs. @anagri would have to clarify...
Same for freezing with suspend. Workaround would be to set freezing method to disable in Settings - Rules
Also stuff like "use mmap" or thread count
My setup would be - compiled from source - latest Firefox - using different devices to access hollama, sometimes private window for experiments
For me as plugin dev it wouldn't make much of a difference. I only asked here because I found this repo and it contained the other logos.
Is there a repo for the main website somewhere? I'd create a PR there
I would prefer option 1, as this would allow using default settings also when using a private tab, my laptop/phone, or when a friend uses my hollama from their PC....
Example prompt file has been moved and refactored, example is now [here](https://github.com/open-webui/open-webui/blob/c4ea31357f49d08a14c86b2bd85fdcd489512e91/backend/open_webui/main.py#L1784) Same for the other stuff: [Example](https://github.com/open-webui/open-webui/blob/03d5a670f610435cc667e8d6b638bed75af2acc0/backend/open_webui/utils/task.py#L40-L79) [implementation](https://github.com/open-webui/open-webui/blob/03d5a670f610435cc667e8d6b638bed75af2acc0/backend/open_webui/main.py#L1352-L1402) (permalinks)