Peter Sugihara
Peter Sugihara
yeah that would be cool. curious, would it be better to just have a signed release on freechat.run you could download?
All makes sense, thanks for all the research here. I agree with the conclusion that it's not worth it to render the whole scrollview with webviews even though that would...
Another quick thought. I wonder if there's a hybrid solution where each message renders its own wkwebview for the content (but not the buttons etc). You wouldn't be able to...
could be useful https://github.com/yl4579/StyleTTS2
Good suggestion @cleesmith, thanks. Let's definitely implement that! A checkbox in Settings would be a good way to handle it.
> just have plain text hey @cleesmith... small clarification. Is your request to see the output as plaintext (e.g. unrendered markdown)? That's what I originally thought you were asking for,...
Good to know it worked. Happy for any contribution back if the component is portable.
This is cool, great work! Would it make sense to go more general and migrate OllamaBackend -> OpenAIBackend? Now that there's template support in llama.cpp server, we could migrate the...
Would their openai /v1/chat/completions endpoint give what we need? https://github.com/ollama/ollama/blob/main/docs/openai.md#endpoints
Just played with this, very cool. I like the general approach of allowing you to switch backends (and having the 0-config localhost backend by default). Try merging `main` for a...