iPad Pro support
It is currently impossible to install the plugin on an iPad Pro. Please, please, please... Is there any chances for it to be implemented in the future? Thanks for the amazing work!
@Edodamm Thanks for being an early adopter!
The reason I set this plugin to be desktop-only, for now, is to explore the options to run LLM locally and have this plugin work offline without any internet connection. I'm a big believer in local AI and privacy. That said, I definitely see demand for mobile support. Currently I'm exploring the option to run local models, and am deliberately avoiding desktop-dependent components to pave the way to mobile support. So please stay tuned!
exploring
Very good ideology! I hope you will utilize Apple M1 chips, as they are amazing with GPU for this job!
Why don't just disable desktop only feature on mobile instead of disabling the whole plugin? People like me selfhost models on home servers, this can be useful even for private AI.
Please refer to #532 to follow along. Closing this as duplicate.