Native windows version?
Helllo, I personally find LocalAI very nice, but wish to compare it's efficiency to llama (and derivatives). Is a windows version (with cpu only option) gonna be available soon?
I mean without a docker container.
boo
@mudler thanks.
Sadly I don't use windows so I probably will struggle into this to various levels (testing, time dedicated to it, etc. ) but any help from the community into that direction is very much appreciated. This point keeps popping up around from the community - even if WSL and Docker works. maybe @sozercan could help here?
Any updates about this?