Lucas Igel
Lucas Igel
Bumping on @eshaanagarwal ! Facing the same issue
Seconding @nietras ! Would love to see if anyone's already exported ONNX models. Happy to provide Google Drive storage space so everyone can get access
Thanks for testing it! I've only tested on macOS yeah. The linked video is notable because it's
Thanks for confirming @wvq ! Very curious if Tauri's file server is patchable here or if the scope of fixing this is quite large
Like @bvaughn said, I will also name my first child "React" if this gets implemented!
Barely missing the threshold on `openai_clip-vit-base-patch16.ggmlv0.f16.bin`! Can we significantly reduce the minimum memory pool size? Is this just a bug that's massively inflating the minimum? I'd like to run clip.cpp...
That worked! Thank you so much for the quick reply. I want to run on Intel-era Macbook Pros and Airs, like a 13" Macbook Air 2019. Not very low-end in...
As an aside, have you tried converting these models to CoreML [like they do in whisper.cpp](https://github.com/ggerganov/whisper.cpp#core-ml-support)?
Really appreciate the quick replies here. Have you also considered building out a version of this for BLIP or other more recent CLIP variants? Currently exploring the steps involved. Large-scale...