AINXTGENStudio

Results 2 comments of AINXTGENStudio

@cmp-nct @tc-mb My apologies if this has already been discovered, but from my quick research and experimentation yesterday, I have been able to successfully use openbmb/MiniCPM-Llama3-V-2_5-gguf VLM directly on LM...

Thanks for the consideration as most of these UIs are actually using llama.cpp just like ollama. LM Studio's UI though is very user friendly and has access to huggingface URL...