MegaParse icon indicating copy to clipboard operation
MegaParse copied to clipboard

Can I use other privatized and open-source LLM models?

Open JAVA-LW opened this issue 1 year ago • 6 comments

JAVA-LW avatar Oct 31 '24 10:10 JAVA-LW

Yes of course but it needs a bit of modifications. What do you need ?

StanGirard avatar Oct 31 '24 13:10 StanGirard

Was going to make an issue, but will hop onto this one.

Would like to operate with LLMs hosted with either vLLM or llama.cpp with models like llama 3.1 70b or llava for multi-modal?

tblattner avatar Dec 05 '24 14:12 tblattner

+1 it would be nice to have an option to use with local ollama server, and models like llama-vision and llava.

heltonteixeira avatar Dec 05 '24 18:12 heltonteixeira

+1 for ollama and maybe LM Studio

mzeidhassan avatar Dec 05 '24 18:12 mzeidhassan

Ollama now supports llama3-vision model - https://ollama.com/library/llama3.2-vision It will be very nice to support it.

netandreus avatar Dec 05 '24 19:12 netandreus

+1 Strongly support this request. Having standardized support for different LLM hosting options would be incredibly valuable. Some key use cases this would enable:

  1. Enterprise/private deployments where data needs to stay on-premise
  2. Cost optimization by using open source models like llama3-vision or llava
  3. Flexibility to leverage existing infrastructure (Azure, custom hosts, etc.)
  4. Easier integration with LangChain-compatible vision models for specialized use cases

This would make the project much more versatile for production deployments. Happy to help test if you implement support for any of these hosting options.

rdewolff avatar Jan 02 '25 10:01 rdewolff