MegaParse
MegaParse copied to clipboard
Can I use other privatized and open-source LLM models?
Yes of course but it needs a bit of modifications. What do you need ?
Was going to make an issue, but will hop onto this one.
Would like to operate with LLMs hosted with either vLLM or llama.cpp with models like llama 3.1 70b or llava for multi-modal?
+1 it would be nice to have an option to use with local ollama server, and models like llama-vision and llava.
+1 for ollama and maybe LM Studio
Ollama now supports llama3-vision model - https://ollama.com/library/llama3.2-vision It will be very nice to support it.
+1 Strongly support this request. Having standardized support for different LLM hosting options would be incredibly valuable. Some key use cases this would enable:
- Enterprise/private deployments where data needs to stay on-premise
- Cost optimization by using open source models like llama3-vision or llava
- Flexibility to leverage existing infrastructure (Azure, custom hosts, etc.)
- Easier integration with LangChain-compatible vision models for specialized use cases
This would make the project much more versatile for production deployments. Happy to help test if you implement support for any of these hosting options.