OpenLLM
OpenLLM copied to clipboard
Deploying LLM in On-Premises Server to Assist Users to Launch Locally in Work Laptop - Web Browser
Feature request
I have been searching through a lot of websites and watching youtube videos on how to deploy opensource LLM models locally on a windows server and then it could be further exposed to the users who can interact with the LLM to ask questions using their own laptop's web browser. I believe this could be acheived using openllm however, I am not sure if this is already included in the library.
Motivation
No response
Other
No response
Have you find an way @sanket038 . Even i am in search of how to host the openllm from my working server and then making api calls from the server . any idea on hosting the openllm from the server . IF so please help me out.
Try to look at something like Ollama. (And let us know if that's what you seek.)
do u know the steps to link my custom downloaded model to be linked with ollama and then serve as an api to everyone. where i have deployment an chatbot ui i need to have backend code as the api which can be accessed by entire members.like ui in multiple device piging the server like that. @euroblaze . If you have discord please let me know we can connect send me the invite link to this mail [email protected].