LLMstudio
LLMstudio copied to clipboard
FEAT: Access UI remotely
Feature Request
I don't know if this is possible i could not find this in the docus. The server UI is starting with http://localhost...., is it possible to start the server to listen on all or specific ip address?
Motivation
I would like run all AI services on a strong hardware which is available in the local LAN, and would like to manage the LLMstudio remotely.
Your contribution
I'm not a programmer, but i could test the possible solution.