LLMstudio
LLMstudio copied to clipboard
Framework to bring LLM applications to production
## Summary The objective is to create an option for users to run LLMStudion in a container by pulling it from Docker hub or building it locally. ## TODO: -...
### Feature Request I don't know if this is possible i could not find this in the docus. The server UI is starting with http://localhost...., is it possible to start...
### System Info Macbook pro ### Who can help? _No response_ ### Related Components - [ ] API - [ ] SDK - [X] UI ### Reproduction llmstudio server --ui...
### Feature Request Tests are running in a batch but they are not being evaluated against the GT answer. The evaluation could be done using similarity metrics, LLM or even...
### Feature Request - No option to delete - Sort from last to first - Show the timestamp of the job - Allow filtering  ### Motivation The LLMstudio UI...
### Feature Request All requests made to the LLMstudio server should authenticated and authorized through a centralized RBAC system ### Motivation The current design assumes single user, all the permissions...
### Feature Request Request for the addition of Google Colab support for the LLMstudio Web UI. I know that it's technically possible, as other projects use gradio for spawning web...
### Feature Since we have the backend to compare the outputs of several prompts for the same model, the same should be supported in the UI ### Motivation Make prompt...
### Feature Request Providers like OpenAI have some rate limits (things like a limit in the requests per minute). This feature would allow llm studio to wait it out (or...
### System Info Unable to launch LLMStudio in pycharm environment on Windows server. $ LLMStudio server Traceback (most recent call last): File "C:\Program Files\Python38\lib\runpy.py", line 193, in _run_module_as_main return _run_code(code,...