podman-desktop-extension-ai-lab
podman-desktop-extension-ai-lab copied to clipboard
Work with LLMs on a local environment using containers
### What does this PR do? Adding the `@podman-desktop/ui-svelte` library using `next`. ### Screenshot / video of UI ### What issues does this PR fix or reference? ### How to...
If commits are added to the branch of a recipe's repository, the user's local clone is not updated with the latest version, including the added commits. It is not reproducible...
From the page for a recipe, it is not possible to know for the user which ai-lab.yaml is used for the recipe, and so, which are the sources which are...
AI Apps page: when there is no app running, the message "There is no AI App running..." is centered. Model Services: Playground Environments: when there is no model service /...
Currently we are able to delete a model even if it is used by an inference server. We should prevent it and inform the user that the model is in...
When the download task is raising an error while starting a recipe, the process continue to the upload task, which should stop. 
I'd like to be able to define the model parameters - which are really configuring the way the model serving is happening https://github.com/abetlen/llama-cpp-python/blob/08e910f7a7e3657cf210ab8633a6994e1bde7164/llama_cpp/llama.py#L57-L109 As a user, I would also like...
Generally (in other apps), required fields are marked with a (*) after its label to indicate it is required. In AI Studio, the required fields are not materialised. We could...