llama-stack
                                
                                 llama-stack copied to clipboard
                                
                                    llama-stack copied to clipboard
                            
                            
                            
                        Developer-facing UI for chat completions
🚀 Describe the new functionality needed
Implement the ability to Chat with the Llama Stack using basic inference
💡 Why is this needed? What if we don't build it?
Basic requirement replicating the Streamlit behavior
Other thoughts
No response