Add Model Information to ChatInterface label in private_gpt/ui/ui.py #1647
Related to Issue: Add Model Information to ChatInterface label in private_gpt/ui/ui.py #1647
Introduces a new function get_model_label that dynamically determines the model label based on the PGPT_PROFILES environment variable. The function returns the model label if it's set to either "ollama" or "vllm", or None otherwise.
The get_model_label function is then used to set the label text for the chatbot interface, which includes the LLM mode and the model label (if available). This change allows the UI to display the correct model label based on the user's configuration.
I ran make check: "Success: no issues found in 59 source files" and make test: "30 passed, 11 warnings"
@ingridstevens conflicts with main branch need to be resolved. Let me know if you can do it. Otherwise I can give it a try
@imartinez I believe I've resolved the conflict, but let me know if there's something else I should do to get this PR ready to merge. Thanks!