private-gpt icon indicating copy to clipboard operation
private-gpt copied to clipboard

Add Model Information to ChatInterface label in private_gpt/ui/ui.py #1647

Open ingridstevens opened this issue 1 year ago • 2 comments

Related to Issue: Add Model Information to ChatInterface label in private_gpt/ui/ui.py #1647

Introduces a new function get_model_label that dynamically determines the model label based on the PGPT_PROFILES environment variable. The function returns the model label if it's set to either "ollama" or "vllm", or None otherwise.

The get_model_label function is then used to set the label text for the chatbot interface, which includes the LLM mode and the model label (if available). This change allows the UI to display the correct model label based on the user's configuration.

I ran make check: "Success: no issues found in 59 source files" and make test: "30 passed, 11 warnings"

ingridstevens avatar Feb 24 '24 14:02 ingridstevens

@ingridstevens conflicts with main branch need to be resolved. Let me know if you can do it. Otherwise I can give it a try

imartinez avatar Mar 15 '24 15:03 imartinez

@imartinez I believe I've resolved the conflict, but let me know if there's something else I should do to get this PR ready to merge. Thanks!

ingridstevens avatar Mar 16 '24 13:03 ingridstevens