LibreChat
LibreChat copied to clipboard
Enhancement: Use the Current Model for the Message Label
What features would you like to see added?
One of the benefits of Bedrock is that i can source a large number of very diverse models from a single vendor i am already working with and trust, Unfortunately there is no reproducibility and I cannot see which exact model was used, When using openAI it is a little better but it only gives you the Model Family in the prompt but not the exact model. Our Scientists will reject this lack of reproducibility. Wouldn't it be nice to have something like in the screen shot. That option would also make it easier to copy the model id to clipboard, easier than fiddling with a pull down menu
More details
na
Which components are impacted by your request?
No response
Pictures
Note the latest and biggest Models think Matt is the right answer, does this not look better ?
and when i switch to an older model after using the latest claude the answer is different but I don't understand why when i come back tomorrow
Code of Conduct
- [X] I agree to follow this project's Code of Conduct
Thanks! I've been wanting to implement this for a while. Most of the data is already, it should be as simple as a toggle.
Awesome, thank you Danny
This would be great to have, it's one of the only few features missing from LibreChat IMO! :)
@danny-avila , I wonder if this feature has a chance to make it into v0.7.5 final ?
Also curious if it's in a branch anywhere. =)
I've also found myself needing this feature as I am using OpenRouter/auto model that dynamically selects an actual model to use.
I have made a hacky temporary solution in my fork (https://github.com/Kostusas/LibreChat/tree/openrouter_model_show) by appending the returned message in here with the model information:
let content = message.content;
if (this.modelOptions.model == 'openrouter/auto') { // Get the actual model name from the API response
const modelName = chatCompletion.model;
if (modelName) {
content += "\n\nThe question was answered by the model " + modelName;
}
}
return content;
which gives
Although I understand this is sub-optimal solution, as this model information sentence now goes into the prompt memory.