semantic-kernel icon indicating copy to clipboard operation
semantic-kernel copied to clipboard

Copilot Chat Sample Application: Lower prompts.json verbosity

Open PederHP opened this issue 2 years ago • 1 comments

The prompt components in prompts.json (for the Copilot Chat Sample Application) are very verbose. I think it would be valuable to reduce their token length in order to free up more tokens for prompt content and/or faster processing.

An alternative or additional possibility could be to highlight in the sample readme that these prompts are examples and creating good prompt components is an important part of building on this sample, as it can significantly impact the AI's performance (perhaps with a reference to the documentation on LLM AI Prompts).

The sample is a really great entry point for quickly getting some hands-on experience with semantic-kernel, but I think it's too easy to miss the importance of these prompt templates hidden away in a json file.

I'm also slightly skeptical that the level of verbosity in these templates is really necessary to achieve decent results. I know this is a work in progress, and the prompts are likely biased to favor safety over other concerns, but perhaps this could then be highlighted in the sample readme.

PederHP avatar May 09 '23 18:05 PederHP

@PederHP , thanks for the feedback here. We are looking into it.

evchaki avatar May 09 '23 20:05 evchaki

Thanks for the feedback on chat copilot. As a reference app, it's goal was to show what was possible, so it purposefully used as many tokens as possible. Unfortunately, we do not currently have plans to change the verbosity of the current prompts.

madsbolaris avatar Nov 28 '23 01:11 madsbolaris