jupyter-ai
jupyter-ai copied to clipboard
Update prompts for locally installed models
Problem
Current prompts in Jupyter AI work well with remote providers, but are not optimized for locally installed models provided by GPT4All. See the discussion on #190 to some examples where the responses are not honoring the guardrails in the prompt.
Proposed Solution
Update the prompt, so it behaves consistently with all models. A second option is to provide a custom prompt for local providers.
Would this also solve cases where your model is intended for auto-completing code? E.g. the current prompts asking magics with code format doesn't work with all models since it adds natural language instructions after the code asking about the output being code in markdown etc.
@vidartf We are looking into applying different prompt templates for each provider, so they are specific to the provider. It's not captured fully here, but this issue along with #225 should make improvements to tackling peculiarities with different providers.
Fixed by #309.
@JasonWeill This issue is only solved partially by 309. There is pending work to add these templates and use them with chat UI.
OK, reopened.
Code Llama instruction models require different prompting specification. This requires an ability to customize the prompt for chat generation as well. Can this be supported?