logseq-plugin-gpt3-openai icon indicating copy to clipboard operation
logseq-plugin-gpt3-openai copied to clipboard

Openai settings in prompt templates

Open briansunter opened this issue 2 years ago • 2 comments
trafficstars

Support customizing things like model, temperature, token count, frequency penalty, stop sequences, etc in the prompt templates.

[summarize-text]
name = "Summarize Text"
description = "Summarize a text document"
temperature = 0.7
model = "text-davinci-002'
maximumLength = 500
prompt = '''
Summarize text:
'''

briansunter avatar Jan 13 '23 23:01 briansunter

Customizing system prompt per template would be really useful. Perhaps it can be incorporated into the existing prompt template, e.g.: prompt-template:: Custom Prompt

System: Custom system prompt text
Custom prompt text 

chilang avatar Sep 06 '23 14:09 chilang

This feature would be really helpful.

Further, I would love to see the support for stop_sequences parameter as well. Currently, the models generate text as long as the limit set in settings.

aswny avatar Feb 05 '24 09:02 aswny