logseq-plugin-gpt3-openai
logseq-plugin-gpt3-openai copied to clipboard
Openai settings in prompt templates
trafficstars
Support customizing things like model, temperature, token count, frequency penalty, stop sequences, etc in the prompt templates.
[summarize-text]
name = "Summarize Text"
description = "Summarize a text document"
temperature = 0.7
model = "text-davinci-002'
maximumLength = 500
prompt = '''
Summarize text:
'''
Customizing system prompt per template would be really useful. Perhaps it can be incorporated into the existing prompt template, e.g.: prompt-template:: Custom Prompt
System: Custom system prompt text
Custom prompt text
This feature would be really helpful.
Further, I would love to see the support for stop_sequences parameter as well. Currently, the models generate text as long as the limit set in settings.