big-AGI
big-AGI copied to clipboard
[Roadmap] Option to customize local LLM inference settings
Why Would provide users with more control over the inference settings such as "top_k", "min_p", "top_p", "mirostat", etc.
Concise description An advanced settings menu where the requests sent to the local LLM can be customized. Either as a slider menu where the inference settings such as "top_k", "min_p", "top_p", "mirostat", etc., can be selected or a textbox where inference configuration can be entered.
Requirements Additional settings to customize local LLM model settings in 'Configure AI Models' menu. Option to import the inference configuration from a JSON file.