big-AGI icon indicating copy to clipboard operation
big-AGI copied to clipboard

[Roadmap] Option to customize local LLM inference settings

Open unseensholar opened this issue 1 year ago • 2 comments

Why Would provide users with more control over the inference settings such as "top_k", "min_p", "top_p", "mirostat", etc.

Concise description An advanced settings menu where the requests sent to the local LLM can be customized. Either as a slider menu where the inference settings such as "top_k", "min_p", "top_p", "mirostat", etc., can be selected or a textbox where inference configuration can be entered.

Requirements Additional settings to customize local LLM model settings in 'Configure AI Models' menu. Option to import the inference configuration from a JSON file.

unseensholar avatar Jan 15 '24 20:01 unseensholar