NeMo-Guardrails
NeMo-Guardrails copied to clipboard
doc: explain expected values of the `parameters` field in ModelConfig
Please also confirm the following
- [x] I have searched the main issue tracker of NeMo Guardrails repository and believe that this is not a duplicate
Issue Kind
Improving documentation
Existing Link
https://docs.nvidia.com/nemo/guardrails/latest/user-guides/configuration-guide.html
Description
the parameters are actually meant for the Langchain classes that we are using to actually interact with various LLM providers. This way we are able to set any of the attributes different LLM providers support, model and model_name are just two of them, and we cannot control the names of these attributes:
https://github.com/NVIDIA/NeMo-Guardrails/blob/27169c73388b473cd02c30d9dc03dd5a695e24ca/nemoguardrails/rails/llm/llmrails.py#L357
..., but maybe we should highlight in the docs in another MR that the parameters key in our config.yml file are actually the attributes that can be passed to different Langchain LLM (or anything that extends BaseLanguageModel) providers. Right now in the config documentation we just mention that parameters: any additional parameters, e.g., temperature, top_k, etc. (but there is no mention how to understand which are the actual supported additional parameters, or maybe there is this information somewhere else and I cannot find it now) - @mikemckiernan any thoughts?
https://docs.nvidia.com/nemo/guardrails/latest/user-guides/configuration-guide.html
Originally posted by @trebedea in https://github.com/NVIDIA/NeMo-Guardrails/issues/1084#issuecomment-2810941395
I sincerely doubt there is any additional information elsewhere in the docs about the parameters field.
IIUC, it's a field that accepts arbitrary key-value pairs (or objects too and arrays too?). Was there a particular LangChain package and integration that motivated this change? Would Anthropic's thinking be an example of something that can only be expressed in parameters?
models:
- type: main
engine: anthropic
model: claude-3-7-sonnet-latest
parameters:
thinking:
type: enabled
budget_tokens: 2000
Is that the idea?
@mikemckiernan , that is right and Anthropic's thinking looks like a great example.
The parameters field was created because many models (in Langchain, but also outside the Langchain ecosystem) have a different set of parameters and we needed a standard way to pass them when instantiating an LLM / Model object. As parameters to a constructor are just a dict of keys and values (param_name : param_value) it made sense to have a dict and simply name it parameters.
Tested this and works fine, the only thing is that we need to also have the max_tokens set and be higher that the budget_tokens.
models:
- type: main
engine: anthropic
model: claude-3-7-sonnet-latest
parameters:
max_tokens: 2500
thinking:
type: enabled
budget_tokens: 2000
It would be great to update the docs with a clear explanation on the parameters attribute and provide this example.