haystack-core-integrations
haystack-core-integrations copied to clipboard
MistralChatAdapter: Typo in the `max_tokens`
MistralChatAdapter's ALLOWED_PARAMS has max_tokens but generation_kwargs is using max_gen_len. Because of this discrepancy, max_tokens limit is set to the default 512 whenever running the model.
- Haystack version: 2.1.2
- Integration version: 0.7.1
Related PR - https://github.com/deepset-ai/haystack-core-integrations/pull/740