spacy-llm
spacy-llm copied to clipboard
`transformers` > 4.38 causes bug in inference for HF models
Inference fails with
TypeError: transformers.generation.utils.GenerationMixin.generate() got multiple values for keyword argument 'pad_token_id'
```.
Cause for this is unclear so far.
Workaround for the time being is to pin `transformers` to <= 4.38.