Rafail Giavrimis
Rafail Giavrimis
@ChainYo As I mentioned in the pr description, a function inside torch (with version 1.11) had an issue parsing the signature correctly and provided kwargs as an extra positional argument....
Is there anything I can do to help with this?
Have a similar issue with a ViT transformer. In my case at least, it seems to stem from using the Auto huggingface classes inside the context.
@gonzalezsieira @ColinFerguson Issue is in langchain-community. Downgrading it to 0.0.19 fixes the issue for now (regardless of langchain version).
I'm using langchain 0.1.6 curently and it seems to be working
Having the same issue with Firefox for Android
I was thinking about this in the same way that `transformResponse` is used to make other LLMs compatible when not streaming (e.g. https://github.com/janhq/jan/blob/0cae7c97612ba4f6a3387f7f72c56e065711cacf/extensions/inference-anthropic-extension/src/index.ts#L87). Effectively a way to adjust the format...