Sylvain Gugger

Results 633 comments of Sylvain Gugger

This doesn't seem like a use-case for the pipeline though. Since you want access to the process inputs, you should just used the tokenizer and the model directly.

@ArthurZucker I believe it is the opposite, the mismatch happens when ftfy is not installed. (@connor-henderson correct me if I misunderstood your posts).

The design is not easy enough to use. If a user saves a quantized model and pushes to the Hub, it should work directly with `from_pretrained`. This is why I...

It may be installing old versions of the library so you have to pick up the corresponding version of the example (cc @philschmid for the exact versions)

cc @Wauplin looks like something that should be in huggingface_hub (it it's not already).

Yes, your code is exactly what I'm suggesting. I think it would be a better API since the user wouldn't have to look for warnings (no need for a warning...

That's related to the data you are preprocessing, not the Transformers library or its examples. There is simply no `"seq2seq2"` in the features you prepare with your function. I suggest...