Arthur
Arthur
cc @stas00 sorry for the long wait
What solved it for me was `wget "path"` instead of `wget path`
It's [in the `original` folder.](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/blob/main/original/tokenizer.model) Because the `transformers` compatible version only needs `tokenizer.json` π€
Hey ! https://github.com/huggingface/transformers/pull/30334 was opened 3 days ago for this π
Hey! As you can see from the red tests, this cannot be merged as it is breaking a lot of the API π
Hey! great workπ₯ Would you be open to put this model on the hub following [this tutorial](https://huggingface.co/docs/transformers/custom_models)! This model seems very similar to a Bert model, so it makes more...
Main with compile is broken, https://github.com/huggingface/transformers/pull/28937 should fix it !
This is only valid if we indeed have the argument `return_dict_in_generate`. Otherwise the pipeline will also fail because `output_ids` will not be a dictionary. Pipelines in general currently don't support...
Can you provide a reproduction script to make sure we are running with the same parameters? π Also this might ring some bels to @Narsil. I know we interacted before,...