Joao Gante

Results 270 comments of Joao Gante

@ChrisSpraaklab inside generate, in encoder-decoder models like T5, `input_ids` is related to the decoder input ids. They are not the same as the `input_ids` you feed to `.generate()`, which will...

Hey @davidavdav -- yeah, you can try using Beam Search (i.e. `num_beams>1`) and pass a NEGATIVE [`length_penalty`](https://huggingface.co/docs/transformers/main/en/main_classes/text_generation#transformers.GenerationConfig.length_penalty). This will nudge the output towards shorter outputs!

BTW, if you come across better variable names, by all means, please suggest them :) We have so many features on our to-do list (including better docs) that every little...

@hollance Re `.generate()` not supporting TTS -- `transformers` doesn't have any TTS model, and in fact `.generate()` only supports text (or other sets of integers) output. I'm not sure whether...

Hi @gtebbutt 👋 It is not a bug, but a backward compatibility-intended duplication. If you look at the code, the line `t = t / self.scaling_factor` is used exclusively to...

Hey there 👋 Assessing the model confidence for text generation is a very interesting topic, and its application would certainly be of value for `Open-Assistant`. As an external observer to...

@fzyzcjy We use black 22.3 (see [here](https://github.com/huggingface/transformers/blob/e1cd78634ae4ba2b0a3d548bd6663c08765a8b4d/setup.py#L101))

@fzyzcjy Suggestion: revert to the first commit (which only touches 2 lines), run `make fixup` (which only touches modified files), then force commit the result :)

Hey @erlichsefisalesforce 👋 looking at the stack trace, we see that `inputs`'s first dimension, the batch size, is unknown (shape = `[None, 1]`). It is possible that our generate function...

Hey @erlichsefisalesforce -- in that case, I will need a reproducible example to debug. The example you shared above contains references to local files :)