Omar Sanseviero
Omar Sanseviero
Actually I misread the code. Now I realize this is a ASR Model, for which we don't have support in `fairseq` in the API. I'll add `asr`, but the model...
We do allow passing `top_k` and `top_p` for generation tasks at the moment (see https://github.com/huggingface/huggingface_hub/blob/main/api-inference-community/api_inference_community/validation.py#L81-L82. although this seems unused). But this is passed in the input, not as parameters :thinking:....
cc @sanchit-gandhi @Vaibhavs10
cc @Vaibhavs10 @sanchit-gandhi
Hi there! 🤗 Not at the moment. I would suggest to ask in https://discuss.huggingface.co/ and you can link to the corresponding blog post,
cc @pcuenca and @philschmid as well here > If we need to control the length of input sequences should we initialize tokenizer with model_max_length=X, truncation=True? Yes. > Shouldn't we then...
cc @sanchit-gandhi @Vaibhavs10
I think we can aim to release on Monday. WDYT?
cc @simoninithomas
cc @clefourrier