parler-tts icon indicating copy to clipboard operation
parler-tts copied to clipboard

attention_mask

Open netagl opened this issue 8 months ago • 3 comments

hi, I have attention_mask problem mismatch in the cross attenstion

can you please explain this line: requires_attention_mask = "encoder_outputs" not in model_kwargs ?

why is comed after this: if "encoder_outputs" not in model_kwargs: # encoder_outputs are created and added to model_kwargs model_kwargs = self._prepare_text_encoder_kwargs_for_generation( inputs_tensor, model_kwargs, model_input_name, generation_config, )

is the attention mask is needed for the cross attnetion layer in the generation part? this mismach problem accure only in the generator the train & eval are ok.

tnx!

netagl avatar May 30 '24 11:05 netagl