parler-tts
parler-tts copied to clipboard
attention_mask
hi, I have attention_mask problem mismatch in the cross attenstion
can you please explain this line: requires_attention_mask = "encoder_outputs" not in model_kwargs ?
why is comed after this:
if "encoder_outputs" not in model_kwargs:
# encoder_outputs are created and added to model_kwargs
model_kwargs = self._prepare_text_encoder_kwargs_for_generation(
inputs_tensor,
model_kwargs,
model_input_name,
generation_config,
)
is the attention mask is needed for the cross attnetion layer in the generation part? this mismach problem accure only in the generator the train & eval are ok.
tnx!