[Tests] Fix CI for deprecated attention block when used with `device_map`
What does this PR do?
Ran a round of fast GPU tests (from push_tests.yml). They are all passing except for the deprecated attention block.
I think the change is okay because it doesn't introduce any performance regressions in the CI, either.
The failure: https://github.com/huggingface/diffusers/actions/runs/10734214122/job/29768965396#step:6:4275
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
cc @DN6 is this ok to merge?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
@DN6 okay to merge?
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
@DN6 okay to merge?
Sorry missed this. Yes can merge. Failing tests are unrelated.
Oh wait this was already fixed with this? https://github.com/huggingface/diffusers/blob/a4c1aac3ae10172f4acb8eaf83aac7f1f6e02ab0/tests/models/test_attention_processor.py#L88
PR: https://github.com/huggingface/diffusers/pull/10359
You're right