Dhruv Nair

Results 205 comments of Dhruv Nair

@josemerinom Could you share the exact traceback here? Not a screenshot.

@josemerinom Should be fixed in main now.

@sayakpaul We can merge once the conflicts are resolved here

Hi @ernestchu sorry for the delay here. Yeah this is tricky since the `conv_in` channels for models like PIA are loaded directly into the UNet. I assume you're working with...

Hi @JemiloII the issue here isn't the `pt` format. Rather that the checkpoint contains serialised objects that are not model weights. See attached screenshot below. We switched to not allowing...

Which version of diffusers are you using? The snippet I shared is meant to be run with the >0.27.2 version. Based on the traceback it seems like you're using version...

> I tried with 0.27.2 and the latest release. The goal is to update to the latest, but the diffusers keeps making breaking changes. This PT one is crazy as...

Hi @openSourcerer9000 unfortunately MPS sometimes lacks efficient kernels for certain operations, which could be the reason for the memory spike. Does the stack trace mention where in the inference process...

I'm with @yiyixuxu on the dedicated method for the initial version. If we go with `ModelMixin.compile` with the regional args turned off by default, then we're effectively the same as...

LGTM. There's a failing test that looks related to saving/loading the transformer.