Evin Pınar Örnek

Results 11 comments of Evin Pınar Örnek

Yes I am adding them as soon as possible.

Indeed, for example when we reload a notebook and execute many cells which are ordered logically whereas having unrelated cells in between, either a bookmark or a "jump from a...

I have the same problem, receive more or less a constant sound. Did you already solve it? Is it conditioned on anything, and how long&with which data did you train...

Solved, my problem was giving the quantized inputs as scalar inputs, without creating one-hot vectors.

Not very specific scripts, I run them on some input that I generate from real point cloud data.

Hi @andreaferretti, > [Here](https://github.com/huggingface/diffusers/issues/2121#issuecomment-1415798328) there is an example of implementing Attend and Excite as an attention processor, but some points are still a little obscure to me, for instance: >...

Hi [andreaferretti](https://github.com/andreaferretti), the way I understood and use the processors aligns with your explanation. But I'm sure there can be many exciting ways to use the processors.

This looks plausible thanks! Furthermore, with the xformers implementation, how can we retrieve softmaxed k*q attention map (before applying to values)? See here: https://github.com/facebookresearch/xformers/blob/5df1f0b682a5b246577f0cf40dd3b15c1a04ce50/xformers/ops/fmha/__init__.py#L149 > ```python > class XFormersCrossAttnKVProcessor: >...

If useful for anyone, I've implemented an Attend-to-Excite with the AttentionProcessors, an example is here: https://github.com/evinpinar/Attend-and-Excite-diffusers/blob/72fa567a1e3bb3cc1b63cb53a1d9db5fc10b241f/utils/ptp_utils.py#L57 ``` python class AttendExciteCrossAttnProcessor: def __init__(self, attnstore, place_in_unet): super().__init__() self.attnstore = attnstore self.place_in_unet =...