Maatra
Maatra
https://github.com/facebookresearch/dinov2/issues/19#issuecomment-1514310057 Shows how to replace memory_efficient_attention with normal attention, could fix your issue.
``` x_norm = self.norm(x) return { "x_norm_clstoken": x_norm[:, 0], "x_norm_patchtokens": x_norm[:, 1:], "x_prenorm": x, "masks": masks, } ``` There is a layer norm applied right before returning the output, also...
Can you share the link?
It's L2 norm as per the graph in the paper. It makes most sense for it to betaken on the channels of the patches (no CLS) and they seem to...
@SmartDever02 Did you find a way to pass authentification data to an embedded gradio iframe?
I heavily recommend the "pro Git" book by Scott Chacon it's available online for free and demystifies a lot surrounding git branching, merging etc
There are two repositories you are trying to change. You want to change [sd-scripts @ bfb352b](https://github.com/kohya-ss/sd-scripts/tree/bfb352bc433326a77aca3124248331eb60c49e8c)(the one currently pointed at in kohya_ss) to merge the hyperparameter pull request. You want...
Key here is understanding most PR are based on forks, so the hyperparamater PR is not located within the official sd_script tree, it is located in a branch i created...
oups i dont think you want to open a PR with your branch on the sd_script repo.
A pull request is meant to be when you want the changes you made into your own copy of the repository(the fork) to be also made into the main repository(official...