JulioHC00

Results 18 comments of JulioHC00

Also interested in how one can use this for our own annotations!

Hi, is there any solution available now for limiting the cache size?

> Thanks for the report and for your interest to contribute to shap. The warning is thrown [here](https://github.com/shap/shap/blob/master/shap/explainers/_deep/deep_pytorch.py#L242). In order to support the layers you point out, one needs to...

I see that #2549 tried to implement LayerNorm, is there any reason why this isn't in the current release?

@CloseChoice Any ideas as to why it's wrong? I'm going to go through the paper you mention, and I'd love to work on it. Hopefully it's got something to do...

> Pretty sure it'll help with #3881. Here is how I would tackle this: > > * create a very basic network with only a couple inputs (2-3), a normalization...

Ok so here's some thoughts so far. 1. BatchNorm can be treated as a linear operation: When the model is in eval mode, the mean and variance for the normalization...

> > Ok so here's some thoughts so far. > > > > 1. BatchNorm can be treated as a linear operation: When the model is in eval mode, the...

This is what it results in ``` Joint x shape: [torch.Size([2, 3])] Joint x: [tensor([[-1., 4., 1.], [ 1., 2., 3.]])] ``` It works fine because there's always the batch...

@CloseChoice Some updates. The nonlinear_1d **works** if you just so slightly shift the input that's equal to the background in the LayerNorm. I don't know if this helps understand how...