deeplift
deeplift copied to clipboard
How to make DeepLIFT support customized keras layer?
Hi,
I was trying to use DeepLIFT to interpret my CNN model. I don't know how to convert my model because I built a customized layer. My model is built with keras, and that customized layer is something like global max pooling. My results show it does work for my data, so I should not remove it. However, I really want to understand my model, and DeepLIFT is such a good choice. So I was wondering how to make DeepLIFT support my customized layer?
Thanks!
Hi @KeLanhhh,
Thanks for reaching out. Is it not possible to achieve global maxpooling by specifying a pooling width that covers your entire input?
To make DeepLIFT support your customized layer, you would need to define a DeepLIFT layer object that corresponds to your layer, and then define a conversion function. I can explain how to do these things, but it may be better to wait until after a DeepLIFT update I am planning to put out in the next week or so (the update will be accompanied by a new ArXiv preprint). Would it be possible to wait that long? If not, let me know and I can guide you on how to achieve your desired layer in the current implementation (assuming that a pooling layer with a sufficiently large pooling width does not satisfy your use-case)
Thank you for the quick reply! @AvantiShri, a week or longer would be okay for me, just do you work first. By the way, my input length is not too long, So selecting top k max features might perform better than usual local max pooling. ps: looking forward to your big update!
Does DeepLIFT support to use deep residual? I have a model with deep residual, and want to use DeepLIFT to interpret it too. Thanks!
Hi @ttgump - not yet, but it would not be hard to add. Can you tell me more about the layer types you use to implement the resnet (are you using Keras?)
@AvantiShri Yes, I am using Keras.
@ttgump Great, if I understand correctly you are probably using a merge layer with "sum" as the merge mode. Currently I have support for the "concat" merge mode but I can add "sum". Just to be on the safe side, can you let me know any other layer types you might be using?
Hi, @AvantiShri
I read your new preprint of DeepLift, nice job! I think the separated positive and negative contribution might be very useful to my work.
Like what we discussed before, I want to apply DeepLift to my model which contains a customized k max pooling layer. At your convenience, could you teach me how to achieve my desired layer in the current DeepLift implementation? If need, I can also post the code of my layer here.
Thank you :)
@KeLanhhh Glad you liked the paper! Yes, it would be helpful if you tell me about the implementation of your layer here, and then I can advise you on how to adapt it for DeepLIFT.
@AvantiShri Here is code of k max pooling layer used in my model.
from keras import backend as K
from keras.engine.topology import Layer
class KMaxPooling(Layer):
def __init__(self, K, **kwargs):
super(KMaxPooling, self).__init__(**kwargs)
self.K = K
def get_output_shape_for(self,input_shape):
shape = list(input_shape)
shape[1] = self.K
return tuple(shape)
def call(self,x,mask = None):
k = theano.tensor.cast(self.K, dtype="int32")
sorted = theano.tensor.sort(x, axis = 1)
out = sorted[:, -k:, :]
return out
def get_config(self):
config = {"pool_size": self.K}
base_config = super(KMaxPooling, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
Is it hard to convert? Please let me know if you need any further information! Thanks!
Hi there,
I also have a customized layer implemented in Keras (code is here). It may be slightly more complicated than @KeLanhhh 's case though as it is a trainable layer. Could you help in supporting this in DeepLift ?
Thanks a lot in advance,
Best
Nicolas