efficient_densenet_tensorflow icon indicating copy to clipboard operation
efficient_densenet_tensorflow copied to clipboard

A memory efficient implementation of densenet.

Results 3 efficient_densenet_tensorflow issues
Sort by recently updated
recently updated
newest added

``` class Back_Recompute(Layer): def __init__(self, nb_filter, bottleneck=False, dropout_rate=None, weight_decay=1e-4, **kwargs): self.nb_filter = nb_filter self.weight_decay = weight_decay self.bottleneck = bottleneck self.dropout_rate = dropout_rate super(Back_Recompute, self).__init__(**kwargs) def call(self, ip): global brcount concat_axis...

I have learn so much from your project, It's a good working. Thank you~ I am writing project in tf2.0 now. I haven't been able to implement it. So sad........

The method you propose for using `recompute_grad` is not working, except for the simplest case where all layers in the model are recomputed except the input and output layers. All...