NetDissect icon indicating copy to clipboard operation
NetDissect copied to clipboard

ZeroDivisionError: centered_arange step size is 0

Open datwelk opened this issue 8 years ago • 5 comments
trafficstars

In upsample.py line 352, the step size (3rd param) passed to np.arange is 0, yielding a ZeroDivisionError. This is caused by t having value 1, and reduction having value 2: 1 // 2 = 0. The fieldmap is ((0,0), (4,4), (1,1)).

The conv layer that is being analyzed has a 4x4 kernel, 4 output channels, and a stride of 1. Its input size is 3x120x120, the output size is 4x117x117.

I am going to debug this further but just wanted to share this already.

datwelk avatar Jun 09 '17 17:06 datwelk

I ran the code just as the readme shown, while i got this srror:

Traceback (most recent call last): File "src/labelprobe.py", line 306, in parallel=args.parallel) File "src/labelprobe.py", line 101, in label_probe thresh, labelcat, batch_size, ahead, verbose, parallel) File "src/labelprobe.py", line 134, in fast_process result.get(31536000) File "/usr/lib/python2.7/multiprocessing/pool.py", line 558, in get raise self._value TypeError: call() got an unexpected keyword argument 'grid'

Is there anything wrong with my python2.7? or should I use the python3.5? and why?

lingeo avatar Jun 23 '17 04:06 lingeo

@lingeo: please open new records for differnt issues; this is unrelated. I've moved your question to #5.

davidbau avatar Jun 28 '17 12:06 davidbau

@datwelk: are you still encountering this ZeroDivisionError problem? Some of the logic in upsample.py depends on the specific way you have written things in your proto config. It will be easier to debug if you are able to share a specific proto that triggers the error.

davidbau avatar Jun 28 '17 12:06 davidbau

@davidbau it's reproducible using any proto where the spatial dimension of the conv layer output is odd as opposed to even.

datwelk avatar Jun 28 '17 12:06 datwelk

I'm running into the same problem, but not just for odd shaped output layers.

Using pytorch, blob="features.3" in vgg19 (relu before first pool layer) has an activation shape of (224, 224), the reduction is 2 but the fieldmap is ((0,0), (1,1), (1,1)), since the input size is (224, 224). Then, it's the same problem that @datwelk mentioned (t // s = 1 // 2 = 0).

To get interior pytorch modules, I use the following function as such: get_pytorch_module(net, blob).register_forward_hook(hook_function)

    def get_pytorch_module(net, blob):                                          
        modules = blob.split('.')                                               
        if len(modules) == 1:                                                   
            return net._modules.get(blob)                                       
        else:                                                                   
            curr_m = net                                                        
            for m in modules:                                                   
                curr_m = curr_m._modules.get(m)                                 
            return curr_m                                                       

ruthcfong avatar Oct 05 '17 11:10 ruthcfong