nolearn icon indicating copy to clipboard operation
nolearn copied to clipboard

flip_filters and pad parameter not used by NeuralNet's class

Open algila opened this issue 9 years ago • 5 comments
trafficstars

I would suggest to add into NeuralNet's class the Lasagne's parameters 'pad' and 'flip_filters' used in ImageNet nn like VGG or GoogleNet or ResNet today available in several library but not in nolearn. Into the nolearn documentation the two are not mentioned but probably the parameter pad is used because when I add it no errors are generated. In contrary if I add to a layer flip_filters parameter an error is generated. Please sorry if I'm wrong.

algila avatar Nov 03 '16 07:11 algila

nolearn uses whatever Lasagne layers you throw at it. I have trained with layers using the pad parameter, so I believe that it should work. If you encounter an error when using these parameters, post the exact code here and we may help. Remember that those are Lasagne parameters, not nolearn parameters.

BenjaminBossan avatar Nov 05 '16 14:11 BenjaminBossan

In effect 'pad is working just not documented in nolearn not so important. It is the 'flip_filters' that is not working. hereafter the portion of the code where I define the layer.

`cnn = NeuralNet( layers=[ ('input', layers.InputLayer), ('conv1', layers.Conv2DLayer), #Convolutional layer. Params defined below ('pool1', layers.MaxPool2DLayer), # Like downsampling, for execution speed ('conv2', layers.Conv2DLayer), ('pool2', layers.MaxPool2DLayer), ('conv3', layers.Conv2DLayer), ('dropout1', layers.DropoutLayer), ('dense', layers.DenseLayer), ('dropout2', layers.DropoutLayer), #('dense2', layers.DenseLayer), #('dropout3', layers.DropoutLayer), ('output', layers.DenseLayer), ],

input_shape=(None, 1, 28, 28),

conv1_num_filters=96, 
conv1_pad=1,
conv1_flip_filters=False,
conv1_filter_size=(3, 3), 
conv1_nonlinearity=lasagne.nonlinearities.rectify,
conv1_W=lasagne.init.GlorotUniform(),` 

The received error message is: TypeError: Failed to instantiate <class 'lasagne.layers.conv.Conv2DLayer'> with args {'W': <lasagne.init.GlorotUniform object at 0x7f57d78cd890>, 'flip_filters': False, 'pad': 1, 'name': 'conv1', 'nonlinearity': <function rectify at 0x7f57dc0d29b0>, 'filter_size': (3, 3), 'incoming': <lasagne.layers.input.InputLayer object at 0x7f57d78cda50>, 'num_filters': 96}. Maybe parameter names have changed? everything is working just commenting the line #conv1_flip_filters=False,

algila avatar Nov 05 '16 14:11 algila

This looks more like the Lasagne version installed with nolearn not supporting that parameter. Is it possible to instantiate a conv layer with that parameter at all?

layer = layers.ConvLayer(..., conv1_flip_filters=False)

If not, maybe installing the latest Lasagne version will help.

In general, the nolearn documentation will not repeat all the parameters from Lasagne. That does not mean that nolearn does not support them.

BenjaminBossan avatar Nov 05 '16 15:11 BenjaminBossan

I have nolearn 0.6.1 Theano 0.8.2 Lasagne 0.1 but in my version the file into the package folder lasagne/layers/conv.py has not into the class BaseConvLayer the 'flip_filter'. It is mentioned into lasagne folder available in github website.

I should immagine that I do not have the last version but if I try the command 'Sudo apt-get update lasagne' the response is that I just have the last release.

I should imagine that my version is corrupted so I try to delete using the command 'sudo apt-get --purge remove Lasagne' but the response is that lasagne is not found

No other idea to fix this issue

algila avatar Nov 05 '16 23:11 algila

I solved, the reason was exactly the old lasagne version. Using:

pip install --upgrade https://github.com/Theano/Theano/archive/master.zip pip install --upgrade https://github.com/Lasagne/Lasagne/archive/master.zip

I installed lasagne 0.2dev2 with also Theano 0.9dev4 and now it is working 1 billion of thanks

BR Alberto

algila avatar Nov 05 '16 23:11 algila