spiking_relu_conversion icon indicating copy to clipboard operation
spiking_relu_conversion copied to clipboard

I want the same result as the paper.

Open rhkr9609 opened this issue 3 years ago • 0 comments

Hi I read your paper with interest. but, I couldn't get the results of your paper experiment with this code. The code below is the parameter values I used.

cnn.layers = { struct('type', 'i') %input layer struct('type', 'c', 'outputmaps', 12, 'kernelsize', 5) %convolution layer struct('type', 's', 'scale', 2) %sub sampling layer struct('type', 'c', 'outputmaps', 64, 'kernelsize', 5) %convolution layer struct('type', 's', 'scale', 2) %subsampling layer }; cnn = cnnsetup(cnn, train_x, train_y); % Set the activation function to be a ReLU cnn.act_fun = @(inp)max(0, inp); % Set the derivative to be the binary derivative of a ReLU cnn.d_act_fun = @(forward_act)double(forward_act>0); %% ReLU Train % Set up learning constants opts.alpha = 1; opts.batchsize = 50; opts.numepochs = 50; opts.learn_bias = 0; opts.dropout = 0.5; cnn.first_layer_dropout = 0;

How should I fix it?

rhkr9609 avatar Aug 11 '21 06:08 rhkr9609