DeepLearnToolbox icon indicating copy to clipboard operation
DeepLearnToolbox copied to clipboard

dbn+nn for regression

Open Duanexiao opened this issue 9 years ago • 5 comments

latestly,I try dbn+nn for regression on the dataset blogfeedback .I find the predict value always the same every time on the test dataset. dbn = dbnsetup(dbn, train_x, opts); dbn = dbntrain(dbn, train_x, opts);

%unfold dbn to nn nn = dbnunfoldtonn(dbn, 1); nn.activation_function = 'sigm'; nn.output = 'linear';

%train nn opts.numepochs = 1; opts.batchsize = 100; opts.learningRate = 0.001; nn = nntrain(nn, train_x, train_y, opts); nn = nnff(nn, test_x, test_y); nn.a{end} is the predict value Is there anything wrong? Or is there any dataset for regression?

Duanexiao avatar Aug 20 '15 13:08 Duanexiao

Hello,@Duanexiao recently I also try dbn+nn for a regression problem, but the fitted outputs are constant. Have you already found out the reason for this issue, please? Is there anything wrong? OR dbn+nn in this toobox are designed for some certain datasets?

JannieLee avatar Dec 10 '15 02:12 JannieLee

the problem is that the output activation function is not 'linear' by default, but "'sigm". This will always return an output between 0 and 1 which is not what you want for regression problems, so set it to "linear". Having looked at the code though, there so many other implementation issues (drop out implementation is flat out wrong - there is no calculation of the mean net anywhere), so, don't bother with this toolbox.

grunterGah avatar Jan 16 '16 18:01 grunterGah

The mean calculation for dropout is in nnff.m:

    %dropout
    if(nn.dropoutFraction > 0)
        if(nn.testing)
            nn.a{i} = nn.a{i}.*(1 - nn.dropoutFraction);
        else
            nn.dropOutMask{i} = (rand(size(nn.a{i}))>nn.dropoutFraction);
            nn.a{i} = nn.a{i}.*nn.dropOutMask{i};
        end
    end

Basically in testing phase you compensate for not dropping nodes by multiplying all activations by (1 - dropoutFraction), which gives the next layer roughly the same amount of input as with dropout.

tambetm avatar Jan 16 '16 18:01 tambetm

@grunterGah the pasted coded in the second part:

nn.output = 'linear';

so i feel output activation function is not the key.

stutys avatar Aug 15 '16 09:08 stutys

So. I also encounter the problem. How can I solve it? The outputs are constant now. @tambetm Can you help me?

senyeer avatar Nov 21 '16 08:11 senyeer