Cortexsys icon indicating copy to clipboard operation
Cortexsys copied to clipboard

error happened when to train a mapping from input with 100 neurons to output with 4 neurons

Open RyanCV opened this issue 8 years ago • 4 comments

The network defined as follows: input size = 100, output size = 4

layers.af{1} = [];
layers.sz{1} = [input_size 1 1];
layers.typ{1} = defs.TYPES.INPUT;

layers.af{end+1} = ReLU(defs, []);
layers.sz{end+1} = [input_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED;

layers.af{end+1} = ReLU(defs, []);
layers.sz{end+1} = [output_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED;

if defs.plotOn
    nnShow(23, layers, defs);
end

Error in

Error using  - 
Matrix dimensions must agree.

Error in squaredErrorCostFun (line 2)
    J = (Y.v(:,:,t)-A.v(:,:,t)).^2;

Error in ReLU/cost (line 65)
            J = squaredErrorCostFun(Y, A, m, t);

Error in nnCostFunctionCNN (line 29)
J = nn.l.af{nn.N_l}.cost(Y, nn.A{nn.N_l}, m, 1) + J_s;

Error in
Train_proposal>@(nn,r,newRandGen)nnCostFunctionCNN(nn,r,newRandGen)

Error in gradientDescentAdaDelta (line 69)
    [J, dJdW, dJdB] = feval(f, nn, r, true);

Error in Train_proposal (line 158)
nn = gradientDescentAdaDelta(costFunc, nn, defs, [], [], [], [], 'Training
Entire Network');

RyanCV avatar Feb 09 '17 00:02 RyanCV

Its hard to say exactly without seeing your exact code and data formats. However, it seems like the dimensionality of your label matrix (Y) could be incorrect. Check the shape of Y and try permuting it so that it matches that of the final output layer.

joncox123 avatar Feb 09 '17 04:02 joncox123

Thanks for your help, it solves the problem.

RyanCV avatar Feb 09 '17 14:02 RyanCV

@joncox123 by the way, where can I find '~/mnist_full.mat' in order to run your example?

RyanCV avatar Feb 09 '17 17:02 RyanCV

@joncox123 , if I want to design a network only has one hidden layer (including one Fully_connected layer + ReLU), and the output layer is only Fully_connected layer, is the following design right??

layers.af{1} = [];
layers.sz{1} = [input_size 1 1];
layers.typ{1} = defs.TYPES.INPUT;  (input layer)

layers.af{end+1} = ReLU(defs, defs.COSTS.SQUARED_ERROR);
layers.sz{end+1} = [input_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED; (hidden layer 1)

layers.af{end+1} = ReLU(defs, defs.COSTS.SQUARED_ERROR); (**for this line, if I use [], there will be error??** )
layers.sz{end+1} = [output_size 1 1];
layers.typ{end+1} = defs.TYPES.FULLY_CONNECTED; (**output layer, how to set it only has fully connected layer??**)

X = varObj(X,defs, defs.TYPES.INPUT);
Y = varObj(Y,defs, defs.TYPES.OUTPUT);

Thanks.

RyanCV avatar Feb 09 '17 21:02 RyanCV