Arraymancer icon indicating copy to clipboard operation
Arraymancer copied to clipboard

Found that output from forward pass zeros-out during test

Open Niminem opened this issue 2 years ago • 0 comments

Hello, it seems like the output from the forward pass zeros out with this small example below:

import arraymancer

let ctx = newContext Tensor[float32]
network ctx, Test:
    layers:
        hidden1: Linear(2,3)
        output:  Linear(3,2)
    forward x:
        x.hidden1.relu.output

let model = ctx.init(Test)
var optim = model.optimizerAdam(learningrate=0.001'f32)

var X = ctx.variable(toTensor([[2.1,2.90],[1.001 ,0.908]]).astype(float32))
var y = toTensor([0,1]).astype(float32)

for epoch in 1 .. 10:
    let output = model.forward(X)
    echo output.value
    let loss = sparse_softmax_cross_entropy(output, y)
    echo "Loss is:" & $loss.value
    backprop(loss)
    optim.update()

If you simply add another sample to X and output to y, the forward pass acts normally. Not quite sure why this is.

Niminem avatar Sep 15 '21 02:09 Niminem