TensorFlow.NET
TensorFlow.NET copied to clipboard
.Net 6 accuracy on training doesn't changing
Training on .Net 6 basically not working, but on .Net 5 all is good.
TensorFlowNet.Examples(MNIST CNN (Eager)):
.Net 5:
step: 10, loss: 25,200823, accuracy: 0,75
step: 20, loss: 17,497072, accuracy: 0,8125
step: 30, loss: 4,7848167, accuracy: 0,96875
step: 40, loss: 11,228649, accuracy: 0,90625
step: 50, loss: 8,46461, accuracy: 0,90625
step: 60, loss: 12,897742, accuracy: 0,875
step: 70, loss: 5,8186455, accuracy: 0,90625
step: 80, loss: 1,6474819, accuracy: 1
step: 90, loss: 5,5969973, accuracy: 0,90625
Test Accuracy: 0,9
18.11.2021 00:18:13 Completed MNIST CNN (Eager)
Example: MNIST CNN (Eager) in 4,7636146s is OK!
.Net 6:
step: 10, loss: 73,61097, accuracy: 0,125
step: 20, loss: 74,02429, accuracy: 0,03125
step: 30, loss: 73,439026, accuracy: 0,03125
step: 40, loss: 73,71028, accuracy: 0,0625
step: 50, loss: 73,987946, accuracy: 0,125
step: 60, loss: 73,33862, accuracy: 0,15625
step: 70, loss: 72,83645, accuracy: 0,21875
step: 80, loss: 73,17415, accuracy: 0,21875
step: 90, loss: 73,84353, accuracy: 0,09375
Test Accuracy: 0,11
18.11.2021 00:19:28 Completed MNIST CNN (Eager)
Example: MNIST CNN (Eager) in 4,4328711s is Failed!
Custom CNN code:
var ((x_train, y_train), (x_test, y_test)) = keras.datasets.mnist.load_data();
var x_train_norm = (x_train.astype(TF_DataType.DtFloatRef) / 255f).reshape((-1, 28, 28, 1));
var x_test_norm = (x_test.astype(TF_DataType.DtFloatRef) / 255f).reshape((-1, 28, 28, 1));
var model = keras.Sequential(new()
{
keras.layers.InputLayer((28, 28, 1)),
keras.layers.Conv2D(32, 3, activation: keras.activations.Relu),
keras.layers.MaxPooling2D(),
keras.layers.Conv2D(64, 2, activation: keras.activations.Relu),
keras.layers.MaxPooling2D(),
keras.layers.Flatten(),
keras.layers.Dense(10, keras.activations.Softmax)
});
model.compile(keras.optimizers.Adam(), keras.losses.SparseCategoricalCrossentropy(from_logits: true), new [] { "accuracy" } );
model.summary();
model.fit(x_train_norm, y_train, 128, 2);
.Net 5:
Epoch: 001/002, Step: 0001/0469, loss: 2,302554, accuracy: 0,062500
...
Epoch: 001/002, Step: 0469/0469, loss: 1,604421, accuracy: 0,870233
Epoch: 002/002, Step: 0001/0469, loss: 1,497208, accuracy: 0,968750
...
Epoch: 002/002, Step: 0469/0469, loss: 1,498555, accuracy: 0,965783
.Net 6:
Epoch: 001/002, Step: 0001/0469, loss: 2,302585, accuracy: 0,796875
...
Epoch: 001/002, Step: 0469/0469, loss: 1,736288, accuracy: 0,768267
Epoch: 002/002, Step: 0001/0469, loss: 1,726776, accuracy: 0,734375
...
Epoch: 002/002, Step: 0469/0469, loss: 1,692857, accuracy: 0,768283
System info
OS: Manjaro linux(arch-based), Windows 10
Hi @Tsohndeq , I'm aware of this issue yesterday as well, will take a deep look once I got chance.
Found some clue, the root cause is from the np.Load_Npz
function. It doesn't return correct numbers in .NET 6.0.
I'll continue to do some research on this issue.
