alerem18

Results 18 comments of alerem18

yes i know, i've used logitcrossentropy without softmax, also softmax with crossentropy, but still same results i just typed the julia code here wrongly

> I haven't run the code. But is there a possibility that model input is mistaken? 50% accuracy really reminds me of my one data-processing experience. how i should prepare...

> your code works but i really don't know why my code isn't working if the data preprocessing is the same i tried a different implementation similar to yours for...

> Not on a computer right now, but I think you should remove the `reset!` from the loss function. And therefore, stick to a custom training loop instead of `train!`...

loss_fn(X, Y), accuracy(X, Y) ===> bad results loss_fn(m, X, Y), accuracy(m, X, Y) ==> good results passing model thorough loss and accuracy functions will work as expected, if you don't...

```julia using Flux using Flux: gradient, logitcrossentropy, params, Momentum using OneHotArrays: onecold, onehotbatch using MLDatasets: MNIST using Random: shuffle using Statistics: mean using Base.Iterators: partition # ------------------- data -------------------------- train_x,...

In the modified code, the loss_fn and accuracy functions do not take the params of the model as input, and they call the model directly within the function to compute...

> @alerem18 for now you'll have to use something complicated and solve this problem with genie instead of declaring frontend routing with something like stippleui tabs. I am thinking of...