Leon Lysak

Results 19 comments of Leon Lysak

```nim import std/strformat import arraymancer except softmax # by default, this softmax signature is proc (input: Tensor[softmax.T]): Tensor[softmax.T] import arraymancer/nn/activation/softmax # this is the softmax we need softmax*[TT](a: Variable[TT]): Variable[TT]...

@Vindaar the above is the "simple 2 layer" example modified to simply add softmax in the forward. It produces the same error as above

Thanks Vindaar for your fast response and comment edit lol I'm still getting used to Github markdown

I've modified the `softmax_backward_ag[TT]` procedure to pass in `self` rather than `Gate`: (see below) reference: https://github.com/mratsim/Arraymancer/blob/master/src/arraymancer/nn/activation/softmax.nim ```nim proc softmax_backward_ag[TT](self: Gate[TT], payload: Payload[TT]): SmallDiffs[TT] = let self = SoftmaxActivation[TT](self)#(Gate) let gradient...

@Vindaar please review when you can

@Vindaar my good sir can we please get this implemented lol

Shit, thanks for looking into it Vindaar. I will take a look when I finally get the time and mental space as well lol

I'm so at a loss for saving and loading models.. respectfully, how are we supposed to use arraymancer for deep learning without being able to do this?

Things I learned from trying to solve this problem all day, hope it helps someone: In order to save/load weights and biases of your model, you'll first need to define...

@LemongrabThree From what I've been reading today, using async procedures within an except block is a no go. I've copied the example and tried using procedures from other modules to...