Results 75 comments of bamsumit

Hi @pbonazzi I am guessing you are meaning to say SUM mode and OR mode. Yes the current implementation seems to be a bug. Would you like to make a...

Thanks, @stevenabreu7 for getting this started. I'll take a close look at the PR soon. One thing that pops up is the license of https://github.com/neuromorphs/NIR I don't see a license...

BSD-3 would be great :)

> Thanks for all your help so far @bamsumit. I've been struggling to get the weights and parameters to match between an original lava-dl-generated .net file and the .net file...

@naveedunjum can you check it again with the latest codebase? There was a change pushed recently.

Hi @ronichester, yes it looks like a bug. Your snippet is correct. Will you be willing to issue a fix?

Hi @zhangsen-hit this implementation uses some of the accelerated components from the original C++ implementation. This version is more feature-rich. You might want to look at [Lava-DL](https://github.com/lava-nc/lava-dl) SLAYER for even...

@zhangsen-hit slayerPyTorch computes the output of each layer for all the timesteps at once. Lava-dl SLAYER allows for both options. By default, it calculates the output of all the outputs...

@surabhi-Siemens you will need to preprocess the aedat files in the link above. Here is an example script to do that: https://github.com/bamsumit/slayerPytorch/issues/37#issuecomment-680762632 ALternatively, you can use this version of dataset...

@naveedunjum Yes and Yes. 1. To make the decay parameters learnable, you can set `requires_grad = True` in the neuron parameter. 2. You can use different neuron parameters for different...