lava-dl
lava-dl copied to clipboard
Neuron Parameters remain unchanged after setting them and also after training them.
When building the network, the neuron parameters that need to be set don't seem to change even after setting different values. For example, for the following network:
self.blocks = torch.nn.ModuleList([
slayer.block.cuba.Dense(neuron_params, 18, 20),
slayer.block.cuba.Dense(neuron_params, 20, 18)
])
with
neuron_params = {
'threshold': 1,
'current_decay': 1,
'voltage_decay': 1,
'requires_grad': True,
}
when checked from inside the network gives:
for block in net.blocks:
print(block)
print("Voltage", block.neuron.voltage_decay)
print("Current", block.neuron.current_decay)
print("Threshold", block.neuron.threshold)
<<<<<OUTPUT>>>>>>
Dense(
(neuron): Neuron()
(synapse): Dense(18, 20, kernel_size=(1, 1, 1), stride=(1, 1, 1), bias=False)
)
Voltage Parameter containing:
tensor([4096.], requires_grad=True)
Current Parameter containing:
tensor([4096.], requires_grad=True)
Threshold :1
We see from the source code the decay is scaled by 1<<12
, so we get 4096.
But when changing the neuron parameters to
neuron_params = {
'threshold': 10,
'current_decay': 10,
'voltage_decay': 10,
'requires_grad': True,
}
we only see the threshold changing inside the network
Dense(
(neuron): Neuron()
(synapse): Dense(18, 20, kernel_size=(1, 1, 1), stride=(1, 1, 1), bias=False)
)
Voltage Parameter containing:
tensor([4096.], requires_grad=True)
Current Parameter containing:
tensor([4096.], requires_grad=True)
Threshold 10.0
The voltage and current decay remain the same.
After training the network with SpikeTime Loss(Oxford tutorial) with requires_grad=True, we again see don't see the threshold changing, and the only the decay changes by a very small amount.
(neuron): Neuron()
(synapse): Dense(18, 20, kernel_size=(1, 1, 1), stride=(1, 1, 1), bias=False)
)
Voltage Parameter containing:
tensor([4095.0273], requires_grad=True)
Current Parameter containing:
tensor([4095.0273], requires_grad=True)
Threshold 1.0
********************
Dense(
(neuron): Neuron()
(synapse): Dense(20, 18, kernel_size=(1, 1, 1), stride=(1, 1, 1), bias=False)
)
Voltage Parameter containing:
tensor([4096.0005], requires_grad=True)
Current Parameter containing:
tensor([4096.0005], requires_grad=True)
Threshold 1.0
Steps to reproduce the behavior:
- In the oxford tutorial, set the
neuron_params
and print the neuron parameters using this:
for block in net.blocks:
print(block)
print("Voltage", block.neuron.voltage_decay)
print("Current", block.neuron.current_decay)
print("Threshold", block.neuron.threshold)
print("********************")
- Try changing the neuron parameters, there is no effect on the decay parameters,
- Train the model, only decay parameters by a small margin, while threshold remains the same.
I don't know if the issue is with the implementation or my Code. Can someone cross check this?