keras-spiking icon indicating copy to clipboard operation
keras-spiking copied to clipboard

Feature requests for Lowpass: non-trainable and/or homogeneous

Open arvoelke opened this issue 4 years ago • 4 comments

There are two features I've found myself needing lately w.r.t. the lowpass layer:

  1. (Non-Trainability) The ability to make the time-constant non-trainable. apply_during_training is close, but setting this to False skips over the lowpass entirely during training. I still want the lowpass in the forward training pass. I just don't want its time-constants to be modified from the initial value that I've provided.

  2. (Homogeneity) The ability to learn only a single time-constant. Currently the initial tau is broadcasted like so: https://github.com/nengo/keras-spiking/blob/116fc6c29049e6a3e9c960e7ec075c9f09c801bb/keras_spiking/layers.py#L582-L583 such that a different tau is learned for each dimension. Sometimes prior knowledge can tell us that the time-constant should be the same. This would also make trained lowpass filters compatible with NengoDL's converter (see https://github.com/nengo/nengo-dl/issues/60#issuecomment-763974388). But even independently of that, I've encountered a sitaution where I'd like to be able to learn just a single time-constant, and then change the shape of the data going into the layer at inference time (i.e., have the single lowpass that is broadcast across all of the dimensions, and also with the same initial values).

arvoelke avatar Feb 05 '21 03:02 arvoelke

You can make the layer non-trainable by passing trainable=False to the constructor (that's a general Keras feature, not something specific to keras-spiking). But allowing a single tau value would be good. It might make sense to implement this by adding a constraint parameter (similar to standard Keras layers), and then we could have a built in constraint that enforces homogeneity.

drasmuss avatar Feb 05 '21 15:02 drasmuss

You can make the layer non-trainable by passing trainable=False to the constructor

If I wanted to do this just for the time-constant and not for the initial value then I would need to do layer.tau_var.trainable = False? (Or how would I do that if tau_var doesn't exist until it's built?)

arvoelke avatar Feb 05 '21 15:02 arvoelke

Hmm I'm not positive whether modifying tau_var.trainable after the initial build would have an effect or not, you'd have to experiment. But yeah trainable=False wouldn't work for that case.

drasmuss avatar Feb 05 '21 15:02 drasmuss

  1. (Homogeneous time constant) can now be done by setting tau_constraint=keras_spiking.constraints.Mean()

drasmuss avatar Jul 26 '21 14:07 drasmuss