tf-keras icon indicating copy to clipboard operation
tf-keras copied to clipboard

LMU layer, Legendre Memory Units

Open flacle opened this issue 3 years ago • 4 comments
trafficstars

Describe the problem

LMU's (originally introduced in 2019) are a separate class of RNNs. They have been demonstrated to outperform LSTMs on some tasks that require longer time windows while requiring fewer parameters (see NeurIPS 2019 submission). Including LMU's directly into Keras would extend the usefulness and impact of Keras for its users where parameter size is an important consideration.

flacle avatar Nov 10 '22 18:11 flacle

@flacle, Could you please elaborate about your Feature. Also, please specify the Use Cases for this feature. Thank you!

tilakrayal avatar Nov 11 '22 12:11 tilakrayal

Talked this over, I am not sure we would want to go straight to a new layer at this time, but if you are interested, you could contribute a new Keras example (instructions here) showing how to implement this layer, and how it can improve performance on a timeseries dataset over lstms!

mattdangerw avatar Nov 17 '22 19:11 mattdangerw

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.

github-actions[bot] avatar Jun 18 '23 02:06 github-actions[bot]

This issue is stale because it has been open for 180 days with no activity. It will be closed if no further activity occurs. Thank you.

github-actions[bot] avatar Mar 21 '24 01:03 github-actions[bot]

This issue was closed because it has been inactive for more than 1 year.

github-actions[bot] avatar Mar 22 '25 02:03 github-actions[bot]