RTNeural icon indicating copy to clipboard operation
RTNeural copied to clipboard

No 'vanilla' RNN layer support?

Open nvssynthesis opened this issue 1 year ago • 2 comments

Is it correct that there is no support for 'vanilla' RNN layers, e.g. that of torch.nn.RNN? Is the reason for this something like 'GRU or LSTM is better anyway, just use that'?

nvssynthesis avatar Oct 22 '24 07:10 nvssynthesis

That's correct, at the moment RTNeural does not have support for that layer type. The reasoning is more just that I haven't yet had a need for it, and haven't received any requests to implement it (up to now). We probably should implement that layer, especially since it's simpler than the GRU or LSTM layers.

I'll probably end up naming the layer something like ElmanRNN, since I think that's maybe a more "specific" name than just RNN. Would you happen to know if TensorFlow has an equivalent layer?

I've added this to my to-do list, but it might be a minute before I get around to implementing it.

jatinchowdhury18 avatar Oct 22 '24 16:10 jatinchowdhury18

Excellent, thanks for clarifying. I just know there is tf.keras.layers.RNN but I'm not dead sure if it's equivalent.

nvssynthesis avatar Oct 26 '24 02:10 nvssynthesis