neat-python icon indicating copy to clipboard operation
neat-python copied to clipboard

Multi-layered RNN implementation

Open HeshamMeneisi opened this issue 7 years ago • 2 comments

This is an RNN implementation that supports multiple layers within each time step.

I had opened issue #126 a few days back where I described the problem. This is a solution that adds another class for multi-layered RNN that supports both normal and recurrent connections with minimal modifications.

Change Log:

  • Introduced the following configuration attributes in the [DefaultGenome] section nw_type in ['default', 'rnn', 'mlrnn', 'ctrnn', 'iznn'] recurrent_con_prob = 0.5 by default

  • Implemented the MLRecurrentNetwork class

  • Modified required_for_output() and feed_forward_layers() to check for recurrent connections on demand.

  • Handled the recurrent connection cases in mutation functions.

The implementation passed the fixed-memory and variable-memory tests as well as some Gym environments I used for further testing. The overall performance in terms of average generation count and network size is better.

HeshamMeneisi avatar Feb 21 '18 19:02 HeshamMeneisi

Coverage Status

Coverage decreased (-12.7%) to 83.965% when pulling 87543c6bef5a93329630fefdc127bd0ea82295e7 on HeshamMeneisi:master into 15e910ce12f34497b32946e468205e08b019034d on CodeReclaimers:master.

coveralls avatar Feb 21 '18 19:02 coveralls

Apologies for the long silence--I've been overwhelmed with other stuff and just kind of let this fall idle. I appreciate the patch and will take a look as soon as I get some free time. Thanks!

CodeReclaimers avatar Jun 23 '18 15:06 CodeReclaimers