neat-python
neat-python copied to clipboard
Multi-layered RNN implementation
This is an RNN implementation that supports multiple layers within each time step.
I had opened issue #126 a few days back where I described the problem. This is a solution that adds another class for multi-layered RNN that supports both normal and recurrent connections with minimal modifications.
Change Log:
-
Introduced the following configuration attributes in the
[DefaultGenome]
sectionnw_type
in['default', 'rnn', 'mlrnn', 'ctrnn', 'iznn']
recurrent_con_prob
= 0.5 by default -
Implemented the MLRecurrentNetwork class
-
Modified
required_for_output()
andfeed_forward_layers()
to check for recurrent connections on demand. -
Handled the recurrent connection cases in mutation functions.
The implementation passed the fixed-memory and variable-memory tests as well as some Gym environments I used for further testing. The overall performance in terms of average generation count and network size is better.
Coverage decreased (-12.7%) to 83.965% when pulling 87543c6bef5a93329630fefdc127bd0ea82295e7 on HeshamMeneisi:master into 15e910ce12f34497b32946e468205e08b019034d on CodeReclaimers:master.
Apologies for the long silence--I've been overwhelmed with other stuff and just kind of let this fall idle. I appreciate the patch and will take a look as soon as I get some free time. Thanks!