Indigo
Indigo copied to clipboard
Tune MPNN architecture
Blocked by https://github.com/epam/Indigo/issues/659.
After we create a couple of useful presets of featurisers, it could be useful to try to improve MPNN architecture to achieve better predicting results.
ToDo Tune various MPNN architecture params to achieve better predicting results on ADRA1A using out best preset of featurisers:
- Number of layers
- Number of message passing steps
- Activation function (tanh?)
Use best params as default.