neural-fortran
neural-fortran copied to clipboard
A parallel framework for deep learning
Current forward pass store data needed for the backward pass during training. This is not necessary for inference alone. Each layer should provide a read-only forward pass called `output` or...
that will allow chaining layers of different shape, e.g. 1-d input to 2-d convolutional. Reference: [Keras Reshape layer](https://keras.io/api/layers/reshaping_layers/reshape/)
@milancurcic would you be open to a pull request with some refactoring that is minor but global? If so, I would submit one or more pull requests with the changes...
Would it be possible to use sub-modules to provide different backends to the neural-fortran interface? What I have in mind are things like: * [Intel® oneAPI Deep Neural Network Library](https://software.intel.com/content/www/us/en/develop/tools/oneapi/components/onednn.html)...
Currently ifort uses repeatable random seed by default: https://fortran-lang.discourse.group/t/mnist-problem-finding-file/3464/17 Use `random_init` (where supported) for more consistent behavior. Also proposed in #57.
This issue tracks the progress on the support for convolutional layers. Ideally will be using GH Projects for this but that's on an Org level; I don't want to pollute...
This PR adds UML documentation in the doc/ subdirectory, including 1. A README.md describing and depicting the diagrams. 2. A UML class diagram detailing the derived type relationships. 3. A...