neural-fortran
neural-fortran copied to clipboard
Provide different back-ends?
Would it be possible to use sub-modules to provide different backends to the neural-fortran interface?
What I have in mind are things like:
- Intel® oneAPI Deep Neural Network Library
- Tensorflow
- Fann (a Fortran interface for this one already exists here)
- Several other NN libraries
While I admire the effort to build a pure Fortran NN library, the amount of effort (and money) being put into these other libraries is simply enormous. Perhaps this way disciplines traditionally reliant upon Fortran (meteorology, quantum chemistry, ...) could also benefit from the numerous existing machine learning frameworks containing all kinds of advanced graph and runtime optimizations.
The way I see this working, is we need to define (and possibly expand) the high-level interface for creating and training NNs. Then the non-Fortran implementations (effectively just adaptors to other frameworks) can be placed in submodules which could be switched on by a CMake flag.
Digging deeper into TensorFlow as an example, they have some documents on how to build bindings for other languages through the C api:
The C API however, is still in development, and doesn't support all the features of TensorFlow in Python. I've found a few blog posts on how the C API can be used to call an existing graph:
It looks more complicated than I expected.
Thank you for the proposal! It sounds like a useful though daunting effort. I won't be able to lead the development but I'd be happy to help. I'm also open to changes to the API.
All this assuming that any added back-end would be optional at build time.
I just became aware of pytorch-fortran.
Very cool that NVIDIA is involved. I see one just needs to define a model and the input/output layers:
type(torch_module) :: torch_mod
type(torch_tensor) :: in_tensor, out_tensor
The tensors can be initialized from Fortran arrays and vice-versa.
It reminds me a bit of the tensor classes by Patrick Seewald: fortran-einsum-example
With TensorFlow I didn't get any further than what I posted at Discourse.
There's also Fortran Torch Adapter, which takes an approach similar to pytorch-fortran.