neural-fortran icon indicating copy to clipboard operation
neural-fortran copied to clipboard

Refactor `forward` and `backward` methods to allow passing a batch of data instead of one sample at a time

Open milancurcic opened this issue 1 year ago • 0 comments

In support of #155.

This will impact the forward and backward methods in:

  • network type
  • layer type
  • dense_layer type
  • conv2d_layer type

Effectively, rather than looping over sample in a batch inside of network % train, we will pass batches of data all the way down to the lowest level, that is, the forward and backward methods of dense_layer and conv2d_layer types. Lowering the looping over the sample in a batch will also allow the implementation of a batchnorm_layer.

It will also potentially allow more efficient matmuls in dense and conv layers if we replace the stock matmul with some more specialized and efficient sgemm or similar from some flavor of BLAS or MKL.

milancurcic avatar Aug 08 '23 15:08 milancurcic