backprop icon indicating copy to clipboard operation
backprop copied to clipboard

Can it be used with accelerate?

Open kamilc opened this issue 7 years ago • 8 comments

This looks fantastic!

One (possibly) trivial question I have is: can this be used with accelerate? My Haskell got a little bit rusty - I can't tell for sure just from looking at the types...

Thanks!

kamilc avatar May 07 '18 11:05 kamilc

Thanks!

As of the recent version 0.2, it should be able to used with accelerate, provided the types all have a Backprop instance. I'm currently working on an adapter now that gives all the right instances to accelerate types, but there's nothing stopping anyone from experimenting with the instances on their own!

The bigger work issue is probably porting all of the linear algebra operations with their gradients, which is a bit of a boilerplatey undertaking (see hmatrix-backprop). Accelerate types might work, but you also would need to lift linear algebra functions (like dot products, matrix multiplications, etc.) by providing their individual gradients as well for the user. This is something I'm also working on getting out soon!

mstksg avatar May 07 '18 18:05 mstksg

@mstksg Awesome! Thanks for the pointers to the hmatrix integration - I'll have a look. I can't promise you much help as of now unfortunately but this might change in the near future.

kamilc avatar May 08 '18 08:05 kamilc

Have you seen https://github.com/tmcdonell/accelerate-blas?

It could be a good place to start augmenting with backprop variables.

cpennington avatar May 16 '18 21:05 cpennington

Hi all,

Did anyone try to use backprop with accelerate?

vincent-hui avatar Oct 21 '18 08:10 vincent-hui

I am trying to figure it out now. My problem is:

if a neural network with fully connected layers is represented as

network :: (Reifies s W) =>
=> BVar s (Acc (Matrix Float)) -- ^ Inputs 
-> BVar s [Acc (Matrix Float)]  -- ^ Weights
-> BVar s (Acc (Matrix Float))  -- ^ Outputs

then how do I represent [Acc (Matrix Float)] as Arrays a => Acc a?

An additional problem is in the discrepancy between pure equations (backprop library) and impure, GPU-specific operations (Accelerate DSL), which might ultimately result in a poor speed annihilating the advantages of GPU use.

masterdezign avatar May 06 '19 16:05 masterdezign

I am using the matrix-matrix multiplication (<>) from accelerate-blas package as a basis of a linear layer:

linear :: (Reifies s W, Numeric e)
=> BVar s (Acc (Matrix e))
-> BVar s (Acc (Matrix e)) 
-> BVar s (Acc (Matrix e))

masterdezign avatar May 06 '19 17:05 masterdezign

One approach might be to give Acc a backprop instance, like

instance Backprop (Acc a) where
    zero :: Acc a -> Acc a
    one :: Acc a -> Acc a
    plus :: Acc a -> Acc a -> Acc a

As long as you can assemble the the DSL purely, it should work. One issue that might come up, however, is the partsVar mechanic using lens-based accessing.

mstksg avatar Aug 27 '19 20:08 mstksg

I would really love to try, but I am really, really afraid of getting a result similar to this one. Are there any limitations that you are aware of?

masterdezign avatar Sep 14 '19 12:09 masterdezign