CoreMLHelpers icon indicating copy to clipboard operation
CoreMLHelpers copied to clipboard

Question about backprop

Open yousifKashef opened this issue 2 years ago • 5 comments

Hi Matt,

is it possible to get the gradient from back propagation using core ml or metal? I am trying to implement an adversarial attack on a core ML resnet 50 and don't know how to go about it?

by the way, the tutorials you post are excellent

yousifKashef avatar Aug 15 '23 03:08 yousifKashef

Core ML: no, gradients are not exposed in the API.

Metal: yes, but you'll have to re-implement the model using the lowest-level MPS primitives, such as MPSCNNConvolutionGradient.

hollance avatar Aug 15 '23 13:08 hollance

I see. So it would be: 1- implement model using MPS primitives 2- load the weights and bias to MPS model

What happens next? Are there metal functions for forward and backward passes? Or would those also need to be implemented?

yousifKashef avatar Aug 15 '23 14:08 yousifKashef

It's possible a nicer API is available these days (I haven't used MPS in a while) but in the past you had to implement both the forward and backward pass yourself. So a very simple model would be MPSLinearLayer -> MPSLossFunction -> MPSLinearLayerGradient where the MPSLinearLayerGradient is the backward pass.

hollance avatar Aug 15 '23 19:08 hollance

This sounds like the current API. Does your book (or any of your articles) have an example implementation of this?

yousifKashef avatar Aug 16 '23 04:08 yousifKashef

I don't have any examples for this, unfortunately.

hollance avatar Aug 16 '23 19:08 hollance