DeepSpeed icon indicating copy to clipboard operation
DeepSpeed copied to clipboard

[REQUEST] How to access to the gradients (to manipulate) while the model is training?

Open BilgehanSel opened this issue 2 years ago • 3 comments

Does deepspeed offer an API to have access the gradients while training for any of the stages (1-2-3). When I try to access the gradients, I only get None types. I'm mostly interested in the stage 3. If there is no such API, is there any chance to delve into the deepspeed code to access them somehow, I would also be glad if someone just points me to the right direction in the source code for this. I'm not that concerned about any slowdowns in the code as long as I have access to the gradients. I also want to be able to manipulate/change the gradients before optimization steps.

BilgehanSel avatar Apr 19 '23 12:04 BilgehanSel

@BilgehanSel, please see https://deepspeed.readthedocs.io/en/latest/zero3.html#debugging

tjruwase avatar Apr 19 '23 13:04 tjruwase

Thank you for the response. I can confirm that I am able to access the gradients with deepspeed.utils.safe_get_full_grad() function. As also stated in my original question, how can I assign new values to the gradients?

BilgehanSel avatar Apr 19 '23 22:04 BilgehanSel

I was also wondering how to set the grad? Thank you in advance if someone could provide a solution.

This is pretty helpful when partially updating token embeddings, in this case, only a few index are updated while other's grad are set to zero before optimizer.step().

Luodian avatar Apr 20 '23 18:04 Luodian

Any solutions?

pangjh3 avatar Sep 06 '23 13:09 pangjh3

@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.

tjruwase avatar Sep 06 '23 14:09 tjruwase

@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.

Hi there, have you implemented this feature? I also need this feature in my case.

song-wx avatar Oct 26 '23 08:10 song-wx