[REQUEST] How to access to the gradients (to manipulate) while the model is training?
Does deepspeed offer an API to have access the gradients while training for any of the stages (1-2-3). When I try to access the gradients, I only get None types. I'm mostly interested in the stage 3. If there is no such API, is there any chance to delve into the deepspeed code to access them somehow, I would also be glad if someone just points me to the right direction in the source code for this. I'm not that concerned about any slowdowns in the code as long as I have access to the gradients. I also want to be able to manipulate/change the gradients before optimization steps.
@BilgehanSel, please see https://deepspeed.readthedocs.io/en/latest/zero3.html#debugging
Thank you for the response. I can confirm that I am able to access the gradients with deepspeed.utils.safe_get_full_grad() function. As also stated in my original question, how can I assign new values to the gradients?
I was also wondering how to set the grad? Thank you in advance if someone could provide a solution.
This is pretty helpful when partially updating token embeddings, in this case, only a few index are updated while other's grad are set to zero before optimizer.step().
Any solutions?
@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.
@Luodian, @pangjh3, apologies for the delay. We wiill add this feature soon to complement others.
Hi there, have you implemented this feature? I also need this feature in my case.