tutorials icon indicating copy to clipboard operation
tutorials copied to clipboard

Feedback about Optimizing Model Parameters Page

Open madhaven opened this issue 5 months ago • 0 comments

There is the following issue on this page: https://docs.pytorch.org/tutorials/beginner/basics/optimization_tutorial.html

Within the section Full implementation, the loop does not contain the zero_grad function on top of the backward propagation block as is recommended in the paragraph preceding this section.

Actual code:

# Backpropagation
loss.backward()
optimizer.step()
optimizer.zero_grad()

Recommended code:

optimizer.zero_grad()
loss.backward()
optimizer.step()

If you could instruct me how to make this change on the documentation, I would be glad to do that.

madhaven avatar Aug 04 '25 14:08 madhaven