Skip to content

Commit 9a43c95

Browse files
authored
Register all the parameters in the optimizer (pytorch#1455)
The code at the end registers only the parameters from `model.fc` in the optimizer, although the text underneath says: "Notice although we register all the parameters in the optimizer, the only parameters that are computing gradients (and hence updated in gradient descent) are the weights and bias of the classifier." To be consistent with this explanation, we should be adding all the parameters from the model.
1 parent 6fb4757 commit 9a43c95

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

beginner_source/blitz/autograd_tutorial.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -299,7 +299,7 @@
299299
# The only parameters that compute gradients are the weights and bias of ``model.fc``.
300300

301301
# Optimize only the classifier
302-
optimizer = optim.SGD(model.fc.parameters(), lr=1e-2, momentum=0.9)
302+
optimizer = optim.SGD(model.parameters(), lr=1e-2, momentum=0.9)
303303

304304
##########################################################################
305305
# Notice although we register all the parameters in the optimizer,

0 commit comments

Comments
 (0)