Aggregating gradients before optimizer step #18383
Unanswered
spivakoa
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I would like to aggregate model gradients several times before the optimizer step.
In pytorch I would do something like this:
for _ in range(4):
pred = model.forward()
loss = Loss_function(pred, gt)
loss.backward()
optimizer.step()
How is it possible to achieve this in lightning?
Thank you,
Alex.
Beta Was this translation helpful? Give feedback.
All reactions