How to customize amp in pytorchightning #14356
Replies: 3 comments 5 replies
-
The implementation sits in |
Beta Was this translation helpful? Give feedback.
-
Maybe I didn't make it very clear in the original question. For example, I override |
Beta Was this translation helpful? Give feedback.
-
@MooreManor did you find a solution for that? I am curious how you finally solved this, as I am facing the exact same obstacle now. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
To use amp
autocast
in pytorch, the pipeline is listed below.I want to reimplement the same procedure in pytorch-ligtning, but I don't know where to rewrite the call of
scaler.scale(loss).backward()
,scaler.step(optimizer)
, andscaler.update()
because they are encapsulated in pl. Now I want to customize these parts, but I don't know what are the corresponding apis in pl.Beta Was this translation helpful? Give feedback.
All reactions