Optimize dynamically generated modules #8996
Unanswered
ajoino
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 2 comments 2 replies
-
Out of curiosity, are you using PyTorch lazy modules? Is it possible to initialize your module in the Or if you have an example batch of data before training, is it possible to initialize your module outside of the lightning module and then pass into the lightning module? |
Beta Was this translation helpful? Give feedback.
2 replies
-
Dear @ajoino, We currently have similar mechanism within the BaseFinetuning Callback. Best, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have a network that adds modules dynamically depending on the data, but since those modules are added after configure_optimizers is run they will not be optimized during training. I have read that you can customize the
manual_backward
to do something similar to this but I am not sure exactly how.In more detail, my network basically looks like this:
How would I add a
manual_backward
here, or could this be done with some trick to make the optimizer evaluateself.parameters()
each run?Beta Was this translation helpful? Give feedback.
All reactions