How to modify model states before saving checkpoints ? #18887
Unanswered
thangld201
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment 2 replies
-
https://lightning.ai/docs/pytorch/stable/common/lightning_module.html#on-save-checkpoint |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using Optimum and BetterTransformer in Lightning Module. When I apply BetterTransformer to the original BERT model, its state_dicts change, so if the state_dicts is saved this way there will be compatibility problems when loading it later (with from_pretrained, etc). So I want to reverse the BERT model to normal before saving the checkpoint (e.g. supposed that the BERT model is wrapped in a LightningModule).
Beta Was this translation helpful? Give feedback.
All reactions