How to ignore certain "parameters" with model checkpointing? #9627
-
I have some data that I store on my LightningModule during validation. I want to prevent this from being saved by the model checkpoint. They are not actually parameters and do not affect the state at all. I want to maintain other parts of the state, I don't want to use weights only. Is it possible to do this? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Out of curiosity, what are the states? You could try removing these states within the LightningModule's |
Beta Was this translation helpful? Give feedback.
-
Hey @jmerkow, The checkpoint is generated from the This is actually done in-place. Therefore, you could do the following. class MyModel(LightningModule):
def on_save_checkpoint(self, checkpoint):
# pop the keys you are not interested by
checkpoint |
Beta Was this translation helpful? Give feedback.
Hey @jmerkow,
The checkpoint is generated from the
dump_checkpoint
function of the CheckpointConnector. One of the latest hooks called is lightning_module.on_save_checkpoint(checkpoint) here: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/checkpoint_connector.py#L386This is actually done in-place.
Therefore, you could do the following.