Skip to content

How to ignore certain "parameters" with model checkpointing? #9627

Discussion options

You must be logged in to vote

Hey @jmerkow,

The checkpoint is generated from the dump_checkpoint function of the CheckpointConnector. One of the latest hooks called is lightning_module.on_save_checkpoint(checkpoint) here: https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/trainer/connectors/checkpoint_connector.py#L386

This is actually done in-place.

Therefore, you could do the following.

class MyModel(LightningModule):

    def on_save_checkpoint(self, checkpoint):
        # pop the keys you are not interested by
        checkpoint

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
2 replies
@jmerkow
Comment options

@jmerkow
Comment options

Comment options

You must be logged in to vote
1 reply
@jmerkow
Comment options

Answer selected by jmerkow
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment