load checkpoint resume from CLI, different learning rate #18339
Answered
by
mauvilsa
mshooter
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
-
Hi how do I load the weights of the lightning module? but change learning rate using CLI ? |
Beta Was this translation helpful? Give feedback.
Answered by
mauvilsa
Aug 18, 2023
Replies: 1 comment 2 replies
-
This depends on the modules and CLI you implemented. If you implemented the minimal # Initial run
python cli.py fit [some settings] --optimizer SGD --optimizer.lr 0.01
# second run
python cli.py fit [other settings] --optimizer SGD --optimizer.lr 0.05 --ckpt_path lightning_logs/.../checkpoints/[some name].ckpt |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It depends on what you implement. What
LightningCLI
does is expose the parameters in__init__
as configurable from command line and config files. Byself.learning_rate
do you mean that you added an init parameter for the learning rate?Note that the CLIs have a help that explains how to use it, i.e.
python cli.py --help
. Also, look at the CLI documentation. The use of the automaticconfigure_optimizers
, both for optimizers and schedulers is explained here. Though, if you have defined a learning rate parameter then it seems you are not doing that. A more advanced way of making optimizers and schedulers configurable (via dependency injection) is explained in multiple-optimizers-and-schedulers.