Skip to content

load checkpoint resume from CLI, different learning rate #18339

Discussion options

You must be logged in to vote

It depends on what you implement. What LightningCLI does is expose the parameters in __init__ as configurable from command line and config files. By self.learning_rate do you mean that you added an init parameter for the learning rate?

Note that the CLIs have a help that explains how to use it, i.e. python cli.py --help. Also, look at the CLI documentation. The use of the automatic configure_optimizers, both for optimizers and schedulers is explained here. Though, if you have defined a learning rate parameter then it seems you are not doing that. A more advanced way of making optimizers and schedulers configurable (via dependency injection) is explained in multiple-optimizers-and-schedulers.

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@mshooter
Comment options

@mauvilsa
Comment options

Answer selected by mshooter
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment