You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We landed support for a better, cleaner optimizer in backward implementation in #2719! This was implemented only for full_finetune_single_device.py.
To-dos
Feel free to grab one or more of these, just comment on the Issue if you'd like to work on it, and thank you so much for helping contribute to torchtune :)
Integrate OptimizerInBackward for PPO and QAT single device recipes.
Integrate OptimizerInBackward into full_finetune_distributed.py (This should be taken on by someone familiar with the library, not a beginner)
Integrate OptimizerInBackward into all distributed recipes that currently use it - needs the above to land
Deprecate all uses of the hooks used previously to set up an optimizer in backwards (e.g. create_optim_in_bwd_wrapper
The text was updated successfully, but these errors were encountered:
@omkar-334 It might be tricky at some points (that's why no orange label). But in case if you are sure that it will not cause any struggles I would love to review your PR.
Uh oh!
There was an error while loading. Please reload this page.
Context
We landed support for a better, cleaner optimizer in backward implementation in #2719! This was implemented only for
full_finetune_single_device.py
.To-dos
Feel free to grab one or more of these, just comment on the Issue if you'd like to work on it, and thank you so much for helping contribute to torchtune :)
OptimizerInBackward
for PPO and QAT single device recipes.OptimizerInBackward
intofull_finetune_distributed.py
(This should be taken on by someone familiar with the library, not a beginner)OptimizerInBackward
into all distributed recipes that currently use it - needs the above to landcreate_optim_in_bwd_wrapper
The text was updated successfully, but these errors were encountered: