Skip to content

Integrate new OptimizerInBackward in rest of torchtune codebase #2750

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
4 tasks
joecummings opened this issue May 19, 2025 · 3 comments
Open
4 tasks

Integrate new OptimizerInBackward in rest of torchtune codebase #2750

joecummings opened this issue May 19, 2025 · 3 comments

Comments

@joecummings
Copy link
Contributor

joecummings commented May 19, 2025

Context

We landed support for a better, cleaner optimizer in backward implementation in #2719! This was implemented only for full_finetune_single_device.py.

To-dos

Feel free to grab one or more of these, just comment on the Issue if you'd like to work on it, and thank you so much for helping contribute to torchtune :)

  • Integrate OptimizerInBackward for PPO and QAT single device recipes.
  • Integrate OptimizerInBackward into full_finetune_distributed.py (This should be taken on by someone familiar with the library, not a beginner)
  • Integrate OptimizerInBackward into all distributed recipes that currently use it - needs the above to land
  • Deprecate all uses of the hooks used previously to set up an optimizer in backwards (e.g. create_optim_in_bwd_wrapper
@omkar-334
Copy link
Contributor

i'd like to work on this @joecummings

@krammnic
Copy link
Contributor

@omkar-334 It might be tricky at some points (that's why no orange label). But in case if you are sure that it will not cause any struggles I would love to review your PR.

@krammnic
Copy link
Contributor

krammnic commented Jun 8, 2025

Will take this, but ETA is ~1 week

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants