Closed
Description
Hi,
the dev/generate_v2 recipes is not working due to a missing symbol.
Cam presumably fixed by using the method from the distributed generation recipes.
Traceback (most recent call last):
File "/root/venv/bin/tune", line 10, in <module>
sys.exit(main())
^^^^^^
File "/root/venv/lib/python3.12/site-packages/torchtune/_cli/tune.py", line 52, in main
parser.run(args)
File "/root/venv/lib/python3.12/site-packages/torchtune/_cli/tune.py", line 46, in run
args.func(args)
File "/root/venv/lib/python3.12/site-packages/torchtune/_cli/run.py", line 214, in _run_cmd
self._run_single_device(args, is_builtin=is_builtin)
File "/root/venv/lib/python3.12/site-packages/torchtune/_cli/run.py", line 108, in _run_single_device
runpy.run_path(str(args.recipe), run_name="__main__")
File "<frozen runpy>", line 286, in run_path
File "<frozen runpy>", line 98, in _run_module_code
File "<frozen runpy>", line 88, in _run_code
File "/root/venv/lib/python3.12/site-packages/recipes/dev/generate_v2.py", line 243, in <module>
sys.exit(main())
^^^^^^
File "/root/venv/lib/python3.12/site-packages/torchtune/config/_parse.py", line 99, in wrapper
sys.exit(recipe_main(conf))
^^^^^^^^^^^^^^^^^
File "/root/venv/lib/python3.12/site-packages/recipes/dev/generate_v2.py", line 239, in main
recipe.generate(cfg=cfg)
File "/root/venv/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/venv/lib/python3.12/site-packages/recipes/dev/generate_v2.py", line 170, in generate
if isinstance(self.model, DeepFusionModel) and is_multimodal_input:
^^^^^^^^^^^^^^^^^^^
NameError: name 'is_multimodal_input' is not defined
Repro
tune download meta-llama/Llama-3.2-11B-Vision-Instruct --output-dir /tmp/Llama-3.2-11B-Vision-Instruct --ignore-patterns "original/consolidated*"
tune run dev/generate_v2 --config llama3_2_vision/11B_generation_v2