Skip to content

to_onnx return ONNXProgram #20811

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: master
Choose a base branch
from

Conversation

GdoongMathew
Copy link
Contributor

@GdoongMathew GdoongMathew commented May 11, 2025

What does this PR do?

Fixes #20810

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--20811.org.readthedocs.build/en/20811/

@github-actions github-actions bot added pl Generic label for PyTorch Lightning package dependencies Pull requests that update a dependency file labels May 11, 2025
@GdoongMathew GdoongMathew changed the title Feat/dynamo export onnx WIP: to_onnx return ONNXProgram May 11, 2025
@GdoongMathew
Copy link
Contributor Author

Hi @Borda , a friendly ping for code review~

BTW, I'm wondering if there's any way to update the typing-extensions requirement during testing so that it could fit with onnxscript? Thanks.

@deependujha
Copy link
Contributor

seems this to be the cause of issue:

typing-extensions >=4.4.0, <4.14.0

I wonder what could be the reason behind this version pinning.

error:

The conflict is caused by:
    pytorch-lightning 2.5.1rc2 depends on typing-extensions<4.14.0 and ==4.4.0
    pytorch-lightning[extra,strategies,test] 2.5.1rc2 depends on typing-extensions<4.14.0 and ==4.4.0
    lightning-utilities 0.10.0 depends on typing-extensions
    torch 2.1.0 depends on typing-extensions
    onnxscript 0.2.5 depends on typing_extensions>=4.10
    onnxscript 0.2.4 depends on typing_extensions>=4.10
    onnxscript 0.2.3 depends on typing_extensions>=4.10
    onnxscript 0.2.2 depends on typing_extensions>=4.10

model.eval()

ret = model.to_onnx(dynamo=True)
assert isinstance(ret, torch.onnx.ONNXProgram)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think it would be worth adding a check that verifies the output of the exported ONNX model matches the PyTorch model’s output (within tolerance)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we could use the existed test test_if_inference_output_is_valid for this purpose? I added an parametrized dynamo to that unittest in the last commit as well.

Thank you ~.

file_path: Union[str, Path, BytesIO, None] = None,
input_sample: Optional[Any] = None,
**kwargs: Any,
) -> Union["ONNXProgram", None]:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's technically correct, but since we've only one type, what do you think about Optional["ONNXProgram"] for readability?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. I've changed it in the last commit. Thanks.

@GdoongMathew
Copy link
Contributor Author

seems this to be the cause of issue:

typing-extensions >=4.4.0, <4.14.0

I wonder what could be the reason behind this version pinning.

error:

The conflict is caused by:
    pytorch-lightning 2.5.1rc2 depends on typing-extensions<4.14.0 and ==4.4.0
    pytorch-lightning[extra,strategies,test] 2.5.1rc2 depends on typing-extensions<4.14.0 and ==4.4.0
    lightning-utilities 0.10.0 depends on typing-extensions
    torch 2.1.0 depends on typing-extensions
    onnxscript 0.2.5 depends on typing_extensions>=4.10
    onnxscript 0.2.4 depends on typing_extensions>=4.10
    onnxscript 0.2.3 depends on typing_extensions>=4.10
    onnxscript 0.2.2 depends on typing_extensions>=4.10

Yea, I think that's where the problem is, and there're two solutions for this.

  1. Update the requirement.
  2. Figure out a way to only change the version to fit with onnxscript when testing.

Personally, I'd prefer the second one, but this probably would need some change to the ci config which I not familiar with.

@GdoongMathew GdoongMathew changed the title WIP: to_onnx return ONNXProgram to_onnx return ONNXProgram May 21, 2025
@GdoongMathew
Copy link
Contributor Author

Hi @deependujha all the changes are completed, just left with the requirement part.

And to answer your question, I think it's for testing the compatibility of all the requirements with the lowest supported version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file package pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Allow to return ONNXProgram when calling to_onnx(dynamo=True)
2 participants