-
Notifications
You must be signed in to change notification settings - Fork 614
Arm backend: Add dump_delegate_data function #12334
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Change-Id: I0b7adcc7a754bb5dd825435104f9ba54f0367222 Signed-off-by: Elena Zhelezina <[email protected]> Change-Id: Idd3821e5a987c8ef08ffae9e506afce23b3aa3b8
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/12334
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 New Failures, 2 Cancelled Jobs, 5 Unrelated FailuresAs of commit dce54c7 with merge base aaf0a4c ( NEW FAILURES - The following jobs have failed:
CANCELLED JOBS - The following jobs were cancelled. Please retry:
BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@@ -1726,6 +1726,75 @@ def dump_executorch_program( | |||
else: | |||
print_program(self._emitter_output.program, out=out) | |||
|
|||
def dump_delegate_data( # noqa: C901 | |||
self, | |||
path: str, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you take a TextIO instead? Like Dump_et_program above
edit: Or just any stream really since the data isnt text
) -> None: | ||
""" | ||
Dumps the delegate blob out of backend_delegate_data to <path><extension>. | ||
Must have been created with extract_delegate_segments=True. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you not find the blob if its embedded in the flatbuffer section?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @wwwind for the PR.
Is the goal for extracting the blobs is to run a delegate subgraph on a TOSA reference model? If yes, then I don't think I prefer this approach of extracting the blobs and running them.
The main reason is e2e validation. You can have N blobs which will lead to N tosa_ref_sim calls and then you might have to stitch them together manually to see if the entire network worked correctly or not.
Alternatively I propose we can write a tosa runtime which works with tosa partitioner and graph breaks can be handled by portable libs seamlessly. This way you can re-use all the ET infra and have e2e working TOSA PTE :)
Thank you for the review @digantdesai Actually, this function is for cases when we don't need e2e flow. We have a graphic use case when we get these blobs as .tosa files and then we have a plugin which implements different bits around it in shaders and pass some data to run subgraphs. |
Ok. Curious why can't we dump them after generation from |
Add dump_delegate_data function
Signed-off-by: Elena Zhelezina [email protected]
cc @digantdesai @freddan80 @per @zingo @oscarandersson8218