Skip to content

Export LLM documentation revamp #12381

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: export-llm-docs
Choose a base branch
from

Conversation

jackzhxng
Copy link
Contributor

@jackzhxng jackzhxng commented Jul 10, 2025

Summary

Structure:

  • New Getting Started page
  • AOT (export)
    • Old getting started page, which was the NanoGPT tutorial, is moved to export-custom-llm.md, with the runner sections removed to add to the run-with-c-plus-plus.md
    • New export-llm.md page for exporting LLMs with export_llm API
  • Runtime
    • iOS/Android app docs remain, they detail steps to take after the .pte is generated for running on-device
    • Added a C++ runner page for @larryliu0820 to fill out with the new runner APIs
    • Since the QNN Llama tutorial is highly custom, we are going to leave the export section in it as well instead of dividing like we did for the rest of the tutorials

Copy link

pytorch-bot bot commented Jul 10, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/12381

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure

As of commit 6af121b with merge base dd4488d (image):

NEW FAILURE - The following job has failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 10, 2025
Copy link

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@jackzhxng jackzhxng force-pushed the jz/export-llm-docs branch 20 times, most recently from 89e52e9 to 0ffe50a Compare July 11, 2025 22:13
@jackzhxng jackzhxng requested a review from larryliu0820 July 11, 2025 22:15
@jackzhxng jackzhxng force-pushed the jz/export-llm-docs branch from 0ffe50a to dc267a8 Compare July 14, 2025 18:36
@jackzhxng jackzhxng changed the base branch from main to export-llm-docs July 14, 2025 18:36
@jackzhxng jackzhxng marked this pull request as ready for review July 14, 2025 18:36
@@ -1,871 +1,22 @@
# Intro to LLMs in Executorch
# Exporting LLMs to Executorch
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this page called "Exporting"? Shouldn't this be high-level than just export? Presumably, you'll talk about runtime etc.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about just "Deploying LLM using ExecuTorch"?

@jackzhxng jackzhxng force-pushed the jz/export-llm-docs branch 3 times, most recently from 863c3b1 to 8753f2b Compare July 14, 2025 21:02
@jackzhxng jackzhxng force-pushed the jz/export-llm-docs branch from 8753f2b to 6af121b Compare July 14, 2025 23:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. release notes: none Do not include this in the release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants