Skip to content

Pull c10 headers directly from PyTorch internally, not from the c10 mirror #11835

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 23, 2025

Conversation

pytorchbot
Copy link
Collaborator

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #11800 by @swolchok
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/swolchok/463/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/swolchok/463/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/swolchok/463/orig
@diff-train-skip-merge

…irror

Previously, both internal and external worlds got ExecuTorch's pinned version of PyTorch. Now external is the same, but internal gets its monorepo behavior. PyTorch guarantees backward compatibility (i.e., not breaking the builds of existing PyTorch users), so our ExecuTorch code that works with pinned PyTorch should also automatically work with internal PyTorch. (bc breaks will be difficult, but that's inherent to bc breaks.)

Differential Revision: [D76832520](https://our.internmc.facebook.com/intern/diff/D76832520/)

ghstack-source-id: 291387183
Pull Request resolved: #11800
Copy link

pytorch-bot bot commented Jun 20, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/11835

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 New Failure, 1 Unrelated Failure

As of commit 0a155c8 with merge base 5136175 (image):

NEW FAILURE - The following job has failed:

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 20, 2025
@swolchok swolchok added the release notes: none Do not include this in the release notes label Jun 23, 2025
Copy link
Contributor

@swolchok swolchok left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

landing bot PR

@swolchok
Copy link
Contributor

emulator job is known flaky, merging

@swolchok swolchok merged commit 925ef20 into main Jun 23, 2025
96 of 99 checks passed
@swolchok swolchok deleted the gh/swolchok/463/orig branch June 23, 2025 18:28
hinriksnaer pushed a commit to hinriksnaer/executorch that referenced this pull request Jun 26, 2025
…irror (pytorch#11835)

This PR was created by the merge bot to help merge the original PR into
the main branch.
ghstack PR number: pytorch#11800 by
@swolchok
^ Please use this as the source of truth for the PR details, comments,
and reviews
ghstack PR base:
https://github.com/pytorch/executorch/tree/gh/swolchok/463/base
ghstack PR head:
https://github.com/pytorch/executorch/tree/gh/swolchok/463/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/main
Merge bot PR head:
https://github.com/pytorch/executorch/tree/gh/swolchok/463/orig
@diff-train-skip-merge

Co-authored-by: Scott Wolchok <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. release notes: none Do not include this in the release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants