-
Notifications
You must be signed in to change notification settings - Fork 789
Fix dynamic shape missing Symbol errors #16390
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/16390
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New Failure, 8 Pending, 1 Unrelated FailureAs of commit c6440a7 with merge base 31ec75e ( NEW FAILURE - The following job has failed:
UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This PR needs a
|
Summary: More context in https://fb.workplace.com/groups/pytorch.edge.users/permalink/1941980700005547/ Differential Revision: D89754133
9a4cb8e to
dbda7d1
Compare
Summary: More context in https://fb.workplace.com/groups/pytorch.edge.users/permalink/1941980700005547/ Differential Revision: D89754133
|
@pytorchbot label "topic: not user facing" |
Summary:
Handle edge case of the range constraints of mixed Symbol + int where the replacement does not have the key.
E.g. {s11: VR[2, 32], s11 + 2048: VR[2050, 2080]}
The replacements only have key "s11" due to "s11 + 2048" is merged to s11. This is needed to the StaticAttention attention class that the attention mask append the dynamic seq_len at the last dimension.
Dynamic shapes and mixed symbol are required for the CoreML enumerated shapes as the precursor for ET enabled CoreML enumerated shape
Reviewed By: kimishpatel
Differential Revision: D89754133
dbda7d1 to
b847301
Compare
Summary:
Handle edge case of the range constraints of mixed Symbol + int where the replacement does not have the key.
E.g. {s11: VR[2, 32], s11 + 2048: VR[2050, 2080]}
The replacements only have key "s11" due to "s11 + 2048" is merged to s11. This is needed to the StaticAttention attention class that the attention mask append the dynamic seq_len at the last dimension.
Dynamic shapes and mixed symbol are required for the CoreML enumerated shapes as the precursor for ET enabled CoreML enumerated shape
Reviewed By: kimishpatel
Differential Revision: D89754133
b847301 to
c6440a7
Compare
Handle edge case of the range constraints of mixed Symbol + int where the replacement does not have the key.
E.g. {s11: VR[2, 32], s11 + 2048: VR[2050, 2080]}
The replacements only have key "s11" due to "s11 + 2048" is merged to s11. This is needed to the StaticAttention attention class that the attention mask append the dynamic seq_len at the last dimension.
Dynamic shapes and mixed symbol are required for the CoreML enumerated shapes as the precursor for ET enabled CoreML enumerated shape
Differential Revision: D89754133