Back out "Make PT2 compile backprop through custom op without autograd key a hard error (#166367)" (#168142)

Summary:
Original commit changeset: 7148dc4803f5

Original Phabricator Diff: D86736500

Differential Revision: D87407335

Pull Request resolved: https://github.com/pytorch/pytorch/pull/168142
Approved by: https://github.com/wdvr
This commit is contained in:
Edward Yang
2025-11-27 20:20:06 +00:00
committed by PyTorch MergeBot
parent a5436a5e8e
commit 5a607febc0
11 changed files with 48 additions and 100 deletions

View File

@@ -55,7 +55,6 @@ MUTABLE_OPS_THAT_CANNOT_GET_AN_OUT_VARIANT = [
# All of these operators don't have any tensor like returns
FUNCTIONAL_OPS_THAT_CANNOT_GET_AN_OUT_VARIANT = [
"_async_error",
"_assert_async", # no return
"_assert_async.msg", # no return
"_assert_tensor_metadata", # no return