[inductor] enable fx graph cache on torchbench (#128239)

Summary: We've already enabled for timm and huggingface, but we had failures saving cache entries for moco. It looks like https://github.com/pytorch/pytorch/pull/128052 has fixed that issue, so we can enable for torchbench.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/128239
Approved by: https://github.com/oulgen
This commit is contained in:
Sam Larsen
2024-06-12 08:03:35 -07:00
committed by PyTorch MergeBot
parent 6206da55ef
commit 55a6b38f52

View File

@@ -25,6 +25,10 @@ from torch._dynamo.utils import clone_inputs
# We are primarily interested in tf32 datatype
torch.backends.cuda.matmul.allow_tf32 = True
# Enable FX graph caching
if "TORCHINDUCTOR_FX_GRAPH_CACHE" not in os.environ:
torch._inductor.config.fx_graph_cache = True
def _reassign_parameters(model):
# torch_geometric models register parameter as tensors due to