mirror of
https://github.com/zebrajr/pytorch.git
synced 2026-01-15 12:15:51 +00:00
[inductor] enable fx graph cache on torchbench (#128239)
Summary: We've already enabled for timm and huggingface, but we had failures saving cache entries for moco. It looks like https://github.com/pytorch/pytorch/pull/128052 has fixed that issue, so we can enable for torchbench. Pull Request resolved: https://github.com/pytorch/pytorch/pull/128239 Approved by: https://github.com/oulgen
This commit is contained in:
committed by
PyTorch MergeBot
parent
6206da55ef
commit
55a6b38f52
@@ -25,6 +25,10 @@ from torch._dynamo.utils import clone_inputs
|
||||
# We are primarily interested in tf32 datatype
|
||||
torch.backends.cuda.matmul.allow_tf32 = True
|
||||
|
||||
# Enable FX graph caching
|
||||
if "TORCHINDUCTOR_FX_GRAPH_CACHE" not in os.environ:
|
||||
torch._inductor.config.fx_graph_cache = True
|
||||
|
||||
|
||||
def _reassign_parameters(model):
|
||||
# torch_geometric models register parameter as tensors due to
|
||||
|
||||
Reference in New Issue
Block a user