mirror of
https://github.com/zebrajr/pytorch.git
synced 2026-01-15 12:15:51 +00:00
Historically, HIP and ROCm versions were interchangeable, but moving forward these versions are allowed to diverge. ROCm version represents the full ROCm software stack, while HIP is a component of the ROCm stack. Issue #166068 was fixed by [switching from using HIP_VERSION to ROCM_VERSION_DEV](https://github.com/pytorch/pytorch/pull/166336). However, this broke the build of ROCm apex because the hip version from `hipcc --version` no longer matched `torch.version.hip`. This highlights the need for both versions to be exposed. Bitsandbytes has also been impacted by the change in behavior of `torch.version.hip`: https://github.com/bitsandbytes-foundation/bitsandbytes/issues/1799#issuecomment-3534269635 The solution is to fix the `torch.version.hip` so that it uses the hipcc header values and removes the trailing hash code. In addition, `torch.version.rocm` variable is created to store the ROCm version. ## Technical Details ### Fix torch.version.hip HIP_VERSION variable is computed in https://github.com/ROCm/hip/blob/develop/cmake/FindHIP.cmake. This runs hipcc –version and extracts the output of HIP version line, e.g., ``` hipcc --version HIP version: 7.1.25421-32f9fa6ca5 ``` The HIP_VERSION variable may contain a hash code at the end. This trailing hashcode is removed from the HIP_VERSION variable so that the torch.version.hip can be parsed by packaging version parse method, e.g., ``` import torch from packaging import version print(version.parse(torch.version.hip)) ``` ### Add torch.version.rocm Code changes: - Add rocm variable to torch/version.py.tpl - Add code to write rocm variable in tools/generate_torch_version.py - Write rocm version in installation process - torch/CMakeLists.txt ## Testing Tested on a preview of ROCm 7.2. Successfully built pytorch and apex. Tested above parsing torch.version.hip code. ``` >>> import torch >>> torch.version.hip '7.1.25421' >>> torch.version.rocm '7.2.0' ``` Pull Request resolved: https://github.com/pytorch/pytorch/pull/168097 Approved by: https://github.com/jeffdaily Co-authored-by: Jeff Daily <jeff.daily@amd.com>
14 lines
388 B
Smarty
14 lines
388 B
Smarty
__version__ = '{{VERSION}}'
|
|
debug = False
|
|
cuda = '{{CUDA_VERSION}}'
|
|
# TODO: use workspace status to stamp the correct version
|
|
git_version = ""
|
|
hip = None
|
|
rocm = None
|
|
|
|
# This is a gross monkey-patch hack that depends on the order of imports
|
|
# in torch/__init__.py
|
|
# TODO: find a more elegant solution to set `USE_GLOBAL_DEPS` for the bazel build
|
|
import torch
|
|
torch.USE_GLOBAL_DEPS = False
|