PyTorch 2.6.0 Compatibility Fix for Kaolin
Issue Summary
Kaolin currently encounters compilation errors with PyTorch 2.6.0 due to API changes in PyTorch. However, with minimal modifications to just three files, Kaolin can be successfully built and used with PyTorch 2.6.0.
Error Details
When attempting to build Kaolin with PyTorch 2.6.0, several CUDA compilation errors occur with the message:
error: no suitable conversion function from "const at::DeprecatedTypeProperties" to "c10::ScalarType" exists
This is due to the deprecation of the behavior of .type() in PyTorch 2.6.0, which previously could be used with AT_DISPATCH_FLOATING_TYPES_AND_HALF macro.
Solution
The fix is straightforward and involves modifying just three files by replacing all occurrences of .type() with .scalar_type():
kaolin/csrc/ops/spc/query_cuda.cukaolin/csrc/ops/spc/point_utils_cuda.cukaolin/csrc/render/spc/raytrace_cuda.cu
For example, in query_cuda.cu, lines like:
AT_DISPATCH_FLOATING_TYPES_AND_HALF(query_coords.type(), "query_cuda", ([&] {
// Implementation
}));
Should be changed to:
AT_DISPATCH_FLOATING_TYPES_AND_HALF(query_coords.scalar_type(), "query_cuda", ([&] {
// Implementation
}));
Testing
After applying these changes, Kaolin successfully compiles and installs with PyTorch 2.6.0 on Windows, and all Kaolin nodes in ComfyUI load without errors.
Recommendation
It would be beneficial to update the Kaolin codebase to use .scalar_type() instead of .type() for future compatibility with PyTorch, as this method is the recommended approach moving forward.
Hi @ansorre , we will definitely support 2.6.0 for next release!
@Caenorst hey what about some love of RTX 5000 series?
We need Torch 2.7 and CUDA 12.8 : https://github.com/NVIDIAGameWorks/kaolin/issues/871
Yes, in general we always try to support the latest of pytorch at release. Expect whatever latest pytorch to be supported for next release. We should have one soon. If you can help us to fix any build issue with pytorch 2.7 and cuda 12.8 we will gladly welcome PR! :)
Yes, in general we always try to support the latest of pytorch at release. Expect whatever latest pytorch to be supported for next release. We should have one soon. If you can help us to fix any build issue with pytorch 2.7 and cuda 12.8 we will gladly welcome PR! :)
like i have any idea how to make it work :D currently there is Trellis 3d AI app from Microsoft and because it depends on kaolin RTX 5000 series can't use it
https://github.com/microsoft/TRELLIS
Is pytorch 2.6.0 supported now?
Stale issue, please reopen if still relevant
Hey everyone, kaolin v0.18.0 support pytorch 2.6.0 now !
Hey everyone, kaolin v0.18.0 support pytorch 2.6.0 now !
you know torch 2.7 is mandatory for blackwell gpus which has been around since January
We support all the way to 2.7.1 😃 I'm just answering the initial issue.