kaolin icon indicating copy to clipboard operation
kaolin copied to clipboard

PyTorch 2.6.0 Compatibility Fix for Kaolin

Open ansorre opened this issue 9 months ago • 5 comments

Issue Summary

Kaolin currently encounters compilation errors with PyTorch 2.6.0 due to API changes in PyTorch. However, with minimal modifications to just three files, Kaolin can be successfully built and used with PyTorch 2.6.0.

Error Details

When attempting to build Kaolin with PyTorch 2.6.0, several CUDA compilation errors occur with the message:

error: no suitable conversion function from "const at::DeprecatedTypeProperties" to "c10::ScalarType" exists

This is due to the deprecation of the behavior of .type() in PyTorch 2.6.0, which previously could be used with AT_DISPATCH_FLOATING_TYPES_AND_HALF macro.

Solution

The fix is straightforward and involves modifying just three files by replacing all occurrences of .type() with .scalar_type():

  1. kaolin/csrc/ops/spc/query_cuda.cu
  2. kaolin/csrc/ops/spc/point_utils_cuda.cu
  3. kaolin/csrc/render/spc/raytrace_cuda.cu

For example, in query_cuda.cu, lines like:

AT_DISPATCH_FLOATING_TYPES_AND_HALF(query_coords.type(), "query_cuda", ([&] {
    // Implementation
}));

Should be changed to:

AT_DISPATCH_FLOATING_TYPES_AND_HALF(query_coords.scalar_type(), "query_cuda", ([&] {
    // Implementation
}));

Testing

After applying these changes, Kaolin successfully compiles and installs with PyTorch 2.6.0 on Windows, and all Kaolin nodes in ComfyUI load without errors.

Recommendation

It would be beneficial to update the Kaolin codebase to use .scalar_type() instead of .type() for future compatibility with PyTorch, as this method is the recommended approach moving forward.

ansorre avatar Mar 08 '25 16:03 ansorre

Hi @ansorre , we will definitely support 2.6.0 for next release!

Caenorst avatar Mar 21 '25 16:03 Caenorst

@Caenorst hey what about some love of RTX 5000 series?

We need Torch 2.7 and CUDA 12.8 : https://github.com/NVIDIAGameWorks/kaolin/issues/871

FurkanGozukara avatar Mar 21 '25 23:03 FurkanGozukara

Yes, in general we always try to support the latest of pytorch at release. Expect whatever latest pytorch to be supported for next release. We should have one soon. If you can help us to fix any build issue with pytorch 2.7 and cuda 12.8 we will gladly welcome PR! :)

Caenorst avatar Mar 22 '25 00:03 Caenorst

Yes, in general we always try to support the latest of pytorch at release. Expect whatever latest pytorch to be supported for next release. We should have one soon. If you can help us to fix any build issue with pytorch 2.7 and cuda 12.8 we will gladly welcome PR! :)

like i have any idea how to make it work :D currently there is Trellis 3d AI app from Microsoft and because it depends on kaolin RTX 5000 series can't use it

https://github.com/microsoft/TRELLIS

FurkanGozukara avatar Mar 22 '25 00:03 FurkanGozukara

Is pytorch 2.6.0 supported now?

marcusaureliusfun avatar May 20 '25 19:05 marcusaureliusfun

Stale issue, please reopen if still relevant

github-actions[bot] avatar Aug 25 '25 21:08 github-actions[bot]

Hey everyone, kaolin v0.18.0 support pytorch 2.6.0 now !

Caenorst avatar Aug 27 '25 01:08 Caenorst

Hey everyone, kaolin v0.18.0 support pytorch 2.6.0 now !

you know torch 2.7 is mandatory for blackwell gpus which has been around since January

FurkanGozukara avatar Aug 27 '25 09:08 FurkanGozukara

We support all the way to 2.7.1 😃 I'm just answering the initial issue.

Caenorst avatar Aug 27 '25 14:08 Caenorst