pytorch_cluster icon indicating copy to clipboard operation
pytorch_cluster copied to clipboard

Still there: Incompatibility with bfloat16

Open borisfom opened this issue 1 year ago • 4 comments
trafficstars

There was an issue filed previously of lack of support for bfloat16 - looks like it's still there. I am getting the same error : RuntimeError: "_" not implemented for 'BFloat16' running the code below:

from torch_cluster import radius, radius_graph
import torch
from torch import tensor
x =     tensor([[  4.0625, -27.3750,  -4.3438],
                [  3.0312, -27.6250,  -3.4844],
                [  5.0312, -28.5000,  -4.2812],
                [ -8.1875, -17.7500,  -2.9062],
                [ -8.1875, -19.0000,  -2.2812],
                [ -8.0625, -20.2500,  -2.9688]], device='cuda:0', dtype=torch.bfloat16)

radius_inp = (
    x,
    5.0,
    tensor([0, 0, 0, 0, 0, 0], device='cuda:0'),
    10
)

radius_edges = radius_graph(*radius_inp)

borisfom avatar Jan 25 '24 05:01 borisfom

Same issue with radius() call. This is for the package built from the trunk, on A6000 box.

borisfom avatar Jan 25 '24 05:01 borisfom

Currently, bfloat16 support only exists on CPU :(

rusty1s avatar Jan 28 '24 14:01 rusty1s

Any plans to include bfloat16 support on GPU soon ?

borisfom avatar Jan 30 '24 19:01 borisfom

Currently no, since this repo is no longer in active development.

rusty1s avatar Jan 31 '24 18:01 rusty1s

This issue had no activity for 6 months. It will be closed in 2 weeks unless there is some new activity. Is this issue already resolved?

github-actions[bot] avatar Jul 30 '24 01:07 github-actions[bot]