CornerNet-Lite icon indicating copy to clipboard operation
CornerNet-Lite copied to clipboard

What does the UserWarning mean? How to solve it? Does it effect the training result?

Open pingqi opened this issue 5 years ago • 5 comments

When I use "python train.py CornerNet_Squeeze", a lot of UserWarning happens, what does it mean? How to solve it? Does it effect the training result?

/pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:114: UserWarning: torch.gt received 'out' parameter with dtype torch.uint8, this behavior is now deprecated,please use 'out' parameter with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:38: UserWarning: masked_scatter_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:14: UserWarning: masked_fill_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:114: UserWarning: torch.gt received 'out' parameter with dtype torch.uint8, this behavior is now deprecated,please use 'out' parameter with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:38: UserWarning: masked_scatter_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:14: UserWarning: masked_fill_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:114: UserWarning: torch.gt received 'out' parameter with dtype torch.uint8, this behavior is now deprecated,please use 'out' parameter with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:38: UserWarning: masked_scatter_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead. /pytorch/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:14: UserWarning: masked_fill_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead.

pingqi avatar Sep 10 '19 01:09 pingqi

Same issue. I think it's because this repo is written with torch 1.0.0 when torch.bool and torch ::kBool wasn't a thing

tienthegainz avatar Feb 07 '20 10:02 tienthegainz

Have you solved the problem?I met the same issue.

hero-y avatar Mar 11 '20 13:03 hero-y

First I think it is the tag_masks = np.zeros((batch_size, max_tag_len), dtype=np.uint8) in core/sample/cornernet.py ; Then i change np.uint8 to np.bool, but is still print that warning

Second, i think it is because the "at::kByte" in auto gt_mask = torch::zeros({batch, channel, width}, at::device(at::kCUDA).dtype(at::kByte)); in core/models/py_utils/_cpools/src/top_pool.cpp (also , left_pool, .. etc.) BUT i don't know how to change at:kByte to bool, at::Bool dosen't exist !

Finally, I add 2>/dev/null before the shell command, and then it stop print UserWarning information. i.e. 2>/dev/null /home/xxx/anaconda3/envs/torch10_py37/bin/python /home/xxx/project/CornerNet-Lite-master/train.py CornerNet_Squeeze

In conclusion, if anyone can fix "at::kByte", maybe it will work, otherwise, just add 2>/dev/null to aviod printing warning

Dawn-LX avatar Mar 24 '20 09:03 Dawn-LX

I met the same issue. But I solved it. The issue came from the below code: auto gt_mask = torch::zeros({batch, channel, width}, at::device(at::kCUDA).dtype(at::kByte)); Only need to change at::kByte to at::kBool. ps: Four files need to be modified. Don't forget to recompile.

polariseee avatar May 22 '20 17:05 polariseee