CLUT icon indicating copy to clipboard operation
CLUT copied to clipboard

hashluts.params: copying a param with shape torch.Size([73392]) from checkpoint, the shape in current model is torch.Size([73648]).

Open riversky2025 opened this issue 10 months ago • 1 comments

When I ran the test command: python test.py --model HashLUT 6+13 SmallBackbone --epoch 367, an error occurred. The error message is as follows: 6 tables of range:9-64, T:13, b:1.480 compensate: True SmallBackbone backbone Traceback (most recent call last): File "test.py", line 60, in <module> model.load_state_dict(ckpt, strict=True) File "site-packages/torch/nn/modules/module.py", line 1224, in load_state_dict self.__class__.__name__, "\n\t".join(error_msgs))) RuntimeError: Error(s) in loading state_dict for HashLUT: size mismatch for hashluts.params: copying a param with shape torch.Size([73392]) from checkpoint, the shape in current model is torch.Size([73648]).

riversky2025 avatar Apr 06 '24 02:04 riversky2025

Hi, thx for your interest first. As for your issue, the 'hashluts' part of our 'HashLUT' model in 'models.py' are implemented as a 'tcnn.NetworkWithInputEncoding', and with the '6+13' setting and other settings defined in 'models.py' it is indeed of 73,392 parameters . So I' de like to suggest you to inspect these setting and your 'tcnn' version. Maybe you have changed some settings or you are using new version's tcnn which has slightly different architecture like bias in MLP. (73648-73392=256, which could be a clue). Hope it helps.

Xian-Bei avatar Apr 15 '24 11:04 Xian-Bei

Hi, thx for your interest first. As for your issue, the 'hashluts' part of our 'HashLUT' model in 'models.py' are implemented as a 'tcnn.NetworkWithInputEncoding', and with the '6+13' setting and other settings defined in 'models.py' it is indeed of 73,392 parameters . So I' de like to suggest you to inspect these setting and your 'tcnn' version. Maybe you have changed some settings or you are using new version's tcnn which has slightly different architecture like bias in MLP. (73648-73392=256, which could be a clue). Hope it helps.

No help at all

yzl1014 avatar Sep 03 '24 03:09 yzl1014