semantic-tsdf
semantic-tsdf copied to clipboard
`kitti.pkl` isn't uploaded properly
kitti.pkl is 0 bytes. It would be great if you could re-upload the weights. Thanks!
@ThisIsIsaac hello~, have you solve the problem? I have the same issue:
Traceback (most recent call last):
File "/home/PycharmProjects/semantic-tsdf/data_parser.py", line 153, in <module>
main(basedir, date, drive, image_h, image_w, image_ratio)
File "/home/PycharmProjects/semantic-tsdf/data_parser.py", line 62, in main
pre_weight = torch.load('network/checkpoints/kitti.pkl')
File "/home/PycharmProjects/end_to_end_36/3d_venv/lib/python3.6/site-packages/torch/serialization.py", line 595, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "/home/PycharmProjects/end_to_end_36/3d_venv/lib/python3.6/site-packages/torch/serialization.py", line 764, in _legacy_load
magic_number = pickle_module.load(f, **pickle_load_args)
EOFError: Ran out of input