mmdetection3d
mmdetection3d copied to clipboard
[Bug] Unable to save prediction results when running test.py
Prerequisite
- [X] I have searched Issues and Discussions but cannot get the expected help.
- [X] I have read the FAQ documentation but cannot get the expected help.
- [X] The bug has not been fixed in the latest version (dev-1.x) or latest version (dev-1.0).
Task
I have modified the scripts/configs, or I'm working on my own tasks/models/datasets.
Branch
main branch https://github.com/open-mmlab/mmdetection3d
Environment
sys.platform: linux Python: 3.8.16 (default, Jan 17 2023, 23:13:24) [GCC 11.2.0] CUDA available: True numpy_random_seed: 2147483648 GPU 0,1: NVIDIA RTX A6000 CUDA_HOME: /home/apurvabadithela/miniconda3/envs/detection NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (Ubuntu 10.5.0-1ubuntu1~22.04) 10.5.0 PyTorch: 1.13.1 PyTorch compiling details: PyTorch built with:
- GCC 9.3
- C++ Version: 201402
- Intel(R) oneAPI Math Kernel Library Version 2021.4-Product Build 20210904 for Intel(R) 64 architecture applications
- Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815)
- OpenMP 201511 (a.k.a. OpenMP 4.5)
- LAPACK is enabled (usually provided by MKL)
- NNPACK is enabled
- CPU capability usage: AVX2
- CUDA Runtime 11.7
- NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37
- CuDNN 8.5
- Magma 2.6.1
- Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.7, CUDNN_VERSION=8.5.0, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -fabi-version=11 -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Werror=non-virtual-dtor -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wunused-local-typedefs -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.13.1, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF,
TorchVision: 0.14.1 OpenCV: 4.7.0 MMEngine: 0.9.1 MMDetection: 3.2.0 MMDetection3D: 1.4.0+fe25f7a spconv2.0: True
Reproduces the problem - code sample
I want to save prediction results from running the project mmdet3d/projects/BEVFusion. The documentation states to add the tag pklfile_prefix to the test_evaluator, which I do in the config file: config_file by adding the following line after:
test_evaluator.update({'pklfile_prefix':'/home/apurvabadithela/nuscenes_dataset/inference_results/bevfusion_model/results.pkl'})
Reproduces the problem - command or script
Then, I run the following from command line:
python tools/test.py projects/BEVFusion/configs/bevfusion_lidar-cam_voxel0075_second_secfpn_8xb4-cyclic-20e_nus-3d.py checkpoints/bevfusion_converted.pth --task 'multi-modality_det'
Reproduces the problem - error message
And I get the following error message.
04/22 10:32:27 - mmengine - INFO - ------------------------------
04/22 10:32:27 - mmengine - INFO - The length of test dataset: 6019
04/22 10:32:27 - mmengine - INFO - The number of instances per category in the dataset:
+----------------------+--------+
| category | number |
+----------------------+--------+
| car | 80004 |
| truck | 15704 |
| construction_vehicle | 2678 |
| bus | 3158 |
| trailer | 4159 |
| barrier | 26992 |
| motorcycle | 2508 |
| bicycle | 2381 |
| pedestrian | 34347 |
| traffic_cone | 15597 |
+----------------------+--------+
Traceback (most recent call last):
File "tools/test.py", line 149, in <module>
main()
File "tools/test.py", line 145, in main
runner.test()
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/runner/runner.py", line 1816, in test
self._test_loop = self.build_test_loop(self._test_loop) # type: ignore
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/runner/runner.py", line 1611, in build_test_loop
loop = TestLoop(
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/runner/loops.py", line 413, in __init__
self.evaluator = runner.build_evaluator(evaluator) # type: ignore
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/runner/runner.py", line 1318, in build_evaluator
return Evaluator(evaluator) # type: ignore
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/evaluator/evaluator.py", line 25, in __init__
self.metrics.append(METRICS.build(metric))
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/registry/registry.py", line 570, in build
return self.build_func(cfg, *args, **kwargs, registry=self)
File "/home/apurvabadithela/miniconda3/envs/detection/lib/python3.8/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg
obj = obj_cls(**args) # type: ignore
TypeError: __init__() got an unexpected keyword argument 'pklfile_prefix'
Additional information
- I would like to store the prediction results (of the validation set) not in a \tmp folder. What is the best way to do this?
- I have tried setting --cfg-options, but the syntax was not clear and it kept erroring out.