mmtracking
mmtracking copied to clipboard
A error in test SORT
Describe the bug
I use the SORT algorithm and run the following script to test MOT17 dataset.
./tools/dist_test.sh ./configs/mot/deepsort/sort_faster-rcnn_fpn_4e_mot17-private.py 1 --eval track --out /remote-home/******/Code/mmtracking/outputs/mytestmot17-sort.pkl
My machine outputs correctly.
However, when evaluate CLEAR MOT results, it costs more than 65GB memory to calculate the metrics. I use only one gpu and I set the workers_per_gpu:2.
It seems that the evaluation code cost too much memory. Can you tell how much memory do I need to test?
Also, I think maybe the code needs to improve or something!
Error trackback:
Evaluate CLEAR MOT results.
Traceback (most recent call last):
File "./tools/test.py", line 225, in <module>
main()
File "./tools/test.py", line 215, in main
metric = dataset.evaluate(outputs, **eval_kwargs)
File "/remote-home/*****/Code/mmtracking/mmtrack/datasets/mot_challenge_dataset.py", line 393, in evaluate
generate_overall=True)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 285, in compute_many
for acc, analysis, name in zip(dfs, anas, names)]
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 285, in <listcomp>
for acc, analysis, name in zip(dfs, anas, names)]
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 185, in compute
cache[mname] = self._compute(df_map, mname, cache, options, parent='summarize')
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 310, in _compute
v = cache[depname] = self._compute(df_map, depname, cache, options, parent=name)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 310, in _compute
v = cache[depname] = self._compute(df_map, depname, cache, options, parent=name)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 315, in _compute
return minfo['fnc'](df_map, *vals, **options)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/motmetrics/metrics.py", line 610, in id_global_assignment
fnmatrix = np.full((no + nh, no + nh), 0.)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/numpy/core/numeric.py", line 343, in full
a = empty(shape, dtype, order)
numpy.core._exceptions.MemoryError: Unable to allocate 1.70 GiB for an array with shape (15106, 15106) and data type float64
Traceback (most recent call last):
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/torch/distributed/launch.py", line 261, in <module>
main()
File "/root/anaconda3/envs/open-mmlab/lib/python3.7/site-packages/torch/distributed/launch.py", line 257, in main
cmd=cmd)
subprocess.CalledProcessError: Command '['/root/anaconda3/envs/open-mmlab/bin/python', '-u', './tools/test.py', '--local_rank=0', './configs/mot/deepsort/sort_faster-rcnn_fpn_4e_mot17-private.py', '--launcher', 'pytorch', '--eval', 'track', '--out', '/remote-home/chenshaodong/Code/mmtracking/outputs/mytestmot17-sort.pkl']' returned non-zero exit status 1.
The involved large data matrices in the evaluation of MOT which consumes much RAM is inevitable. We will study the details later.
The involved large data matrices in the evaluation of MOT which consumes much RAM is inevitable. We will study the details later.
Thanks for your reply! I suggest that for MOT task you can use the evaluation code like FairMOT! For MOT task, the output storing in only one file is not suitable! Thanks a lot! https://github.com/ifzhang/FairMOT
Thanks for your suggestions! We'll think about it carefully! Would you like to create a PR to improve it?
Hi, S-D chen, We tried to reproduce this problem, but it didn't succeed. Running the test on Ubuntu will consume about 4GB of memory and 7GB on windows. If you still encounter this problem, please leave your comment here.