mmpose icon indicating copy to clipboard operation
mmpose copied to clipboard

[Question] High Memory Consumption When Loading Annotations in mmpose for RTMW Model Training

Open lelexx opened this issue 1 year ago • 1 comments

Prerequisite

  • [X] I have searched Issues and Discussions but cannot get the expected help.
  • [X] The bug has not been fixed in the latest version(https://github.com/open-mmlab/mmpose).

Environment

none

Reproduces the problem - code sample

none

Reproduces the problem - command or script

none

Reproduces the problem - error message

I'm using mmpose to train the RTMW model on a server with 256GB of RAM. I've used all the datasets in the official configuration file for training. However, I've noticed that the memory is almost fully consumed during the loading of the dataset annotations. The annotations consist of coordinate points, and the annotation files themselves are relatively small. Why is it consuming so much memory when loading these annotations into memory?

Additional information

No response

lelexx avatar Oct 20 '23 03:10 lelexx

Maybe you can slim the annotations of ubody dataset. These annotation files contain many useless field. However, even if you slim the annotation files, the memory usage can still be high, because the ubody dataset has 1b images.

zero0kiriyu avatar Nov 27 '23 09:11 zero0kiriyu