mim
mim copied to clipboard
mim.train number of GPUs is not update in logged config
When using mim.train with specified number of GPUs, training is done on the specified GPUs, but logged config is not updated accordingly.
from mim import train
work_dir = "/xxx"
path_config = "/yyy/my_config.py"
random_args = (
"--work-dir",
work_dir,
)
train(
package="mmtrack",
config=path_config,
gpus=2,
launcher="pytorch",
other_args=random_args,
)
output:
...
work_dir = '/xxx'
gpu_ids = [0]
But when looking at batch size and GPUs usage, both GPUs are correctly used.
note: also the readme seams to be wrong on how to use mim.train. I had to use package and put the other_args in a list.