About Finetune on Custom Dataset
Hi,
I want to express my gratitude for your exceptional work. I’ve encountered a few issues. I have an RGB-D paired dataset at hand, but the depth unit is in millimeters or even smaller, micrometers. I’m curious if there’s a more effective method to fine-tune the model? I appreciate your help in advance.
Best regards, Chenlin
Meanwhile, I am currently fine-tuning your model and have encountered an issue related to mmengine while executing the official script located at ~/training/mono/scripts/train_scripts/train_kitti.sh.
The error message I received is as follows: AttributeError: 'ConfigDict' object has no attribute 'dist_params'
I suspect this issue arises due to the absence of dist_params in your vit.raft5.large.kitti.py. However, I am unsure about where to include this parameter as I could not find where to determine the train_cfg.
For your reference, the versions of the packages I am using are as follows: mmengine version 0.10.4, mmcv version 2.2.0, and mmseg version 1.3.0. I am unsure if these packages could be contributing to the issue I am experiencing.
I would greatly appreciate any assistance you could provide in resolving this issue. Thank you very much in advance for your help.
Sincerely, Chenlin
Hi,
I want to express my gratitude for your exceptional work. I’ve encountered a few issues. I have an RGB-D paired dataset at hand, but the depth unit is in millimeters or even smaller, micrometers. I’m curious if there’s a more effective method to fine-tune the model? I appreciate your help in advance.
Best regards, Chenlin
Hi Chenlin, thank you for your interest. If it is only the case of length unit, you can just transfer them to meters and pay a little attention to the variable metric_scale. However if your images are for something like bacteria or cells with micrometer-size, I guess it could be more challenging to define the unit of model outputs.
For your reference, the versions of the packages I am using are as follows:
mmengine version 0.10.4,m
This is something related to distributed training. It seems that you did not use it.