DeepLIO icon indicating copy to clipboard operation
DeepLIO copied to clipboard

Regarding the trained models

Open LHM3762 opened this issue 3 years ago • 2 comments

Dear author,

We re really intriguing in your current project and did training some models. The training phase worked well, paradoxically, the testing phase doesn't seem to be successful. We encountering a problem, in which the T_local.shape and pred_f2f_t_b.shape as well as pred_f2f_w_b are distinct. It could be vividly exemplified by the following figure. image Simultaneously, we found that the testing phase doesn't utilize the trained models. May we ask how we can directly using the trained models. Or the training and testing phase were intentionally designed separately ?

Thank you in advance for your attention.

LHM3762 avatar Jun 15 '21 11:06 LHM3762

You can use test.py to do inference, at the same time, in config.yaml you can use pre-trained model which you obtained after training.

rginjapan avatar Jun 29 '21 23:06 rginjapan

@LHM3762 as @rginjapan already answered you need to use test.py for testing. Assuming you have trained your model, let say you trained the whole deeplio-network with all its components. Now for testing, you need to provide a config-file with the path of the pre-trained model as below:

### DeepLIO Network ##############################
deeplio:
  dropout: 0.25
  pretrained: true
  model-path: "/path/to/my/model.ckpt"
  lidar-feat-net:
    name: "lidar-feat-pointseg"
    pretrained: false
    model-path: ""
    requires-grad: true
  imu-feat-net:
    name: "imu-feat-rnn"
    pretrained: false
    model-path: ""
    requires-grad: true
  odom-feat-net:
    name: "odom-feat-rnn"
    pretrained: false
    model-path: ""
    requires-grad: true
  fusion-net:
    name: "fusion-layer-soft"
    requires-grad: true   # only soft-fusion has trainable params

good luck!

ArashJavan avatar Jul 21 '21 19:07 ArashJavan