REVO icon indicating copy to clipboard operation
REVO copied to clipboard

There is a problem with running large datasets

Open RuiYangQuan opened this issue 3 years ago • 3 comments

When the data set of more than 1500 frames was used to verify the algorithm, it was found that it could not fully map all scenes of the data set. Is there a buffer mechanism that makes the algorithm stop drawing after exceeding a certain limit value? Please give me some suggestions, thank you !

RuiYangQuan avatar Jan 05 '22 03:01 RuiYangQuan

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames. Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

fabianschenk avatar Jan 05 '22 10:01 fabianschenk

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames. Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

Thanks for your reply, you are right, there is no problem with real-time operation through sensors, but this problem is often encountered in the face of datasets.

RuiYangQuan avatar Jan 10 '22 03:01 RuiYangQuan

Hi @RuiYangQuan ,

Could you check the dataset config you're using. There's a parameter that limits the number of images to read, e.g. dataset_tum1.yaml: https://github.com/fabianschenk/REVO/blob/eb949c0fdbcdf0be09a38464eb0592a90803947f/config/dataset_tum1.yaml#L44

fabianschenk avatar Jan 12 '22 14:01 fabianschenk