RobotCarDataset-Scraper
RobotCarDataset-Scraper copied to clipboard
Problem of downlod Robocar dataset
Hi,
Thanks for your work that performs an automatic download of all datasets. But when I run the code as below:
python scrape_mrgdatashare.py --downloads_dir /Downloads --datasets_file datasets.csv --username USERNAME --password PASSWORD
Then the code is always stuck in here as below:
Could you help me to solve this issue? Thanks.
dude, u have ur username and password in the script!
U may wanna remove it from the public!
Also have u tried running python get_datasets.py
first to obtain dataset.csv
file ?
Hey @mrgransky good catch on that one. @LZL-CS I have had to deactivate your account to prevent those credentials from being abused. Please get in touch via email, we can restore access e.g. password change or something.
dude, u have ur username and password in the script! U may wanna remove it from the public! Also have u tried running
python get_datasets.py
first to obtaindataset.csv
file ?
Hi @mrgransky, thanks for your reminder. I did run python get_datasets.py first to obtain the dataset.csv file.
Hey @mrgransky good catch on that one. @LZL-CS I have had to deactivate your account to prevent those credentials from being abused. Please get in touch via email, we can restore access e.g. password change or something.
Hi @mttgdd. Finally, I manually downloaded the required dataset one by one. Although it was insufficient, I succeeded in the end. If I need to download a large amount of data in the future, I'll send an email to you. Thank you very much!
@LZL-CS : could u check if the number of samples for each sensor are the same as what it's written in here since u downloaded them manually?
For instance, using this script, I get different number of sample for 2014/05/19 Time: 13:05:38 GMT
using Grasshopper 2 Left
which this indicates that we should have Frames: 13427
whereas I get less number of frames (9206
) downloading them using this python script! it happens pretty much for everything!
maybe @mttgdd could help if this is a weird case or quite common for everyone? Cheers,
@LZL-CS : could u check if the number of samples for each sensor are the same as what it's written in here since u downloaded them manually?
For instance, using this script, I get different number of sample for
2014/05/19 Time: 13:05:38 GMT
usingGrasshopper 2 Left
which this indicates that we should haveFrames: 13427
whereas I get less number of frames (9206
) downloading them using this python script! it happens pretty much for everything!maybe @mttgdd could help if this is a weird case or quite common for everyone? Cheers,
Hi @mrgransky, actually I just need to download the centre image dataset, so I didn't check all of the dataset consistency.
so they're exactly the same as in dataset page? number of frames when u extract them all in one directory?
so they're exactly the same as in dataset page? number of frames when u extract them all in one directory?
yes, I extract them all in one directory.
@LZL-CS : could u check if the number of samples for each sensor are the same as what it's written in here since u downloaded them manually?
For instance, using this script, I get different number of sample for
2014/05/19 Time: 13:05:38 GMT
usingGrasshopper 2 Left
which this indicates that we should haveFrames: 13427
whereas I get less number of frames (9206
) downloading them using this python script! it happens pretty much for everything!maybe @mttgdd could help if this is a weird case or quite common for everyone? Cheers,
It appears that the problem still persists in 2023. For example, there seem to be some files missing for the Grasshopper camera related to the track "2014-06-26-09-31-18": https://robotcar-dataset.robots.ox.ac.uk/datasets/2014-06-26-09-31-18/ Looks like there is a recurring issue with multiple files for the Grasshopper camera, including not only the left, but also the right and rear cameras. For instance, as you can observe, both "2014-06-26-09-31-18_mono_left_03.tar" and "2014-06-26-09-31-18_mono_left_04.tar" have the same small size of 130.00KB and identical MD5 checksum.
Would it be possible to look into this and find a solution? Thank you.