Hello,can you show the code about how we can get the h5py file from zhe dataset
Thanks very much for your attention. Because we should apply authorization for these datasets from the authors, I do not think it would be proper to share the datasets online. These datasets could access through the following websites. You could download the raw data and train with the corresponding backbones (image backbone are the pre-trained models given by the torchvision, and the pre-trained doc2vec model is available from https://github.com/jhlau/doc2vec). Hope this helps.
Note that, If the link is invalid, you could try to contact the original authors.
INRIA-Websearch: https://weiyc.github.io/assets/projects/tcyb.html NUS-WIDE: https://lms.comp.nus.edu.sg/wp-content/uploads/2019/research/nuswide/NUS-WIDE.html XmediaNet: http://59.108.48.34/tiki/XMediaNet/ The list for nus_wide_deep_doc2vec_data_42941: https://drive.google.com/file/d/15xm6SwwdoLhMPupwheWopb3c5ryCp06j/view?usp=sharing
Best regards, Hu, Peng
Hi PengHu, Sorry but it seems to be unavailable for me to access to the website of INRIA-websearch you provide with. Is it possible to provide a up-to-date website? And I've been looking for single-labeled multi-modal dataset, any recommands? Thanks a lot.
Bests, ZihuaZhao
Hi,
You could ask the authors for an up-to-date website. Thanks a lot.
Best regards, Hu, Peng
2022年10月10日 21:42,ZihuaZhao @.***> 写道:
Hi PengHu, Sorry but it seems to be unavailable for me to access to the website of INRIA-websearch you provide with. Is it possible to provide a up-to-date website? And I've been looking for single-labeled multi-modal dataset, any recommands? Thanks a lot.
Bests, ZihuaZhao
— Reply to this email directly, view it on GitHub https://github.com/penghu-cs/MRL/issues/9#issuecomment-1273332407, or unsubscribe https://github.com/notifications/unsubscribe-auth/AK5KOI62RJ3LSRTU2QS2KQTWCQMN5ANCNFSM52US7D4Q. You are receiving this because you commented.
@ZihuaZhao Hello, do you get the INRIA-websearch dataset?
Thank you for your reply! I've used other datasets, much thanks.
Hi PengHu,
Thanks for the list “nus_wide_deep_doc2vec_data_42941”, but I don't understand how to get its h5py file from this list, could you please provide the h5py file about it?
Thank you so much.