HyRSM
HyRSM copied to clipboard
Code for our CVPR 2022 Paper "Hybrid Relation Guided Set Matching for Few-shot Action Recognition".
 我将ucf101的数据集规模缩小后,训练出现了上述问题,是数据集的问题吗?
Hello authors,I download SSV2 from the official website,but the split of the data set looks different than yours. For example, the official data look like :"78687 54 19" ,that is...
Hi, thanks for your excellent work. I have a question about the data split. You process the data in dataset into 8 frames per video, but I didn't find script...
This is from [train_few_shot.txt](https://github.com/alibaba-mmai-research/HyRSM/blob/main/configs/projects/hyrsm/kinetics100/train_few_shot.txt) train0//train_256/air_drumming/-VtLx-mcPds_000012_000022.mp4 train0//train_256/air_drumming/-eGnPDG5Pxs_000053_000063.mp4 train0//train_256/air_drumming/-fTmHyOG-CY_000000_000010.mp4 train0//train_256/air_drumming/-ni1UWaBmL0_000056_000066.mp4 ... How to get these files?
Hello authors, I would like to ask where the code block about Set matching metric is located?
Hi authors.When I run the code,it reports the following error.I can't solve it. Could you do me a favor? I tried to update the decord version to 0.6.0 but the...
Hello authors, it looks like you treat the SSv2 videos as .mp4 files ([here](https://github.com/alibaba-mmai-research/HyRSM/blob/main/datasets/base/ssv2_few_shot.py#L365)), but the SSv2 dataset is composed of .webm files. So have you done transformation before reading...
It looks like none of the links in the model/feature zoo pages link to anything. Do you have a plan to release them? UPDATE: are these them? https://github.com/alibaba/EssentialMC2/blob/main/MODEL_ZOO.md
Hi authors, I noticed that the NUM_TRAIN_TASKS under UCF setting is only 2500, is it enough to train the model? cuz I tried to reproduce your method, but the output...
You offered different training configs for different datasets.  However, the same training config is used for all datasets in your code.   Which training config is right?