HandGestureRecognition icon indicating copy to clipboard operation
HandGestureRecognition copied to clipboard

Questions about step4 and step5

Open TitaRussell16 opened this issue 3 years ago • 1 comments

Hello, I would like to ask you these questions:

  1. The step4_coding.m file is the sample code given by the vl-feat package. How do you set the parameters dimension, numFeatures, and numClusters in your paper? Can you give an example of the code used in the paper? I think numClusters is the number of gestures categories, and numDataToBeEncoded is the Dictionary Size, which is the abscissa of Figure 3 in your paper. Is this understanding correct?

  2. In the step5_classification_Cambridge.m file, are the data saved in the parameter maindir = 'F:\Myprojects\matlabProjects\featureExtraction\surf_feature\Cambridge_color_9_9entropy_4096\' visual words that have been coded? Assuming that the experiment has 30 gesture videos, first extract the key frames of 30 videos, then extract SIFT features (for example) from the key frames (step2), then perform word vectors clustering on the extracted features (step3), and then encode the word vectors. After that set the number of categories and dictionary size (step4), and finally perform classification prediction (step5); is it stored under the maindir folder in step5: The encoded word vectors extracted from these 30 videos, that is, there are 30 mat files respectively loaded Corresponding coded characteristic vocabulary. Is the following content stored in the maindir folder in the step5.m file: the encoded word vectors extracted from these 30 videos, that is, there are 30 mat files each containing the corresponding encoded words.

Thank you!

TitaRussell16 avatar Apr 13 '21 02:04 TitaRussell16

Can u please tell me how to work with these codes

bhanum7353 avatar May 26 '21 04:05 bhanum7353