mmskeleton
mmskeleton copied to clipboard
Training the model with new dataset
@yysijie @yjxiong ,hello, sorry to interrupt you, i have built my own dataset which is much smaller than kinetics, when i use the new dataset to train the model and adjust some parameters, the best accuracy is 47.77% for Top1, 90.23% for top5, but it is still not enough,could you help me to improve the accuracy?Looking forward to your reply!
Hi, if you use a smaller dataset, your model may suffer from over-fitting problem. May be you can use our model as a pre-trained model. Or, you can use some proper augmentation strategies, such as rotation, transform, randomly clipping and so on.
@yysijie hello, thank you very much for your kind reply! I have tried the method you proposed,but the accurary is still small. I wondor if it is the limit of the 2d pose, so i want to test the ntu-rgb dataset,but i cannot get the dataset, i have send the request for ntu-rgb,but i don't receive the reply. so can you share me the ntu-rgb dataset, looking forward to your reply
I think your problem is not caused by the limit of 2d pose. What is the accuracy?
Sorry, I can not share the ntu-rgb dataset without their authorization. They replied me on the third day.
@yysijie ,thank you for your kind reply! i have used the PKUMMD dataset to test your 3d ntu-crossview pretained model,and the accuracy is only 3.52% for top1, i wonder what cause such a big difference?
It's strange. Did you check the value range of data and the definition of joints are identical?
@taxuezcy Did you train st-gcn on PKUMMD dataset?
@taxuezcy Hello, how do you generate a val_json file when you are doing a dataset?
@taxuezcy , Hello , how do you generate train_json file when you use new dataset?Download dataset from the internet or generate by openpose?
@fmthoker ,yes, I have trained st-gcn on PKUMMD dataset
@ml930310 @wwwpbai ,I generate train_json file and val_json file by openpose
@taxuezcy @yysijie Hello,how to use the pre-trained model to train my dataset
@ml930310 @wwwpbai ,I generate train_json file and val_json file by openpose
i can't generate a file but some files of every frames. How can i generate a whole file?
@1209805200, you can use code https://github.com/yysijie/st-gcn/blob/master/tools/kinetics_gendata.py to generate a whole file with json files of every frame.
@taxuezcy hi,sorry to interrupt you, i have built ucf101 dataset ,can i tell you what your order is ?How to transfer the original video dataset to the skeleton format file required by the network
@wwwpbai soory to interrupt you, how to get this red line file? i don't know how to run openpose to get my dataset json file.
@hacker-wei, you can see demo.py,this file tests some video files by calling openpose, you can learn from it then write your own code ,then run your own code get kinetics-skeleton then save them in your computer.
@wwwpbai thank you. i'll try it. and how to get this file,do you have the script.py
???
@hacker-wei , you can see this script
@wwwpbai thank you.may i have your wechat or qq?
why this the output channel multiple kernel_size of conv operate?
can you give the url of openpose. build file in ubuntu to me? i can't compile the openpose, it waste my lots of time.thank you @yysijie
@ml930310 @wwwpbai ,I generate train_json file and val_json file by openpose
@taxuezcy Hi,there is a problem coming to me.Do you get the train_json file of your own dataset by the openpose directly?(e.g. you didn't add your custom post-processing function with the OpenPose C++/Python API ?) thx!
@i2isaalien,it needs two step to get the train_json
@i2isaalien,it needs two step to get the train_json
thanks to reply my question,Could you be more specific?
为什么这个输出通道的conv的多个kernel_size操作?
Do you know why?
Hello, I have .json files(skeleton data) for each frame. I want to get a .npy file out of it which the training script accepts. What script should I use ? I am thinking to use kinetics_gendata.py Is that correct?
Thanks in advance
yes!You can use openpose. py to gendata. json file for each video. Then you can use kinetics_gendata.py to get the. npy file.
---Original--- From: "Akhil"<[email protected]> Date: Wed, Nov 13, 2019 08:18 AM To: "open-mmlab/mmskeleton"<[email protected]>; Cc: "YZYLHR"<[email protected]>;"Comment"<[email protected]>; Subject: Re: [open-mmlab/mmskeleton] Training the model with new dataset (#64)
Hello, I have .json files(skeleton data) for each frame. I want to get a .npy file out of it which the training script accepts. What script should I use ? I am thinking to use kinetics_gendata.py Is that correct?
Thanks in advance
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
I used the self built data set to train the new model, but there was a situation of Top1 = 40%, top5 = 100%. Why? And I don't use openpost to extract bone data. I use joint point data extracted from other models. The effect of joint point position is not much worse than that of openpost, but there is a little difference in confidence. Does that have an impact?
@yysijie @taxuezcy @fmthoker @ml930310 @wwwpbai
Hi! Maybe your dataset is too small.You can try using the pre-training model.
------------------ 原始邮件 ------------------ 发件人: "luoluo-gif"<[email protected]>; 发送时间: 2020年1月3日(星期五) 上午10:02 收件人: "open-mmlab/mmskeleton"<[email protected]>; 抄送: "小仙女"<[email protected]>;"Comment"<[email protected]>; 主题: Re: [open-mmlab/mmskeleton] Training the model with new dataset (#64)
I used the self built data set to train the new model, but there was a situation of Top1 = 40%, top5 = 100%. Why? And I don't use openpost to extract bone data. I use joint point data extracted from other models. The effect of joint point position is not much worse than that of openpost, but there is a little difference in confidence. Does that have an impact?
@yysijie @taxuezcy @fmthoker @ml930310 @wwwpbai
— You are receiving this because you commented. Reply to this email directly, view it on GitHub, or unsubscribe.
I think your problem is not caused by the limit of 2d pose. What is the accuracy?
Sorry, I can not share the ntu-rgb dataset without their authorization. They replied me on the third 你好,请问Kinetics skeleton 数据集方便分享以下百度网盘的连接吗,连接失效了。邮箱:[email protected] @yjxiong @hellock