CoRe
CoRe copied to clipboard
Regress Tree
After reading your code, I think it equals a multi-classification and regression problem. Why did not you directly get the final classification results but by 2 to 4 to 8 to 16? Does this progressive method help a lot?
Hi, thanks for your interest in our work. Our regression tree can perform slightly better than multi-classfication fc, and the regression tree has much better interpretability as shown in figure 8.
@yuxumin Thank you for your reply. By Fig.8, I think the tree consists of several binary classifications. However, it is directly 16 classification in the code. The binary classification is obtained, but the code does not use it to calculate the loss function. So I am confused if your code and paper are consistent. Could you do me a favor?
@ToBeNormal We add the NLL loss to the sum of the binary logsoftmax results from different layers instead of the sotfmax of the final layer (see this line). This implementation is equal to superivising multiple binary classficiation as decribed in Eq.6.
@yuxumin Thanks for your reply, I have missed the code you mentioned!
Could you please release your code for the skill dataset?
@ToBeNormal Hi, we need some time to prepare this part of the code, please be patient.
@yuxumin Thank you for your reply~ I want to know how you select your final model. Do you select the best model for each fold and then average? Or you average first and then select the best model like MUSDL?
Hi, you need to train different models for different fold in 4-fold validation. So we select the best model for each fold and then average.
@yuxumin I got it, thanks again! Have a good day!
Hi @yuxumin , have you prepared your code for the skill dataset? When I use USDL model, the performance is not stable. So I want to try your model. Thanks a lot!
Hi, could you please release your training logs?
Hi. Sry that i can't find the logs now. the releasing of codes for skill dataset will be after ECCV.