TFC-pretraining
TFC-pretraining copied to clipboard
backbone
thank you for your time series work. for transformer backbone, my SleepEEG2Epilepsy reproduce result is close to paper,but others i can't get a good result. I refered to the paper and simCLR baseline code, try to build and train Resnet backbone but result is bad. Could you please give a detail structure of Resnet backbone? Thank you!
I have the same problem
Can you share how you reproduce the SleepEEG2Epilepsy paper result?
I'm guessing now that the author just gave the correct configuration of sleepeeg experiments?
Can you share how you reproduce the SleepEEG2Epilepsy paper result? I'm guessing now that the author just gave the correct configuration of sleepeeg experiments?
In sleepeeg config, I set batchsize from 128 to 64; in model , set transformer encoder to 3 layers; in main, let subset = True.these are same with paper except for subset. Then I get a result of acc=94.1944, precision=93.7874, recall=87.4317, F1=90.1626. since using transformer will have a drop of performance, I think the reproduce is successful? But when using 'subset=False' the result is even worse than without pretraining.
I've noticed a small issue with using TransformerEncoderLayer. When the input tensor shape is [B, C, T] and TransformerEncoderLayer is applied directly, attention seems to be applied on B instead of T. I'm not entirely sure if I'm understanding this correctly, but I was wondering if the author could provide some clarification on this matter. Additionally, would it be possible for the author to submit the backbone of ResNet? Thank you
I meet with the same problem. Do you successfully reproduce SleepEEG to other dataset?
no
I have the same issue. When using the sleepEEG dataset for pre-training and fine-tuning with other datasets, all categories are classified into the same class, resulting in a warning of no positive samples and low accuracy. I really hope the author can provide the data preprocessing process, preferably in code form.