1chenchen22

Results 10 comments of 1chenchen22

> 请问你们在AffectNet上跑通了代码吗,对AffectNet数据集进行的裁剪对齐操作中missed_img_txt和lb2_txt两个文件是从哪里获得,还是说他们是生成的文件 你好,请问你有找到这两个文件吗?

> 你好,使用RAF-DB数据集训练网络时参数设置与AffectNet数据集相同吗?代码中设置了两个学习率lr1和lr2,但论文中只提到了一个学习率,请问论文中的学习率指的是哪个。代码中的学习率与论文中提到的设置并不相同,且关于ramp_up的公式设置也不相同,我在RAF-DB数据集上尝试了代码中与论文中的设置方式,均无法达到论文中所提到的精度,能否告知在RAF-DB和FERPlus数据集上训练时参数设置,感谢! 您好,请问您是怎么设置的吗,方便提供一下修改后的代码吗,感谢

Hello, I also have the same question, may I ask you how to operate? Do you just need to add one layer to the model schema, and how do you...

您好,请问上述问题您解决了吗,有没有成功复现吗

The obtained accuracy and loss of Epoch 10/10, Loss: 0.6809, Validation Accuracy: 0.5688, when fine-tuning the DAN model on the Fer2013 dataset and loading the weights of rafdb_epoch21_acc0.897_bacc0.8275.pth, are as...

> 好的,谢谢作者解答疑惑 作者,还想打扰您一下,我用3090复现您的代码,性能最高只有0.8954,总是没法到达您这个值,其原因大概是? 您好,您复现的在raf-db上是0.8954的识别率吗,方便加微信交流一下这个程序吗chenxue19991020

Thank you and look forward to hearing from you

### > class Mix_Depth_Wise(Module): > def __init__(self, in_c, out_c, residual = False, kernel=(3, 3), stride=(2, 2), padding=(1, 1), groups=1, kernel_size=[3 ,5 ,7], split_out_channels=[64 ,32 ,32]): > super(Mix_Depth_Wise, self).__init__() > self.conv...

Ok, looking forward to your examples.I'm a novice, and I don't know much about it. As shown in the image below, I added featup to each Mix_Depth_Wise and Mix_Residual. The...

Yeah, I tried, I added it to my program, but it didn't improve, Anyway, thank you for your reply ![image](https://github.com/DensoITLab/TeachAugment/assets/131838869/f94e92f4-b3de-4013-9ab2-bbc7ecb3ad11)