Jiaxiang Cheng
Jiaxiang Cheng
> In the code, the convolution operations is applied for the future time stamp also during the training. Example: for the second iteration (t=1), the stack of three time steps...
> Yes, thank you Cheng for your reply. When you tried taking only 2 blocks at a time i.e. time step 2 for t = 1 did you get better...
> Excuse me, is there any way to draw "example with unit 43 in FD001.png"? Just simply get the predictions corresponding to each cycle and then plot them
> There is a problem with the data processing in this code. If the results in the paper are also processed in this way, it is impossible to get the...
> Hello, this is the first time I have asked a question on github. If you offend you, please forgive me.Why my rmse is more than 40 .I hope you...
> Thank you for your answers. My hyperparameters are what you originally set. I can use other methods to make rmse around 20. Because I am just getting started, it...
Hi Tommy! 这个基于Transformer的模型只采用了Transformer的encoder部分,详细模型结构可以看下参考论文,因为Transformer在NLP的应用中例如翻译通常是需要通过decoder来把encoder提取的特征重构成和输入类似的结构,但是在这个应用中我们只需要利用encoder把输入转化成特征然后再用全连接层输出RUL
> 您好,请问就是只要运行train.py返回的是测试集的RMSE吗,为什么我迭代20次运行出来是30多,还有请问只用训练集和测试集不需要验证集吗 您好,train.py里面在每一个epoch后用测试集验证,所以是测试集的RMSE;README中的测试结果只是输出示例,并不代表你能得到的测试结果,示例中比较好的结果有我自己的Innovation无法公开,公开部分为已发表论文的reproduction;验证集是从训练集里选取的,用于在训练过程中验证模型的performance,可自行生成验证集并取消每个训练epoch后验证测试集,但因为数据集不大没有必要,如有需要请自行实验。该项目是复现先前的论文,效果不佳还请多多指教并欢迎一起合作改进~
> 想再请问您一下,您公开的部分参数是怎么设置的,最好结果是多少呢 您好,公开部分最好结果是21.06,参数设置基本就是现在代码里的,但需要反复试验。
> 好像并没有实现并行化,每次都是利用一个样本在训练 您说的并行化是指什么