Jian Xiao
Jian Xiao
你好,我重新阅读了一遍论文,纠正一下我的第二个问题,power activation function 置于加法层后面。对于以往正常的激活函数(如ReLU)还需要吗?需要的话它该放在哪个位置?还有就是shortcut位置是直接在加法层后面,还是在power activation function和bactchnorm的后面
Sorry sir, I made a mistake with my code, and your code is fine. Thanks your reply.
Thanks your reply. I have tried to build a channel information set, which is used generated different signal under different SNR. But it doesn't work and I have no effective...
I have starred your respository. I have added some code block based on your code: `#channel state imformation set num_channel = 100 H_real1=np.zeros((N,M,num_channel)) H_imag1=np.zeros((N,M,num_channel)) H_real2=np.zeros((N,M,num_channel)) H_imag2=np.zeros((N,M,num_channel)) for l in range(num_channel):...
The SER always are 1/2 for this code, so it doesn't work
Thanks your reply, and your opinion is similiar with my idea.