rnnoise icon indicating copy to clipboard operation
rnnoise copied to clipboard

Why is the activation of gru different from in you train py to the data.c ?

Open carcloudfly opened this issue 6 years ago • 6 comments

Why is the activation of gru different from in you train py to the data.c ?

which activation is better from your research ?please !

carcloudfly avatar Jan 11 '19 01:01 carcloudfly

Can you explain what you mean?

jmvalin avatar Jan 11 '19 01:01 jmvalin

Can you explain what you mean?

I found that the rnn_data.c shows the activation of vad_gru is relu ,but in the model inited in train.py there is a tanh activation like this

const GRULayer vad_gru = { vad_gru_bias, vad_gru_weights, vad_gru_recurrent_weights, 24, 24, ACTIVATION_RELU };

/// rnn_train.py vad_gru = GRU(24, activation='tanh', recurrent_activation='sigmoid', return_sequences=True, name='vad_gru', kernel_regularizer=regularizers.l2(reg), recurrent_regularizer=regularizers.l2(reg), kernel_constraint=constraint, recurrent_constraint=constraint, bias_constraint=constraint)(tmp)

so I wander if there is a change will you do your training.

carcloudfly avatar Jan 11 '19 01:01 carcloudfly

and when I try to change che activation in train.py from tanh to relu,the loss get bigger will training . what's wrong I have done ?

carcloudfly avatar Jan 11 '19 01:01 carcloudfly

denoise_gru is also different. is relu right according to paper and rnn_data.c? https://arxiv.org/pdf/1709.08243.pdf

contribu avatar Apr 10 '19 07:04 contribu

denoise_gru is also different. is relu right according to paper and rnn_data.c? https://arxiv.org/pdf/1709.08243.pdf

with a long time training, I find out that relu is better

carcloudfly avatar Apr 10 '19 07:04 carcloudfly

@jmvalin : The Activation type in rnn_train.py for GRU (noise, vad and denoise) whereas the actual rnn_data.c file in src directory says RELU. Is there a mismatch from whats in the repository and the file to generate those weights with RELU? Also, when I change the type to RELU in rnn_train.py I get the loss as NaN for most of the epochs. I appreciate your help. Thanks

sporwar-lifesize avatar Apr 09 '20 22:04 sporwar-lifesize