Contrastive-Predictive-Coding-PyTorch icon indicating copy to clipboard operation
Contrastive-Predictive-Coding-PyTorch copied to clipboard

Contrastive Predictive Coding for Automatic Speaker Verification

Results 11 Contrastive-Predictive-Coding-PyTorch issues
Sort by recently updated
recently updated
newest added

In the implementation, a random position of the sequence is used to compute the NCE loss. The paper mentions that the GRU output at each step is used to predict...

The paper does not mention the use of Batch Normalization in the case of the audio task. In the case of the Vision task, it mentions that '' We did...

In the calculation of the NCE loss, the softmax does not have a dimension to compute the result and by default, PyTorch uses dim=1 with 2D input. The Loss in...

I see in your implementation that you feed entire signal into the encoder, while the paper has noted that each timestemp should be insert seperatly. When you feed the entire...

Dear Jeff, Thank you so much for providing this great repository! Sincerely appreciate your great implementation! However, after reading all the closed issues and trying out for initializing the training,...

I saw list files such as "LibriSpeech/list/train.txt" are required parameters for `main.py`. It seems such files are not provided by librispeech officially. What is the format of them? Could you...

At [Line 310](https://github.com/jefflai108/Contrastive-Predictive-Coding-PyTorch/blob/a9dab4e759aaa68dce1b1ada46a8035076ba3296/src/model/model.py#L310), you have the following code ```python output, hidden = self.gru(forward_seq, hidden) # output size e.g. 8*100*256 c_t = output[:,t_samples,:].view(batch, 256) # c_t e.g. size 8*256 ``` So...

https://github.com/jefflai108/Contrastive-Predictive-Coding-PyTorch/blob/a9dab4e759aaa68dce1b1ada46a8035076ba3296/src/model/model.py#L100 Meaning, should that `correct` variable be += ?

Hello @jefflai108, I think the accuracy also should have its denominator as batch*self.timestep at https://github.com/jefflai108/Contrastive-Predictive-Coding-PyTorch/blob/a9dab4e759aaa68dce1b1ada46a8035076ba3296/src/model/model.py#L320 Possibly for other models too although I did not check them.

I don't understand how to update the self.softmax() when training, because the acc don't backward. How does it work?