deepfake-whisper-features icon indicating copy to clipboard operation
deepfake-whisper-features copied to clipboard

About Learning Rate and Training Data

Open ndisci opened this issue 9 months ago • 2 comments

Hello,

Thanks for this nice work. I have some questions. Firstly when I used tensorboard to monitor training curves I realized that the learning rate didn't change. Why do you use constant learning rate instead of learning rate decay ? Is there any advantage to using constant learning rate ? When I take a look your paper , I can't see any explanation about this. I am trainig the specrnet model.

Second question is about spoof and bonafide data. How much data or how many hours spoof and bonafide data do you actually use ?

Thanks for your time.

ndisci avatar May 24 '24 11:05 ndisci