Kaan Bey

Results 12 comments of Kaan Bey

Quadstar should be printed with a flexible material such as ninja flex, so that you can pull and stretch it to fit the octaring, no need to print it bigger.

https://github.com/Kyubyong/tacotron/blob/master/networks.py#L38 has bn = false so no batch normalization for that layer I believe, am I missing something?

I trained a single wav file, used 2 identical files in the list, changed dropouts to 1.0 and training rate to 0.01. Trained for 1350k steps (I think it was...

The silence was at the beginning of the file not in the end. I believe it is messy because you are using dropout of 0.5 and learning rate of 0.0001,...

@Durham : When training a single file, I had to put the same line twice to the text.txt and change the number of batch_size to 1. Did you do the...

trained to 0.06 loss, you can download sample wav from this link https://we.tl/HE9sOliX4W

@basuam I trained a single wav file, used 2 identical files in the list, changed dropouts to 1.0 and training rate to 0.01. Trained for 1350k steps (I think it...

Tensorflow's Saver class seems to have been updated since the release of this project. Change saver = tf.train.Saver() to saver = tf.train.Saver(write_version=tf.train.SaverDef.V1) it's in src/optimize.py #78

I opened an issue regarding the same problem, and I was told that In evaluation, unlike training, the output of the decoder fed back into the decoder input of next...