RNN_Text_Generation_Tensorflow
RNN_Text_Generation_Tensorflow copied to clipboard
About the sentence result
I notice that the longer(the more batches)that I've trained the more perfect that generated sentences will be,but I find that some generated sentences can be totally the same as some sentences in my train corpus,is it possible?I just wonder whether it just generate the sentences like that or 'copy' like that.
That sounds like overfitting. You may try adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set).
@spiglerg I am facing overfitting issue . as you said " adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set) " are you referring to below to increase or decrease ? if yes, please give the exact value
lstm_size = 256 # 128 num_layers = 2 batch_size = 128 # 128 time_steps = 100 # 50