RNN_Text_Generation_Tensorflow
RNN_Text_Generation_Tensorflow copied to clipboard
Weird generation
Hi,
First, thanks for your work :) Second, I tried to use your code without any changes for test purposes,, but result isn't good.
Output of python rnn_tf.py saved\model.ckpt "The "
It's a normal result for default configuration or not?
Btw, one time I tried to clear saved
folder, and after train file model.ckpt
didn't create, only checkpoint/index/meta/data files. It's right behavior or I missed something?
My env: Windows 10 x64 Python 3.5.2 Tensorflow GPU 1.1.0 (works with GeForce GTX 950M)
The tf.train.Saver() method doesn't work correctly on Windows for restoring or saving models. If you run train the network from scratch on your Windows machine you'll see good text, but you won't be able to save the model once training finishes.
Thanks for response, @therealjtgill so there is no way to make it work on Windows?
I tried to pass write_version=tf.train.SaverDef.V1
to tf.train.Saver
.
After that model.ckpt
created, but it seems that restore
function doesn't restore...
Also I tried something like this (V2):
checkpoint_dir='saved/'
cwd = os.getcwd();
ckpt = tf.train.get_checkpoint_state(checkpoint_dir)
saver = tf.train.import_meta_graph(cwd + "/saved/model.ckpt.meta")
saver.restore(sess, tf.train.latest_checkpoint(cwd + "/saved/"))
All functions passed without errors, but result still the same
I'm sorry I've been out of the country for a while. Did you manage to solve your problem?
@spiglerg Hi. Currently I didn't found working solution...
i have same error, i was fix dump vocab and reload vocabulary, this error from vocab = list(set(data_)), EG: first: vocab = [1,2,3,4] 2nd: vocab = [2,1,4,3]
I spend hours trying to debug why loading a model doesn't work (I am on Windows) but the solution was simple.
When you create a set() from data_ you have to sort it or it will produce a random order every time you run it.
vocab = sorted(list(set(data_)))
Hmm sounds like an interesting observation! I've committed the change to the reposity, thank you very much for pointing it out! :)