char-rnn-tensorflow
char-rnn-tensorflow copied to clipboard
Multi-layer Recurrent Neural Networks (LSTM, RNN) for character-level language models in Python using Tensorflow
There should be an option to add a bidirectional recurrent neural network using the three core RNN cells.
Hi, may I ask what does Line 115-117 do in train.py? ` for i, (c, h) in enumerate(model.initial_state): feed[c] = state[i].c feed[h] = state[i].h` In the previous line, doesn't the...
In the training phase the _self.initial_state_ is used as _cell.zero_state_ and _last_state_ of the last layer is kept: self.initial_state = cell.zero_state(args.batch_size, tf.float32) outputs, last_state = legacy_seq2seq.rnn_decoder(inputs, self.initial_state, cell, loop_function=loop if...
This error caused because this code `output = tf.reshape(tf.concat(1, outputs), [-1, rnn_size])`, I want to know why a list could be connected with int and why do that
### Print sample from a model Before this bugfix: > \xd1\x84\xd0\xb8\xd0\xb3\ After this bugfix: > Кто маленький Кролик
Hi, There are such lines of code at `train.py` file: train_loss, state, _ = sess.run([model.cost, model.final_state, model.train_op], feed) summ, train_loss, state, _ = sess.run([summaries, model.cost, model.final_state, model.train_op], feed) Means, we...
Running `sample.py` with a unicode prime fails with something in the spirit of ``` sample.py: error: argument --prime: invalid unicode value: [...] ``` Using the approach suggested in https://stackoverflow.com/a/22947334/596167 appears...
I want to see what the activations are for individual neurons of a given layer for a given input character. Any suggestions?
create_batches in TextLoader in utils.py doesn't seem to transform the data into batches correctly
The following lines transform the xdata to tensors with the correct dimensions, but the output data are not in the correct order anymore. self.x_batches = np.split(xdata.reshape(self.batch_size, -1), self.num_batches, 1) self.y_batches...
What if I'd like to use `attention_decoder` instead of `rnn_decoder`? I wonder how to modify `outputs, last_state = seq2seq.rnn_decoder(inputs, self.initial_state, cell, loop_function=loop if infer else None, scope='rnnlm')`. What should `attention_states`...