rnn-tutorial-rnnlm icon indicating copy to clipboard operation
rnn-tutorial-rnnlm copied to clipboard

Recurrent Neural Network Tutorial, Part 2 - Implementing a RNN in Python and Theano

Results 19 rnn-tutorial-rnnlm issues
Sort by recently updated
recently updated
newest added

Bumps [tornado](https://github.com/tornadoweb/tornado) from 6.3.2 to 6.3.3. Changelog Sourced from tornado's changelog. Release notes .. toctree:: :maxdepth: 2 releases/v6.3.3 releases/v6.3.2 releases/v6.3.1 releases/v6.3.0 releases/v6.2.0 releases/v6.1.0 releases/v6.0.4 releases/v6.0.3 releases/v6.0.2 releases/v6.0.1 releases/v6.0.0 releases/v5.1.1 releases/v5.1.0...

dependencies

Bumps [certifi](https://github.com/certifi/python-certifi) from 2022.12.7 to 2023.7.22. Commits 8fb96ed 2023.07.22 afe7722 Bump actions/setup-python from 4.6.1 to 4.7.0 (#230) 2038739 Bump dessant/lock-threads from 3.0.0 to 4.0.1 (#229) 44df761 Hash pin Actions and...

dependencies

I converted Python 2 language to Python 3 and modified reading 'reddit-comment.csv' file to be compatible with Python 3. Additionally, I also updated again 'requirements.txt' file.

When i use lstm in keras or tensorflow, they have different data strctures. When i compared this tutorial with keras, i found the 'timestep' is only used in the backward_propagation,...

Hey, Im having troubles getting the data to train the RNN. Specifically on this line: `sentences = itertools.chain(*[nltk.sent_tokenize(x[0].decode('utf-8').lower()) for x in reader])` if I open the file as 'rb' i...

Great tutorial, learning a lot reading through this! I collected the code from the tutorial as follows but I don't understand how you made it work because I get some...

When pip install -r requirements.txt , the following error occurred: Collecting appnope==0.1.0 (from -r requirements.txt (line 1)) Using cached appnope-0.1.0-py2.py3-none-any.whl Collecting backports.ssl-match-hostname==3.4.0.2 (from -r requirements.txt (line 2)) Using cached backports.ssl_match_hostname-3.4.0.2.tar.gz...

Hi there, Thank you for taking your time to code and present it all in your tutorial. I've enjoyed trying them out so far. However I ran into this rather...

In the tutorial notebook, I found one curious point. In the forward propagation, `s[-1]` is added as the additional operation. However, in the declaration of s array, it is defined...

@dennybritz I have been reading your rnn code and I got an question in rnn_theano.py: 30--> def forward_prop_step(x_t, s_t_prev, U, V, W): s_t = T.tanh(U[:,x_t] + W.dot(s_t_prev)) o_t = T.nnet.softmax(V.dot(s_t))...