Rita

Results 5 comments of Rita

Same same same, it runs on GPU, but CPU is capped out with minor memory consumption on GPU(~97MB).

I had the same error and solved, try as follows: 1. Need to make sure preprocess and train have same max_seq_length 2. Try to reduce your max_seq_len if also error...

There are some errors with py3, such as unicode and another differences between py2 and py3, have to find and fix them first.

num_train_steps = int(len(train_examples) / train_batch_size * num_train_epochs)