chinese-char-rnn
chinese-char-rnn copied to clipboard
Character-Level language models
Chinese Character-Level Language Model
Recurrent Neural Networks(LSTM, GRU, RWA) for character-level language models in Tensorflow, the task is to predict the next character given the history of previous characters in the sentence, nce-loss is used to speedup multi-class classification when vocab size is huge, dataset was web scraped from Hong Kong Apple daily
Results
Similarity
Requirements
tensorflow 1.1.0