Zijian Cai(John)
Zijian Cai(John)
@Perfeee The source code uses Oracle data generated from pretrain corpus-lstm model by providing Start_Token as the first token of each sentence. So it dosen't really need a corpus.
@shaomai00 I think START_TOKEN here is the first input to generator's LSTM model and furthermore h0 represents the initialization of hidden state and cell state in LSTM structure.
@playma I think target_lstm is just a model for us to generate something like "toy-corpus" in this experiment. As the paper said, it used this model to generate some token...
@sweetkurapika You can dump your parameters with pickle
Following the official description of the tensorflow , the “linalg” package of tf1.12 version has been moved to tf.linalg.XXX
I found that your leakgan is hard to train if I change the generator model to BERT or Transformer_Autoregression