neuralmonkey
neuralmonkey copied to clipboard
An open-source tool for sequence learning in NLP built on TensorFlow.
I'm interested in your paper -- 'Input Combination Strategies for Multi-Source Transformer Decoder', Would you mind telling me how can I reproduce this work. I want cite this paper. Thanks
Allow referencing protected values like ``
During The Model Configuration when I fire the command **bin/neuralmonkey-train exp-nm-mt/translation.ini** It is processed in the following--- 2019-06-23 12:13:06: Loading INI file: 'exp-nm-mt/translation.ini' 2019-06-23 12:13:06: INI file is parsed. 2019-06-23...
I wonder how to modify the configuration file to train a multi-source based transformer model with different attention types.
Moved attention-related attributes/methods to a separate class Attentive. Every decoder that requires computing attention against the encoders should inherit this class.
included batching scheme methods from: https://github.com/tensorflow/tensor2tensor/blob/415585f40d9f21c56df7bda35033bc915d82321e/tensor2tensor/utils/data_reader.py
If Vocabulary returns a batch of empty sentences ('[[]]', '[[], []]', etc.), the list(zip(*vectors)) at https://github.com/ufal/neuralmonkey/blob/master/neuralmonkey/model/sequence.py#L213 reduces the dimension of the list which causes a ValueError (feeding a list of...
Try-catch block at https://github.com/ufal/neuralmonkey/blob/master/neuralmonkey/learning_utils.py#L250 Print a warning, when tf.Saver.restore while ignoring the caught exception. This leads to a confusing behavior where user might think that the variables present in the...
When running neuralmonkey-train (and possibly *run) with main.initial_variables set to nonexistent path, Neural Monkey does not throw an exception and continues with execution (probably with randomly initialized variables instead). This...
This is motivated by fact that for training stability, you can choose bucketed batching with batch_size specified in the number of tokens per batch. This can create batches with small...