Abstractive-Summarization-With-Transfer-Learning icon indicating copy to clipboard operation
Abstractive-Summarization-With-Transfer-Learning copied to clipboard

Abstractive summarisation using Bert as encoder and Transformer Decoder

Results 30 Abstractive-Summarization-With-Transfer-Learning issues
Sort by recently updated
recently updated
newest added

Thank you. I used your code to migrate to the Chinese dataset, but there was a problem in the prediction phase. The generated summary has always been one, without any...

ValueError: Unknown hyperparameter: position_embedder_type. Only hyperparameters named 'kwargs' hyperparameters can contain new entries undefined in default hyperparameters. I clone code, and run main.py without any code change, file 'model.py', line...

ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found. Happens when i try running preprocess.py file.

Hello, Thanks for providing the Transformer-based s2s models for abstractive text summarization, it helps me a lot. I run it on CNN and Daily Mail dataset and obtain the results...

In model.py line 114 ``` decoder = tx.tf.modules.TransformerDecoder(embedding=tgt_embedding, hparams=dcoder_config) ``` I am getting a type error that there is an unexpected keyword embedding passed into the TransformerDecoder. How did people...

I am training the model and it is taking longer than expected so I killed the process. However, when I am running inference.py, I am getting an error Traceback (most...

what specific value should be given to the test_batch_size? could anyone suggest?

Can you add a requirement.txt file of this project as I am getting different issues related to in compatible version of different modules

Creating a `requirements.txt` file might help users for dependencies :)