Generating_Text_Summary_With_GPT2 icon indicating copy to clipboard operation
Generating_Text_Summary_With_GPT2 copied to clipboard

A simple approach to use GPT2-medium (345M) for generating high quality text summaries with minimal training.

Results 6 Generating_Text_Summary_With_GPT2 issues
Sort by recently updated
recently updated
newest added

Hi, your [notebook](https://github.com/SKRohit/Generating_Text_Summary_With_GPT2/blob/master/train_gpt2_summarizer.ipynb) is an impressive tutorial about GPT-2 for seq2seq model. But it can not run with `batch_size` greater than 1. I think it is due to you added...

Hi, your [notebook](https://github.com/SKRohit/Generating_Text_Summary_With_GPT2/blob/master/train_gpt2_summarizer.ipynb) is an impressive tutorial about GPT-2 for seq2seq model. But it can not run with `batch_size` greater than 1. I think it is due to you added...

How to fix the error below: ``` ERROR: Could not find a version that satisfies the requirement googlesearch-python==2020.0.2 (from versions: 1.0.0, 1.0.1, 1.1.0) ERROR: No matching distribution found for googlesearch-python==2020.0.2...

ml-lab@mllab-OptiPlex-7010:~/Downloads/Temp/Generating_Text_Summary_With_GPT2-master$ python3 train_gpt2_summarizer.py --batch_size 1 --root_dir ./CNN --lr 5e-5 --gradient_accumulation_steps 32 --num_train_epochs 1 --output_dir ./output --model_dir ./weights Epoch: 0%| | 0/1 [00:00

How do I preprocess my dataset and convert it to the model needed format?

Bumps [torch](https://github.com/pytorch/pytorch) from 1.7.1 to 1.13.1. Release notes Sourced from torch's releases. PyTorch 1.13.1 Release, small bug fix release This release is meant to fix the following issues (regressions /...

dependencies