gpt-2 icon indicating copy to clipboard operation
gpt-2 copied to clipboard

Can GPT-2 be used to generate sentences from set of keywords?

Open ranjeetds opened this issue 4 years ago • 15 comments

I have a use case where I have set of keywords and target sentences. I want to build a model which will take keywords as inputs and generate sentences as outputs. Can this be achieved using GPT-2 in some way?

Example - set of keywords (Sam, Went, India, Wedding) expected output Sam went to India for his friends wedding

ranjeetds avatar Mar 23 '20 09:03 ranjeetds

Minimaxir has an example of this functionality. I think it can be achieve by tweaking the inputs a little to fool GPT-2, https://github.com/minimaxir/gpt-2-keyword-generation

gieoon avatar Mar 23 '20 09:03 gieoon

@gieoon tried this ... But adding lots of imaginative text in it. Isn't their a way to limit vocabulary so that it will not be imaginative and constructed properly?

ranjeetds avatar Mar 23 '20 12:03 ranjeetds

@ranjeetds I’m working on something similar and I’m facing the same issue wherein there is a lot of text that is being generated for a certain keyword. Could you please enlighten the approach if you have continued working on it ?

akhilNair avatar Jun 20 '20 03:06 akhilNair

Interested too.

Ziltosh avatar Aug 24 '20 16:08 Ziltosh

Interested

mehdiomid avatar Aug 29 '20 03:08 mehdiomid

sounds really interesting!

yana-xuyan avatar Sep 13 '20 03:09 yana-xuyan

Got any luck on this problem?

GatoY avatar Oct 08 '20 13:10 GatoY

I'd like to do this as well.

BigSalmon2 avatar Oct 08 '20 17:10 BigSalmon2

@GatoY I found a trick. You could try something like the following:

Input: "https://github.com/openai/I-like-to-play-sports. I see myself as an individual"

BigSalmon2 avatar Oct 08 '20 21:10 BigSalmon2

@GatoY I found a trick. You could try something like the following:

Input: "https://github.com/openai/I-like-to-play-sports. I see myself as an individual"

@zanderbush I failed to open this page.. Any thought?

GatoY avatar Oct 12 '20 02:10 GatoY

@GatoY Sorry, I didn't explain it well enough. As an output into GPT-2, add a link. Example: i-like-to-play-sports. It is likely that sports will be included in the output.

BigSalmon2 avatar Oct 12 '20 03:10 BigSalmon2

For anyone looking into this, I've done a fair bit of research and it seems like in 2020 this is possible! Conditional generation has come along way, this can be done with something like T5 or BART. More information available here: https://huggingface.co/transformers/model_doc/t5.html#tft5forconditionalgeneration https://towardsdatascience.com/data-to-text-generation-with-t5-building-a-simple-yet-advanced-nlg-model-b5cce5a6df45

Best of luck everyone

FL33TW00D avatar Oct 16 '20 15:10 FL33TW00D

Interested too.

xieyxclack avatar Nov 02 '20 08:11 xieyxclack

Did anyone find a working solution?

farazkalwar avatar Feb 28 '21 06:02 farazkalwar

If you are fine-tuning to your own dataset you can create textual prompts so that inference will be based on it. This is also how GPT3 works, by recognizing prompts so that it knows that ‘Q:’ precedes a question and ‘A:’ precedes an answer. If your prompt ends at ‘A:’ the system can infer that it needs to generate an answer.

farazkalwar [email protected]于2021年2月28日 周日下午7:18写道:

Did anyone find a working solution?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/openai/gpt-2/issues/239#issuecomment-787403274, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE4YQ655TJLHLEQPDAJG36LTBHN4PANCNFSM4LRWWAPA .

gieoon avatar Feb 28 '21 07:02 gieoon