gpt-2
gpt-2 copied to clipboard
Can GPT-2 be used to generate sentences from set of keywords?
I have a use case where I have set of keywords and target sentences. I want to build a model which will take keywords as inputs and generate sentences as outputs. Can this be achieved using GPT-2 in some way?
Example - set of keywords (Sam, Went, India, Wedding)
expected output Sam went to India for his friends wedding
Minimaxir has an example of this functionality. I think it can be achieve by tweaking the inputs a little to fool GPT-2, https://github.com/minimaxir/gpt-2-keyword-generation
@gieoon tried this ... But adding lots of imaginative text in it. Isn't their a way to limit vocabulary so that it will not be imaginative and constructed properly?
@ranjeetds I’m working on something similar and I’m facing the same issue wherein there is a lot of text that is being generated for a certain keyword. Could you please enlighten the approach if you have continued working on it ?
Interested too.
Interested
sounds really interesting!
Got any luck on this problem?
I'd like to do this as well.
@GatoY I found a trick. You could try something like the following:
Input: "https://github.com/openai/I-like-to-play-sports. I see myself as an individual"
@GatoY I found a trick. You could try something like the following:
Input: "https://github.com/openai/I-like-to-play-sports. I see myself as an individual"
@zanderbush I failed to open this page.. Any thought?
@GatoY Sorry, I didn't explain it well enough. As an output into GPT-2, add a link. Example: i-like-to-play-sports. It is likely that sports will be included in the output.
For anyone looking into this, I've done a fair bit of research and it seems like in 2020 this is possible! Conditional generation has come along way, this can be done with something like T5 or BART. More information available here: https://huggingface.co/transformers/model_doc/t5.html#tft5forconditionalgeneration https://towardsdatascience.com/data-to-text-generation-with-t5-building-a-simple-yet-advanced-nlg-model-b5cce5a6df45
Best of luck everyone
Interested too.
Did anyone find a working solution?
If you are fine-tuning to your own dataset you can create textual prompts so that inference will be based on it. This is also how GPT3 works, by recognizing prompts so that it knows that ‘Q:’ precedes a question and ‘A:’ precedes an answer. If your prompt ends at ‘A:’ the system can infer that it needs to generate an answer.
farazkalwar [email protected]于2021年2月28日 周日下午7:18写道:
Did anyone find a working solution?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/openai/gpt-2/issues/239#issuecomment-787403274, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE4YQ655TJLHLEQPDAJG36LTBHN4PANCNFSM4LRWWAPA .