MTTOD
MTTOD copied to clipboard
DB states as training labels
Hi, I think it may be a little wrong here. The model should not learn to predict db_state tokens, thus I think it doesn't make sense taking db_state tokens as labels. Maybe putting db states into inputs make more sense.
I agree with you in that the db_state token can be regarded as a prompt.
But other models such as SimpleTOD, UBAR learn the TOD workflow including the db_state token prediction.
I think it's a difference in perspective on whether to consider the db_state token as a prompt or TOD workflow.
Thanks for your reply, it seems that predicting db does not influence the preformance. However, some TOD work, like PPTOD, use db as a part of inputs when training.
Yes, PPTOD uses the db state as a kind of prompt.
But, as you said, I think predicting db state does not have a significant impact on the performance.