Kurt Shuster

Results 198 comments of Kurt Shuster

You can take a look at how the [Wizard of Wikipedia](https://parl.ai/projects/wizard_of_wikipedia/) dataset provides grounded dialogue examples: ```bash parlai dd -t wizard_of_wikipedia:Generator --prepend-gold-knowledge True . . . - - - NEW...

With regards to the data format: that's a design decision for you to make Any generation model within ParlAI can ground on a knowledge sentence. [The SeeKeR model](https://parl.ai/projects/seeker/), for instance,...

The command in the original comment (with `BB3SubSearchAgent`) should only be used if you wish to interact directly with a certain _module_ of BB3. The only module of BB3 that...

Hi there, actually in the actual SeeKeR model, we didn't use a `__knowledge__` token for generating the knowledge response (just the vanilla dialogue history). But, if we did, it would...

Yes, definitely. We also offer non-dialogue pre-trained models (e.g., the R2C2 model, which is essentially a 3B parameter BART-style pre-trained model).

were you able to build the website locally and have this display? see [here](https://github.com/facebookresearch/ParlAI/tree/main/docs) for how to do so

cc @dexterju27 since I believe @pearlli98 is not able to complete this PR?

could you also please merge `main` into this branch? to pass tests

i think there are two ways to view this issue - that is, whether we are rending images _within_ **OR** _outside_ the context of chit-chat/conversation. As the most immediate use...