BlenderBot3: inference on a particular module
Hi, I was wondering how can I inference a particular module of the BB3 model? For example, the generate dialogue response module takes in “Full context + knowledge + memory sequences” and generates a response. How can I feed in my own input to it?
Using the huggingface analogy, I’d tokenize this input string f”{dialogue_context} {TOKEN_KNOWLEDGE} {knowledge} {TOKEN_END_KNOWLEDGE} {BEGIN_MEMORY} {memory_sequence} {END_MEMORY}” and pass it through the model. How to accomplish this using ParlAI? Also I've found the special tokens for BB3 here, but am not sure if it's usage is documented elsewhere (would be great to have a guide for this).
Thanks!
Hi there, in #4746 I've written up a README describing the various ways in which to interact directly with the BB3 model (including how to show context to the model). Hopefully that answers your questions!
(Once that lands, you can find the information at this link
Inferencing a module of the 3B model as described in the doc:
parlai interactive --model projects.bb3.agents.r2c2_bb3_agent:BB3SubSearchAgent --module sdm
returns a parsing error: Parse Error: unrecognized arguments: --module sdm
my mistake, you don't need that option for the 3B model. You will want to include --model-file zoo:bb3/bb3_3B/model however (I will update the README to reflect that)
#4765 improves the documentation
This issue has not had activity in 30 days. Please feel free to reopen if you have more issues. You may apply the "never-stale" tag to prevent this from happening.