Kurt Shuster

Results 198 comments of Kurt Shuster

What doesn't make sense to me is why you're getting `'episode_done': True` in the first place. What would two agents messaging have to do with that? That persona generated is...

I think your print statements are capturing calls to the search query generation and memory generation modules. Could you try including the logging and pasting the full stack trace with...

sorry yes it absolutely does, thank you for sharing the full stack trace. The gist of the issue is that we use batching in the memory decoder / search query...

something else appears to be going wrong with the agent cloning, then, if they are sharing history. which lines are you printing this in? I think maybe we should just...

Perhaps something is going wrong with the batching - might it be possible to wrap your calls to the model in a semaphore such that only one user has access...

There are a lot of moving parts to BB2, part of the reason why BB3 is the way it is. I'll go ahead and close this issue then, please reopen...

have you run into an issue where you actually OOM because of this? and is it a consistent additional 4-7mb even after removing other agents?

I've confirmed on my end that just calling `agent.clone()` successively does not add any additional CUDA memory allocation. Are you saving things in the world that are taking up GPU...

There are a few things you can try: 1. If you set up your own custom interactive world, you can simply output those statements directly as openers to the conversation;...

You will need to look elsewhere for the data, as we do not provide it within ParlAI. The SeeKeR paper outlines the training hyperparamters