SUMBT icon indicating copy to clipboard operation
SUMBT copied to clipboard

SUMBT: Slot-Utterance Matching for Universal and Scalable Belief Tracking (ACL 2019)

Results 5 SUMBT issues
Sort by recently updated
recently updated
newest added

Bumps [tqdm](https://github.com/tqdm/tqdm) from 4.23.4 to 4.66.3. Release notes Sourced from tqdm's releases. tqdm v4.66.3 stable cli: eval safety (fixes CVE-2024-34062, GHSA-g7vv-2v7x-gj9p) tqdm v4.66.2 stable pandas: add DataFrame.progress_map (#1549) notebook: fix...

dependencies

- Where can I find the pre-processing scripts used to prepare the data - How can I train the model for MultiWoz 2.3 -

Why in self.utterance_encoder use attention_mask https://github.com/SKTBrain/SUMBT/blob/011127be0587f942e95b6dd8e6e15195577a637c/code/BeliefTrackerSlotQueryMultiSlot.py#L212 and after self.utterance_encoder use the attention_mask again? https://github.com/SKTBrain/SUMBT/blob/011127be0587f942e95b6dd8e6e15195577a637c/code/BeliefTrackerSlotQueryMultiSlot.py#L214

When evaluating the model, I can't see the Exact Joint Accuracy of **0.48806** reported in the paper (both in the output text log and tensorboard log). Is this logged somewhere...

Why we need to expand and reshape the tensor to [(slot_dim*ds*ts), bert_seq, hid_size], is that mean every utterance need to mapping into slot_dim times prediction https://github.com/SKTBrain/SUMBT/blob/011127be0587f942e95b6dd8e6e15195577a637c/code/BeliefTrackerSlotQueryMultiSlot.py#L214