BERT_multimodal_transformer icon indicating copy to clipboard operation
BERT_multimodal_transformer copied to clipboard

Results 5 BERT_multimodal_transformer issues
Sort by recently updated
recently updated
newest added

I simply return text_embedding in MAG's forward() function to reproduce performance of BERT model `def forward(self, text_embedding, visual, acoustic): return text_embedding` but I found that on MOSI, BERT model 5...

Is acc_2 result on MOSEI dataset calculated without label=0 examples ?

Can you tell me why the Acc-7 is not reported in your paper?

Dear Rahman, when I reproduce the [code](https://github.com/WasifurRahman/BERT_multimodal_transformer/blob/dc7876fc30f7ef362999200911e3d4d8a2bca107/bert.py#L180) it seems that there has an error when use the get_extended_attention_mask , can you give me some advice?

How to insert a MAG after an intermediate layer instead of an embedding layer