StreamPETR
StreamPETR copied to clipboard
projects/configs/StreamPETR/stream_petr_vov_flash_800_bs2_seq_24e.py文件里用的是MultiHeadAttention,但是projects/mmdet3d_plugin/models/utils/petr_transformer.py里面没有MultiHeadAttention,只有PETRMultiheadAttention
老师您好,我发现projects/configs/StreamPETR/stream_petr_vov_flash_800_bs2_seq_24e.py文件里用的是MultiHeadAttention,如下图所示:
但是projects/mmdet3d_plugin/models/utils/petr_transformer.py里面没有MultiHeadAttention,只有PETRMultiheadAttention,如下图:
但是代码仍然可以正常跑起来,但是在debug的时候,只能进入PETRMultiheadFlashAttention,没有进入MultiheadAttention(我debug几次,没有看到),如图:
于是我将projects/mmdet3d_plugin/models/utils/petr_transformer.py中的PETRTemporalTransformer进行实例化:我将projects/configs/StreamPETR/stream_petr_vov_flash_800_bs2_seq_24e.py中关于PETRTemporalTransformer的信息粘贴到了projects/mmdet3d_plugin/models/utils/petr_transformer.py中的class PETRTemporalTransformer(BaseModule)模块的结尾,如图:
仍然可以打印MultiheadAttention如图:
请问:1、 MultiHeadAttention和PETRMultiheadFlashAttention是同一个东西吗
2、 MultiHeadAttention和PETRMultiheadAttention在网络中分别起到什么作用
盼复!!!感谢老师