MoCha-Stereo icon indicating copy to clipboard operation
MoCha-Stereo copied to clipboard

[CVPR2024] The official implementation of "MoCha-Stereo: Motif Channel Attention Network for Stereo Matching”.

MoCha-Stereo

[CVPR2024] The official implementation of "MoCha-Stereo: Motif Channel Attention Network for Stereo Matching".

     

MoCha-Stereo: Motif Channel Attention Network for Stereo Matching
Ziyang Chen†, Wei Long†, He Yao†, Yongjun Zhang✱,Bingshu Wang, Yongbin Qin, Jia Wu
CVPR 2024
Correspondence: [email protected]; [email protected]
Grateful to Prof. Wenting Li, Prof. Huamin Qu, and anonymous reviewers for their comments on this work.

https://github.com/ZYangChen/MoCha-Stereo/assets/108012397/2ed414fe-d182-499b-895c-b5375ef51425

@inproceedings{chen2024mocha,
  title={MoCha-Stereo: Motif Channel Attention Network for Stereo Matching},
	author={Chen, Ziyang and Long, Wei and Yao, He and Zhang, Yongjun and Wang, Bingshu and Qin, Yongbin and Wu, Jia},
	booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
	year={2024}
}

or

@article{chen2024mocha,
  title={MoCha-Stereo: Motif Channel Attention Network for Stereo Matching},
  author={Chen, Ziyang and Long, Wei and Yao, He and Zhang, Yongjun and Wang, Bingshu and Qin, Yongbin and Wu, Jia},
  journal={arXiv preprint arXiv:2404.06842},
  year={2024}
}

Todo List

  • [CVPR2024] V1 version
    • [X] Preprint paper
    • [ ] Code of MoCha-Stereo (1. MoCha-Stereo will be released in this repository in July, 2024. 2. For researchers at Guizhou University, I have made the code available in our internal repository. Therefore, you do not need to contact me to get the code, just request access to the repository.)
    • [ ] Code of MoCha-MVS

The code and checkpoints are still being prepared. They will be released when they are sorted out!

Acknowledgements

This project borrows the code from IGEV, DLNR, RAFT-Stereo, GwcNet. We thank the original authors for their excellent works!