Inquiry about Model Size and Plans for Open Sourcing Larger Models
Hello, I noticed that the model in Modelscope is m2_encoder_0.2B.ckpt. Is this the 0.4B parameter model mentioned in the paper? Will there be larger models open sourced in the future?
- Indeed, the file 'm2_encoder_0.2B.ckpt' corresponds to the model with 0.4 billion parameters referenced in our publication. Apologies for the discrepancy in naming
- Certainly, we are committed to making our research as accessible as possible. In line with this commitment, we plan to open source the 1 billion and 10 billion parameter models within the upcoming months.
fixe the issue in this commit: https://github.com/alipay/Ant-Multi-Modal-Framework/commit/b38e199ecca0989c84fcba49f44823a1024a14a7, and naming in modelscope: https://www.modelscope.cn/models/M2Cognition/M2-Encoder/files
Thanks for the detailed response and sharing your future plans. I'm excited to hear about your team's intention to open source models at the 1 billion and 10 billion parameter scale. Appreciate your contributions through open sourcing valuable AI resources.
We have released 1B and 10B modes in this PR: #14