It is definitely a milestone to the community of LLM.
The performance of InternLM is amazing, if you guys release the weight of InternLM to the community, it will proke the LM community advanced.
Thanks for your attention to our model. We understand that the community is looking forward to us opening our models. We have a multi-stage plan to gradually open up our repository. Preparatory work is underway.
This is incredibly awesome, especially the quality! Will you open-source the weights?
As advice for your research team, if you could train an ensemble of models (e.g. a similar one with 1/2-1/4 the parameters), it might fit on personal computers. At the very least, please open-source the model's architecture, so other people can reproduce your work.
I am an American, but if China is the first to open-source these models’ architectures, I think that makes China the leader in AI.
This is incredibly awesome, especially the quality! Will you open-source the weights?
As advice for your research team, if you could train an ensemble of models (e.g. a similar one with 1/2-1/4 the parameters), it might fit on personal computers. At the very least, please open-source the model's architecture, so other people can reproduce your work.
Thanks for the suggestion. We are working on models of various sizes, in order to fit the need of real-world applications.