InternLM-techreport
InternLM-techreport copied to clipboard
Releasing model?
These are excellent results, especially comapred to LLaMA.
Are you planning to release model/weights?
We plan to share more (definitely more than just a report) with the community as our work proceeds. Stay tuned.
@internlm-team You told us number of parameters at least :)
Any official contact or more detail where we can reach you
@internlm-team You told us number of parameters at least :)
We present InternLM, a multilingual foundational language model with 104B parameters. InternLM is pre-trained on a large corpora with 1.6T tokens with a multi-phase progressive process, and then fine-tuned to align with human preferences.
you can access the 7B model at https://github.com/InternLM/InternLM