mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

[Model Request] Mosaic Pretrained Transformer (MPT)

Open ThangPM opened this issue 6 months ago • 0 comments

⚙️ Request New Models

  • Link to an existing implementation (e.g. Hugging Face/Github): https://huggingface.co/mosaicml/mpt-7b-instruct
  • Is this model architecture supported by MLC-LLM? (the list of supported models) No.

Additional context

Hello, it's time to embrace MPT (Mosaic Pretrained Transformer) architecture in your open-source framework!

MPT offers:

  1. Efficient scaling for large language models.
  2. Improved performance across various NLP tasks.
  3. Enhanced stability during training.
  4. Open-source foundation for customization and innovation.

By incorporating MPT, you'll leverage a powerful, flexible architecture designed for the future of NLP.

ThangPM avatar Aug 28 '24 17:08 ThangPM