mlc-llm
mlc-llm copied to clipboard
[Model Request] Mosaic Pretrained Transformer (MPT)
⚙️ Request New Models
- Link to an existing implementation (e.g. Hugging Face/Github): https://huggingface.co/mosaicml/mpt-7b-instruct
- Is this model architecture supported by MLC-LLM? (the list of supported models) No.
Additional context
Hello, it's time to embrace MPT (Mosaic Pretrained Transformer) architecture in your open-source framework!
MPT offers:
- Efficient scaling for large language models.
- Improved performance across various NLP tasks.
- Enhanced stability during training.
- Open-source foundation for customization and innovation.
By incorporating MPT, you'll leverage a powerful, flexible architecture designed for the future of NLP.