text-generation-inference
text-generation-inference copied to clipboard
MPT support
Model description
Can you add a new MPT model? This looks very promising, especially the ability to extend context length by up to 85K tokens.
Open source status
- [X] The model implementation is available
- [X] The model weights are available
Provide useful links for the implementation
https://github.com/mosaicml/llm-foundry
Probably a related issue: https://github.com/huggingface/transformers/issues/23174
any plan on supporting mosaicml/mpt-30b-instruct support
I am also interested in deploying the mosaicml/mpt-30b-chat
model. Would be really useful for the community! :pray:
Yes I am also interested in getting support for MPT models. I would love to assist in any way I can.
+1
please🙏
Yes I am also interested in getting support for MPT models. I would love to assist in any way I can.
Hey wanna work on the implementation then we can do a pr, add me on discord mantrakp- proud Logitech controller owner
https://github.com/huggingface/text-generation-inference/pull/514