Mixtral 8x22b
Model description
Mixtral's most capable model yet https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1
I see that it's not explicitly mentioned here: https://github.com/huggingface/text-generation-inference/blob/2d0a7173d4891e7cd5f9b77f8e0987b82a339e51/docs/source/supported_models.md?plain=1#L23 And I wonder if it's already works
Open source status
- [X] The model implementation is available
- [X] The model weights are available
Provide useful links for the implementation
No response
Also have the same question.
By checking their release log, I assume TGI hasn't supported Mixtral 8x22b yet.
Any updates on this? Thanks
I am currently unable to run the model as well. I could be having another issue unrelated issues though.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.