mixtral-8x7b-instruct topic

List mixtral-8x7b-instruct repositories

Aurora

256
Stars
21
Forks
Watchers

🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.

perplexity-ai-toolkit

28
Stars
1
Forks
Watchers

A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship Chat and Online 'Sonar Llama-3' models along with `LLama-3 and 'Mixtral'. Streamline the cre...