mixtral-8x7b-instruct topic
List
mixtral-8x7b-instruct repositories
Aurora
257
Stars
21
Forks
Watchers
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
perplexity-ai-toolkit
45
Stars
6
Forks
Watchers
A versatile CLI and Python wrapper for Perplexity's suite of large language models including their flagship 'Sonar' models (built on top of Meta's latest and most advanced open-source model 'Llama-3.1...
amazon-bedrock-node-js-samples
17
Stars
3
Forks
Watchers
This repository contains Node.js examples to get started with the Amazon Bedrock service.