mixtral-8x7b-instruct topic

List mixtral-8x7b-instruct repositories

Aurora

257
Stars
21
Forks
Watchers

🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.

perplexity-ai-toolkit

65
Stars
10
Forks
65
Watchers

A lightweight Python API wrapper and CLI for Perplexity’s Sonar language models.

amazon-bedrock-node-js-samples

17
Stars
3
Forks
Watchers

This repository contains Node.js examples to get started with the Amazon Bedrock service.

pasllm

17
Stars
1
Forks
17
Watchers

PasLLM - LLM inference engine in Object Pascal (synced from my private work repository)