mixtral-8x7b topic
Aurora
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
LlamaIndex-RAG-WSL-CUDA
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
ChatGPT-Telegram-Bot
🤖️ an AI chat Telegram bot can Web Search Powered by GPT, Claude2.1/3, Gemini and Groq using Python on Zeabur, fly.io and Replit.
gdGPT
Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.
machinascript-for-robots
Build LLM-powered robots in your garage with MachinaScript For Robots!
tinychat
🔮 TinyChat is a lightweight GUI client for modern Language Models, designed for straightforward comprehension. Supports OpenAI, Anthropic, Meta, Mistral, Google and Cohere APIs.
Chinese-Mixtral-8x7B
中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
IALab-Suite
Tool for test diferents large language models without code.
fltr
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
fiddler
Fast Inference of MoE Models with CPU-GPU Orchestration