mistral-haystack icon indicating copy to clipboard operation
mistral-haystack copied to clipboard

Mistral + Haystack: build RAG pipelines that rock 🤘

📌 mistral-haystack collection

Mistral + Haystack Collection: build RAG pipelines that rock 🤘

Collection of notebooks and resources to build Retrieval Augmented Generation pipelines using:

💻 For other great Haystack Notebooks, check out the 👩🏻‍🍳 Haystack Cookbook

📓 Notebooks

Model Haystack version Link Details Author
Mistral-7B-Instruct-v0.1 1.x 🎸 Notebook RAG on a collection of Rock music resources, using the free Hugging Face Inference API @anakin87
Mixtral-8x7B-Instruct-v0.1 1.x 📄🚀 Notebook RAG on a PDF File, using the free Hugging Face Inference API (using the free Hugging Face Inference API) @AlessandroDiLauro
Mixtral-8x7B-Instruct-v0.1 1.x 🛒 Notebook
📊🔍 Blog post
RAG from CSV, Product description analysis @AlessandroDiLauro
Mixtral-8x7B-Instruct-v0.1 2.x 🕸️💬 Notebook RAG on the Web, using the free Hugging Face Inference API @TuanaCelik
Zephyr-7B Beta 2.x 🪁 Article and notebook Article on how make this great model (fine-tuned from Mistral) run locally on Colab @TuanaCelik @anakin87
Mixtral-8x7B-Instruct-v0.1 2.x 🩺💬 Article and notebook Healthcare chatbot with Mixtral, Haystack, and PubMed @annthurium
Mixtral-8x7B-Instruct-v0.1 2.x 🇮🇹🇬🇧🎧 Notebook Multilingual RAG from a podcast @anakin87
Mixtral-8x7B-Instruct-v0.1 2.x 📰 Notebook Building a Hacker News Top Stories TL;DR @TuanaCelik

📚 Resources

  • Mixture of Experts Explained

    Great and deep blog post by Hugging Face on the MoE architecture, which is the basis of Mistral 8x7B.

  • Zephyr: Direct Distillation of LM Alignment

    Technical report by the Hugging Face H4 team. They explain how they trained Zephyr, a strong 7B model fine-tuned from Mistral.

    The main topic is: ⚗️ how to effectively distill the capabilities of GPT-4 into smaller models? The report is insightful and well worth reading. I have summarized it here.