haystack-core-integrations
haystack-core-integrations copied to clipboard
Support Llama API as a Chat Generator
This pull request introduces the initial implementation of the meta-llama-haystack integration, enabling the use of Meta's Llama API for text generation within the Haystack framework. The changes include the addition of the core LlamaChatGenerator component, example usage, project configuration, and documentation.
Core Features and Implementation:
-
LlamaChatGenerator Component:
- Added
LlamaChatGeneratorclass to enable text generation using Meta's Llama API. This class extendsOpenAIChatGeneratorand supports features like streaming responses, customizable generation parameters, and integration with the Haystack framework. (integrations/meta_llama/src/haystack_integrations/components/generators/llama/chat/chat_generator.py)
- Added
-
Example Usage:
- Added a Python example demonstrating how to create a Retrieval-Augmented Generation (RAG) pipeline using the
LlamaChatGeneratorand Haystack components. (integrations/meta_llama/examples/rag_with_llama.py)
- Added a Python example demonstrating how to create a Retrieval-Augmented Generation (RAG) pipeline using the
Project Setup and Configuration:
-
Project Metadata:
- Added
pyproject.tomlto define project metadata, dependencies, and build configuration for themeta-llama-haystackpackage. (integrations/meta_llama/pyproject.toml)
- Added
-
Documentation and Changelog:
- Added
README.mdwith installation instructions, API overview, and licensing information. (integrations/meta_llama/README.md) - Added
CHANGELOG.mdfor tracking changes. (integrations/meta_llama/CHANGELOG.md)
- Added
-
Documentation Generation:
- Added
pydoc/config.ymlfor generating API documentation usinghaystack-pydoc-tools. (integrations/meta_llama/pydoc/config.yml)
- Added
Thank you for opening this PR @seyeong-han ! It looks quite good to me already. My first main point is about aligning the implementation more with the most recently added OpenRouterChatGenerator, which also inherits from OpenAIChatGenerator and is very similar. The changes I suggest are about the streaming callback and the tools. And the second point is about the name of the component and the module. I thought about users importing it and given they installed the integration meta-llama-haystack, it might be more intuitive to have
from haystack_integrations.components.generators.meta_llama import LlamaChatGeneratororfrom haystack_integrations.components.generators.meta_llama import MetaLlamaChatGeneratorinstead offrom haystack_integrations.components.generators.llama import LlamaChatGenerator. I'll ask another team member for their opinion.
I agree with @julian-risch's comment here. Between suggestions, I think
from haystack_integrations.components.generators.meta_llama import MetaLlamaChatGenerator
fits better to the Haystack codebase and is more straightforward for users to understand, as it has the same prefix as the package name and represents both the company and API name
@julian-risch @bilgeyucel Thanks for all your review!
We'll add additional Agent example after merging this PR :)
@julian-risch I signed the CLA using my meta email but it hasn't been confirmed. Should I have typed my representative email in Github?
Looks good to me! 👍 I'll make some additions that we need for a new integration. After that, I'll approve, merge, and release.
- [x] A new label named like
integration:<your integration name>has been added to the list of labels for this repository - [x] There is a Github workflow running the tests for the integration nightly and at every PR
- [x] The labeler.yml file has been updated
- [x] Add to README on the repository overview page
Thanks @julian-risch !
I'll let you know after adding the meta_llama documentation in haystack-integrations repository.