haystack icon indicating copy to clipboard operation
haystack copied to clipboard

Pipeline fails validation if component uses `from __future__ import annotations`

Open wochinge opened this issue 1 year ago • 1 comments

Describe the bug A pipeline fails validation if my component imports from __future__ import annotations. If I drop the line, the validation passes.

Error message

E           haystack.core.errors.PipelineConnectError: Cannot connect 'prompt_builder' with 'llm': no matching connections available.
E           'prompt_builder':
E            - prompt: str
E           'llm':
E            - prompt: 'str' (available)
E            - generation_kwargs: 'Optional[Dict[str, Any]]' (available)

Expected behavior No error

Additional context Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.

To Reproduce Comment in / out the top line

# from __future__ import annotations
import time
from typing import Any, Dict, List, Optional

from haystack import component, Pipeline
from haystack.components.builders import PromptBuilder, AnswerBuilder
from haystack.components.embedders import SentenceTransformersTextEmbedder
from haystack.components.retrievers import InMemoryEmbeddingRetriever
from haystack.document_stores.in_memory import InMemoryDocumentStore


@component
class SleeperGenerator:
    """Component to mock a generator component during benchmarks."""

    @component.output_types(replies=List[str], meta=List[Dict[str, Any]])
    def run(
        self, prompt: str, generation_kwargs: Optional[Dict[str, Any]] = None
    ) -> Dict[str, List[str | Dict[str, Any]]]:
        time.sleep(1.0)

        return {"replies": ["test"], "meta": [{}]}


def test_silvano():
    basic_rag_pipeline = Pipeline(max_loops_allowed=10)
    template = """"
            Given the following information, answer the question.

            Context:
            {% for document in documents %}
                {{ document.content }}
            {% endfor %}

            Question: {{question}}
            Answer:\
            """
    prompt_builder = PromptBuilder(template=template)
    text_embedder = SentenceTransformersTextEmbedder(model="sentence-transformers/multi-qa-mpnet-base-dot-v1")
    generator = SleeperGenerator()
    answer_builder = AnswerBuilder()

    # Add components to your pipeline
    basic_rag_pipeline.add_component("text_embedder", text_embedder)
    basic_rag_pipeline.add_component("retriever", InMemoryEmbeddingRetriever(InMemoryDocumentStore()))
    basic_rag_pipeline.add_component("prompt_builder", prompt_builder)
    basic_rag_pipeline.add_component("llm", generator)
    basic_rag_pipeline.add_component("answer_builder", answer_builder)

    # Now, connect the components to each other
    basic_rag_pipeline.connect("text_embedder.embedding", "retriever.query_embedding")
    basic_rag_pipeline.connect("retriever", "prompt_builder.documents")
    basic_rag_pipeline.connect("prompt_builder", "llm")
    basic_rag_pipeline.connect("llm.replies", "answer_builder.replies")
    basic_rag_pipeline.connect("llm.meta", "answer_builder.meta")

FAQ Check

System:

  • OS:
  • GPU/CPU:
  • Haystack version (commit or version number): 2.0
  • DocumentStore:
  • Reader:
  • Retriever:

wochinge avatar Apr 29 '24 13:04 wochinge

@silvanocerza

wochinge avatar Apr 29 '24 13:04 wochinge

Looks like a duplicate of https://github.com/deepset-ai/haystack/issues/7609

masci avatar May 10 '24 06:05 masci