weaviate-python-client icon indicating copy to clipboard operation
weaviate-python-client copied to clipboard

[Proposal] Enhancements to generative queries

Open databyjp opened this issue 1 year ago • 0 comments

Proposal: Could we create new functions for wrapping generative capabilities? With:

  • Mandatory prompt and model parameters
  • Optional search_results parameters from Weaviate

This will allow a user to :

  1. Prompt an LLM (without additional retrieved data)
  2. Perform RAG from a Weaviate search response
  3. Perform RAG from multiple Weaviate search responses
  4. Pre-process to formulate a custom LLM prompt

Syntax proposal:

import weaviate
from weaviate.classes.config import Generative
from weaviate.classes.generate import generate_text

client = weaviate.connect_to_local()

gen_model = Generative.aws(
      model="cohere.command-text-v14",
     region="us-east-1"
),


# 💡 >>> SCENARIO 1 <<< Standalone LLM prompt

response = generate_text(
    model=gen_model,
    prompt="What is the capital of France?",
)


# 💡 >>> SCENARIO 2 <<< RAG with a Weaviate response

wiki = client.collections.get("Wiki")
search_response = wiki.query.hybrid("Afrian or European swallow")

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut?",
    search_response=search_response
)


# 💡 >>> SCENARIO 3 <<< RAG with TWO Weaviate responses!

wiki = client.collections.get("Wiki")
scripts = client.collections.get("Scripts")

wiki_response = wiki.query.hybrid("Afrian or European swallow")
scripts_response = scripts.query.hybrid("Afrian or European swallow")

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut?",
    search_response=[wiki_response, scripts_response]
)


# 💡 >>> SCENARIO 4 <<< RAG with transformed text

wiki = client.collections.get("Wiki")
search_response = wiki.query.hybrid("Afrian or European swallow")

context = "\n\n".join([f'{o["title"]}: {o["chunk"]}' for o in search_response.objects])

response = generate_text(
    model=gen_model,
    prompt="Could a swallow carry a coconut? Answer based on the following information:\n\n" + context,
)

databyjp avatar Aug 19 '24 15:08 databyjp