WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

Error: src/pipelines/retrieval/retrieval.py table_contents

Open lsky-walt opened this issue 10 months ago • 3 comments

Describe the bug When ai-service is initialized, table_contents error is reported

To Reproduce Steps to reproduce the behavior:

  1. Use wren-launcher-linux to start and set custom
  2. docker logs -f wrenai-wren-ai-service-1

Expected behavior success?

Screenshots

Image

Desktop (please complete the following information):

  • OS: macos
  • Browser chrome:133

Wren AI Information WREN_PRODUCT_VERSION=0.15.3 WREN_ENGINE_VERSION=0.13.1 WREN_AI_SERVICE_VERSION=0.15.17 IBIS_SERVER_VERSION=0.13.1 WREN_UI_VERSION=0.20.1 WREN_BOOTSTRAP_VERSION=0.1.5

Additional context Use Alibaba Cloud API, compatible with OpenAI

Relevant log output

# you should rename this file to config.yaml and put it in ~/.wrenai
# please pay attention to the comments starting with # and adjust the config accordingly

type: llm
provider: litellm_llm
models:
  # put OPENAI_API_KEY=<random_string> in ~/.wrenai/.env
  - api_base: https://dashscope.aliyuncs.com/compatible-mode/v1 # change this to your ollama host, api_base should be <ollama_url>/v1
    api_key_name: LLM_OLLAMA_API_KEY
    model: openai/qwen-plus-2025-01-25 # openai/<ollama_model_name>
    timeout: 600
    kwargs:
      n: 1
      temperature: 0

---
type: embedder
provider: litellm_embedder
models:
  # put OPENAI_API_KEY=<random_string> in ~/.wrenai/.env
  - model: openai/text-embedding-v3 # put your ollama embedder model name here, openai/<ollama_model_name>
    api_base: https://dashscope.aliyuncs.com/compatible-mode/v1
    api_key_name: EMBEDDER_OLLAMA_API_KEY
    timeout: 600

---
type: engine
provider: wren_ui
endpoint: http://wren-ui:3000

---
type: document_store
provider: qdrant
location: http://qdrant:6333
embedding_model_dim: 1024 # put your embedding model dimension here
timeout: 120
recreate_index: true

---
# please change the llm and embedder names to the ones you want to use
# the format of llm and embedder should be <provider>.<model_name> such as litellm_llm.gpt-4o-2024-08-06
# the pipes may be not the latest version, please refer to the latest version: https://raw.githubusercontent.com/canner/WrenAI/<WRENAI_VERSION_NUMBER>/docker/config.example.yaml
type: pipeline
pipes:
  - name: db_schema_indexing
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: historical_question_indexing
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: table_description_indexing
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: db_schema_retrieval
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: historical_question_retrieval
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: sql_generation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: sql_correction
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: followup_sql_generation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: sql_summary
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_answer
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: sql_breakdown
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: sql_expansion
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: sql_explanation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: semantics_description
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: relationship_recommendation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: question_recommendation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: question_recommendation_db_schema_retrieval
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: question_recommendation_sql_generation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui
  - name: chart_generation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: chart_adjustment
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: intent_classification
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    embedder: litellm_embedder.openai/text-embedding-v3
    document_store: qdrant
  - name: data_assistance
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_pairs_deletion
    document_store: qdrant
    embedder: litellm_embedder.openai/text-embedding-v3
  - name: sql_pairs_indexing
    document_store: qdrant
    embedder: litellm_embedder.openai/text-embedding-v3
  - name: sql_pairs_retrieval
    document_store: qdrant
    embedder: litellm_embedder.openai/text-embedding-v3
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_pairs_preparation
    document_store: qdrant
    embedder: litellm_embedder.openai/text-embedding-v3
  - name: preprocess_sql_data
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_executor
    engine: wren_ui
  - name: sql_question_generation
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_generation_reasoning
    llm: litellm_llm.openai/qwen-plus-2025-01-25
  - name: sql_regeneration
    llm: litellm_llm.openai/qwen-plus-2025-01-25
    engine: wren_ui

---
settings:
  column_indexing_batch_size: 50
  table_retrieval_size: 10
  table_column_retrieval_size: 100
  allow_using_db_schemas_without_pruning: false # if you want to use db schemas without pruning, set this to true. It will be faster
  query_cache_maxsize: 1000
  query_cache_ttl: 3600
  langfuse_host: https://cloud.langfuse.com
  langfuse_enable: true
  logging_level: DEBUG
  development: true

lsky-walt avatar Feb 25 '25 12:02 lsky-walt

Hi @lsky-walt, from the log and context, I don’t know why it happened. If it's possible, can you integrate with https://langfuse.com/ (cloud or self-host both okay)? Just put the Langfuse key in the env file and then you can see more detail for the execution. And it also can help me to understand the issue. Thank you!

LANGFUSE_SECRET_KEY="sk-xxx"
LANGFUSE_PUBLIC_KEY="pk-xxx"

If you self-host Langfuse, you need to change the host in config.yaml.

  langfuse_host: https://cloud.langfuse.com <--
  langfuse_enable: true
  logging_level: DEBUG

paopa avatar Mar 04 '25 11:03 paopa

Hi @lsky-walt, from the log and context, I don’t know why it happened. If it's possible, can you integrate with https://langfuse.com/ (cloud or self-host both okay)? Just put the Langfuse key in the env file and then you can see more detail for the execution. And it also can help me to understand the issue. Thank you!

LANGFUSE_SECRET_KEY="sk-xxx"
LANGFUSE_PUBLIC_KEY="pk-xxx"

If you self-host Langfuse, you need to change the host in config.yaml.

  langfuse_host: https://cloud.langfuse.com <--
  langfuse_enable: true
  logging_level: DEBUG

@paopa Sorry, I don't have langfuse, local log, is this okay?


poetry run python -m src.force_update_config
Successfully updated engine names to 'wren_ui' in all pipelines
poetry run python -m src.__main__
INFO:     Will watch for changes in these directories: ['/root/wrenai/wren-ai-service']
INFO:     Uvicorn running on http://0.0.0.0:5555 (Press CTRL+C to quit)
INFO:     Started reloader process [2832] using WatchFiles
INFO:     Started server process [2845]
INFO:     Waiting for application startup.
I0304 07:02:45.982 2845 wren-ai-service:40] Imported Provider: src.providers.document_store
{"event": "Imported Provider: src.providers.document_store", "level": "info", "timestamp": "2025-03-04T07:02:45.982927Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.document_store", "asctime": "0304 07:02:45"}
I0304 07:02:45.983 2845 wren-ai-service:64] Registering provider: qdrant
{"event": "Registering provider: qdrant", "level": "info", "timestamp": "2025-03-04T07:02:45.984074Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: qdrant", "asctime": "0304 07:02:45"}
I0304 07:02:45.984 2845 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
{"event": "Imported Provider: src.providers.document_store.qdrant", "level": "info", "timestamp": "2025-03-04T07:02:45.984170Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.document_store.qdrant", "asctime": "0304 07:02:45"}
I0304 07:02:45.984 2845 wren-ai-service:40] Imported Provider: src.providers.embedder
{"event": "Imported Provider: src.providers.embedder", "level": "info", "timestamp": "2025-03-04T07:02:45.984387Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.embedder", "asctime": "0304 07:02:45"}
I0304 07:02:46.110 2845 wren-ai-service:64] Registering provider: azure_openai_embedder
{"event": "Registering provider: azure_openai_embedder", "level": "info", "timestamp": "2025-03-04T07:02:46.110959Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: azure_openai_embedder", "asctime": "0304 07:02:46"}
I0304 07:02:46.111 2845 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
{"event": "Imported Provider: src.providers.embedder.azure_openai", "level": "info", "timestamp": "2025-03-04T07:02:46.111074Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.embedder.azure_openai", "asctime": "0304 07:02:46"}
I0304 07:02:47.211 2845 wren-ai-service:64] Registering provider: litellm_embedder
{"event": "Registering provider: litellm_embedder", "level": "info", "timestamp": "2025-03-04T07:02:47.211663Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: litellm_embedder", "asctime": "0304 07:02:47"}
I0304 07:02:47.211 2845 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
{"event": "Imported Provider: src.providers.embedder.litellm", "level": "info", "timestamp": "2025-03-04T07:02:47.211795Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.embedder.litellm", "asctime": "0304 07:02:47"}
I0304 07:02:47.212 2845 wren-ai-service:64] Registering provider: ollama_embedder
{"event": "Registering provider: ollama_embedder", "level": "info", "timestamp": "2025-03-04T07:02:47.212654Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: ollama_embedder", "asctime": "0304 07:02:47"}
I0304 07:02:47.212 2845 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
{"event": "Imported Provider: src.providers.embedder.ollama", "level": "info", "timestamp": "2025-03-04T07:02:47.212752Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.embedder.ollama", "asctime": "0304 07:02:47"}
I0304 07:02:47.213 2845 wren-ai-service:64] Registering provider: openai_embedder
{"event": "Registering provider: openai_embedder", "level": "info", "timestamp": "2025-03-04T07:02:47.213964Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: openai_embedder", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
{"event": "Imported Provider: src.providers.embedder.openai", "level": "info", "timestamp": "2025-03-04T07:02:47.214058Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.embedder.openai", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:40] Imported Provider: src.providers.engine
{"event": "Imported Provider: src.providers.engine", "level": "info", "timestamp": "2025-03-04T07:02:47.214259Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.engine", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:64] Registering provider: wren_ui
{"event": "Registering provider: wren_ui", "level": "info", "timestamp": "2025-03-04T07:02:47.214585Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: wren_ui", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:64] Registering provider: wren_ibis
{"event": "Registering provider: wren_ibis", "level": "info", "timestamp": "2025-03-04T07:02:47.214682Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: wren_ibis", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:64] Registering provider: wren_engine
{"event": "Registering provider: wren_engine", "level": "info", "timestamp": "2025-03-04T07:02:47.214785Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: wren_engine", "asctime": "0304 07:02:47"}
I0304 07:02:47.214 2845 wren-ai-service:40] Imported Provider: src.providers.engine.wren
{"event": "Imported Provider: src.providers.engine.wren", "level": "info", "timestamp": "2025-03-04T07:02:47.214857Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.engine.wren", "asctime": "0304 07:02:47"}
I0304 07:02:47.215 2845 wren-ai-service:40] Imported Provider: src.providers.llm
{"event": "Imported Provider: src.providers.llm", "level": "info", "timestamp": "2025-03-04T07:02:47.215084Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.217 2845 wren-ai-service:64] Registering provider: azure_openai_llm
{"event": "Registering provider: azure_openai_llm", "level": "info", "timestamp": "2025-03-04T07:02:47.217631Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: azure_openai_llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.217 2845 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
{"event": "Imported Provider: src.providers.llm.azure_openai", "level": "info", "timestamp": "2025-03-04T07:02:47.217722Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.llm.azure_openai", "asctime": "0304 07:02:47"}
I0304 07:02:47.217 2845 wren-ai-service:64] Registering provider: litellm_llm
{"event": "Registering provider: litellm_llm", "level": "info", "timestamp": "2025-03-04T07:02:47.217941Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: litellm_llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.217 2845 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
{"event": "Imported Provider: src.providers.llm.litellm", "level": "info", "timestamp": "2025-03-04T07:02:47.218018Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.llm.litellm", "asctime": "0304 07:02:47"}
I0304 07:02:47.218 2845 wren-ai-service:64] Registering provider: ollama_llm
{"event": "Registering provider: ollama_llm", "level": "info", "timestamp": "2025-03-04T07:02:47.218953Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: ollama_llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.219 2845 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
{"event": "Imported Provider: src.providers.llm.ollama", "level": "info", "timestamp": "2025-03-04T07:02:47.219050Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.llm.ollama", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:64] Registering provider: openai_llm
{"event": "Registering provider: openai_llm", "level": "info", "timestamp": "2025-03-04T07:02:47.244538Z", "lineno": 64, "module": "wren-ai-service", "message": "Registering provider: openai_llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:40] Imported Provider: src.providers.llm.openai
{"event": "Imported Provider: src.providers.llm.openai", "level": "info", "timestamp": "2025-03-04T07:02:47.244644Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.llm.openai", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:40] Imported Provider: src.providers.loader
{"event": "Imported Provider: src.providers.loader", "level": "info", "timestamp": "2025-03-04T07:02:47.244733Z", "lineno": 40, "module": "wren-ai-service", "message": "Imported Provider: src.providers.loader", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:15] initializing provider: litellm_embedder
{"event": "initializing provider: litellm_embedder", "level": "info", "timestamp": "2025-03-04T07:02:47.244835Z", "lineno": 15, "module": "wren-ai-service", "message": "initializing provider: litellm_embedder", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:91] Getting provider: litellm_embedder from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}
{"event": "Getting provider: litellm_embedder from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "level": "info", "timestamp": "2025-03-04T07:02:47.244906Z", "lineno": 91, "module": "wren-ai-service", "message": "Getting provider: litellm_embedder from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "asctime": "0304 07:02:47"}
I0304 07:02:47.244 2845 wren-ai-service:176] Initializing LitellmEmbedder provider with API base: https://dashscope.aliyuncs.com/compatible-mode/v1
{"event": "Initializing LitellmEmbedder provider with API base: https://dashscope.aliyuncs.com/compatible-mode/v1", "level": "info", "timestamp": "2025-03-04T07:02:47.244977Z", "lineno": 176, "module": "wren-ai-service", "message": "Initializing LitellmEmbedder provider with API base: https://dashscope.aliyuncs.com/compatible-mode/v1", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:179] Using Embedding Model: openai/text-embedding-v3
{"event": "Using Embedding Model: openai/text-embedding-v3", "level": "info", "timestamp": "2025-03-04T07:02:47.245032Z", "lineno": 179, "module": "wren-ai-service", "message": "Using Embedding Model: openai/text-embedding-v3", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:15] initializing provider: litellm_llm
{"event": "initializing provider: litellm_llm", "level": "info", "timestamp": "2025-03-04T07:02:47.245084Z", "lineno": 15, "module": "wren-ai-service", "message": "initializing provider: litellm_llm", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:91] Getting provider: litellm_llm from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}
{"event": "Getting provider: litellm_llm from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "level": "info", "timestamp": "2025-03-04T07:02:47.245140Z", "lineno": 91, "module": "wren-ai-service", "message": "Getting provider: litellm_llm from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:15] initializing provider: qdrant
{"event": "initializing provider: qdrant", "level": "info", "timestamp": "2025-03-04T07:02:47.245200Z", "lineno": 15, "module": "wren-ai-service", "message": "initializing provider: qdrant", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:91] Getting provider: qdrant from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}
{"event": "Getting provider: qdrant from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "level": "info", "timestamp": "2025-03-04T07:02:47.245256Z", "lineno": 91, "module": "wren-ai-service", "message": "Getting provider: qdrant from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "asctime": "0304 07:02:47"}
I0304 07:02:47.245 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:47.245313Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:47"}
I0304 07:02:47.516 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:47.517848Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:47"}
I0304 07:02:47.740 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:47.740802Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:47"}
I0304 07:02:47.989 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:47.990149Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:47"}
I0304 07:02:48.222 2845 wren-ai-service:15] initializing provider: wren_ui
{"event": "initializing provider: wren_ui", "level": "info", "timestamp": "2025-03-04T07:02:48.223329Z", "lineno": 15, "module": "wren-ai-service", "message": "initializing provider: wren_ui", "asctime": "0304 07:02:48"}
I0304 07:02:48.223 2845 wren-ai-service:91] Getting provider: wren_ui from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}
{"event": "Getting provider: wren_ui from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "level": "info", "timestamp": "2025-03-04T07:02:48.223864Z", "lineno": 91, "module": "wren-ai-service", "message": "Getting provider: wren_ui from {'qdrant': <class 'src.providers.document_store.qdrant.QdrantProvider'>, 'azure_openai_embedder': <class 'src.providers.embedder.azure_openai.AzureOpenAIEmbedderProvider'>, 'litellm_embedder': <class 'src.providers.embedder.litellm.LitellmEmbedderProvider'>, 'ollama_embedder': <class 'src.providers.embedder.ollama.OllamaEmbedderProvider'>, 'openai_embedder': <class 'src.providers.embedder.openai.OpenAIEmbedderProvider'>, 'wren_ui': <class 'src.providers.engine.wren.WrenUI'>, 'wren_ibis': <class 'src.providers.engine.wren.WrenIbis'>, 'wren_engine': <class 'src.providers.engine.wren.WrenEngine'>, 'azure_openai_llm': <class 'src.providers.llm.azure_openai.AzureOpenAILLMProvider'>, 'litellm_llm': <class 'src.providers.llm.litellm.LitellmLLMProvider'>, 'ollama_llm': <class 'src.providers.llm.ollama.OllamaLLMProvider'>, 'openai_llm': <class 'src.providers.llm.openai.OpenAILLMProvider'>}", "asctime": "0304 07:02:48"}
I0304 07:02:48.224 2845 wren-ai-service:24] Using Engine: wren_ui
{"event": "Using Engine: wren_ui", "level": "info", "timestamp": "2025-03-04T07:02:48.224228Z", "lineno": 24, "module": "wren-ai-service", "message": "Using Engine: wren_ui", "asctime": "0304 07:02:48"}
I0304 07:02:48.237 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.238297Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.290 2845 wren-ai-service:135] Loading Helpers for DB Schema Indexing Pipeline: src.pipelines.indexing.utils
{"event": "Loading Helpers for DB Schema Indexing Pipeline: src.pipelines.indexing.utils", "level": "info", "timestamp": "2025-03-04T07:02:48.290488Z", "lineno": 135, "module": "wren-ai-service", "message": "Loading Helpers for DB Schema Indexing Pipeline: src.pipelines.indexing.utils", "asctime": "0304 07:02:48"}
I0304 07:02:48.291 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.291263Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.329 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.329446Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.367 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.368166Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
W0304 07:02:48.406 2845 wren-ai-service:155] SQL pairs file not found: sql_pairs.json
{"event": "SQL pairs file not found: sql_pairs.json", "level": "warning", "timestamp": "2025-03-04T07:02:48.406303Z", "lineno": 155, "module": "wren-ai-service", "message": "SQL pairs file not found: sql_pairs.json", "asctime": "0304 07:02:48"}
I0304 07:02:48.406 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.407126Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.444 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.444998Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.486 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.486382Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.527 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.527738Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.566 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.566741Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.605 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.605557Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.643 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.643413Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.720 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.720770Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.757 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.757562Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.803 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.804016Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.843 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.843645Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
I0304 07:02:48.887 2845 wren-ai-service:405] Using Qdrant Document Store with Embedding Model Dimension: 1024
{"event": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "level": "info", "timestamp": "2025-03-04T07:02:48.887564Z", "lineno": 405, "module": "wren-ai-service", "message": "Using Qdrant Document Store with Embedding Model Dimension: 1024", "asctime": "0304 07:02:48"}
W0304 07:02:48.924 2845 wren-ai-service:155] SQL pairs file not found: sql_pairs.json
{"event": "SQL pairs file not found: sql_pairs.json", "level": "warning", "timestamp": "2025-03-04T07:02:48.924711Z", "lineno": 155, "module": "wren-ai-service", "message": "SQL pairs file not found: sql_pairs.json", "asctime": "0304 07:02:48"}
I0304 07:02:48.927 2845 wren-ai-service:291] Service version: 0.15.17
{"event": "Service version: 0.15.17", "level": "info", "timestamp": "2025-03-04T07:02:48.927269Z", "lineno": 291, "module": "wren-ai-service", "message": "Service version: 0.15.17", "asctime": "0304 07:02:48"}
{"event": "Langfuse client is disabled since no public_key was provided as a parameter or environment variable 'LANGFUSE_PUBLIC_KEY'. See our docs: https://langfuse.com/docs/sdk/python/low-level-sdk#initialize-client", "level": "warning", "timestamp": "2025-03-04T07:02:48.927372Z", "lineno": 259, "module": "langfuse"}
I0304 07:02:48.943 2845 wren-ai-service:84] LANGFUSE_ENABLE: True
{"event": "LANGFUSE_ENABLE: True", "level": "info", "timestamp": "2025-03-04T07:02:48.943918Z", "lineno": 84, "module": "wren-ai-service", "message": "LANGFUSE_ENABLE: True", "asctime": "0304 07:02:48"}
I0304 07:02:48.943 2845 wren-ai-service:85] LANGFUSE_HOST: https://cloud.langfuse.com
{"event": "LANGFUSE_HOST: https://cloud.langfuse.com", "level": "info", "timestamp": "2025-03-04T07:02:48.943999Z", "lineno": 85, "module": "wren-ai-service", "message": "LANGFUSE_HOST: https://cloud.langfuse.com", "asctime": "0304 07:02:48"}
INFO:     Application startup complete.
INFO:     172.19.0.6:47654 - "POST /v1/asks HTTP/1.1" 200 OK
I0304 07:03:01.990 2845 wren-ai-service:157] HistoricalQuestion Retrieval pipeline is running...
{"event": "HistoricalQuestion Retrieval pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:01.990797Z", "lineno": 157, "module": "wren-ai-service", "message": "HistoricalQuestion Retrieval pipeline is running...", "asctime": "0304 07:03:01"}
I0304 07:03:01.995 2845 wren-ai-service:322] Intent Classification pipeline is running...
{"event": "Intent Classification pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:01.995299Z", "lineno": 322, "module": "wren-ai-service", "message": "Intent Classification pipeline is running...", "asctime": "0304 07:03:01"}
INFO:     172.19.0.6:47670 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
I0304 07:03:02.303 2845 wren-ai-service:160] dbschema_retrieval with table_names: []
{"event": "dbschema_retrieval with table_names: []", "level": "info", "timestamp": "2025-03-04T07:03:02.303598Z", "lineno": 160, "module": "wren-ai-service", "message": "dbschema_retrieval with table_names: []", "asctime": "0304 07:03:02"}
INFO:     172.19.0.6:47686 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47692 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47698 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47714 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47720 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47732 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
I0304 07:03:08.080 2845 wren-ai-service:490] Ask Retrieval pipeline is running...
{"event": "Ask Retrieval pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:08.080931Z", "lineno": 490, "module": "wren-ai-service", "message": "Ask Retrieval pipeline is running...", "asctime": "0304 07:03:08"}
I0304 07:03:08.354 2845 wren-ai-service:298] db_schemas token count is greater than 100,000, so we will prune columns
{"event": "db_schemas token count is greater than 100,000, so we will prune columns", "level": "info", "timestamp": "2025-03-04T07:03:08.355071Z", "lineno": 298, "module": "wren-ai-service", "message": "db_schemas token count is greater than 100,000, so we will prune columns", "asctime": "0304 07:03:08"}
INFO:     172.19.0.6:47746 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47756 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:47764 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:48592 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:48606 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
I0304 07:03:13.206 2845 wren-ai-service:345] table_content: None
{"event": "table_content: None", "level": "info", "timestamp": "2025-03-04T07:03:13.206278Z", "lineno": 345, "module": "wren-ai-service", "message": "table_content: None", "asctime": "0304 07:03:13"}
{"event": "\n********************************************************************************\n> construct_retrieval_results [src.pipelines.retrieval.retrieval.construct_retrieval_results()] encountered an error<\n> Node inputs:\n{'check_using_db_schemas_without_pruning': \"<Task finished name='Task-77' \"\n                                           'coro=<AsyncGraphAdap...',\n 'construct_db_schemas': \"<Task finished name='Task-76' \"\n                         'coro=<AsyncGraphAdap...',\n 'dbschema_retrieval': \"<Task finished name='Task-75' coro=<AsyncGraphAdap...\",\n 'filter_columns_in_tables': \"<Task finished name='Task-79' \"\n                             'coro=<AsyncGraphAdap...'}\n********************************************************************************", "level": "error", "timestamp": "2025-03-04T07:03:13.206699Z", "lineno": 129, "module": "hamilton.async_driver", "exception": [{"exc_type": "KeyError", "exc_value": "'table_name'", "syntax_error": null, "is_cause": false, "frames": [{"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 122, "name": "new_fn"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 256, "name": "sync_wrapper"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 517, "name": "_handle_exception"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 254, "name": "sync_wrapper"}, {"filename": "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", "lineno": 347, "name": "construct_retrieval_results"}]}]}
{"event": "-------------------------------------------------------------------\nOh no an error! Need help with Hamilton?\nJoin our slack and ask for help! https://join.slack.com/t/hamilton-opensource/shared_invite/zt-2niepkra8-DGKGf_tTYhXuJWBTXtIs4g\n-------------------------------------------------------------------\n", "level": "error", "timestamp": "2025-03-04T07:03:13.206845Z", "lineno": 373, "module": "hamilton.async_driver"}
E0304 07:03:13.207 2845 wren-ai-service:504] ask pipeline - OTHERS: 'table_name'
Traceback (most recent call last):
  File "/root/wrenai/wren-ai-service/src/web/v1/services/ask.py", line 307, in ask
    retrieval_result = await self._pipelines["retrieval"].run(
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 219, in async_wrapper
    self._handle_exception(observation, e)
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 517, in _handle_exception
    raise e
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 217, in async_wrapper
    result = await func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", line 491, in run
    return await self._pipe.execute(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 375, in execute
    raise e
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 366, in execute
    outputs = await self.raw_execute(_final_vars, overrides, display_graph, inputs=inputs)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 326, in raw_execute
    raise e
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 321, in raw_execute
    results = await await_dict_of_tasks(task_dict)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks
    coroutines_gathered = await asyncio.gather(*coroutines)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value
    return await val
           ^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn
    await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs)
                                                                  ^^^^^^^^^^^^^^^
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 256, in sync_wrapper
    self._handle_exception(observation, e)
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 517, in _handle_exception
    raise e
  File "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 254, in sync_wrapper
    result = func(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^
  File "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", line 347, in construct_retrieval_results
    reformated_json[table["table_name"]] = table.get("table_contents", None)
                    ~~~~~^^^^^^^^^^^^^^
KeyError: 'table_name'
{"event": "ask pipeline - OTHERS: 'table_name'", "level": "error", "timestamp": "2025-03-04T07:03:13.208063Z", "lineno": 504, "module": "wren-ai-service", "message": "ask pipeline - OTHERS: 'table_name'", "asctime": "0304 07:03:13", "exception": [{"exc_type": "KeyError", "exc_value": "'table_name'", "syntax_error": null, "is_cause": false, "frames": [{"filename": "/root/wrenai/wren-ai-service/src/web/v1/services/ask.py", "lineno": 307, "name": "ask"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 219, "name": "async_wrapper"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 517, "name": "_handle_exception"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 217, "name": "async_wrapper"}, {"filename": "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", "lineno": 491, "name": "run"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 375, "name": "execute"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 366, "name": "execute"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 326, "name": "raw_execute"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 321, "name": "raw_execute"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 23, "name": "await_dict_of_tasks"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 36, "name": "process_value"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 122, "name": "new_fn"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 256, "name": "sync_wrapper"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 517, "name": "_handle_exception"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 254, "name": "sync_wrapper"}, {"filename": "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", "lineno": 347, "name": "construct_retrieval_results"}]}]}
INFO:     172.19.0.6:48616 - "GET /v1/asks/dd7e1391-6936-409f-ba53-889f800eaa6c/result HTTP/1.1" 200 OK
INFO:     172.19.0.6:48628 - "POST /v1/question-recommendations HTTP/1.1" 200 OK
I0304 07:03:14.027 2845 wren-ai-service:151] Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Generate Question Recommendation pipeline is running...
{"event": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Generate Question Recommendation pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:14.027487Z", "lineno": 151, "module": "wren-ai-service", "message": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Generate Question Recommendation pipeline is running...", "asctime": "0304 07:03:14"}
I0304 07:03:14.029 2845 wren-ai-service:263] Question Recommendation pipeline is running...
{"event": "Question Recommendation pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:14.029486Z", "lineno": 263, "module": "wren-ai-service", "message": "Question Recommendation pipeline is running...", "asctime": "0304 07:03:14"}
INFO:     172.19.0.6:48634 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48650 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48652 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48654 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48664 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48672 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:48674 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
I0304 07:03:20.481 2845 wren-ai-service:490] Ask Retrieval pipeline is running...
{"event": "Ask Retrieval pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:20.481502Z", "lineno": 490, "module": "wren-ai-service", "message": "Ask Retrieval pipeline is running...", "asctime": "0304 07:03:20"}
I0304 07:03:20.481 2845 wren-ai-service:490] Ask Retrieval pipeline is running...
{"event": "Ask Retrieval pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:20.481913Z", "lineno": 490, "module": "wren-ai-service", "message": "Ask Retrieval pipeline is running...", "asctime": "0304 07:03:20"}
I0304 07:03:20.482 2845 wren-ai-service:490] Ask Retrieval pipeline is running...
{"event": "Ask Retrieval pipeline is running...", "level": "info", "timestamp": "2025-03-04T07:03:20.482282Z", "lineno": 490, "module": "wren-ai-service", "message": "Ask Retrieval pipeline is running...", "asctime": "0304 07:03:20"}
I0304 07:03:20.761 2845 wren-ai-service:298] db_schemas token count is greater than 100,000, so we will prune columns
{"event": "db_schemas token count is greater than 100,000, so we will prune columns", "level": "info", "timestamp": "2025-03-04T07:03:20.761898Z", "lineno": 298, "module": "wren-ai-service", "message": "db_schemas token count is greater than 100,000, so we will prune columns", "asctime": "0304 07:03:20"}
I0304 07:03:20.788 2845 wren-ai-service:298] db_schemas token count is greater than 100,000, so we will prune columns
{"event": "db_schemas token count is greater than 100,000, so we will prune columns", "level": "info", "timestamp": "2025-03-04T07:03:20.788665Z", "lineno": 298, "module": "wren-ai-service", "message": "db_schemas token count is greater than 100,000, so we will prune columns", "asctime": "0304 07:03:20"}
INFO:     172.19.0.6:48676 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
I0304 07:03:20.830 2845 wren-ai-service:298] db_schemas token count is greater than 100,000, so we will prune columns
{"event": "db_schemas token count is greater than 100,000, so we will prune columns", "level": "info", "timestamp": "2025-03-04T07:03:20.830710Z", "lineno": 298, "module": "wren-ai-service", "message": "db_schemas token count is greater than 100,000, so we will prune columns", "asctime": "0304 07:03:20"}
INFO:     172.19.0.6:52422 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52432 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52446 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52454 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52470 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52478 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52480 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52490 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
I0304 07:03:29.447 2845 wren-ai-service:345] table_content: None
{"event": "table_content: None", "level": "info", "timestamp": "2025-03-04T07:03:29.447106Z", "lineno": 345, "module": "wren-ai-service", "message": "table_content: None", "asctime": "0304 07:03:29"}
{"event": "\n********************************************************************************\n> construct_retrieval_results [src.pipelines.retrieval.retrieval.construct_retrieval_results()] encountered an error<\n> Node inputs:\n{'check_using_db_schemas_without_pruning': \"<Task finished name='Task-153' \"\n                                           'coro=<AsyncGraphAda...',\n 'construct_db_schemas': \"<Task finished name='Task-152' \"\n                         'coro=<AsyncGraphAda...',\n 'dbschema_retrieval': \"<Task finished name='Task-151' coro=<AsyncGraphAda...\",\n 'filter_columns_in_tables': \"<Task finished name='Task-155' \"\n                             'coro=<AsyncGraphAda...'}\n********************************************************************************", "level": "error", "timestamp": "2025-03-04T07:03:29.447443Z", "lineno": 129, "module": "hamilton.async_driver", "exception": [{"exc_type": "KeyError", "exc_value": "'table_name'", "syntax_error": null, "is_cause": false, "frames": [{"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 122, "name": "new_fn"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 256, "name": "sync_wrapper"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 517, "name": "_handle_exception"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 254, "name": "sync_wrapper"}, {"filename": "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", "lineno": 347, "name": "construct_retrieval_results"}]}]}
{"event": "-------------------------------------------------------------------\nOh no an error! Need help with Hamilton?\nJoin our slack and ask for help! https://join.slack.com/t/hamilton-opensource/shared_invite/zt-2niepkra8-DGKGf_tTYhXuJWBTXtIs4g\n-------------------------------------------------------------------\n", "level": "error", "timestamp": "2025-03-04T07:03:29.447598Z", "lineno": 373, "module": "hamilton.async_driver"}
E0304 07:03:29.447 2845 wren-ai-service:129] Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'
{"event": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'", "level": "error", "timestamp": "2025-03-04T07:03:29.447938Z", "lineno": 129, "module": "wren-ai-service", "message": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'", "asctime": "0304 07:03:29"}
INFO:     172.19.0.6:52500 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:52516 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36604 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36610 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36616 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36622 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
I0304 07:03:35.196 2845 wren-ai-service:345] table_content: None
{"event": "table_content: None", "level": "info", "timestamp": "2025-03-04T07:03:35.196145Z", "lineno": 345, "module": "wren-ai-service", "message": "table_content: None", "asctime": "0304 07:03:35"}
{"event": "\n********************************************************************************\n> construct_retrieval_results [src.pipelines.retrieval.retrieval.construct_retrieval_results()] encountered an error<\n> Node inputs:\n{'check_using_db_schemas_without_pruning': \"<Task finished name='Task-162' \"\n                                           'coro=<AsyncGraphAda...',\n 'construct_db_schemas': \"<Task finished name='Task-161' \"\n                         'coro=<AsyncGraphAda...',\n 'dbschema_retrieval': \"<Task finished name='Task-160' coro=<AsyncGraphAda...\",\n 'filter_columns_in_tables': \"<Task finished name='Task-164' \"\n                             'coro=<AsyncGraphAda...'}\n********************************************************************************", "level": "error", "timestamp": "2025-03-04T07:03:35.196463Z", "lineno": 129, "module": "hamilton.async_driver", "exception": [{"exc_type": "KeyError", "exc_value": "'table_name'", "syntax_error": null, "is_cause": false, "frames": [{"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/hamilton/async_driver.py", "lineno": 122, "name": "new_fn"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 256, "name": "sync_wrapper"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 517, "name": "_handle_exception"}, {"filename": "/root/.cache/pypoetry/virtualenvs/wren-ai-service-eNPyrrjE-py3.12/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", "lineno": 254, "name": "sync_wrapper"}, {"filename": "/root/wrenai/wren-ai-service/src/pipelines/retrieval/retrieval.py", "lineno": 347, "name": "construct_retrieval_results"}]}]}
{"event": "-------------------------------------------------------------------\nOh no an error! Need help with Hamilton?\nJoin our slack and ask for help! https://join.slack.com/t/hamilton-opensource/shared_invite/zt-2niepkra8-DGKGf_tTYhXuJWBTXtIs4g\n-------------------------------------------------------------------\n", "level": "error", "timestamp": "2025-03-04T07:03:35.196603Z", "lineno": 373, "module": "hamilton.async_driver"}
E0304 07:03:35.196 2845 wren-ai-service:129] Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'
{"event": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'", "level": "error", "timestamp": "2025-03-04T07:03:35.196962Z", "lineno": 129, "module": "wren-ai-service", "message": "Request 990931b9-2655-42ba-a4ca-fd062eff94c1: Error validating question: 'table_name'", "asctime": "0304 07:03:35"}
INFO:     172.19.0.6:36632 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36640 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36654 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36660 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36676 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:36680 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46060 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46068 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46072 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46074 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46090 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46096 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46102 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46118 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:46124 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59590 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59592 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59602 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59616 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59622 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59626 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59634 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59648 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     172.19.0.6:59656 - "GET /v1/question-recommendations/990931b9-2655-42ba-a4ca-fd062eff94c1 HTTP/1.1" 200 OK
INFO:     Shutting down
INFO:     Waiting for background tasks to complete. (CTRL+C to force quit)
INFO:     Finished server process [2845]
INFO:     Stopping reloader process [2832]


lsky-walt avatar Mar 05 '25 08:03 lsky-walt

Hey @lsky-walt, the local log is tricky to follow because the error message is

    reformated_json[table["table_name"]] = table.get("table_contents", None)
                    ~~~~~^^^^^^^^^^^^^^
KeyError: 'table_name'

If it is key error for table_name, it might be because the LLM you used didn’t generate the structure output correctly. I’m not sure what it actually generated, though.

paopa avatar Mar 06 '25 03:03 paopa