WrenAI icon indicating copy to clipboard operation
WrenAI copied to clipboard

wrenai-wren-ai-service start error

Open YISIO opened this issue 9 months ago • 6 comments

Describe the bug

The Containers Log: 2025-03-05 16:55:04 ERROR: Traceb

config.yaml.zip

ack (most recent call last): 2025-03-05 16:55:04 File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 693, in lifespan 2025-03-05 16:55:04 async with self.lifespan_context(app) as maybe_state: 2025-03-05 16:55:04 File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter 2025-03-05 16:55:04 return await anext(self.gen) 2025-03-05 16:55:04 ^^^^^^^^^^^^^^^^^^^^^ 2025-03-05 16:55:04 File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan 2025-03-05 16:55:04 async with original_context(app) as maybe_original_state: 2025-03-05 16:55:04 File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter 2025-03-05 16:55:04 return await anext(self.gen) 2025-03-05 16:55:04 ^^^^^^^^^^^^^^^^^^^^^ 2025-03-05 16:55:04 File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan 2025-03-05 16:55:04 async with original_context(app) as maybe_original_state: 2025-03-05 16:55:04 File "/usr/local/lib/python3.12/contextlib.py", line 204, in aenter 2025-03-05 16:55:04 return await anext(self.gen) 2025-03-05 16:55:04 ^^^^^^^^^^^^^^^^^^^^^ 2025-03-05 16:55:04 File "/src/main.py", line 32, in lifespan 2025-03-05 16:55:04 app.state.service_container = create_service_container(pipe_components, settings) 2025-03-05 16:55:04 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-03-05 16:55:04 File "/src/globals.py", line 193, in create_service_container 2025-03-05 16:55:04 **pipe_components["sql_explanation"], 2025-03-05 16:55:04 ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^ 2025-03-05 16:55:04 KeyError: 'sql_explanation' 2025-03-05 16:55:04 2025-03-05 16:55:04 ERROR: Application startup failed. Exiting.

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]

Wren AI Information

  • Version: [e.g, 0.1.0]

Additional context Add any other context about the problem here.

Relevant log output

  • Please share config.yaml with us, it should be located at ~/.wrenai/config.yaml.
  • Please share your logs with us with the following command:
    docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
    docker logs wrenai-wren-ai-service-1 >& wrenai-wren-ai-service.log && \
    docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
    docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log
    

YISIO avatar Mar 05 '25 09:03 YISIO

It seems like a mismatch between image version and config.yaml file. @paopa could you help with it?

onlyjackfrost avatar Mar 05 '25 14:03 onlyjackfrost

Hi @YISIO, if you’re using version 0.15.3, could you please check out this link: https://github.com/Canner/WrenAI/blob/4d6c82ca69985a7d38270b7a4b815b2276f1fd9c/wren-ai-service/docs/config_examples/config.deepseek.yaml#L96-L97 and fill the missing in. We removed those pipeline settings in version 0.15.4. Thanks a bunch!

paopa avatar Mar 05 '25 15:03 paopa

@paopa @onlyjackfrost thanks,the container has successfully started with new config.yaml,but when I ask the question, it error,the log:

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/src/web/v1/services/semantics_preparation.py", line 90, in prepare_semantics await asyncio.gather(*tasks) File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 219, in async_wrapper self._handle_exception(observation, e) File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 517, in _handle_exception raise e File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 217, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/pipelines/indexing/db_schema.py", line 370, in run return await self._pipe.execute( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 375, in execute raise e File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 366, in execute outputs = await self.raw_execute(_final_vars, overrides, display_graph, inputs=inputs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 326, in raw_execute raise e File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 321, in raw_execute results = await await_dict_of_tasks(task_dict) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks coroutines_gathered = await asyncio.gather(*coroutines) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value return await val ^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 91, in new_fn fn_kwargs = await await_dict_of_tasks(task_dict) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks coroutines_gathered = await asyncio.gather(*coroutines) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value return await val ^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 91, in new_fn fn_kwargs = await await_dict_of_tasks(task_dict) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 23, in await_dict_of_tasks coroutines_gathered = await asyncio.gather(*coroutines) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 36, in process_value return await val ^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/hamilton/async_driver.py", line 122, in new_fn await fn(**fn_kwargs) if asyncio.iscoroutinefunction(fn) else fn(**fn_kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 219, in async_wrapper self._handle_exception(observation, e) File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 517, in _handle_exception raise e File "/app/.venv/lib/python3.12/site-packages/langfuse/decorators/langfuse_decorator.py", line 217, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/pipelines/indexing/db_schema.py", line 312, in embedding return await embedder.run(documents=chunk["documents"]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/backoff/_async.py", line 151, in retry ret = await target(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/providers/embedder/litellm.py", line 144, in run embeddings, meta = await self._embed_batch( ^^^^^^^^^^^^^^^^^^^^^^^^ File "/src/providers/embedder/litellm.py", line 106, in _embed_batch response = await aembedding( ^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1358, in wrapper_async raise e File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1217, in wrapper_async result = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 3163, in aembedding raise exception_type( ^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2190, in exception_type raise e File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 380, in exception_type raise BadRequestError( litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'code': 'InvalidParameter', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: Value error, batch size is invalid, it should not be larger than 10.: input.contents', 'type': 'InvalidParameter'}, 'id': 'dc302215-db96-9d95-858c-9351ec93cd6a', 'request_id': 'dc302215-db96-9d95-858c-9351ec93cd6a'}

Hi @YISIO, if you’re using version 0.15.3, could you please check out this link:

WrenAI/wren-ai-service/docs/config_examples/config.deepseek.yaml

Lines 96 to 97 in 4d6c82c

  • name: sql_explanation llm: litellm_llm.deepseek/deepseek-coder and fill the missing in. We removed those pipeline settings in version 0.15.4. Thanks a bunch!

YISIO avatar Mar 06 '25 03:03 YISIO

@paopa @onlyjackfrost thanks,the container has successfully started with new config.yaml,but when I ask the question, it error,the log:

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/src/web/v1/services/semantics_preparation.py", line 90, in prepare_semantics await asyncio.gather(*tasks) ... File "/app/.venv/lib/python3.12/site- "/src/pipelines/indexing/db_schema.py", line 312, in embedding return await embedder.run(documents=chunk["documents"]) ... , in exception_type raise BadRequestError( litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'code': 'InvalidParameter', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: Value error, batch size is invalid, it should not be larger than 10.: input.contents', 'type': 'InvalidParameter'}, 'id': 'dc302215-db96-9d95-858c-9351ec93cd6a', 'request_id': 'dc302215-db96-9d95-858c-9351ec93cd6a'}

Hi @YISIO, I noticed an error in the log that seems to be related to indexing issues when deploying your model (db schema). I can’t seem to reproduce this issue myself, so I’d love to know how you triggered it. Could you please share the steps you took and the data source you’re using? Thanks a bunch!

paopa avatar Mar 06 '25 12:03 paopa

@paopa @onlyjackfrost thanks,the container has successfully started with new config.yaml,but when I ask the question, it error,the log: During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/web/v1/services/semantics_preparation.py", line 90, in prepare_semantics await asyncio.gather(*tasks) ... File "/app/.venv/lib/python3.12/site- "/src/pipelines/indexing/db_schema.py", line 312, in embedding return await embedder.run(documents=chunk["documents"]) ... , in exception_type raise BadRequestError( litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'code': 'InvalidParameter', 'param': None, 'message': '<400> InternalError.Algo.InvalidParameter: Value error, batch size is invalid, it should not be larger than 10.: input.contents', 'type': 'InvalidParameter'}, 'id': 'dc302215-db96-9d95-858c-9351ec93cd6a', 'request_id': 'dc302215-db96-9d95-858c-9351ec93cd6a'}

Hi @YISIO, I noticed an error in the log that seems to be related to indexing issues when deploying your model (db schema). I can’t seem to reproduce this issue myself, so I’d love to know how you triggered it. Could you please share the steps you took and the data source you’re using? Thanks a bunch!

@paopa My schema type is Postgres SQL ;

Steps: (1) start containers and it run successfully; (2) fill in database information and choose 7 table from database, and the sample ddl: create table public.che_sales ( flow_id serial primary key, data_time date, mapping_id bigint, customer_id bigint, product_id bigint, seal_up char, sales_context json not null, source char not null, job_id bigint } create index che_sales_customer_id_index on public.che_sales (customer_id); create index che_sales_ product_id_index on public.che_sales (product_id); (3) when I save the modeling, the containers start error

YISIO avatar Mar 07 '25 02:03 YISIO

Hi @YISIO, thank you for providing the scenario. Could you also dump the log after performing these actions, along with the config and .env file? Thanks!

BTW, we released 0.15.4 version. If it's possible, can you use the version? I think the first issue you encountered won't happen.

paopa avatar Mar 07 '25 07:03 paopa