DuckDB initial SQL statements fail
Describe the bug I am trying to run the example on ubuntu machine. However at the time to duckDB database creation, I get an error. And after sometime, my terminal shows Timeout
To Reproduce Steps to reproduce the behavior: using the example from the documentation to run the services https://docs.getwren.ai/installation Then I upload the data given in the documentation here: https://docs.getwren.ai/getting_started/own_data
Expected behavior I should be able to upload the data, and query it.
Screenshots
starting the services
creating the database
Timeout after sometime:
Desktop (please complete the following information):
- OS: centos
- Browser : chrome, safari, firefox
Wren AI Information
- Version: [e.g, 0.1.0]
- LLM_PROVIDER= # openai_llm,
- GENERATION_MODEL= # gpt-3.5-turbo, llama3:70b, etc.
Additional context Add any other context about the problem here.
HI @snash4, Thanks for reporting. I think we need more information to check what's happened. Could you provide the log of services?
I think the log of wren-ui, wren-engine and ibis-server are required. You can use the following command to dump them.
docker logs wrenai-wren-ui-1 >& wrenai-wren-ui.log && \
docker logs wrenai-wren-engine-1 >& wrenai-wren-engine.log && \
docker logs wrenai-ibis-server-1 >& wrenai-ibis-server.log
BTW, before you upload the logs, please make sure no sensitive information is in the logs. Thanks
I have same problem.but my error message is "Internal server error
update project set display_name = 'e-commerce', connection_info = {"initSql":"CREATE TABLE reviews AS SELECT * FROM read_csv('./etc/data/e-commerce/reviews.csv', header = true);\nCREATE TABLE customers AS SELECT * FROM read_csv('./etc/data/e-commerce/customers.csv', header = true);\nCREATE TABLE orders AS SELECT * FROM read_csv('./etc/data/e-commerce/orders.csv', header = true);","configurations":{},"extensions":[]} where id = 22 returning * - SQLite3 can only bind numbers, strings, bigints, buffers, and null"
wrenai-wren-engine.log wrenai-wren-ui.log wrenai-ibis-server.log
Hi @snash4 I think the logs are incomplete. Typically, the log would like
2024-07-22 15:00:00 Using SQLite
2024-07-22 15:00:00 Already up to date
2024-07-22 15:00:01 ▲ Next.js 14.2.4
2024-07-22 15:00:01 - Local: http://localhost:3000
2024-07-22 15:00:01 - Network: http://0.0.0.0:3000
2024-07-22 15:00:01
2024-07-22 15:00:01 ✓ Starting...
2024-07-22 15:00:02 ✓ Ready in 1788ms
2024-07-22 15:00:14 [2024-07-22T07:00:14.003] [INFO] TELEMETRY - Telemetry initialized
2024-07-22 15:00:14 using sqlite
2024-07-22 15:00:14 [2024-07-22T07:00:14.093] [INFO] AskingService - Background tracker started
2024-07-22 15:00:14 [2024-07-22T07:00:14.101] [INFO] AskingService - Initialization: adding unfininshed thread responses (total: 0) to background tracker
2024-07-22 15:02:18 [2024-07-22T07:02:18.635] [DEBUG] ProjectService - Creating project...
2024-07-22 15:02:18 [2024-07-22T07:02:18.650] [DEBUG] DataSourceResolver - Project created
2024-07-22 15:00:14 Persisted queries are enabled and are using an unbounded cache. Your server is vulnerable to denial of service attacks via memory exhaustion. Set `cache: "bounded"` or `persistedQueries: false` in your ApolloServer constructor, or see https://go.apollo.dev/s/cache-backends for other alternatives.
If got some errors, it would be
2024-07-22 15:07:18 Using SQLite
2024-07-22 15:07:19 model len:0
2024-07-22 15:07:19 Batch 1 run: 22 migrations
2024-07-22 15:07:20 ▲ Next.js 14.2.4
2024-07-22 15:07:20 - Local: http://localhost:3000
2024-07-22 15:07:20 - Network: http://0.0.0.0:3000
2024-07-22 15:07:20
2024-07-22 15:07:20 ✓ Starting...
2024-07-22 15:07:22 ✓ Ready in 2.1s
2024-07-22 15:07:32 [2024-07-22T07:07:32.947] [INFO] TELEMETRY - Telemetry initialized
2024-07-22 15:07:32 using sqlite
2024-07-22 15:07:33 [2024-07-22T07:07:33.008] [INFO] AskingService - Background tracker started
2024-07-22 15:07:33 [2024-07-22T07:07:33.017] [INFO] AskingService - Initialization: adding unfininshed thread responses (total: 0) to background tracker
2024-07-22 15:08:01 [2024-07-22T07:08:01.678] [DEBUG] ProjectService - Creating project...
2024-07-22 15:08:01 [2024-07-22T07:08:01.760] [DEBUG] DataSourceResolver - Project created
2024-07-22 15:08:01 [2024-07-22T07:08:01.978] [DEBUG] WrenEngineAdaptor - Got error when init database: AxiosError: Request failed with status code 500
2024-07-22 15:08:01 [2024-07-22T07:08:01.980] [ERROR] DataSourceResolver - Failed to get project tables {
2024-07-22 15:08:01 "message": "The initializing SQL seems to be invalid, Please check your SQL and try again.",
2024-07-22 15:08:01 "extensions": {
2024-07-22 15:08:01 "originalError": {
2024-07-22 15:08:01 "message": "Request failed with status code 500",
2024-07-22 15:08:01 "name": "AxiosError",
2024-07-22 15:08:01 "stack": "AxiosError: Request failed with status code 500\n at settle (/app/node_modules/axios/lib/core/settle.js:17:12)\n at IncomingMessage.handleStreamEnd (/app/node_modules/axios/lib/adapters/http.js:382:11)\n at IncomingMessage.emit (node:events:529:35)\n at endReadableNT (node:internal/streams/readable:1400:12)\n at process.processTicksAndRejections (node:internal/process/task_queues:82:21)",
2024-07-22 15:08:01 "config": {
2024-07-22 15:08:01 "transitional": {
2024-07-22 15:08:01 "silentJSONParsing": true,
2024-07-22 15:08:01 "forcedJSONParsing": true,
2024-07-22 15:08:01 "clarifyTimeoutError": false
2024-07-22 15:08:01 },
2024-07-22 15:08:01 "transformRequest": [
2024-07-22 15:08:01 null
2024-07-22 15:08:01 ],
2024-07-22 15:08:01 "transformResponse": [
2024-07-22 15:08:01 null
2024-07-22 15:08:01 ],
2024-07-22 15:08:01 "timeout": 0,
2024-07-22 15:08:01 "xsrfCookieName": "XSRF-TOKEN",
2024-07-22 15:08:01 "xsrfHeaderName": "X-XSRF-TOKEN",
2024-07-22 15:08:01 "maxContentLength": -1,
2024-07-22 15:08:01 "maxBodyLength": -1,
2024-07-22 15:08:01 "env": {},
2024-07-22 15:08:01 "headers": {
2024-07-22 15:08:01 "Accept": "application/json, text/plain, */*",
2024-07-22 15:08:01 "Content-Type": "text/plain; charset=utf-8",
2024-07-22 15:08:01 "User-Agent": "axios/0.28.0",
2024-07-22 15:08:01 "Content-Length": 301
2024-07-22 15:08:01 },
2024-07-22 15:08:01 "method": "put",
2024-07-22 15:08:01 "url": "http://wren-engine:8080/v1/data-source/duckdb/settings/init-sql",
2024-07-22 15:08:01 "data": "CREATE TABLE reviews AS SELECT * FROM read_csv('./etc/data/e-commerce/reviews.csv', header = true);\nCREATE TABLE customers AS SELECT * FROM read_csv('./etc/data/e-commerce/customers.csv', header = true);\nCREATE TABLE orders AS SELECT * FROM read_csv('./etc/data/e-commerce/orders.csv', header = true);"
2024-07-22 15:08:01 },
2024-07-22 15:08:01 "code": "ERR_BAD_RESPONSE",
2024-07-22 15:08:01 "status": 500
2024-07-22 15:08:01 },
2024-07-22 15:08:01 "code": "INIT_SQL_ERROR",
2024-07-22 15:08:01 "message": "The initializing SQL seems to be invalid, Please check your SQL and try again.",
2024-07-22 15:08:01 "shortMessage": "Invalid initializing SQL"
2024-07-22 15:08:01 }
2024-07-22 15:08:01 }
2024-07-22 15:08:01 [2024-07-22T07:08:01.987] [ERROR] APOLLO - == original error ==
2024-07-22 15:08:01 [2024-07-22T07:08:01.987] [ERROR] APOLLO - AxiosError: Request failed with status code 500
2024-07-22 15:08:01 at settle (/app/node_modules/axios/lib/core/settle.js:17:12)
2024-07-22 15:08:01 at IncomingMessage.handleStreamEnd (/app/node_modules/axios/lib/adapters/http.js:382:11)
2024-07-22 15:08:01 at IncomingMessage.emit (node:events:529:35)
2024-07-22 15:08:01 at endReadableNT (node:internal/streams/readable:1400:12)
2024-07-22 15:08:01 at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
2024-07-22 15:07:33 Persisted queries are enabled and are using an unbounded cache. Your server is vulnerable to denial of service attacks via memory exhaustion. Set `cache: "bounded"` or `persistedQueries: false` in your ApolloServer constructor, or see https://go.apollo.dev/s/cache-backends for other alternatives.
Could you dump the logs after retrying all the steps in the doc again?
Actually, I can't reproduce this issue in my environment, but I guess the root cause is that the dataset isn't uploaded to the container. Maybe you can double-check if the e-commerce data is copied to the target folder. Pay special attention to steps 1 to 4 in the doc.
I have same problem.but my error message is "Internal server error update
projectsetdisplay_name= 'e-commerce',connection_info= {"initSql":"CREATE TABLE reviews AS SELECT * FROM read_csv('./etc/data/e-commerce/reviews.csv', header = true);\nCREATE TABLE customers AS SELECT * FROM read_csv('./etc/data/e-commerce/customers.csv', header = true);\nCREATE TABLE orders AS SELECT * FROM read_csv('./etc/data/e-commerce/orders.csv', header = true);","configurations":{},"extensions":[]} whereid= 22 returning * - SQLite3 can only bind numbers, strings, bigints, buffers, and null"
Thanks @aiwenForGit for reporting. Could you provide the logs? According to your error message, I guess wren-ai maybe get a null type from your data.
Hi, the error is caused by wren-ai-service:0.7.4 docker. Here is the log wren-ai-service.log
I tested the OpenAI API separately and it works.
Hi, the error is caused by wren-ai-service:0.7.4 docker. Here is the log wren-ai-service.log
Thanks. I think it's weird that the error message shows a SQL error, but it's actually an AI service error. @onlyjackfrost Maybe we should check how we handle this error.
I tested the OpenAI API separately and it works.
@cyyeh Could you check this log?
@snash4 the ai service log said it couldn't connect to OpenAI. What's the LLM option you chose when you launch WrenAI, OpenAI or Custom? If it's custom, could you share .env.ai file with me with credential key removed?