workers-sdk
workers-sdk copied to clipboard
๐ BUG: d1 execution with comment including semicolon causes SQL code did not contain a statement
Which Cloudflare product(s) does this pertain to?
D1
What version(s) of the tool(s) are you using?
3.6.0
What version of Node are you using?
20.5.1
What operating system are you using?
Mac
Describe the Bug
wrangler d1 execute can not execute sql with comment including semicolon on remote.
select 1;
-- comment;
select 2;
Result
$ pnpm wrangler d1 execute mydb2 --file test.sql
๐ Mapping SQL input into an array of statements
๐ Parsing 2 statements
๐ Executing on mydb2 (837018aa-cde8-49ce-ac2a-68225f45dea8):
โ [ERROR] A request to the Cloudflare API (/accounts/f90b16619da21adbec058f5c09b1de53/d1/database/837018aa-cde8-49ce-ac2a-68225f45dea8/query) failed.
SQL code did not contain a statement. [code: 7500]
If you think this is a bug, please open an issue at:
https://github.com/cloudflare/workers-sdk/issues/new/choose
It happens on d1 migrations too.
it only happens on remote only. it works on local (--local)
$ pnpm wrangler d1 execute mydb2 --file test.sql --local
๐ Mapping SQL input into an array of statements
๐ Loading DB at .wrangler/state/v3/d1/837018aa-cde8-49ce-ac2a-68225f45dea8/db.sqlite
โโโโโ
โ 1 โ
โโโโโค
โ 1 โ
โโโโโ
โโโโโ
โ 2 โ
โโโโโค
โ 2 โ
โโโโโ
Please provide a link to a minimal reproduction
https://github.com/mizchi/repro-d1-parse-error
Please provide any relevant error logs
with WRANGLER_LOG=debug and --batch-size 1 --yes
To check sql splitting and batch splitting, I ran with --batch-size 1.
$ env WRANGLER_LOG=debug pnpm wrangler d1 execute mydb2 --file test.sql --batch-size 1
--------------------
๐ง D1 is currently in open alpha and is not recommended for production data and traffic
๐ง Please report any bugs to https://github.com/cloudflare/workers-sdk/issues/new/choose
๐ง To request features, visit https://community.cloudflare.com/c/developers/d1
๐ง To give feedback, visit https://discord.gg/cloudflaredev
--------------------
๐ Mapping SQL input into an array of statements
๐ Parsing 2 statements
๐ We are sending 2 batch(es) to D1 (limited to 1 statements per batch. Use --batch-size to override.)
โ โ ๏ธ Too much SQL to send at once, this execution will be sent as 2 batches.
โน๏ธ Each batch is sent individually and may leave your DB in an unexpected state if a later batch fails.
โ ๏ธ Make sure you have a recent backup. Ok to proceed? โฆ yes
๐ Let's go
Retrieving cached values for account from node_modules/.cache/wrangler
๐ Executing on mydb2 (837018aa-cde8-49ce-ac2a-68225f45dea8):
select 1
-- START CF API REQUEST: POST https://api.cloudflare.com/client/v4/accounts/f90b16619da21adbec058f5c09b1de53/d1/database/837018aa-cde8-49ce-ac2a-68225f45dea8/query
HEADERS: {
"Content-Type": "application/json",
"User-Agent": "wrangler/3.6.0"
}
INIT: {
"method": "POST",
"headers": {
"Content-Type": "application/json",
"User-Agent": "wrangler/3.6.0"
},
"body": "{\"sql\":\"select 1\"}"
}
-- END CF API REQUEST
-- START CF API RESPONSE: OK 200
HEADERS: {
"cf-cache-status": "DYNAMIC",
"cf-ray": "801643869a2f20a1-NRT",
"connection": "keep-alive",
"content-encoding": "gzip",
"content-type": "application/json; charset=UTF-8",
"date": "Mon, 04 Sep 2023 12:33:57 GMT",
"server": "cloudflare",
"set-cookie": "__cfruid=<masked> -1693830837; path=/; domain=.api.cloudflare.com; HttpOnly; Secure; SameSite=None",
"transfer-encoding": "chunked",
"vary": "Accept-Encoding",
"x-envoy-upstream-service-time": "495"
}
RESPONSE: {
"result": [
{
"results": [
{
"1": 1
}
],
"success": true,
"meta": {
"served_by": "v3-prod",
"duration": 0.20614099875092504,
"changes": 0,
"last_row_id": 0,
"changed_db": false,
"size_after": 36864,
"rows_read": 0,
"rows_written": 0
}
}
],
"success": true,
"errors": [],
"messages": []
}
-- END CF API RESPONSE
๐ฃ Executed 1 commands in 0.20614099875092504ms
-- comment;
select 2
-- START CF API REQUEST: POST https://api.cloudflare.com/client/v4/accounts/f90b16619da21adbec058f5c09b1de53/d1/database/837018aa-cde8-49ce-ac2a-68225f45dea8/query
HEADERS: {
"Content-Type": "application/json",
"User-Agent": "wrangler/3.6.0"
}
INIT: {
"method": "POST",
"headers": {
"Content-Type": "application/json",
"User-Agent": "wrangler/3.6.0"
},
"body": "{\"sql\":\"-- comment;\\nselect 2\"}"
}
-- END CF API REQUEST
-- START CF API RESPONSE: Internal Server Error 500
HEADERS: {
"cf-cache-status": "DYNAMIC",
"cf-ray": "8016438bde7020a1-NRT",
"connection": "keep-alive",
"content-encoding": "gzip",
"content-type": "application/json; charset=UTF-8",
"date": "Mon, 04 Sep 2023 12:33:57 GMT",
"server": "cloudflare",
"set-cookie": "__cfruid=<masked>; path=/; domain=.api.cloudflare.com; HttpOnly; Secure; SameSite=None",
"transfer-encoding": "chunked",
"vary": "Accept-Encoding",
"x-envoy-upstream-service-time": "421"
}
RESPONSE: {
"result": [],
"success": false,
"errors": [
{
"code": 7500,
"message": "SQL code did not contain a statement."
}
],
"messages": []
}
-- END CF API RESPONSE
โ [ERROR] A request to the Cloudflare API (/accounts/f90b16619da21adbec058f5c09b1de53/d1/database/837018aa-cde8-49ce-ac2a-68225f45dea8/query) failed.
SQL code did not contain a statement. [code: 7500]
If you think this is a bug, please open an issue at:
https://github.com/cloudflare/workers-sdk/issues/new/choose
So finally body is "body": "{\"sql\":\"-- comment;\\nselect 2\"}". wrangle/src/d1/splitter.ts works but it come from d1 internal.
follow up to this, i recently got this same bug when running a command applied to a beta D1, however the same command with semi colon comments when applied to an Alpha database had no is bug
My solution was to remove comments and then there were no issues
Same here, using types generated by https://github.com/drizzle-team/drizzle-orm.
Did not work:
ALTER TABLE folders ADD `folder_id` integer REFERENCES folders(id);--> statement-breakpoint
Worked:
ALTER TABLE folders ADD `folder_id` integer REFERENCES folders(id)--> statement-breakpoint
Likewise. This works:
SELECT * FROM notes
-- INSERT INTO notes (note_id, title, body, created_at, updated_at)
-- VALUES ('some-uuid', 'Note Title', 'Note Content', CURRENT_TIMESTAMP, CURRENT_TIMESTAMP);
And so does this:
SELECT * FROM notes
But this doesn't:
SELECT * FROM notes;
-- INSERT INTO notes (note_id, title, body, created_at, updated_at)
-- VALUES ('some-uuid', 'Note Title', 'Note Content', CURRENT_TIMESTAMP, CURRENT_TIMESTAMP);
I also just ran into this issue! Can confirm that removing comments did the trick as a workaround
Came across this as the hello world example is broken
https://developers.cloudflare.com/d1/tutorials/build-a-comments-api/
I have a similar case here but with the multi line style comments used for traces:
This works (without the semicolon)
await env.MY_DATABASE.prepare("SELECT name FROM `User` /* traceparent='00-71a4cbd5fc3c2cf97b4cbbeddea28cb3-6a2c496ff1211ea4-01' */").all()
This doesn't
await env.MY_DATABASE.prepare("SELECT name FROM `User`; /* traceparent='00-71a4cbd5fc3c2cf97b4cbbeddea28cb3-6a2c496ff1211ea4-01' */").all()
See example from https://google.github.io/sqlcommenter/ for how it can look like with more data
INSERT INTO "polls_question"
("question_text", "pub_date") VALUES
('What is this?', '2019-05-28T18:54:50.767481+00:00'::timestamptz) RETURNING
"polls_question"."id" /*controller='index',db_driver='django.db.backends.postgresql',
framework='django%3A2.2.1',route='%5Epolls/%24',
traceparent='00-5bd66ef5095369c7b0d1f8f4bd33716a-c532cb4098ac3dd2-01',
tracestate='congo%3Dt61rcWkgMzE%2Crojo%3D00f067aa0ba902b7'*/
Same here:
This doesn't work:
CREATE TABLE IF NOT EXISTS user_account (
id integer PRIMARY KEY AUTOINCREMENT,
full_name text NOT NULL,
mobile text NULL,
email text NULL,
openai_thread_id text NULL
)
CREATE INDEX idx_user_account_mobile ON user_account (mobile);
CREATE INDEX idx_user_account_email ON user_account (email);
CREATE INDEX idx_user_account_thread ON user_account (openai_thread_id);
INSERT INTO user_account (full_name, mobile, openai_thread_id) VALUES ('Lu Co', '+5511999999999', 'thread_123');
-- wrangler d1 execute kontato-ai --file schemas/schema.sql`
This works!
CREATE TABLE IF NOT EXISTS user_account (
id integer PRIMARY KEY AUTOINCREMENT,
full_name text NOT NULL,
mobile text NULL,
email text NULL,
openai_thread_id text NULL
)
CREATE INDEX idx_user_account_mobile ON user_account (mobile);
CREATE INDEX idx_user_account_email ON user_account (email);
CREATE INDEX idx_user_account_thread ON user_account (openai_thread_id);
INSERT INTO user_account (full_name, mobile, openai_thread_id) VALUES ('Lu Co', '+5511999999999', 'thread_123');
Im trying to migrate a D1 DB what is 10,000 lines long. Is there any way to programatically do this? unsure where my issue is, but the cloudflare docs shows how to use sqlite3 to dump the DB, and then import with wrangler and I hit this issue...
Had to remove the following:
PRAGMA foreign_keys=OFF;
BEGIN TRANSACTION;
CREATE TABLE _cf_KV (
key TEXT PRIMARY KEY,
value BLOB
) WITHOUT ROWID;
COMMIT;