Using CAST with json_extract produces a syntax error
SQLDelight Version
2.0.2
SQLDelight Dialect
sqlite-3-38-dialect + sqlite-json-module
Describe the Bug
When compiling an sq file with this query:
WITH
missions_data AS (SELECT value AS missions_value FROM ResponseCache, json_each(ResponseCache.data, '$.missions') WHERE ResponseCache.id = :id)
SELECT
CAST(json_extract(missions_value, "$.linkedItemId") AS INT) AS linkedItemId
FROM missions_data;
I get this error:
Compiling with dialect app.cash.sqldelight.dialects.sqlite_3_38.SqliteDialect
/path/to/ResponseCache.sq: (19, 51): AS or IN expected, got 'AS'
16 WITH
17 missions_data AS (SELECT value AS missions_value FROM ResponseCache, json_each(ResponseCache.data, '$.missions') WHERE ResponseCache.id = :id)
18 SELECT
19 CAST(json_extract(missions_value, "$.linkedItemId") AS INT) AS linkedItemId
20 FROM missions_data
Running the same query using sqlite3 binary works without issues.
P.S. Using WITH to workaround another issue described in #5372, query could be simpler, but I guess the bug will manifest anyway
Stacktrace
🤔 Tested with latest snapshot as been many updates
Seems to compile using CAST ... AS INTEGER - NOT INT
Still a 🐛 but may help you make progress for now
e.g
selectUsingWith:
WITH
missions_data AS (SELECT value AS missions_value FROM ResponseCache, json_each(ResponseCache.data, '$.missions') WHERE ResponseCache.id = :id)
SELECT
CAST(json_extract(missions_value, "$.linkedItemId") AS INTEGER) AS linkedItemId
FROM missions_data;
selectNoWith:
SELECT CAST(json_extract(value, '$.linkedItemId') AS INTEGER) AS linkedItemId
FROM ResponseCache, json_each(ResponseCache.data, '$.missions')
WHERE ResponseCache.id = :id;
Nice! Indeed AS INTEGER works, while AS INT doesn't.
What also doesn't work and I'd like it to work is casting to kotlin/java types, for example
import kotlin.Boolean;
selectNoWith:
SELECT CAST(json_extract(value, '$.linkedItemId') AS Boolean) AS linkedItemId
FROM ResponseCache, json_each(ResponseCache.data, '$.missions')
WHERE ResponseCache.id = :id;
Currently this gives the same error I described in the issue.