tauri-plugin-sql
tauri-plugin-sql copied to clipboard
Number returns as null
When I do such request:
await db.select('SELECT COUNT(*) AS "count" FROM (SELECT * FROM "notes")');
The result is:
[{"count":null}]
I use #release branch
same, postgresql
This is the case with sqlite3 as well
select 1 as foo
returns [{foo: null}]
.
please provide a reproducible example -- including your db schema -- so this can more easily be tested
@yankeeinlondon just try query select 1 as foo
@quolpr i'm not sure how that should evaluate actually ... it would be better if we were querying against actual database fields as the logic tries to determine type information from the field and if it can't convert the value into that field then it returns null
as you have here
it is possible that we may be able to simplify the serde derserialization logic quite a bit here ... but I will probably need to probe a bit more for context amoungst the original authors
please provide a reproducible example -- including your db schema -- so this can more easily be tested
@yankeeinlondon In the todo example, just try to count how many todos you have in database. Currently the demo app loads all records from DB and counts them in JS. What about a real app with 10k records? I added the following line here:
console.log( await db.select('SELECT COUNT(*) FROM todos') );
output was:
[{COUNT(*): null}]
Also seeing this problem with postgres.
The same query used with this plugin returns null
table_name's
hardcoding for debugging purposes
console.log("== before load ==");
const db = await Database.load(
"postgres://postgres:[email protected]:5432/pgdb"
);
console.log("== Database instance ==");
console.log(db);
const result = await db.select(
`SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'`
);
console.log("== db.select result ==");
console.log(result);
Three rows, as expected, but with null values
docker-compose.yml if interested in reproducing this
version: '3'
services:
postgres:
image: postgres:12.3-alpine
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=pgdb
- PGDATA=/postgres-data
ports:
- 5432:5432
So, I dug into this a little bit. This bit of code that attempts to serialize types is likely very wrong. https://github.com/tauri-apps/tauri-plugin-sql/blob/dev/src/plugin.rs#L253
If the type_info
returned from the database doesn't match anything in this very short list of database types, you'll get a null
value back because the lib isn't able to figure out how to serialize.
If you go check out the sqlx postgres types, you'll see an exhaustive list https://github.com/launchbadge/sqlx/blob/061fdcabd72896d9bc3abb4ea4af6712a04bc0a8/sqlx-core/src/postgres/type_info.rs#L451
I don't know Rust well enough to make an update, but I would hope there's either:
- A way use types from upstreams like sqlx and infer and convert that way
- A way to use generics and convert
A couple of other notes:
- The broken serialization step seems like a huge performance tax to pay just to get data into TS land.
- This repo is misleading in the way that it presents itself as working. There needs to be a disclaimer at the top of the README.md in my opinion
- It would be great to have tests demonstrating some sort of functionality / stability.
Yeah i was noticing this when I looked at this issue and I did make this "better" in the last release I put out but I think this may need to go further. I will try and find a moment over the weekend.
Note: if anyone else any special insight I'm all ears. I inherited this Rust code and don't have a real background in sqlx. I'd have thought the Sqlx to Rust type bindings would be automatic and from there if we need TS types we can use the rsts crate.
I pulled the code down a while ago. Codepilot made this suggestion which doesn't look too far from what I would expect. For each database, you would need ensure the column data type maps to what that database supports. I don't think this scales very well, though.
If it's possible during dev mode, I would throw an error indicating a currently unsupported column. It would at least allow people to open PRs / issues for missing type mappings
let value = match info.name() {
#[cfg(feature = "sqlite")]
"INTEGER" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "sqlite")]
"TEXT" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "sqlite")]
"REAL" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "sqlite")]
"BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "mysql")]
"TINY" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"SHORT" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"LONG" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"LONGLONG" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"INT24" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"YEAR" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"FLOAT" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"DOUBLE" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"BIT" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "mysql")]
"TIMESTAMP" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"DATE" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"TIME" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"DATETIME" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"NEWDATE" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"VARCHAR" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"VAR_STRING" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"STRING" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"TINY_BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "mysql")]
"MEDIUM_BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "mysql")]
"LONG_BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "mysql")]
"BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "mysql")]
"ENUM" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"SET" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "mysql")]
"GEOMETRY" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "postgres")]
"bool" => JsonValue::Bool(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"int2" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"int4" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"int8" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"float4" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"float8" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"numeric" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "postgres")]
"text" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"varchar" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"date" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"time" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"timestamp" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"timestamptz" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "postgres")]
"bytea" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
#[cfg(feature = "sqlite")]
"INTEGER" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "sqlite")]
"REAL" => JsonValue::Number(row.try_get(i).unwrap().into()),
#[cfg(feature = "sqlite")]
"TEXT" => JsonValue::String(row.try_get(i).unwrap()),
#[cfg(feature = "sqlite")]
"BLOB" => JsonValue::String(base64::encode(row.try_get(i).unwrap())),
_ => JsonValue::Null,
}
thanks for pulling this up @olingern I'll create a feature branch with this right away and maybe you can help me test it as I typically only use Sqlite to test with my example app.
Also wondering if we can find a mechanism where a user could define a vector of structs which could be translated into compiled (and automatically typed) on the Rust back-end using the FromRow
macro. Then we could use the rsts
crate (which i've not used but am aware of) to translate these into TS types and expose them on the API exposed to the front-end:
#[derive(Serialize, Deserialize, sqlx::FromRow))]
pub struct Todo {
id: String,
title: String,
completed: bool,
}
We might also consider going beyond FromRow
and allowing users to defined a hashmap of known queries and then use the query!
macro to gain compile time type checks, bind queries and only really require the user to know SQL (though they'd need to put it into Rust code).
Anyways i'm just spitballing. Would be nice to have more ergonomic ways to setup the API for the frontend, while getting performance of bind variables and strong typing persisted through both back and front-end.
thanks for pulling this up @olingern I'll create a feature branch with this right away and maybe you can help me test it as I typically only use Sqlite to test with my example app.
Sure, happy to do that. After your feature branch lands, maybe we can set up some sort of integration tests against postgres and others. I imagine something like a SQL migration that creates columns for every column type supported, running the migration, tests against insert/delete/select/update/etc.
I had a look at how diesel does test setup. Not immediately obvious, but it might be reasonable for me to get something basic like that together https://github.com/diesel-rs/diesel/blob/master/diesel_derives/tests/helpers.rs#L18
Ok, i've pushed the feature branch
I am going to look around at some of the other plugins for Tauri to see if there are any good Tauri examples of tests because a library like this really does need some tests :)
Note, this feature branch's more explicity separation of serialization / type mapping into Rust is a step in the right direction but it still doesn't solve the various issues people are having with null
being returned.