data-api-client
data-api-client copied to clipboard
Support PostgreSQL version of Aurora Serverless
Some compatibility issues were brought up in #25 that show conflicts with the PostgreSQL version (likely because of sqlstring
). There are also some additional data types (like structValue
) that are only supported by the PostgreSQL version.
Overall, I think the two engines are rather interchangeable (thanks for that AWS). We likely need to add an engine
config option or something similar since I don't think the library can "auto detect" the underlying version.
I've created a new development branch (https://github.com/jeremydaly/data-api-client/tree/postgres-support) that should be the target of any pull requests regarding this. Please feel free to add comments to this issue with ideas or compatibility observations.
I recently tried to get this library to work with postgres. I ran into several fundamental problems that are not as easy to solve (if at all?). So I ended up writing my own and tightly integrated that into my system.
If anyone is trying to tackle this, here are some of the issues I found:
-
The type conversion of the parameters is the biggest problem. Only a small subset of postgres types is supported. You could precompile the query with the params on the client-side (like with
sqlstring
), but the maximum query size is 65536 characters. Therefore you cannot write anything bigger than that to the database. -
The parameter types cannot be inferred based on the context, as is the default for other DB connection types. As a result, all types need to be explicitly cast in the queries:
insert into my_table (json_column) values (:p1::json);
So the problem is you have to know the types ahead of time. Most of the query builders (knex for example), rely on the type inference of parameters. So this can't be used as a drop-in replacement driver.
I ended up writing the conversion logic for all the possible types in my system manually, but this approach only works because I have a limited set of types. (And is not really sharable in its current state)
If anyone has a good idea of how to tackle this and is willing to build a solid general-purpose driver, let me know and I'd be willing to help out.
Adding a reference to this issue: #34
@jeremydaly I'm running into issues with this, specifically with my own custom types I've set up. Casting using ::jsonb
and ::uuid
has worked out fine, but this doesn't work:
INSERT INTO folders(name,asset_type,owner_id) VALUES(:name, :asset_type::asset_type, :owner_id::uuid) RETURNING *
I get an error:
Cannot find parameter: asset_type
But I'm definitely passing the proper query params:
{
name: 'My First Folder',
asset_type: 'TRACK',
owner_id: '9dbb70d7-3d17-4215-8966-49815e461dee'
}
Any advice?
Aha! Weird. I think maybe the Data API client doesn't like :asset_type::asset_type
and is interpolating improperly. If I rename my type and do :asset_type::asset_enum
, it works great.
If I understand correctly there is no proper solution except executing a raw query?
Sorry for the late reply. This is likely due to the support for parametizing identifiers. I think full Postgres support will require a separate config flag.
I would be glad to help out, but am not familiar with how this library is working under the hood. Would you be prepared to provide a short walk-through on what should be done? Thank you.
I've created PR #57 which lets you pass casting information via parameters.
I am happy to provide a fix for #25 . However, I think we need to deal with the fact that the maintainers of sqlstring will not be adding support for Postgres. Off the top of my head, there are a few options:
- Remove the dependency on sqlstring and rewrite the lone method this library currently uses (it's small)
- Pros: simple and clean
- Cons: if we're planning to make more use of sqlstring in the future, it will be harder
- Fork sqlstring and add Postgres support
- Pros: can continue to rely on other features of sqlstring if we need them in the future
- Cons: maintaining upstream changes can become unwieldy