vertx-sql-client icon indicating copy to clipboard operation
vertx-sql-client copied to clipboard

Custom PostgreSQL type mapping

Open vietj opened this issue 6 years ago • 11 comments
trafficstars

PostgreSQL does have a fixed type OID assignments for all the built in types. Extensions will add custom OID that do not have a specific OID and can change according to the database instance.

We need to support such types and have the ability to provide a dynamic mapping so that custom types can be correctly encoded/decoded.

So basically we can

  1. start by providing a OID <-> Codec table to the driver by the user (easy to do ?)
  2. add an utils to build such table using SQL

vietj avatar Jun 28 '19 07:06 vietj

@vietj https://github.com/impossibl/pgjdbc-ng is a JDBC 4.2 compliant driver built on an asynchronous Netty core; I am the author.

One of the central tenants of the driver is to use a pluggable codecs in the client that match the inverse of those in the server (see https://github.com/impossibl/pgjdbc-ng/tree/develop/driver/src/main/java/com/impossibl/postgres/system/procs). I'm wondering if/how we could transfer this existing code (which is built, tested and used in quite a bit of production) to solve the problem outlined here.

Even as I am the driver maintainer, I am quite interested in moving to some version of a reactive driver that has all the same capabilities of a the JDBC implementations.

kdubb avatar Aug 14 '21 20:08 kdubb

FYI, the reason I bring this to your attention is that you specifically talked about mapping OIDs to codecs and that's what pgjbdc-ng does.

kdubb avatar Aug 14 '21 20:08 kdubb

@kdubb that sound interesting, can you elaborate how the mapping works ?

vietj avatar Aug 17 '21 22:08 vietj

@vietj We tried to execute the large object Server side functions using a prepared statement and values from Tuple.

But this failed as the OID type (and also VOID?!?) is not mapped in the io.vertx.pgclient.impl.codec.DataTypeCodec. with 2 DEUBG logs:

  1. "Data type OID does not support binary encoding"
  2. "Data type VOID does not support binary encoding"

The 2 types should be added to Codec.

dprincethg avatar Feb 15 '23 09:02 dprincethg

can you provide a reproducer to help us ? @dprincethg

vietj avatar Feb 15 '23 09:02 vietj

Sorry, I cannot copy the code here (secured environment).

In short, for Large Object PUT functions lo_put ( loid oid, offset bigint, data bytea ) → void Writes data starting at the given offset within the large object; the large object is enlarged if necessary. lo_put(24528, 1, '\xaa') →

The code looks like:

sqlConnection .prepare("SELECT lo_put ($1, $2, $3 );", ar -> { if (ar.succeeded()) { PreparedStatement preparedStatement = ar.result(); preparedStatement.query() .execute(Tuple.tuple().addLong(largeObjectOid).addLong(offset).addBuffer(buffer)), ar2 -> {

As workaround, we use jdbc.postgresql Large Object in an execute blocking to manage large object.

We don't use "simple" query , as we want to avoid to build the String sql with encoding of byte array (buffer) to hex string (/xaa....) due to perf impact issue.

dprincethg avatar Feb 15 '23 10:02 dprincethg

It is related to another issue about "Large Object support" https://github.com/eclipse-vertx/vertx-sql-client/issues/46.

I think a dedicated API shall be provided to manage Large Object like jdbc.postgresql, but in Async way of course ;)

dprincethg avatar Feb 15 '23 13:02 dprincethg

BTW,

As an alternative to the jdbc.postgresql LArge Object api, the PostgreeSQL documentation indicates this way

File file = new File("myimage.gif"); FileInputStream fis = new FileInputStream(file); PreparedStatement ps = conn.prepareStatement("INSERT INTO images VALUES (?, ?)"); (1) ps.setString(1, file.getName()); ps.setBinaryStream(2, fis, file.length()); ps.executeUpdate(); ps.close(); fis.close();

for table: CREATE TABLE images (imgname text, imgoid oid);

As this alternative relies on simple "INSERT" query ( But with java InputStream as query parameter), the Vertx PosgreSQL client should be able to manage it.

However, in the Vertx Tuple there is no way to add a ReadStream < Buffer > (We can't use a addBuffer() method, as we have to handle a ReadStream)

@vietj Any help?

dprincethg avatar Feb 15 '23 14:02 dprincethg

you can buffer the stream to a buffer first I think

vietj avatar Mar 01 '23 09:03 vietj

you can buffer the stream to a buffer first I think

No, the stream is too large to be buffered. The purpose here is to stream large uploaded file content into DB Blob. (Large Object in mysql terminology)

dprincethg avatar Mar 02 '23 08:03 dprincethg

I see, we don't support currently blob for postgres

vietj avatar Mar 02 '23 09:03 vietj