spring-data-relational
spring-data-relational copied to clipboard
Unable to create custom array codecs
Hi,
I'm using spring-data-r2dbc
and the extension mechanism from r2dbc-postgresql
to create a codec for a custom Postgres range type:
CREATE TYPE timetzrange AS RANGE (subtype = timetz);
Since there's no native support for Postgres range types on Java's side, I created my own class OffsetTimeRange
, and registered a custom codec OffsetTimeRangeCodec
which implements the interface Codec<OffsetTimeRange>
.
So far, that worked perfectly fine, but I wanted to add the custom codec for its corresponding multirange type. From Postgres range types documentation, we know that:
When you define your own range you automatically get a corresponding multirange type.
So I registered another codec OffsetTimeRangeArrayCodec
which implements the interface Codec<OffsetTimeRange[]>
, but this raised the following exception upon reading/writing from the database:
java.lang.IllegalArgumentException: Unsupported array type: com.example.util.OffsetTimeRange
The problem
I did some debugging, and I found the exception is thrown in the following check (line #164): https://github.com/spring-projects/spring-data-relational/blob/b691af7b305adfb0a4f8ea7585dc8f7c0abc80e1/spring-data-r2dbc/src/main/java/org/springframework/data/r2dbc/dialect/PostgresDialect.java#L156-L168
The problem seems to be that the this.simpleTypeHolder.isSimpleType(typeToUse)
method checks against the local simple types, which are missing any type introduced by a custom codec.
As always, I'll be happy to help with a PR if someone can point me in the right direction 🙂
Cheers!
Spring Data has an abstraction to detect which types are simple ones (understood natively by the driver). For Spring Data R2DBC, we introduced a Dialect
abstraction to keep database-specifics within a single type. A dialect object can also tell which Java types are simple types via the R2dbcDialect.getSimpleTypes()
method.
Subclassing PostgresDialect
, enriching simple types, and configuring the dialect in your application should allow you to use your application-specific simple types.
@mp911de thanks, that worked! However, it was not that easy to get there 😅. Is this just a workaround, or is it documented somewhere?
Also, it'd be great if there could be a simpler way to add custom array types. In a nutshell, these are all the steps I followed to make it work:
- Create a custom codec implementing
Codec<OffsetTimeRange[]>
- Add the codec to the
CodecRegistrar
- Create a custom
@WritingConverter
instance:- Implementing
Converter<OffsetTimeRange, OffsetTimeRange>
, where theconvert(source)
method returns the samesource
value
- Implementing
- Add the custom writing converter to the
R2dbcCustomConversions
bean - Create a
CustomProstgresDialect
class that extends fromPostgresDialect
- Override the
getSimpleTypes()
method - Add
OffsetTimeRange.class
to the collection without losing the types fromsuper.getSimpleTypes()
- Override the
- Create a
CustomDialectProvider
class that implementsR2dbcDialectProvider
- Use the
getDialect(..)
method to return aCustomProstgresDialect
instance
- Use the
- Finally, as mentioned in the docs, create the file
META-INF/spring.factories
that should contain the line:
org.springframework.data.r2dbc.dialect.DialectResolver$R2dbcDialectProvider=com.example.CustomDialectProvider
As you can see, there are a lot of steps and new files. Is there any way to improve this process?
As always, I'll be happy to contribute if possible 🙂
The CodecRegistrar
is part of the ConnectionFactory
configuration and we cannot do much here. The WritingConverter
shouldn't be necessary. Enums use that mechanism only to bypass the built-in enum conversion.
R2dbcDialectProvider
should also not be required as you can override either AbstractR2dbcConfiguration.getDialect()
when using Spring Data R2DBC without Spring Boot. With Spring Boot, you currently cannot provide a custom dialect, but that would be something for Spring Boot to improve. You could alternatively provide your own R2dbcCustomConversions
bean as R2dbcCustomConversions
can be created from a dialect and a collection of converters.
Yep, the CodecRegistrar
it's easy to use (a great feature, IMO!). However, with Spring Boot I still need the WritingConverter
, or else I get the following exception:
org.springframework.dao.InvalidDataAccessApiUsageException: Nested entities are not supported
But... the error goes away if I override
AbstractR2dbcConfiguration.getDialect()
with my custom dialect
On Spring Boot, I was able to create a @Configuration
class that extends from AbstractR2dbcConfiguration
to make some of those overrides. I had to create the ConnectionFactory
manually (using the PostgresqlConnectionFactoryProvider
builder), which is not a bad thing because it also gave me easy access to register custom codecs 😁
I tried providing my own R2dbcCustomConversions
bean created with my custom dialect, but it didn't work. It looks like the check for the simple types uses a dialect provided before the one from the R2dbcCustomConversions
bean. That's why I had to switch to the META-INF/spring.factories
solution 😅
However, I think the most straightforward solution on Spring Boot was to have a @Configuration
class that extends from AbstractR2dbcConfiguration
, so we can easily add more customizations. Would this be something ok to do on Spring Boot?
Thanks again for your answers, @mp911de! They have been of great help 🙂