Can't insert blob as map value
For some reason a byte array can't be serialized if it's in a map. I added a failing test to show this https://github.com/nivekuil/alia/commit/3b4973711be79df5f62f15be2aa79bdda2f83f60 which throws an exception Caused by: com.datastax.oss.driver.api.core.type.codec.CodecNotFoundException: Codec not found for requested operation: [BLOB <-> [B]
same for vector/List, looking into it I found some leads:
https://github.com/datastax/java-driver/pull/1555 https://github.com/datastax/spark-cassandra-connector/pull/1334
From the first link, I figured out a workaround
If you are looking for a BLOB codec, the right call is CachingCodecRegistry.codecFor(DataType.BLOB, ByteBuffer.class). Indeed, there is a default BLOB codec registered in the driver that converts BLOB columns to ByteBuffer objects.
But, as explained in the docs, this method does not accept covariant classes. So if you try, say, CachingCodecRegistry.codecFor(DataType.BLOB, HeapByteBuffer.class) – then the method will throw CodecNotFoundException – even if HeapByteBuffer is a subtype of ByteBuffer.
This is probably not the right way to fix it, but this unblocks me for now https://github.com/mpenet/alia/commit/df69e5034c06ba90f489360bd039fecfed18f516