aws-sdk-java-v2
aws-sdk-java-v2 copied to clipboard
`DefaultAttributeConvererProvider` throws `IndexOutOfBoundException` on `Object` class
DefaultAttributeConvererProvider of dynamodb retrieves its matched types incorrectly.
This causes it to create a wrong converter for some generic types.
Describe the bug
The minimal reproduction scenario is:
new DefaultAttributeConverterProvider().converterFor(EnhancedType.of(Object.class));
It generates the following stacktrace:
java.lang.IndexOutOfBoundsException: Index: 0
at java.util.Collections$EmptyList.get(Collections.java:4456)
at software.amazon.awssdk.enhanced.dynamodb.DefaultAttributeConverterProvider.createMapConverter(DefaultAttributeConverterProvider.java:179)
at software.amazon.awssdk.enhanced.dynamodb.DefaultAttributeConverterProvider.findConverter(DefaultAttributeConverterProvider.java:149)
at software.amazon.awssdk.enhanced.dynamodb.DefaultAttributeConverterProvider.converterFor(DefaultAttributeConverterProvider.java:133)
Possible Solution
I think the code should be:
if (type.rawClass() == Map.class) {
Be aware this not only happens here, but also appears on other places in this class.
The problem is caused because the DefaultAttributeConverterProvider is using the isAssignableFrom call. It writes:
if (type.rawClass().isAssignableFrom(Map.class)) {
Context
Found this while writing with a converer.
Your Environment
- AWS Java SDK version used: latest
@steven-aerts can you tell us more about your use case? What your converter is trying to do specifically?
I saw this when writing unit tests for my converter.
But this is a generic problem of the DefaultAttributeConvererProvider.
So if you create a @DynamoDbBean with a property of type Object you also get this exception that it tries to create a map, while the problem is that it cannot find a convertor for the Object class. Because of the isAssignableFrom which should not be used here.
So the DynamoDB enhanced moved away from the design of the Java SDK 1.x DynamoDB Mapper in favor of type safety and a more fluent IDE discoverability, so it doesn't support Object types anymore. The idea is for you to have a strongly typed TableSchema.
Hi, I am perfectly aware of that.
I am just saying that if you use by accident an Object you get the above stack trace due to a bug in the DefaultAttributeConverterProvider. Where its code on a few places says:
if (type.rawClass().isAssignableFrom(Map.class)) {
While it should be:
if (type.rawClass() == Map.class) {
Or even:
if (Map.class.isAssignableFrom(type.rawClass())) {
So the code does not give that strange stacktrace, but hits the correct error message:
.orElseThrow(() -> new IllegalStateException("Converter not found for " + type))
I want to store a List in Dynamodb, defined the class with the @DynamodbBean with List Object as one of the member variable. I get this IndexOutOfBound exception, whats the solution?
@steven-aerts stumbled on this issue cause it's marked as guidance and haven't had a lot of updates. I've re-read your description and I understand the issue now, is how to prevent an IndexOutOfBoundsException when providing an Object as the enhanced type. Marking this is a bug again.
@MunshiE2911 The solution is not use Object, the DynamoDB Enhanced client does not support it.
I just ran across this problem myself. If Object is not supported (annoying but something I can deal with), then trying to map an Object field should produce an error message of "Object is not allowed" rather than a spurious unrelated exception (which, as noted above, results from using an objectively incorrect type conditional in the DefaultAttributeConverterProvider).
So the DynamoDB enhanced moved away from the design of the Java SDK 1.x DynamoDB Mapper in favor of type safety and a more fluent IDE discoverability, so it doesn't support
Objecttypes anymore. The idea is for you to have a strongly typed TableSchema.
Hi @debora-ito , thank you for these details.
Using strong types is surely possible for new APIs, however this is a bit challenge for existing(legacy) APIs. For an instance, some of legacy apps are using SDK 1.x which are yet to have any impact of this. However, new apps are using SDK 2.x with enhanced client are impacted with this. These are apps are referring the same Java POJO structure from a parent library.
Considering these challenges, based on my knowledge, would it be possible to accept function(s) as converter(s) for a type? If no converters supplied for overriding, the default behavior will use existing converters. There maybe better ways to allow this case to work with Object type.
Function<T, <? extends R>> myObjectTypeConverter
QueryEnhancedRequest.builder()
.typeConverter(myObjectTypeConverter)
.build()
// OR accept as a list of converters
QueryEnhancedRequest.builder()
.typeConverters(List.of(myObjectTypeConverter)) // this will allow overriding conversion behavior
.build()
Just came here to note that there is also a problem with the documentation, alongside the weird error message that has nothing to do with the actual problem. The DefaultAttributeConverterProvider.builder() method's documentation states:
Equivalent to builder(EnhancedType.of(Object.class)).
But a builder for EnhancedType.of(Object.class) would never work without providing another attribute converter (there's literally not even a builder that has any parameter). Should adjust that message to something that actually makes some sense for users of the library. Maybe "Provides a default set of attribute converters"?
Additionally, while I understand the desire to have strongly typed objects, there is a clear use case for some objects not having the strong typing in application preferences. For illustration only
class User {
UUID id;
Map<String, Object> appPrefs;
String fName;
String lName;
}
Without corrupting the domain layer, there doesn't seem to be an easy way to add one attribute that's dynamic along with the other attributes. There are solutions to this in other issues which handle half of the problem such as using AttributeConverters or somehow adding an EnhancedDocument to your schema, but I think we can say that neither of those options is ideal either.