spring-data-elasticsearch
spring-data-elasticsearch copied to clipboard
Ignore property when value is null or empty
Due to the migration from Jackson there's currently no way to ignore a property with a null or empty value when serializing. Is it possible to add this attribute ("IgnoreNullOrEmpty") to the Field annotation?
What kind of serialization are you talking about? Spring Data Elasticsearch about? The MappingElasticsearchConverter
in Spring Data Elasticsearch does not send null
properties to Elasticsearch.
Please provide a minimal runnable example that reporoduces the problem.
An Object's property that has a null or empty value DOES get sent as EMPTY if the property type is a Map.
A simple example is below:
public class Person {
private String name;
private Map<String,Object> metadata;
// getters/setters
}
When using Spring-data-elastic to save this Person class, which only has the name set to "foo" (elasticsearchTemplate.save(entity, IndexCoordinates(...)
)), that Object gets serialized into the following JSON (which then gets sent to our ES cluster)
{
"name" : "foo"
"metadata" {}
}
I'm trying to ONLY serialize the metadata
property if it's not NULL or not Empty.
I believe the heart of the issues lies within a MappingElasticsearchConverter
's write
method. Specifically, it doesn't do a check on an empty Map
:
I have just tested this with the current version:
Using this entity:
public class Foo {
@Id
private String id;
@Nullable
@Field(type = FieldType.Object)
private Map<String, Object> map1;
@Nullable
@Field(type = FieldType.Object)
private Map<String, Object> map2;
// getter and setter
}
and this code:
var foo = new Foo();
foo.setId("42");
foo.setMap1(null);
foo.setMap2(new LinkedHashMap<>());
operations.save(foo);
the following is sent to Elasticsearch:
{"_class":"com.sothawo.springdataelastictest.so.Foo","id":"42","map2":{}}
The property with the value null
is not included.
The code you show is not the place where a property is written, the check for this is at https://github.com/spring-projects/spring-data-elasticsearch/blob/main/src/main/java/org/springframework/data/elasticsearch/core/convert/MappingElasticsearchConverter.java#L945-L955
You can use an EntityCallback
(https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/#entity-callbacks) to set an empty map property to null before it is converted:
@Component
public class FooBeforeConvertCallback implements BeforeConvertCallback<Foo> {
@Override
public Foo onBeforeConvert(Foo foo, IndexCoordinates indexCoordinates) {
if (CollectionUtils.isEmpty(foo.getMap1())) {
foo.setMap1(null);
}
if (CollectionUtils.isEmpty(foo.getMap2())) {
foo.setMap2(null);
}
return foo;
}
}
Perfect example. The initialized, but empty LinkedHashMap
is the issue I am seeing. Thank you
I still think this is a valid issue as this will cause me to have to write a lot of Callback code. Can you not update the MappingElasticsearchConverter
to ignore initialized, but empty Map
s?
I would simple override the writeProperties
method, however I cannot due to MapValueAccessor
not being visible
The question is: which criteria defines that a property is not null, initialized and empty? This would have to work with Collections, but probably with String as well? And it would need an extra configuration on the field, as not writing non-null values is not the default behaviour. For example, Elasticsearch diffferentiates between an empty ("") string and a non-existent string when a exists
query is used.
Another point to consider: what happens to properties that are not nullable, for example when the application using Spring Data Elasticsearch is written in Kotlin and has a non-null Map
property? On writing this would not be written and on reading there would be no value to be set to this property, because we would get null
fromElasticsearch.
It would be possible to add a property storeEmptyValue
to the @Field
annotation with a default true
. This would only be considered if the property is either a String
or derives from java.util.Collection
.
As a work-around I've had to do the following, which ONLY sets the value to NULL if it's an empty Map
. I then traverse through the props and remove any null values.
ElasticsearchConfiguration.java
@Bean
public ElasticSearchClient esClient() {
...
}
@Bean
public ElasticsearchOperations elasticsearchTemplate() {
return new ElasticsearchRestTemplate(esClient(), getIgnoreNullOrEmptyMapConverter());
}
@Bean
@Primary
MappingElasticsearchConverter getIgnoreNullOrEmptyMapConverter() {
return new IgnoreNullOrEmptyMapConverter(mappingContext);
}
private class IgnoreNullOrEmptyMapConverter extends MappingElasticsearchConverter {
public IgnoreNullOrEmptyMapConverter(
MappingContext<? extends ElasticsearchPersistentEntity<?>, ElasticsearchPersistentProperty> mappingContext) {
super(mappingContext);
}
@Override
protected void writeEntity(ElasticsearchPersistentEntity<?> entity, Object source, Document sink,
@Nullable TypeInformation<?> containingStructure) {
super.writeEntity(entity, source, sink, containingStructure);
Iterator<Entry<String, Object>> iter = sink.entrySet().iterator();
while(iter.hasNext()) {
Entry<String, Object> entry = iter.next();
if(Objects.isNull(entry.getValue())) {
iter.remove();
}
}
}
@Override
protected Object getWriteComplexValue(ElasticsearchPersistentProperty property, TypeInformation<?> typeHint, Object value) {
Object val = super.getWriteComplexValue(property, typeHint, value);
if(typeHint.isMap() && ((Map)val).isEmpty()) {
val = null;
}
return val;
}
}
I still think that we need the ability to remove or ignore null or empty properties when serializing. This can be accomplished through @Field or any other annotation.