schema-registry
schema-registry copied to clipboard
Error deserializing json { to Avro of schema when publish messages contains windows new line char `\r\n` .
Steps to reproduce:
- Run zookeeper
docker run -d --net=host --name=zookeeper -e ZOOKEEPER_CLIENT_PORT=32181 confluentinc/cp-zookeeper
- Run kafka
docker run -d --net=host --name=kafka -e KAFKA_ZOOKEEPER_CONNECT=localhost:32181 -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://localhost:29092 -e KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR=1 confluentinc/cp-kafka
- Run schema-registry
docker run -d --net=host --name=schema-registry -e SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL=localhost:32181 -e SCHEMA_REGISTRY_HOST_NAME=localhost -e SCHEMA_REGISTRY_LISTENERS=http://localhost:8081 confluentinc/cp-schema-registry
- Run bash
docker run -it --net=host --rm confluentinc/cp-schema-registry bash
- Publish message - Exception
/usr/bin/kafka-avro-console-producer \
--broker-list localhost:29092 --topic bar \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
Publish message contains \r\n
{
"f1": "value1"
}
Stack trace :
org.apache.kafka.common.errors.SerializationException: Error deserializing json { to Avro of schema {"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}
Caused by: com.fasterxml.jackson.core.io.JsonEOFException: Unexpected end-of-input: expected close marker for Object (start marker at [Source: (String)"{"; line: 1, column: 1])
at [Source: (String)"{"; line: 1, column: 2]
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportInvalidEOF(ParserMinimalBase.java:664)
at com.fasterxml.jackson.core.base.ParserBase._handleEOF(ParserBase.java:486)
at com.fasterxml.jackson.core.base.ParserBase._eofAsNextChar(ParserBase.java:498)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipWSOrEnd(ReaderBasedJsonParser.java:2360)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:671)
at org.apache.avro.io.JsonDecoder.doAction(JsonDecoder.java:482)
at org.apache.avro.io.parsing.Parser.advance(Parser.java:86)
at org.apache.avro.io.JsonDecoder.advance(JsonDecoder.java:132)
at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:212)
at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:207)
at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:208)
at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:469)
at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:459)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:191)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:259)
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:247)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
at io.confluent.kafka.schemaregistry.avro.AvroSchemaUtils.toObject(AvroSchemaUtils.java:190)
at io.confluent.kafka.formatter.AvroMessageReader.readFrom(AvroMessageReader.java:121)
at io.confluent.kafka.formatter.SchemaMessageReader.readMessage(SchemaMessageReader.java:316)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:51)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Console producer (from upstream) intentionally parses single lines. This isn't a problem with the avro console producer