avro
avro copied to clipboard
Avro codec and code generation for Go
We're currently only using the base Go type name as the Avro name, but this can lead to name clashes if there are two types of the same name from...
We could potentially support decoding into `interface{}` values, where what's inside the value is filled in from the writer schema. This would provide an idiomatic and consistent way to inspect...
Currently if reading an integer type into a Go type that's smaller, we can get a silent overflow. We should check that and error if it happens. See relevant TODO...
It would be nice to be able to check type compatibility. A possible API: ``` // CompatMode defines a compatiblity mode used for checking Avro // type compatibility. type CompatMode...
Both maps and slices can already be nil, so perhaps we could encode the union `["null", {"type": "map", "values": "int"}]` as `map[string]int]` not `*map[string]int` as we do currently. But this...
Currently if you've got several records with matching names (but different namespaces), you'll get an error because we'll generate duplicate Go types. We could allow `avro-generate-go` to specify a mapping...
We'd like to support OCF files, and potentially other kinds of streamed Avro record files. Possible API: ``` // NewEncoder returns a new Encoder instance that encodes // a stream...
It would be nice to support JSON marshaling and unmarshaling of data values as specified in the Avro JSON encoding. Currently we could do that by an external library (https://github.com/linkedin/goavro)...
do we have any plan on supporting decimal logical type? So we could do something like decimal numbers with a maximum precision of 4 and a scale of 2: ```...
``` val str = """ |{ | "type": "record", | "name": "TestRecord", | "namespace": "org.apache.avro.test", | "doc": "test schema", | "fields": [ | { | "name": "name", | "type": {...