goavro icon indicating copy to clipboard operation
goavro copied to clipboard

Results 79 goavro issues
Sort by recently updated
recently updated
newest added

tested with avro-tools 1.9.2 ``` $ avro-tools getmeta event.zstd.avro avro.schema { "type" : "record", "name" : "Event", "fields" : [ { "name" : "body", "type" : "bytes" } ] }...

enhancement

In my system I need to get `Schema` from Avro message, and I notice we could get `Schema` by calling `(*OCFReader).Metadata()`. But when generating new `OCFReader` it has to generate...

enhancement

When loading Avro files to BigQuery, there is a limit on the maximum size of avro data blocks inside an ocf file. In the BQ case, it's 16MB. `goavro` should...

Status: Downloaded newer image for docker.elastic.co/beats-dev/golang-crossbuild:1.15.9-arm >> Building using: cmd='build/mage-linux-amd64 golangCrossBuild', env=[CC=aarch64-linux-gnu-gcc, CXX=aarch64-linux-gnu-g++, GOARCH=arm64, GOARM=, GOOS=linux, PLATFORM_ID=linux-arm64] # github.com/linkedin/goavro /go/pkg/mod/github.com/linkedin/[email protected]+incompatible/array.go:48:34: constant -9223372036854775808 overflows int /go/pkg/mod/github.com/linkedin/[email protected]+incompatible/array.go:86:35: constant -9223372036854775808 overflows int /go/pkg/mod/github.com/linkedin/[email protected]+incompatible/binaryReader.go:96:27:...

First, I would like to thank you for the library!. From an user's perspective, we have been really successful in creating an internal Kafka implementation using `goavro`!. Within the team,...

## Background We have a very specific use case where the data is in a CSV file and there is a separate file which stores avro schema. We will have...

For fast processing code the decompression is the bottleneck. The avro format compresses blocks, right? So it should be possible to run concurrent decompressions such that more CPU cores can...

Currently OCFWriter uses `DefaultCompression`. The API should expose the ability to set `BestCompression`.

I'd like to be able to take an OCF as input, then pull out individual binary-encoded objects one at a time. I will store the schema separately and later apply...

enhancement

nativeOutput is not identical to nativeInput: ``` codec, err := goavro.NewCodec(` { "type": "record", "name": "name", "fields" : [ {"name": "a", "type": [{ "items": "string", "type": "array" }]} ] }`)...