Klaus Post
Klaus Post
> I'm curious how we can move the dict.BuildZstdDict function out of the experimental phase TBH I don't see it happening. I don't have the bandwidth for it at the...
Also sorry for being a bit distracted in my last replies... Let me provide some context: `dict.BuildZstdDict(input [][]byte, o Options)` is the higher level API. What it mainly does is...
Fix in https://github.com/klauspost/compress/pull/951 Made a few tweaks to your test. A) You are writing to the input buffer when decompressing. B) EncodeAll/DecodeAll appends- adding a `[:0]` to make that clear....
The threshold is mostly to filter out candidates in large sets. Though everything being discarded could indicate that there is little of actual value. I'll merge your change back into...
pgzip uses concatenated deflate blocks. Blocks back-reference previous blocks, which is why there is no practical compression loss. Therefore it is not possible to decompress these blocks even if you...
Just decompress. But check if size is reasonable with [DecodedLen](https://pkg.go.dev/github.com/golang/snappy#DecodedLen) first.
[s2d](https://github.com/klauspost/compress/tree/master/s2#s2d) supports snappy stream decompression.
You are using the block decompressor to decode what is probably a stream. There are unique formats (streams contains wrapped blocks). Try with a [Reader](https://pkg.go.dev/github.com/golang/snappy#NewReader).
#19677 also related.
What would be the benefit of this? I could easily see speed focused encoders having a predefined table - even if suboptimal having a non-zero probability for unused codes IMO...