lz4-java
lz4-java copied to clipboard
LZ4 compression for Java
I have read this article https://github.com/lz4/lz4/blob/dev/doc/lz4_Frame_format.md and it looks quite suitable for me to compress continous incoming data and append them into compressed local file. But unfortuanately I cannot found...
While doing some testing I found a bug in my code that was related to how read byte[] will only return to the end of the current decompressed buffer. It...
1) `StreamingXXHash32JNI` and `StreamingXXHash64JNI` should not override `finalize()`, it's bad for GC. Instead, they should use Cleaner. 2) They should support close() for explicit resource release. 3) The close() method...
There are a number of interfaces: `LZ4Decompressor`, `LZ4Compressor`, `LZ4UnknownSizeDecompressor`. These all have abstract `compress`/`decompress` methods that are marked deprecated. Could this be cleaned-up? Shouldn't it be possible to just pass...
This feature enables a "very large stream" hashing use case: Instead of trying to read a large remote object in a single pass, one can fetch chunks of it at...
[Druid](http://druid.io/) is a open source datastore which uses Lz4-java for fast compression. We are moving our implementation from ByteBuffer to [Memory](https://github.com/DataSketches/memory) . Memory provides the following benefits over ByteBuffer -...
If I only want to decompress some part of the file, let's say a section at the begining and some other bits at the end, I can use skip(). But...
I've studied the implementation of `wildIncrementalCopy` and found that it has a suboptimal approach to copying ranges which are close to each other (source and destination offset differ by 32...
I'm going over your code while considering to use it for a Hazelcast enterprise feature. In the same effort I have done some I/O coding similar to the `write(byte[], int,...