ZstdNet
ZstdNet copied to clipboard
Combination of CompressionStream and Decompressor fails to Unwrap on multiple iterations
[TestCase(new byte[0], 0, 0)]
[TestCase(new byte[] { 1, 2, 3 }, 1, 2)]
[TestCase(new byte[] { 1, 2, 3 }, 0, 2)]
[TestCase(new byte[] { 1, 2, 3 }, 1, 1)]
[TestCase(new byte[] { 1, 2, 3 }, 0, 3)]
public void StreamCompressAndRegularDecompress(byte[] data, int offset, int count)
{
var tempStream = new MemoryStream();
using (var compressionStream = new CompressionStream(tempStream))
compressionStream.Write(data, offset, count);
byte[] decompressedBytes;
using (Decompressor decompressor = new Decompressor())
{
decompressedBytes = decompressor.Unwrap(tempStream.ToArray());
}
var dataToCompress = new byte[count];
Array.Copy(data, offset, dataToCompress, 0, count);
Assert.AreEqual(dataToCompress, decompressedBytes);
}
@DruSchmitt Hi, we have encountered same problem. If compressions is done using stream approach and you try decompressing using Unwrap method you get exception:
ZstdNet.ZstdException: Decompressed content size cannot be determined (e.g. invalid magic number, srcSize too small)
at ZstdNet.Decompressor.GetDecompressedSize(ReadOnlySpan1 src) at ZstdNet.Decompressor.Unwrap(ReadOnlySpan
1 src, Int32 maxDecompressedSize)
at ZstdNet.Decompressor.Unwrap(ArraySegment`1 src, Int32 maxDecompressedSize)
at ZstdNet.Decompressor.Unwrap(Byte[] src, Int32 maxDecompressedSize)
In our case this caused some pain, as we where using Snowflake DECOMPRESS_STRING feature, it supports ZSDT mode, but was failing with an error. It took us some time to figure out that Stream and Non stream approaches are not compatible.
Is there workaround to this, as it looks like it is missing some header information in output stream, do we need to provide content size information to stream manually?
We tried adding content length bytes similar to how you do in gzip stream, but it did not help.