AV-Coding

Results 15 comments of AV-Coding

This happens when I start the script `cross_vali_data_convert_merge.py`. It no longer happens though since it seems like when `Dataset` is unzipped, it creates a `data` folder.

Thank you @Cyan4973 and @terrelln for your response. We use ZSTD to compress blocks of data up to 256KiB in size. At this point, there are billions of compressed blocks...

``` static ZSTD_CCtx *comp_context = NULL; if (comp_context == NULL) { // allocate a context to help speed up back to back compressions under the same thread comp_context = ZSTD_createCCtx();...

It's multi-threaded, but it's always the same thread which calls the routine. What we posted above is also a streamlined version of what's actually being called. The context is saved...

Compressed on 32 bit, attempting decompression on both 32 and 64. 32bit big endian

We agree. We have already completed that 2nd suggestion a couple years ago by adding an extra CRC over the compressed data. Not all users have that newer level. Unfortunately,...

@embg, the version where we ran into the offending blobs was at 1.5.2. After being unable to decompress the data with v1.5.2, we attempted to decompress with v1.2.0 and v1.5.6,...

We have an important update of the problem we are hitting. Similar blocks for the same workload have a large amount of zero data, so we were thinking perhaps these...

Here's an update of where we are: We are still unable to find root cause of this issue, but we have made progress. After further analysis, we have determined that...

No, blocks up to 256KiB exist within the buffer on 4KiB boundaries. Each block will use one or more 4KiB blocks as needed and the entire block (up to 256KiB)...