node-archiver
node-archiver copied to clipboard
archive.on("progress") returns wrongs values v4 -> v5
in "archiver": "4.0.2", when i used archive.on("progress", progress => {}) the values of progress.entries.processed and progress.entries.total were corrected - different values. for example, Math.round(p.entries.processed * 100.0 / p.entries.total) were giving me the current percentage.

but in "archiver": "5.1.0" im getting same values in processed and total.

do you have example code? It could be the glob change since its more or less reading fs as needed vs huge swaths at a time.
@ctalkington sure , im attaching a code example from Archiver documentation: you can see progress.entries.total and progress.entries.processed have the same value since start.
const archiver = require('archiver'); const fs = require('fs');
// You can change this by something bigger! const directory = __dirname + '/fixtures'; const destination = __dirname + '/' + Date.now() + '.zip'; const destinationStream = fs.createWriteStream(destination);
const archive = archiver('zip', {zlib: {level: 9}});
archive.on('error', function (err) {
console.error('Error while zipping', err);
});
archive.on('progress', function (progress) {
// const percent = progress.fs.processedBytes / totalSize * 100;
let percent = progress.entries.total > 0 ? Math.round(progress.entries.processed * 100.0 / progress.entries.total) : -1;
console.log("TOTAL", progress.entries.total, "PROCESSED", progress.entries.processed,"|",progress,"%");
})
archive.pipe(destinationStream);
archive.directory(directory);
archive.finalize();
@ctalkington
I noticed this today on a mac.
I noticed this as well. It seems that archive.directory
doesn't append a file to the progress total until after it's been processed. Here's some code demonstrating that:
Example Code
const archiver = require("archiver");
const fs = require("fs");
const Path = require("path");
const TEST_FILES_DIR = "./random_test_files"; //A directory with a few random files to test zipping
function setup(fileName) {
const archive = archiver('zip', {
zlib: {level: 9} // Sets the compression level.
});
const writeStream = fs.createWriteStream(fileName);
archive.pipe(writeStream);
archive.on("progress", (progress) => {
console.log("TOTAL", progress.entries.total, "PROCESSED", progress.entries.processed);
});
return archive;
}
(async () => {
const folderName = Path.basename(TEST_FILES_DIR)
console.log("========================= ZIPPING WITH archive.directory =========================")
const archiveDir = setup("directory.zip");
archiveDir.directory(TEST_FILES_DIR, folderName);
await archiveDir.finalize();
console.log("========================= ZIPPING WITH MANUAL TRAVERSAL =========================");
const archiveManual = setup("manual.zip");
const files = fs.readdirSync(TEST_FILES_DIR)
for (const file of files) {
archiveManual.file(Path.join(TEST_FILES_DIR, file), {name: Path.join(folderName, file)});
}
await archiveManual.finalize();
})()
Output:
========================= ZIPPING WITH archive.directory =========================
TOTAL 1 PROCESSED 1
TOTAL 2 PROCESSED 2
TOTAL 3 PROCESSED 3
TOTAL 4 PROCESSED 4
========================= ZIPPING WITH MANUAL TRAVERSAL =========================
TOTAL 4 PROCESSED 1
TOTAL 4 PROCESSED 2
TOTAL 4 PROCESSED 3
TOTAL 4 PROCESSED 4
Looking through the source of directory
, this behavior happens because it appends files one-by-one as each finishes processing (through the _append
callback), rather than all at once like in the manual traversal example. While this is in line with streaming behavior, it has the unfortunate side-effect of breaking the progress event.
To get around this behavior, I'd imagine the best solution is to count the number of files in the directory manually using something like recursive-readdir before calling archive.directory
. Then, just use that value as the total instead of the one provided by progress.
const recursive = require("recursive-readdir");
const archiver = require("archiver");
const archive = archiver('zip', {
zlib: {level: 9} // Sets the compression level.
});
const writeStream = fs.createWriteStream("test.zip");
archive.pipe(writeStream);
let total = 0
archive.on("progress", (progress) => {
console.log("TOTAL", total, "PROCESSED", progress.entries.processed);
});
total = (await recursive(TEST_FILES_DIR)).length;
archive.directory(TEST_FILES_DIR, folderName);
await archive.finalize();
I have the same problem when I use archive.glob to append files. Total has been changing, so I can't calculate the correct progress.
I have the same problem. [email protected] MacOS 12.5.1 Node.js 16.15.1
const fs = require('fs');
const path = require('path');
const archiver = require('archiver');
const log = console.log.bind(console);
const error = console.error.bind(console);
const platform = process.platform;
const projectRoot = path.resolve(__dirname, '..');
const nodeModulesDir = path.resolve(projectRoot, 'node_modules');
const toolkitVersion = require('toolkit/package.json').version;
const zipName = `${toolkitVersion}-${platform}.zip`;
const outputPath = path.resolve(projectRoot, 'zips', zipName);
if (fs.existsSync(zipName)) {
log(`Removing existing zip file: ${zipName}`);
fs.unlinkSync(zipName);
}
if (!fs.existsSync(nodeModulesDir)) {
error('node_modules not found. Please run `yarn` first.');
process.exit(1);
}
const output = fs.createWriteStream(outputPath);
const archive = archiver('zip', {
zlib: { level: 9 }
});
output.on('close', function() {
console.log(archive.pointer() + ' total bytes');
});
archive.on('warning', function(err) {
if (err.code === 'ENOENT') {
error(err);
} else {
throw err;
}
});
archive.on('error', function(err) {
throw err;
});
archive.on("progress", progress => {
const percent = Math.round(progress.entries.processed * 100.0 / progress.entries.total);
log(`\rZipping: ${percent.toFixed(2)}%`);
})
archive.pipe(output)
archive.directory(nodeModulesDir, 'node_modules')
.finalize();
Have somebody solution?
In my situation fs.totalBytes
is always 0. I solved this by calculating the total size (sum each file size on append). The progress
event has a property fs.processedBytes
so I just have to divide this with totalSize. Example:
this.zip.on('progress', (progress) => {
const newProgress = progress.fs.processedBytes / this.totalSize;
// ... do something with progress ...
});
Any updates from the devs? I mean hopefully it gets fixed soon, workarounds are cool but not necessary.
Still has progess.entry.total the same value as progess.entries.processed:
With Code:
archive.on("progress", (progress) => {
const percentage = (progress.entries.processed / progress.entries.total) * 100;
console.log("Processed:", progress.entries.processed, "FilesTotal:", progress.entries.total, "Fortschritt: ", percentage.toFixed(2), "%");
});
Workaround for Version >=5.0.0 by calculating the total entries by myself (with help of fs) By a massive amount of files this is not a good solution:
function countFiles(directory) {
let count = 0;
function countFilesRecursive(currentPath) {
const contents = fs.readdirSync(currentPath);
contents.forEach(file => {
count++; // Zähle jede Datei und jedes Verzeichnis
const fullPath = path.join(currentPath, file);
if (fs.lstatSync(fullPath).isDirectory()) {
countFilesRecursive(fullPath);
}
});
}
countFilesRecursive(directory);
return count;
}
const filesTotal = countFiles(workspacePath); // Zählt die Dateien im Verzeichnis
// Fügen Sie hier den progress Event-Handler hinzu: Überwachung der gesamten Daten umgewandelt in % 0-100 aber ist nicht wirklich 0-100% da alle Daten angeschaut werden und mal 100 gerechnet werden aber nicht korrekt wenn manche einzelne Daten sehr viel größer sind als kleinere, dennoch ausreichend. Anstelle filesTotal sollte eigentlich: progress.entries.total aber das ist seit 5.0 archiver verbuggt siehe: https://github.com/archiverjs/node-archiver/issues/475
archive.on("progress", (progress) => {
const percentage = (progress.entries.processed / filesTotal) * 100;
console.log("Processed:", progress.entries.processed, "FilesTotal:", filesTotal, "Fortschritt: ", percentage.toFixed(2), "%");
});
archive.pipe(res); // Stecke Archiv ins eine Ende der Pipe und schicke es durch bis res -- wird dann in kleineren Junks zur Echtzeit immer in Stückchen gesendet an Browser
archive.directory(workspacePath, false); // Pfad nimmt alle Ordner und Unterdateien und packt sie an die root der .zip
archive.finalize(); // Stecke einen Korken ans Ende der Pipe - definiert die Blöcke
Best solution for this Problem: As in Archiver npm version List: https://www.npmjs.com/package/archiver?activeTab=versions I don`t see the positive features about new versions so I would recommend to stay at Version 4.0.2 until this gets fixed.
So just change the archiver version in the package.json: "archiver": "^4.0.2", and update the version with npm install. Now just use the normal method as descriped before;
archive.on("progress", (progress) => {
const percentage = (progress.entries.processed / progress.entries.total) * 100;
console.log("Processed:", progress.entries.processed, "FilesTotal:", progress.entries.total, "Fortschritt: ", percentage.toFixed(2), "%"); // toFixed (2) also Zweistellen nach dem Komma (z. B. 85.51 %) wenn man keine oder weniger haben möchten Zahl bis 0 veringern.
});
archive.pipe(res);
archive.directory((workspacePath), false);
archive.finalize();
Working now perfectly:
@ctalkington Please work on this please. I think its not a big bug just something which came up from 4.0.2 --> 5.0.0 as I tested.