ftp-deploy
ftp-deploy copied to clipboard
Incremental Update based on a folder-hash file
Hi, not yet perfect but working well.
You may create a folder-hash file and save that somewhere in your files.
To use folder-hash you need to install it from npm locally or global e.g.:
npm install folder-hash
If that is given - the process tries to get the hash map file from ftp and diffs to what is on local.
Based on that the necessary files are deleted remotely and new and updated files will be transferred from local.
You may run folder-hash to create the list of sums (my distribution files from jekyll are located at _site/ folder):
E.g. create an updated folder-hash file:
# create a new folder-hash config and exclude just the hash_sums file itself
echo '"files": { "exclude": ["_folder-hash-sums*"] } }' >.folder-hash
# enter the distro path to get relative path information into the hash sums file
cd _site/
# run folder-hash locallly from node_modules
../node_modules/.bin/folder-hash --config ../.folder-hash . > _folder-hash-sums
# leave the distro
cd ..
For an example this is my main.js to deploy only changed files
// Automatic deployment
var FtpDeploy = require("ftp-deploy");
var ftpDeploy = new FtpDeploy();
var config = {
user: "user",
password: 'passwd',
host: "hostip",
port: 21,
localRoot:"_site/",
remoteRoot: "/www/html",
// use a folder-hash hash map file to allow incremental updates
fileFolderHashSums: '_folder-hash-sums',
// include: ["*", "**/*"], // this would upload everything except dot files
include: [],
// e.g. exclude sourcemaps, and ALL files in node_modules (including dot files)
exclude: [],
// e.g. delete files on remote before upload
delete: [],
// delete ALL existing files at destination before uploading, if true
deleteRemote: true,
// Passive mode is forced (EPSV command is not sent)
forcePasv: true
};
ftpDeploy
.on('removed', data => {
console.log(
'ftp-deploy: processing status: ' +
(" " + Math.round((data.deletedFileCount + data.transferredFileCount) / data.totalFilesCount * 100)).slice(-3) + '% ' +
' removed: ' + data["filename"]
);
});
ftpDeploy
.on('uploaded', data => {
console.log(
'ftp-deploy: processing status: ' +
(" " + Math.round((data.deletedFileCount + data.transferredFileCount) / data.totalFilesCount * 100)).slice(-3) + '% ' +
'uploaded: ' + data["filename"]
);
});
ftpDeploy
.deploy(config)
.then(data => console.log("Finished! "))
.catch(err => console.log(err));
This is what it will look like in real life:

Thanks Tom. As discussed in https://github.com/simonh1000/ftp-deploy/issues/124 I am reluctant to start adding files on remote servers.
A while ago I used Grunt and Gulp for FTPing and they seemed to manage to changed-only uploads without extra files. I had thought about looking at their code to implement syncing but never had the time. What do you think?
Hey Simon @simonh1000
I have not digged into Grunt and Gulp yet but from my point of view I wouldn't believe just a date and size when comparing local and remote files. That's why I think having a hash file is a must have. The hash could, if necessary, also be updated on server side by some cron tasks etc. Even when date will change by touching remote files, the hashes are reliable.
Coming to conclusion :-) I would like to have it implemented as I did it :-D
But I can understand if you won't support that update - wondering if something like a plugin-support could be usable to attach such extensions like mine?
What do you think how to proceed?
Cheers Tom
So this creates a folder hash every time you run ftp, works out which files have changed relative to the folder hash on disk, uploads the changes, stores the folder-hash on your disk?
Hi Simon @simonh1000
exactly :-)