node-ftp icon indicating copy to clipboard operation
node-ftp copied to clipboard

An FTP client module for node.js

Results 101 node-ftp issues
Sort by recently updated
recently updated
newest added

.Pipe's Flow Control fails on slow Writeable Stream

[{"_id":"66ee22d5c36747fde60e2a2d","body":"I'm not sure I understand what you mean by \"automatically cleaned up.\" The raw tcp socket is passed to the get() callback and the only time that socket is destroyed is on data connection timeout.\n","issue_id":1714357015010,"origin_id":30627180,"user_origin_id":54666,"create_time":1387156656,"update_time":1387156656,"id":1726882517466,"updated_at":"2024-09-21T01:35:17.466000Z","created_at":"2024-09-21T01:35:17.466000Z"},{"_id":"66ee22d5c36747fde60e2a2e","body":"Yeah, that appears to be fine. However, the Readable stream that the .get() callback provides is immediately destroyed when the stream from the FTP server to the library ends. This is noticeable as stream.destroyed equals True at this point.\n\n``` javascript\nvar ftp = require('ftp');\nvar client = new ftp();\nvar fs = require('fs');\n\nclient.on('ready', function() {\n client.get('\/www\/favicon.ico', function(err, stream) {\n if (err) {\n console.log('ERROR!');\n return;\n }\n\n console.log('Stream Destroyed? ' + stream.destroyed);\n\n stream.once('close', function() {\n console.log('Stream Destroyed? ' + stream.destroyed);\n client.end();\n });\n\n stream.pipe(fs.createWriteStream('favicon.ico'));\n\n \/\/ Simulate flow-control\n setTimeout(function() {\n stream.pause();\n setTimeout(function() {\n stream.resume();\n }, 3000);\n }, 10);\n });\n});\n\nclient.connect(\/* Host, User and Password *\/);\n```\n\nI guess the expected flow would be that you would push last packet to the Readable stream, then the stream would emit the 'data' event, then you would clean up the stream. However, if the stream is paused at that point (say due to .pipe()'s own internal flow control), all the data after the pause is lost.\n\nEdit: Just to clairfy. The above example has issues if the FTP stream finished (aka, the file had been fully received) in between the .pause() and the .resume().\n","issue_id":1714357015010,"origin_id":30632696,"user_origin_id":5964140,"create_time":1387166888,"update_time":1387166951,"id":1726882517470,"updated_at":"2024-09-21T01:35:17.470000Z","created_at":"2024-09-21T01:35:17.470000Z"},{"_id":"66ee22d5c36747fde60e2a2f","body":"What node version are you using?\n","issue_id":1714357015010,"origin_id":30633701,"user_origin_id":54666,"create_time":1387168116,"update_time":1387168116,"id":1726882517474,"updated_at":"2024-09-21T01:35:17.473000Z","created_at":"2024-09-21T01:35:17.473000Z"},{"_id":"66ee22d5c36747fde60e2a30","body":"```\nnode --version\nv0.10.22\n```\n","issue_id":1714357015010,"origin_id":30634150,"user_origin_id":5964140,"create_time":1387168656,"update_time":1387168656,"id":1726882517476,"updated_at":"2024-09-21T01:35:17.475000Z","created_at":"2024-09-21T01:35:17.475000Z"},{"_id":"66ee22d5c36747fde60e2a31","body":"Ok so if I understand correctly, the complaint is that when you end the control connection, the data connection is immediately severed as well? If that's the case, I don't think there is much I can do about that because that's the server doing that.\n\nPerhaps you should `client.end();` on the local writable file stream's 'finish' event instead of the source socket's 'close' event?\n","issue_id":1714357015010,"origin_id":30634693,"user_origin_id":54666,"create_time":1387169307,"update_time":1387169307,"id":1726882517478,"updated_at":"2024-09-21T01:35:17.477000Z","created_at":"2024-09-21T01:35:17.477000Z"},{"_id":"66ee22d5c36747fde60e2a32","body":"I have tried not having that event at all ... it was still a problem.\n\nWhat did solve my problem was using:\n\n``` javascript\nstream.on('data', function(chunk) {\n res.write(chunk);\n});\n```\n\nInstead of:\n\n``` javascript\nstream.pipe(res);\n```\n\n(Where stream was from your libary, and res is a http.ServerResponse.)\n\nAnd complaint is more when the data connection completes, the stream object you provided is severed. Meaning if the stream object was paused at that point in time, you lose the rest of the data. If it wasn't paused, there is no issue.\n","issue_id":1714357015010,"origin_id":30634879,"user_origin_id":5964140,"create_time":1387169649,"update_time":1387169649,"id":1726882517480,"updated_at":"2024-09-21T01:35:17.480000Z","created_at":"2024-09-21T01:35:17.480000Z"},{"_id":"66ee22d5c36747fde60e2a33","body":"I don't know then. That's a raw socket you're working with and node-ftp (as far as I can tell) isn't meddling with it in any way that would cause something like that. :-\\\n","issue_id":1714357015010,"origin_id":30638839,"user_origin_id":54666,"create_time":1387175946,"update_time":1387175946,"id":1726882517483,"updated_at":"2024-09-21T01:35:17.482000Z","created_at":"2024-09-21T01:35:17.482000Z"},{"_id":"66ee22d5c36747fde60e2a34","body":"I have the same problem. \n","issue_id":1714357015010,"origin_id":34784517,"user_origin_id":153969,"create_time":1392141517,"update_time":1392141517,"id":1726882517487,"updated_at":"2024-09-21T01:35:17.487000Z","created_at":"2024-09-21T01:35:17.487000Z"},{"_id":"66ee22d5c36747fde60e2a35","body":"It happens most often on very small files over very fast networks. May be a node issue, but I can't use pipe with node-ftp because of it. Happens almost every time on fast production hardware.\n","issue_id":1714357015010,"origin_id":34785017,"user_origin_id":153969,"create_time":1392141817,"update_time":1392141817,"id":1726882517491,"updated_at":"2024-09-21T01:35:17.491000Z","created_at":"2024-09-21T01:35:17.491000Z"},{"_id":"66ee22d5c36747fde60e2a36","body":"@polidore I think is right. I have the same issue. I'm connecting to a server and downloading a large number of files, (over 4,000). Besides the EMFILE errors that pop-up if I don't use fs-graceful, I notice that many of the files come down empty. My guess is that the pipe is not really pipping all the data. The conditions in my situation are similar to what @polidore described, small files over a fast network (and many of them). This is probably an underlying issue with node.\n","issue_id":1714357015010,"origin_id":46864022,"user_origin_id":7406960,"create_time":1403538908,"update_time":1403538908,"id":1726882517495,"updated_at":"2024-09-21T01:35:17.494000Z","created_at":"2024-09-21T01:35:17.494000Z"},{"_id":"66ee22d5c36747fde60e2a37","body":"I've wasted a week of my life on this! This does definitely seem to be a Node issue relating to 0.10.x. I've tried to replicate the bug with Node 0.11.13 and it seems fine. I _think_ it was fixed with this release of Node: http:\/\/blog.nodejs.org\/2013\/05\/13\/node-v0-11-2-unstable\/.\n\nEdit: Ignore me. Just retried it a few more times and the same issue is happening.\n","issue_id":1714357015010,"origin_id":52996643,"user_origin_id":3006852,"create_time":1408661999,"update_time":1408662190,"id":1726882517501,"updated_at":"2024-09-21T01:35:17.500000Z","created_at":"2024-09-21T01:35:17.500000Z"},{"_id":"66ee22d5c36747fde60e2a38","body":"I am not sure the problem I was having is the same you guys were. Thing is stream was being closed at some random point before being able to pipe it somewhere else. It was even more weird then this, since after the piping I would get an error. After a few tests I found out this stops happening if I process only one file at a time. I simply create a bunch of tasks for all files I want to process this time and call them using async.series. Hope this helps someone! =)\n","issue_id":1714357015010,"origin_id":67987397,"user_origin_id":2420629,"create_time":1419361987,"update_time":1419361987,"id":1726882517506,"updated_at":"2024-09-21T01:35:17.505000Z","created_at":"2024-09-21T01:35:17.505000Z"},{"_id":"66ee22d5c36747fde60e2a39","body":"doesn\u2019t that mean that you are opening a new connection for each file? Have you found that to be a little slow??\n\nCarl Furst\n\nFrom: Marcel Ferreira Batista <[email protected]<mailto:[email protected]>>\nReply-To: mscdex\/node-ftp <[email protected]<mailto:[email protected]>>\nDate: Tuesday, December 23, 2014 at 2:13 PM\nTo: mscdex\/node-ftp <[email protected]<mailto:[email protected]>>\nCc: Carl Furst <[email protected]<mailto:[email protected]>>\nSubject: Re: [node-ftp] .Pipe's Flow Control fails on slow Writeable Stream (#70)\n\nI am not sure the problem I was having is the same you guys were. Thing is stream was being closed at some random point before being able to pipe it somewhere else. It was even more weird then this, since after the piping I would get an error. After a few tests I found out this stops happening if I process only one file at a time. I simply create a bunch of tasks for all files I want to process this time and call them using async.series. Hope this helps someone! =)\n\n\u2014\nReply to this email directly or view it on GitHubhttps:\/\/github.com\/mscdex\/node-ftp\/issues\/70#issuecomment-67987397.\n\n---\n\nMLB.com: Where Baseball is Always On\n","issue_id":1714357015010,"origin_id":67987572,"user_origin_id":7406960,"create_time":1419362089,"update_time":1419362089,"id":1726882517509,"updated_at":"2024-09-21T01:35:17.509000Z","created_at":"2024-09-21T01:35:17.509000Z"},{"_id":"66ee22d5c36747fde60e2a3a","body":"Sorry I don't think I understand. What I do is to list all the directories that interest me, and find the files that I need. I basically store all the paths to the files. When I am done with this, I create a task for each path and then process it in a serial way. Something like\n\n```\nvar tasks = [];\n\nfilePaths.forEach(function(filePath){\n (function(filePath){\n var task = function(done){\n ftp.get(filePath, function(error, stream){\n\n stream.pipe(process.stdout);\n\n stream.on(\"close\", function(){\n done();\n })\n })\n }\n tasks.push(task);\n })(filePath);\n});\n\nasync.series(tasks, function(){\n console.log(\"Done processing all files\")\n})\n```\n\nSo yes, I make a GET for each path but using the same FTP client. Did I answer your question? \n","issue_id":1714357015010,"origin_id":67988673,"user_origin_id":2420629,"create_time":1419362769,"update_time":1419362800,"id":1726882517512,"updated_at":"2024-09-21T01:35:17.511000Z","created_at":"2024-09-21T01:35:17.511000Z"},{"_id":"66ee22d5c36747fde60e2a3b","body":"Yes, thanks!\nCarl Furst\n\nFrom: Marcel Ferreira Batista <[email protected]<mailto:[email protected]>>\nReply-To: mscdex\/node-ftp <[email protected]<mailto:[email protected]>>\nDate: Tuesday, December 23, 2014 at 2:26 PM\nTo: mscdex\/node-ftp <[email protected]<mailto:[email protected]>>\nCc: Carl Furst <[email protected]<mailto:[email protected]>>\nSubject: Re: [node-ftp] .Pipe's Flow Control fails on slow Writeable Stream (#70)\n\nSorry I don't think I understand. What I do is to list all the directories that interest me, and find the files that I need. I basically store all the paths to the files. When I am done with this, I create a task for each path and then process it in a serial way. Something like\n\nvar tasks = [];\n\nfilePaths.forEach(function(filePath){\n (function(filePath){\n var task = function(done){\n ftp.get(filePath, function(error, stream){\n\n```\n stream.pipe(process.stdout);\n\n stream.on(\"close\", function(){\n done();\n })\n })\n }\n tasks.push(task);\n})();\n```\n\n});\n\nasync.series(tasks, function(){\n console.log(\"Done processing all files\")\n})\n\nSo yes, I make a GET for each path but using the same FTP client. Did I answer your question?\n\n\u2014\nReply to this email directly or view it on GitHubhttps:\/\/github.com\/mscdex\/node-ftp\/issues\/70#issuecomment-67988673.\n\n---\n\nMLB.com: Where Baseball is Always On\n","issue_id":1714357015010,"origin_id":67988904,"user_origin_id":7406960,"create_time":1419362883,"update_time":1419362883,"id":1726882517514,"updated_at":"2024-09-21T01:35:17.513000Z","created_at":"2024-09-21T01:35:17.513000Z"},{"_id":"66ee22d5c36747fde60e2a3c","body":"I see a similar behaviour : I call stream.pause() immediately after a get and still receive the \"end\" and \"close\" events for the stream without using pipe or on('data').\nAccording to nodejs docs \"the end event will not fire unless the data is completely consumed\": https:\/\/nodejs.org\/docs\/v0.10.37\/api\/stream.html#stream_event_close\n\nHowever with node-ftp, the stream sends \"end\" while no data was consumed yet. \n\nlooking at the code, resume() is called after the callback https:\/\/github.com\/mscdex\/node-ftp\/blob\/master\/lib\/connection.js#L618, so my pause is cancelled because it happens before resume() is called in the node-ftp lib. That's very confusing, a way around this is to call to pause() within a setTimeout. \nI wonder why resume() is called after the callback at https:\/\/github.com\/mscdex\/node-ftp\/blob\/master\/lib\/connection.js#L618 ?\n\nCalling pause with setTimeout allows to pause the stream, but the end and close event still arrive before any data is consumed.\n\nThe end event issue is due to node-ftp, if I comment lines: https:\/\/github.com\/mscdex\/node-ftp\/blob\/master\/lib\/connection.js#L555-L569 It behaves as expected. With that code, the end event is never sent because of the return statement outside of the if block. As a result the stream consumers never receive end and hang.\n","issue_id":1714357015010,"origin_id":108289122,"user_origin_id":1196562,"create_time":1433326878,"update_time":1433342800,"id":1726882517518,"updated_at":"2024-09-21T01:35:17.518000Z","created_at":"2024-09-21T01:35:17.518000Z"},{"_id":"66ee22d5c36747fde60e2a3d","body":"**NW 0.12.3**\nIt looks like related bug.\n\n```\nreadStreamFromFTP.pipe(decryptor).pipe(writeStreamToHDD);\n```\n\n**Error: error:0606506D:digital envelope routines:EVP_DecryptFinal_ex:wrong final block length**\n\nRead stream finish and close 'decryptor' (lib crypto, AES decryptor), before it read all data from FTP, so it call 'decryptor.final()' on 'wrong length data' and throw Error.\n\n**AFTER I PUT 'decryptor' IN 'data' AND 'close' EVENTS OF FTP STREAM IT WORKED FINE.**\nBug occurs on fast machines with files size over 1 MB.\nI ran it in loop to get error after 1-25 transfers (1MB file).\n","issue_id":1714357015010,"origin_id":154038698,"user_origin_id":2642737,"create_time":1446723903,"update_time":1446723903,"id":1726882517522,"updated_at":"2024-09-21T01:35:17.522000Z","created_at":"2024-09-21T01:35:17.522000Z"},{"_id":"66ee22d5c36747fde60e2a3e","body":"Maybe a queue layer using async?? put the files on a queue (like FileZilla) and throttle it. I mean file I\/O is so damn fast, forcing node to be a bit more synchronous wouldn't hurt.\n","issue_id":1714357015010,"origin_id":166803559,"user_origin_id":7406960,"create_time":1450846686,"update_time":1450846686,"id":1726882517525,"updated_at":"2024-09-21T01:35:17.525000Z","created_at":"2024-09-21T01:35:17.525000Z"},{"_id":"66ee22d5c36747fde60e2a3f","body":"Perhaps related to https:\/\/github.com\/mscdex\/node-ftp\/issues\/66 ? There's a workaround there which may help perhaps?\n","issue_id":1714357015010,"origin_id":203833928,"user_origin_id":180894,"create_time":1459414918,"update_time":1459414918,"id":1726882517529,"updated_at":"2024-09-21T01:35:17.529000Z","created_at":"2024-09-21T01:35:17.529000Z"},{"_id":"66ee22d5c36747fde60e2a41","body":"What worked for me is the:\r\n\r\n```\r\nftpClient.get(\"fileNameOnFtp\", function (err, stream) {\r\n if (!!err) {\r\n var errMessage = 'downloading the file: \"' + metadata.filename + '\" is failed. Error message:'+ err.toString();\r\n error(errMessage);\r\n \r\n } else {\r\n var localDownloadedFileName = \"localFileName\";\r\n var writeStream = fs.createWriteStream(localDownloadedFileName);\r\n stream.on('data', function(chunk) {\r\n writeStream.write(chunk);\r\n });\r\n stream.on('end', function () {\r\n writeStream.close();\r\n callback();\r\n });\r\n }\r\n });\r\n```","issue_id":1714357015010,"origin_id":280268821,"user_origin_id":5006092,"create_time":1487234603,"update_time":1487234633,"id":1726882517534,"updated_at":"2024-09-21T01:35:17.533000Z","created_at":"2024-09-21T01:35:17.533000Z"},{"_id":"66ee22d5c36747fde60e2a42","body":"Same problem for me, anyone solved ?","issue_id":1714357015010,"origin_id":598179187,"user_origin_id":20125069,"create_time":1584018915,"update_time":1584018915,"id":1726882517538,"updated_at":"2024-09-21T01:35:17.538000Z","created_at":"2024-09-21T01:35:17.538000Z"}] comment

It appears the readable stream is automatically cleaned up when the library has finished receiving a file. This can be a problem when using .pipe when piped to a slower...

Can't get the whole file sometimes

[{"_id":"6640e8c74ecfda50de0caf98","body":"Yes, sometimes it happens with me. I don't know why it does.","issue_id":1714357015014,"origin_id":340284388,"user_origin_id":1855340,"create_time":1509302588,"update_time":1509302588,"id":1715529927165,"updated_at":"2024-05-12T16:05:27.164000Z","created_at":"2024-05-12T16:05:27.164000Z"},{"_id":"6640e8c74ecfda50de0caf99","body":"I resolved this with another library....\r\nBut something go wrong when connect to FTP server with that library.","issue_id":1714357015014,"origin_id":340286218,"user_origin_id":18347146,"create_time":1509304148,"update_time":1509304148,"id":1715529927169,"updated_at":"2024-05-12T16:05:27.168000Z","created_at":"2024-05-12T16:05:27.168000Z"},{"_id":"6640e8c74ecfda50de0caf9a","body":"Which library did you use?","issue_id":1714357015014,"origin_id":347354653,"user_origin_id":23301025,"create_time":1511822827,"update_time":1511822827,"id":1715529927174,"updated_at":"2024-05-12T16:05:27.173000Z","created_at":"2024-05-12T16:05:27.173000Z"},{"_id":"6640e8c74ecfda50de0caf9b","body":"@Besatnias jsftp. But this library is not good at handling error.you cannot detail information about error sometimes.","issue_id":1714357015014,"origin_id":347811224,"user_origin_id":18347146,"create_time":1511949772,"update_time":1511949772,"id":1715529927177,"updated_at":"2024-05-12T16:05:27.177000Z","created_at":"2024-05-12T16:05:27.177000Z"},{"_id":"6640e8c74ecfda50de0caf9c","body":"I'm getting this issue as well. I'm needing to download about 30gb split into 3gb files every day. When the script is running on gigabit internet, all the downloads succeed. When running on 50mbps internet, the downloads will just completely stop if they don't download fast enough. I don't think it's the ftp server that's dropping me since I can download them through chrome just fine using the ftp:\/\/url","issue_id":1714357015014,"origin_id":415102703,"user_origin_id":10371032,"create_time":1534956972,"update_time":1534956972,"id":1715529927181,"updated_at":"2024-05-12T16:05:27.181000Z","created_at":"2024-05-12T16:05:27.181000Z"},{"_id":"6640e8c74ecfda50de0caf9d","body":"I'm also listening to every event on stream and client and nothing happens. I also noticed that on slower internet, when I was downloading a 110mb file, the download would finish and it would just sit there for about 40 seconds before it would close the stream and `end` the client. On fast internet, the client would `end` as soon as the download finished.","issue_id":1714357015014,"origin_id":415103385,"user_origin_id":10371032,"create_time":1534957109,"update_time":1534957109,"id":1715529927184,"updated_at":"2024-05-12T16:05:27.184000Z","created_at":"2024-05-12T16:05:27.184000Z"},{"_id":"6640e8c74ecfda50de0caf9e","body":"@JourdanClark You can test it again with this library `jsftp`[https:\/\/github.com\/sergi\/jsftp](url).\r\nHope it can help you.","issue_id":1714357015014,"origin_id":415109300,"user_origin_id":18347146,"create_time":1534958234,"update_time":1534958234,"id":1715529927186,"updated_at":"2024-05-12T16:05:27.186000Z","created_at":"2024-05-12T16:05:27.186000Z"},{"_id":"6640e8c74ecfda50de0caf9f","body":"For those also having issues with not getting the whole file, it seems related to small files over fast networks as described by @polidore on https:\/\/github.com\/mscdex\/node-ftp\/issues\/70#issuecomment-34785017 (the file I was having issues with was 142kb over a 700Mbps connection). This snippet worked for me (based on a workaround by @ugate on https:\/\/github.com\/mscdex\/node-ftp\/issues\/66):\r\n\r\n```\r\nconst FTP = new ftp();\r\nconst FILE = \"MyFile.txt\";\r\nconst PASSTHROUGH = new require(\"stream\").PassThrough();\r\n\r\nFTP.get(FILE, (error, stream) => {\r\n if (error) {\r\n console.log(`ERROR: ${error.message}`);\r\n throw error;\r\n } else {\r\n stream.pipe(PASSTHROUGH),\r\n FILE,\r\n error => {\r\n if (error) {\r\n console.log(`ERROR: ${error.message}`);\r\n throw error;\r\n } else {\r\n \/\/ in my case I'm writing to a temporary file, but you can do whatever is needed at this point\r\n fs.createWriteStream(path.join(\"\/tmp\", \"data.tsv\"));\r\n }\r\n };\r\n\r\n PASSTHROUGH.end(() => {\r\n FTP.end();\r\n console.log(\"Closed FTP connection. Goodbye!\");\r\n });\r\n }\r\n});\r\n```\r\n\r\nDocumentation on stream.PassThrough can be found here: https:\/\/nodejs.org\/api\/stream.html#stream_class_stream_passthrough","issue_id":1714357015014,"origin_id":592669066,"user_origin_id":341543,"create_time":1582915381,"update_time":1582927225,"id":1715529927189,"updated_at":"2024-05-12T16:05:27.189000Z","created_at":"2024-05-12T16:05:27.189000Z"}] comment

I want to download some files from FTP server.But i cannot get the whole file when i use `get()` method. For example, the size of file which i want to...

I need to get a report like data to get the file that is not on the FTP server. I have the code below but i cannot get the data....

put method creates file but does not write in it

[{"_id":"6651d8252cfd0d66a81712cb","body":"same problem here.","issue_id":1714357015026,"origin_id":502604185,"user_origin_id":990954,"create_time":1560763167,"update_time":1560763167,"id":1716639781309,"updated_at":"2024-05-25T12:23:01.308000Z","created_at":"2024-05-25T12:23:01.308000Z"},{"_id":"6651d8252cfd0d66a81712cc","body":"Same problem-- it worked for months and then all of a sudden stopped. ","issue_id":1714357015026,"origin_id":583481822,"user_origin_id":3122991,"create_time":1581092662,"update_time":1581092662,"id":1716639781335,"updated_at":"2024-05-25T12:23:01.335000Z","created_at":"2024-05-25T12:23:01.335000Z"}] comment

I've been trying to upload a local file to my ftp server. I've created a readable stream from the path of my file like so : ` const stream =...

If FTP directory has more than 9998 files, the list() function trims list to just 9998 files.

FTP Timeout issue from Heroku Server

[{"_id":"663bb01bb7a25c1c6119dbc7","body":"I have a very similar issue that I posted about here:\r\nhttps:\/\/stackoverflow.com\/questions\/58594227\/release-job-on-heroku-randomly-stops-sending-files-over-ftp\r\n\r\nDid you ever make any progress on solving this issue?\r\n\r\nMy release job has been working fine - I've done nearly 60 releases with no problems on this FTP push. My script still works fine from my local machine but on Heroku I keep getting a timeout error.\r\n\r\nMy release script is [here](https:\/\/github.com\/Roaders\/cineworld-planner\/blob\/master\/src\/node\/release\/index.ts).","issue_id":1714357015038,"origin_id":547025228,"user_origin_id":10414642,"create_time":1572279530,"update_time":1572279530,"id":1715187739558,"updated_at":"2024-05-08T17:02:19.558000Z","created_at":"2024-05-08T17:02:19.558000Z"}] comment

Hi there, I am using node FTP in my node Js development to connect our companies FTP network. When I run locally, it seems to be working fine. We are...

Will continue to update the active mode?

i want to make function that runs every 5 minuts connect to ftp and download file,, i cant use cron or setinterval with this. it gust does nothing

i'am use put encounter 'Could not create file';

[{"_id":"66425fc32308c3a903157c8e","body":"I have a same problem, but my webstorm can upload files to same remote dir.","issue_id":1714357015056,"origin_id":336013022,"user_origin_id":1751616,"create_time":1507780133,"update_time":1507780133,"id":1715625923594,"updated_at":"2024-05-13T18:45:23.594000Z","created_at":"2024-05-13T18:45:23.594000Z"},{"_id":"66425fc32308c3a903157c8f","body":"I have a same problem,but it is successful with FLASHFXP","issue_id":1714357015056,"origin_id":337446546,"user_origin_id":19650843,"create_time":1508295662,"update_time":1508295662,"id":1715625923600,"updated_at":"2024-05-13T18:45:23.599000Z","created_at":"2024-05-13T18:45:23.599000Z"},{"_id":"66425fc32308c3a903157c90","body":"I think you simply have no right to upload to the root folder. Use cwd function:\r\n```javascript\r\nc.on('ready', function() {\r\n c.cwd('www.4399.cn', function() {\r\n c.put('foo.txt', 'foo.remote-copy.txt', function(err) {\r\n if (err) throw err;\r\n c.end();\r\n });\r\n });\r\n});\r\n```","issue_id":1714357015056,"origin_id":340205046,"user_origin_id":1855340,"create_time":1509210004,"update_time":1509210004,"id":1715625923605,"updated_at":"2024-05-13T18:45:23.605000Z","created_at":"2024-05-13T18:45:23.605000Z"},{"_id":"66425fc32308c3a903157c91","body":"\u6211\u4e5f\u9047\u5230\u4e86\u540c\u6837\u7684\u95ee\u9898,\u65e0\u6cd5\u65b0\u5efa\u6587\u4ef6\u5939,\u4f60\u4eec\u89e3\u51b3\u4e86\u5417,\u6c42\u544a\u77e5\r\nI have the same problem. I cannot create a new folder. Have you solved it? Please let me know","issue_id":1714357015056,"origin_id":410487183,"user_origin_id":16201262,"create_time":1533430247,"update_time":1533430247,"id":1715625923610,"updated_at":"2024-05-13T18:45:23.610000Z","created_at":"2024-05-13T18:45:23.610000Z"},{"_id":"66425fc32308c3a903157c93","body":"I have the same problem","issue_id":1714357015056,"origin_id":437331788,"user_origin_id":8684228,"create_time":1541762689,"update_time":1541762689,"id":1715625923616,"updated_at":"2024-05-13T18:45:23.616000Z","created_at":"2024-05-13T18:45:23.616000Z"},{"_id":"66425fc32308c3a903157c94","body":"I have the same problem and I find that I think the file in FTP server is the remote file but it's the local file in the method, so maybe you can try to exchange the location of the parameters.","issue_id":1714357015056,"origin_id":536245165,"user_origin_id":33982478,"create_time":1569727906,"update_time":1569727906,"id":1715625923621,"updated_at":"2024-05-13T18:45:23.621000Z","created_at":"2024-05-13T18:45:23.621000Z"}] comment

[connection] < '220 (vsFTPd 2.0.5)\r\n' [parser] < '220 (vsFTPd 2.0.5)\r\n' [parser] Response: code=220, buffer='(vsFTPd 2.0.5)' [connection] > 'USER zgb' [connection] < '331 Please specify the password.\r\n' [parser] < '331 Please...

list

[{"_id":"6640a7744ecfda50de0c6be9","body":"no people?","issue_id":1714357015060,"origin_id":317953745,"user_origin_id":10137,"create_time":1501047847,"update_time":1501047847,"id":1715513204715,"updated_at":"2024-05-12T11:26:44.715000Z","created_at":"2024-05-12T11:26:44.715000Z"},{"_id":"6640a7744ecfda50de0c6beb","body":"same issue here.\r\nstrangely it only happens on one ftp server and not on another.\r\ni suspect it has something to do with different FTP Server systems. most of them return an object, but some do in fact return a string.\r\n","issue_id":1714357015060,"origin_id":336416697,"user_origin_id":1058440,"create_time":1507891146,"update_time":1507891557,"id":1715513204720,"updated_at":"2024-05-12T11:26:44.719000Z","created_at":"2024-05-12T11:26:44.719000Z"},{"_id":"6640a7744ecfda50de0c6bec","body":"Just a suggestion. I think you should offer your ftp server related information","issue_id":1714357015060,"origin_id":343796628,"user_origin_id":16096871,"create_time":1510541801,"update_time":1510541801,"id":1715513204725,"updated_at":"2024-05-12T11:26:44.724000Z","created_at":"2024-05-12T11:26:44.724000Z"},{"_id":"6640a7744ecfda50de0c6bed","body":"Your server no response owner or group permission name and the parser can't process it.\r\nWhich do you use server and operating system?","issue_id":1714357015060,"origin_id":349265313,"user_origin_id":1855340,"create_time":1512470432,"update_time":1512470588,"id":1715513204728,"updated_at":"2024-05-12T11:26:44.728000Z","created_at":"2024-05-12T11:26:44.728000Z"},{"_id":"6640a7744ecfda50de0c6bee","body":"same issue...It works great at work on windows but not on my mac. Can't get response of list method","issue_id":1714357015060,"origin_id":366900585,"user_origin_id":31045769,"create_time":1519115070,"update_time":1519115070,"id":1715513204733,"updated_at":"2024-05-12T11:26:44.732000Z","created_at":"2024-05-12T11:26:44.732000Z"},{"_id":"6640a7744ecfda50de0c6bef","body":"Still encountering the same issue. Tried on Ubuntu and MacOS.","issue_id":1714357015060,"origin_id":525250075,"user_origin_id":9778368,"create_time":1566903356,"update_time":1566903356,"id":1715513204737,"updated_at":"2024-05-12T11:26:44.737000Z","created_at":"2024-05-12T11:26:44.737000Z"},{"_id":"6640a7744ecfda50de0c6bf0","body":"If anyone stumbles upon this issue, this is how I resolved it. However, I am not sure that this solution will work for every FTP server. \r\n\r\n**Code is in Typescript:**\r\n```typescript\r\nenum ListingType {\r\n Directory = \"d\",\r\n File = \"-\",\r\n Symlink = \"l\",\r\n}\r\n\r\nenum SpecialListingType {\r\n Directory = \"*DIR\",\r\n File = \"*STMF\",\r\n Symlink = \"l\",\r\n}\r\n\r\nconst getFtpListingElement = (entry: string | ListingElement): ListingElement => {\r\n if (typeof entry !== \"string\") {\r\n return entry;\r\n }\r\n\r\n \/\/ `entry` is something like \"OWNER 1234 12\/12\/19 18:19:20 *SFTIM Path\/To\/The\/FileOrFolder\/Location\"\r\n const [\r\n owner,\r\n size,\r\n lastDateModified,\r\n lastTimeModified,\r\n type, \/\/ Afaik can be of 3 types - *DIR (dir), *SFTIM (file) or something else that I haven't encountered but which would stand for Symlink\r\n name,\r\n ] = entry\r\n .split(\" \")\r\n .map((part) => part.trim())\r\n .filter(Boolean)\r\n ;\r\n\r\n return {\r\n owner,\r\n group: owner,\r\n size,\r\n date: new Date(`${lastDateModified} ${lastTimeModified}`),\r\n name,\r\n type: getListingType(type as SpecialListingType), \/\/ Each type above corresponds to a type character mentioned in the docs - d (dir), - (file), l (symlink)\r\n };\r\n};\r\n\r\nconst listings = (await ftp.list(\"\/my\/path\")).map(getFtpListingElement);\r\n```\r\nThe problem with this is that there is no data about permissions. At least not in my case.\r\nLet me know if this works for you.","issue_id":1714357015060,"origin_id":525637043,"user_origin_id":9778368,"create_time":1566980419,"update_time":1567026104,"id":1715513204742,"updated_at":"2024-05-12T11:26:44.741000Z","created_at":"2024-05-12T11:26:44.741000Z"},{"_id":"6640a7744ecfda50de0c6bf1","body":"I find the problem only when I cancel the execute permission of the group. But I don't know why it can cause the problem","issue_id":1714357015060,"origin_id":536190881,"user_origin_id":33982478,"create_time":1569677968,"update_time":1569677968,"id":1715513204745,"updated_at":"2024-05-12T11:26:44.745000Z","created_at":"2024-05-12T11:26:44.745000Z"}] comment

let c = Clinet(); c.on('ready', function() { c.list(path, function(err, list) { if (err) throw err; console.log(list); }); }); [ '-rw-r--r-- 1 oinstall 65536 Jul 25 15:01 csxl20170724.dmp', 'drwxr-xr-x 2 oinstall...