youtube-comment-scraper-cli
youtube-comment-scraper-cli copied to clipboard
Resume/retry capabilities
Youtube API sometimes may return error pages (such as #47) or simply a network error may break the whole fetching process.
To overcome this, I'll kindly propose the below approach:
-
Turn the scheme into something like this:
{ "comments": [ // ... ], "numberOfTotalComments": 2300, "nextPageToken": null }
-
If an error is encountered at some point, save the partial data to the file with "nextPageToken" being set.
-
If the user runs the same command again, check if the "nextPageToken" is not empty, and if so resume the operation using it.
And some notes about this approach:
-
"nextPageToken": null
means everything is successfully downloaded. - On step 2 above, when there is an error, the script can actually make a second try before failing and saving partial data.
- To decrease the chance of error, there may be delay between requests (in fact I looked at the source code to see if this is already done but failed; because JS code, to me, is very hard to follow)
@maliayas can you download youtube data using this code?. can you help me to get the data from youtube?. because I gave an error Error API response does not contain a "content_html" field #47. I want to get some data for my final year project. previously I have got data using this program. please help me.
This is just a proposal; there is no code yet. Unfortunately I'm not a JS dev, but I guess @philbot9 or someone else can easily implement this and then yes, it will fix #47.