shreddit
                                
                                 shreddit copied to clipboard
                                
                                    shreddit copied to clipboard
                            
                            
                            
                        Application needs to be run several times to remove all comments/posts
I don't know if it's intended functionality, but the application stops after x amount of deletions, even if there are more posts left to delete.
I've run the command
shreddit --username MYUSERNAME --password MYPASS --client-id MYCLIENTID --client-secret MYCLIENTSECRET
As I said, it might be intended functionality, but if it isn't, I thought I'd let you know.
Thanks for mentioning this. I noticed this behavior before, actually, but I thought maybe it was a bug that would have been fixed after I'd done a refactor. I wonder if it's something about the Reddit API itself. It seems to have a lot of odd behavior. I'll look into this.
I had a workaround in place which was to try paginating through results again after the initial pagination had completed, and just keep continuing restarting pagination until it returned no results, but I had to remove that because it was causing an infinite loop when any posts could not be deleted.
In the meantime you can just run it again after it completes.
Workaround: looping bat file.
@echo off :loop shreddit --YOURPARAMSHERE goto loop
Had the same issue.
Just using this with zsh. It's not fool proof but it will get the job done as a background task as I check up on it periodically.
while true; do shreddit --YOUR-PARAMS-HERE; done
Info update after some testing 👍 I was playing around with the json output of reddit and shreddit. I found that reddit stops reporting results after 34 pages (so thats roughly 850 comments in total, if each has 25 comments on the page). No matter what I tried, i couldn't get a 35th page to occur via the json api that shreddit utilizes. This is why the issue exists. Sadly, I don't think this is something that shreddit can resolve, since the server is the one not producing the data.
That said, over in #61, it appears that the GDPR export is a nice workaround to this.
Thanks for looking into this more @RFBomb ! I had suspected this was the case.
At one point I had some logic to restart the loop and keep going until the first API call returns no results. I removed that at some point, IIRC because sometimes posts can't be deleted for certain reasons, so those would cause infinite loops if you had enough of them.
We could add it back and have the exit condition check that the last iteration through all results weren't all posts which could not be deleted or that the first page is empty.
Still not great since it means theoretically you could have 34 pages of non-deletable posts and the 35th is deletable, but you can't get to it because the Reddit API is terrible.
But yeah, seems like we should explain to users that GDPR is the only sure way to delete everything.
More testing performed on this using the '?limit=100' property for the json request. it appears that no matter how many items per page, the limit (atleast for me) is 888 comments.
2023-06-22T03:06:18.834626Z DEBUG  Page 8 contained 100 results
    at src\things\comment.rs:297
  2023-06-22T03:06:21.876126Z DEBUG  Page 9 contained 88 results
    at src\things\comment.rs:297
  2023-06-22T03:06:24.118796Z DEBUG  Page 10 contained 0 results
this jive's with reddit's 1k limit on their cache. https://www.reddit.com/r/redditdev/comments/2ffide/listing_old_comments/ So basically, as we discovered, GDPR is only realistic way to delete the old comments, other than iterative runs (with some time between them to allow reddit to recache the next 1k).
https://github.com/j0be/PowerDeleteSuite Not sure what this repo is doing, but from what I've read, they figured out how to go past that limit. Its also a javascript script you run while logged in on the page, so I honestly don't know what its doing or how it does it.]
EDIT: I just ran back to back tests, one using Shreddit and one using the PDS linked above. out of roughly 110 comments, PDS successfully edited about 30-40%, maybe less, after 3 runs. Shreddit (edit only mode) edited every single comment. (Also, PR incoming for the rate limit since the numbers were for post 7/1 date, so quick PR for the current week incoming. Then I'll retest with legit numbers post 7/1).