Keep running into _ is not a submission, and it won't download more than half of my saved posts #877
-
As the title describes. I keep running into this message when verbose is enabled, which seems like it might be the issue: I submitted a data request, and I have the links to all of my saved posts. Would it be possible to just use that instead to specify what I would like to download? It'd be easier to modify that list to stop and start places, as well as get my above 1k saves downloaded. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
We have a maximum that cannot be bypassed. That's Reddit's limit, not ours. You're using the download function of the BDFR. That applies only to the contents of a post, that is either the text or the image or the link. Comments are none of those and are ignored. If you want to get both or just comments, then the cloning or archive functions actually do that. They scrape the actual metadata of the comments and submissions. |
Beta Was this translation helpful? Give feedback.
-
What is the limit? It isn't going beyond half of the 1000 post limit, and I'm wondering where it stops. BDFR says it stops at 1000 as that is reddit's limit, I think? |
Beta Was this translation helpful? Give feedback.
You can use --include-id-file to pull the links from a text file with one link per line, sadly this only works for full links if you want the comments too, as stated in #839. There are plenty of methods to extract the column with the links and create the .txt file, a csv file is basically a text file with commas.