Bleachbooru

How to rip the entire site in case of shutdown?

Posted under General

I tried Jdownloader but it only rips the first 20 images (first page).

I got around that by script duplicating each page link (basically automatically writing a whole bunch of lines with each page number link gradually increasing by 1)
posts?page=1
posts?page=2
posts?page=3
...
posts?page=1000
etc

Then copy/pasting THAT list into Jdownloader again.

It technically worked, everything was downloaded but it was ONLY the thumbnail of each file... It said something about "not finding any links" so even Jdownloader's deep search doesn't help.

The only way to batch download the original images is by opening EVERY SINGLE IMAGE PAGE manually and then copy/pasting that huge list into Jdownloader.

But even if I did that slog, there's still the problem of bad file names. Instead of the file name reflecting the number the site gave it based on latest uploaded

For example the number at the end of the link:

"post #1111111111111111111111111111"

They'll have the raw file names instead:

"as2421121dkjasl24111124dkjasdkljasldkajdq3535353583euqweu9082418124981274u.jpg"

This means that posts that are related to each other will be completely out of order (ESPECIALLY if grabbing an entire tag)

The "download" button on each image at the bottom left still keeps images out of order by having the TAGS FIRST and THEN the random file name... Still no site-assigned number.

What else can I do? I don't have these issues in other rule34 sites with Jdownloader...
(rule34xxx downloads the site-assigned number as the file name so that's an especially good site for ripping and keeping everything in order...)

1