Wallhaven extension

Posts: 8 · Views: 192
  • 9297

    Is there any wallhaven extension who could help me download lot of wp easily? I just lost all my collection due to hard disk error and trying to back all my uploads

  • 9305

    Oh god... Really? :\ First of all, this question has already been asked. Second of all, GOOOOOOGLE IT Also, why would download all your uploaded files? They're already safe on the server. Last but not least, you could use kali linux's recovery tools.

    Last updated
  • 9308

    Also, why would download all your uploaded files? They're already safe on the server.

    Not soon , but one eternity later there will be wipe ( fresh start )

  • 9312

    TCBfergie Mhm... Okay, I'll try and help you out. Did you find the extension? If not, send me a private message.

    So, How is that Hard Disk Drive treating you?

    Last updated
  • 9362

    TCBfergie Maybe this one can help you. Wallhaven downloader

    Unfortunately I not released downloading by-user feature(coz it will be really hard, we have no any API(or I searching not hard enough) and I need to parse HTML it's really not so easy)

    P.S. Sorry, its not more avalibe during Wallhaven Policy.

    Last updated
  • 9367

    intel777 Problem is already solved :] Also, I believe when you visit a page that contains lots of thumbs...say TCBfergie's uploads page the wallpaper thumbs are either hidden and are shown on user scroll down OR the site uses jQuery to append new data while scrolling down. It is probably something like this:

    $(document).ready(function() {
        $(window).endlessScroll({
            inflowPixels: 300,
            callback: function() {
                var $img = $('#images li:nth-last-child(5)').clone();
                $('#images').append($img);
            }
        });
    });

    Taken from: The third demo

    Last updated
  • 9375

    Holy Look what I'am found: It's no need to parse through infinity scroll. You can simply go to /user/%username%/uploads?page=%whatever_int% and just parse 1 page per time.

  • 9382

    intel777 said:

    You can simply go to /user/%username%/uploads?page=%whatever_int% and just parse 1 page per time.

    Not bad. You can also set the purity settings and require the user to login in order to download nsfw material. Something like this:

    string SelectedUploader= "https://alpha.wallhaven.cc/user/" + SelectedUser+ "/uploads?purity=111&page="+N

    And this is how you get how many pages are there: Javascript:

    string ListingPageHeader = document.getElementsByClassName("thumb-listing-page-header")[64y74o].textContent

    Say it returns: "Page N / 16" You need to pay attention to the '/' char and the integer 16. You can then simply use:

    if(Int32.TryParse(ListingPageHeader.SubString(ListingPageHeader.LastIndexOf('/')+1), out PageCount))
    Console.WriteLine(PageCount);
    Last updated

Message