Is it possible to download a website and get as close to a 1:1 copy to browse online? How would you do this?

Is it possible to download a website and get as close to a 1:1 copy to browse online?

How would you do this?

CRIME Shirt $21.68

Yakub: World's Greatest Dad Shirt $21.68

CRIME Shirt $21.68

  1. 2 years ago
    Anonymous

    wget

    • 2 years ago
      Anonymous

      Wget

      windowscel here

      • 2 years ago
        Anonymous

        wget

      • 2 years ago
        Anonymous

        Everybody point and laugh

      • 2 years ago
        Anonymous

        wget

      • 2 years ago
        Anonymous

        So let me get this straight. You want to get into web scraping but cant be bothered to learn how to learn wget on linux?

        wget is not part of the linux kernel. you can use it on windows. this board is an embarrassment to humanity

        • 2 years ago
          Anonymous

          no, but the 20 gnu libraries are linux specific

      • 2 years ago
        Anonymous

        you can use it on windows or just use WSL who cares

      • 2 years ago
        Anonymous

        HTTTrack website copier if you are not comfortable with wget.

      • 2 years ago
        Anonymous

        Literally wget
        Powershell has it aliased from Invoke-Webrequest

  2. 2 years ago
    Anonymous

    Wget

  3. 2 years ago
    Anonymous

    Ctrl+s

    /thread

    • 2 years ago
      Anonymous

      It doesnt work, it doesnt download the full website.

      • 2 years ago
        Anonymous

        So let me get this straight. You want to get into web scraping but cant be bothered to learn how to learn wget on linux?

      • 2 years ago
        Anonymous

        It does on firefox, select "full webpage" as file type if its not by default

        • 2 years ago
          Anonymous

          Not really, when I click on an image it just opens the website.

          What about HTTracker software?

  4. 2 years ago
    Anonymous

    wget, or print as PDF for single pages

  5. 2 years ago
    Anonymous

    Back in the Good Old Days(TM) websites often offered ZIP files of their complete content to browse offline because not everyone had an internet connection and/or it was slow and expensive (usually over a modem).
    That was when people genuinely wanted to spread knowledge and not just make money by displaying ads.

    • 2 years ago
      Anonymous

      Never happened

  6. 2 years ago
    Anonymous

    wget -m -k -K -E -l 7 -t 6 -w 5 website.com

  7. 2 years ago
    Anonymous

    s/online/offline

  8. 2 years ago
    Anonymous

    literally just right click save as

  9. 2 years ago
    Anonymous

    In mediaeval Firefox there existed an addon titled Mozilla Archive File Format (MAFF). The purpose of this addon was to save webpages. Old browsers let you save webpages, but the problem was that they weren't perfect copies, because they'd neglect to save something vital to how the webpage works live, so mileage would vary from effectively 1:1 to broken to the point of not even having the CSS work right. MAFF solved this by repurposing the idea of an older format made for Internet Explorer, mhtml---effectively, to save the page as a zip folder, all dependencies contained in it. This basically solved the issue of fidelity, with only one problem, though a major one: all the interactive features were lost. MAFF froze the page as is; thus javascript events wouldn't work. It was not useful for preserving webapps, and as far as I knew, there was no tool available that could do that easily without any tinkering on the user's part.

Your email address will not be published. Required fields are marked *