How to make a static copy of a website
28 Feb 2010wget -k -K -E -r -l 10 -p -N -F -nH -w 2 <http://website.com/>
-k
: convert links to relative-K
: keep an original versions of files without the conversions made by wget-E
: rename html files to .html (if they don’t already have an htm(l) extension)-r
: recursive… of course we want to make a recursive copy-l 10
: the maximum level of recursion. if you have a really big website you may need to put a higher number, but 10 levels should be enough.-p
: download all necessary files for each page (css, js, images)-N
: Turn on time-stamping.-F
: When input is read from a file, force it to be treated as an HTML file.-nH
: By default, wget put files in a directory named after the site’s hostname. This will disabled creating of those hostname directories and put everything in the current directory.-w
: Be a good neighbor and wait between requests (in seconds) to not overwhelm the server
Via StackOverflow