Get a webpage and all the pages it links to for local viewing with wget

wget -r -l1 -p -np -k -nH -P 'outdir' http://www.aaa/bbb.html

#download bbb.html
#-r: recursivelly
#-l1: up to level 1 links, that is, download only the pages imediately linked to by www.aaa/bbb.html
#   (-l1, could be -l2, -l3, ... for nth level links)
#-p: saving all necessary files to view the page locally (images, css)
#-np: not downloading parent directories pages
#-k: converting links in htmls that were downloaded into local links for offline viewing
#-nH: not creating a directory 'www.aaa' as would be default
#-P 'outdir': saving output inside a directory 'outdir', and creating it if it doesn't exist
Advertisements