vim /scripts/download_website.sh; # create new file and fill with this content

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains $1 --no-parent $1

chmod u+x /scripts/download_website.sh; # make script executable

mkdir /offline_websites_workspace; # create new dir where to offline the website

cd /offline_websites_workspace;

/scripts/download_website.sh google.com; # download google.com

The options are:

–recursive: download the entire Web site.

–domains website.org: don’t follow links outside website.org.

–no-parent: don’t follow links outside the directory tutorials/html/.

–page-requisites: get all the elements that compose the page (images, CSS and so on).

–html-extension: save files with the .html extension.

–convert-links: convert links so that they work locally, off-line.

–restrict-file-names=windows: modify filenames so that they will work in Windows as well.

–no-clobber: don’t overwrite any existing files (used in case the download is interrupted and

creditz: http://www.linuxjournal.com/content/downloading-entire-web-site-wget