Tuesday, July 2, 2013

A quick filler post - Downloading websites

Perhaps you need to download a website for offline use because you are going somewhere with no Internet connection, or perhaps like me, you just like to be thorough.

When you find a website with a user manual (that's not in a PDF that you can easily download), you may want to download the entire guide, just in case; or to look through even when you are not near an Internet connection.

If you were using Windows, you could install httrack (which I have used in the past and which worked great), but I want to use Linux as my primary OS so I wanted to be able to do it here. Enter wget, a great way to download websites.

If you don't have it installed, open a Terminal window and type in:
sudo apt-get install wget
enter your sudo password, press <ENTER>

Decide on a website to download, then navigate to where you want to download it to (it should be)

cd Downloads <ENTER>
mkdir test <ENTER> (I'm using test, you can call it something else)
cd test <ENTER>

To download the website to your test folder located in Downloads:

wget -r http://test.com (replace test.com with the website you want to download)
the -r will make it download the entire website, if you leave out the -r you can download a single page, for example:

wget http://www.linux.com/news/embedded-mobile/mobile-linux/726708-smartphone-war-all-about-brics-emerging-markets
will download only that page (tip: copy the url from your web browser, then in the Terminal, type wget
then right click next to it, then left click on paste. Now you didn't have to type out the entire url).

Just please don't type:
wget -r www.google.com <ENTER>
This will download the entire Internet onto your computer! You have been warned!

I hope this helps someone :)

No comments:

Post a Comment