How to download files from the Linux command line

Wget is a very cool command-line downloader for Linux and UNIX environments. Don’t be fooled by the fact that it is a command line tool. It is very powerful and versatile and can match some of the best graphical downloaders around today. It has features such as resuming of downloads, bandwidth control, it can handle authentication, and much more. I’ll get you started with the basics of using wget and then I’ll show you how you can automate a complete backup of your website using wget and cron.

Let’s get started by installing wget. Most Linux distributions come with wget pre-installed. If you manage to land yourself a Linux machine without a copy of wget try the following. On a Red Hat Linux based system such a Fedora you can use:

# yum install wget

or if you use a Debian based system like Ubuntu:

# sudo apt-get install wget

One of the above should do the trick for you. Otherwise, check with your Linux distribution’s manual to see how to get and install packages. wget has also been ported to Windows. Users on Windows can access this website. Download the following packages: ssllibs and wget. Extract and copy the files to a directory such as C:\Program Files\wget and add that directory to you system’s path so you can access it with ease. Now you should be able to access wget from your Windows command line.

The most basic operation a download manager needs to perform is to download a file from a URL. Here’s how you would use wget to download a file:

# wget

Yes, it’s that simple. Now let’s do something more fun. Let’s download an entire website. Here’s a taste of the power of wget. If you want to download a website you can specify the depth that wget must fetch files from. Say you want to download the first level links of Yahoo!’s home page. Here’s how would do that:

# wget -r -l 1

Here’s what each options does. The -r activates the recursive retrieval of files. The -l stands for level, and the number 1 next to it tells wget how many levels deep to go while fetching the files. Try increasing the number of levels to two and see how much longer wget takes.

Now if you want to download all the “jpeg” images from a website, a user familiar with the Linux command line might guess that a command like “wget*.jpeg” would work. Well, unfortunately, it won’t. What you need to do is something like this:

# wget -r -l1 –no-parent -A.jpeg

Another very useful option in wget is the resumption of a download. Say you started downloading a large file and you lost your Internet connection before the download could complete. You can use the -c option to continue your download from where you left it.

# wget -c

Now let’s move on to setting up a daily backup of a website. The following command will create a mirror of a site in your local disk. For this purpose wget has a specific option, –mirror. Try the following command, replacing with your website’s address.

# wget –mirror

When the command is done running you should have a local mirror of your website. This make for a pretty handy tool for backups. Let’s turn this command into a cool shell script and schedule it to run at midnight every night. Open your favorite text editor and type the following. Remember to adapt the path of the backup and the website URL to your requirements.


YEAR=`date +”%Y”`
MONTH=`date +”%m”`
DAY=`date +”%d”`

BACKUP_PATH=`/home/backup/` # replace path with your backup directory
WEBSITE_URL=`` # replace url with the address of the website you want to backup

# Create and move to backup directory
mkdir $DAY
cd $DAY

wget –mirror ${WEBSITE_URL}

Now save this file as something like and grant it executable permissions:

# chmod +x

Open your cron configuration with the crontab command and add the following line at the end:

0 0 * * * /path/to/

You should have a copy of your website in /home/backup/YEAR/MONTH/DAY every day. For more help using cron and crontab, see this tutorial.

There’s a lot more to learn about wget than I’ve mentioned here. Read up wget’s man page.

{ 25 comments… add one }
  • azeddine December 21, 2008, 4:57 pm

    per aprire

  • marco December 29, 2008, 11:58 pm

    Another good downloadmanager for the console is Axel:

    -multiple connections,
    -multiple mirrors,
    -resuming (if the server supports it),
    -no dependencies and

    If you have big downloads (CD/DVD-Images), i suggest to use axel instead of wget, it’s a WAY faster then wget.

  • zzz March 25, 2009, 3:53 am

    # Ням установить Wget – это круто.

  • Info Gaptek May 25, 2009, 8:39 pm

    This page is very useful.
    Thanks for your help!

  • Fernando June 5, 2009, 5:51 pm

    O comando para certo no fedora é # yum instal [nome do programa]
    ou ainda # rpm -Uvh nomedopacote.rpm

  • Abhishek June 17, 2009, 2:03 am

    my first ever comment…realy very gud and simple for novice user…hats off man!!!!!

  • ankur September 2, 2009, 4:45 am

    very clearly written!!

  • Pushparaj September 30, 2009, 6:21 am

    Written nicely. Very useful for newbies

  • Mitesh November 17, 2009, 4:50 am

    how to download documen from Windos command prompt(CMD).

    i hava internet plan in which i can download unlimited from 2 a.m to 8a.m.
    so i can set time to wake up but how to download file without presence of me(using cmd)

    so please help me

  • Vasya May 22, 2010, 5:15 pm

    блять дорвей ипаный

  • H. Steen July 14, 2010, 6:40 am

    That script will fail on the first day of every month or year because it doesn’t create those folders, right?

  • aris November 12, 2010, 1:15 am

    nice info. but i am still confuse

  • fgyieruigehghgu November 25, 2010, 4:30 am

    my email id is not correct, just wanna show appreciation, cool work , keep it up

  • Gu August 3, 2012, 1:31 pm

    But where are the downloades files located?

  • json March 15, 2013, 5:27 am


  • Sjaak June 13, 2013, 6:41 am

    my wget does not support -r :(

  • gavra July 18, 2013, 9:18 am

    Thumbs up! :)

  • zizou August 28, 2013, 11:29 pm


  • uawmagic September 3, 2013, 4:27 am

    really great helping site. clean theme and detailed answer. thanks so much!

  • Bjorn June 13, 2014, 9:44 am

    Old post, I know. But I try to ask a question anyway: How would you go about if you want to download a file using wget from a service using with Windows Authentication?


  • Ross McKillop June 14, 2014, 4:49 pm

    A real quick web search shows that it was broken in wget 1.11, but that’s really old now – so I’d have thought it would be fixed. It’s not in the man file? Another person suggested curl (depending on what you want to do) as a viable alternative…

  • Bogere Aski July 23, 2014, 3:13 am


Leave a Comment