HomeA-Z CommandsUseful Wget Command Examples in Linux System

Useful Wget Command Examples in Linux System

Wget command is one of the most used and handy tools for downloading files, packages, and directories from the web server in Linux and Unix-like operating systems. Usually, you can download any big or small-sized files through the wget tool; the wget does not limit the file size. Originally the Wget command was abbreviated to the combination of the terms World Wide Web and Get. This handy tool was built under the GNU project.

It can access both FTP, HTTP, HTTPS, and other local servers for downloading files on Linux. Even with proper configuration, the wget command can access firewall-protected servers. As we will be talking about the wget, so for making diversity we will often use the term ‘World Wide Web and Get’ instead of the wget.

Wget Commands on Linux


With the default TCP protocol, the wget command can access, download and store files on the file system on Linux. This handy, lightweight, yet powerful tool is written in the C programming language that can easily communicate between the server and host machine for data crawling. It supports download speed limit, pause, resume, caching, SSL, and many more that you’re probably looking for inside a download program.

In this post, we will see how to install the wget command on Linux distributions and the examples of the most used wget commands that you need to know for boosting your Linux experience.

Install wget command in Linux


Usually, the ‘World Wide Web and Get’ command comes pre-installed with all major Linux distributions. However, if you find issues while executing the wget command, please run the following package installer commands on your shell to install the wget tool on Linux. Please execute the right command on the terminal shell according to your distribution.

Install wget on Ubuntu/Debian Linux

$ sudo apt install wget

install wget on Ubuntu

Get ‘World Wide Web and Get’ on Fedora/Red Hat Linux

$ sudo dnf install wget
$ sudo yum install wget

Install Wget tool on SuSE Linux

$ zypper install wget

Get the ‘World Wide Web and Get’ tool on Arch Linux

$ sudo pacman -S wget

Once you get the wget tool on your system, you may now go through the wget syntax that I’ve given below to get an idea about how the wget command functions on Linux.

wget url
wget [options] URL

1. Download a Single File With wget


Since we have already gone through the installation and syntax of the Wget command, we can now directly jump into a wget command to know how it actually functions. The following command will let you download a single file and store it inside your file system.

Downloading a Single File Using wget

$ wget https://cdn.kernel.org/pub/linux/kernel/v5.x/linux-5.15.5.tar.xz

2. Download File with a Different Name and Directory


To save a file or package inside the Linux system with a different name than what originally it is, please execute the following wget command on the shell. Here, the part with file=file.tzx is the section where you rename the file during the downloading.

$ wget --output-document=file=file.tzx 'https://domain/foo-bar-xyz-long-file.tzx?key=xyz'
$ wget -O /tmp/file.tzx \
'https://domain/foo-bar-xyz-long-file.tzx?key=xyz'

3. Set Directory Prefix


Since we use the ‘World Wide Web and Get’ command for effortless downloading, we can also reduce our post-download work by setting the prefix location for files and packages on the Linux system. Please see the below-mentioned commands to know how to set the prefix directory with the wget command.

wget -P /tmp/ url
wget -P /ubuntupit/ https://ur1/freebsd.iso
wget -P /ubuntupit/ https://ur2/opnbsd.iso

4. Adding Notes To File


If you need to put a message inside the file you’re downloading, please execute the following ‘World Wide Web and Get’ command.

$ wget --output-file=log.txt https://url1/..
$ wget -o download.log.txt https://url2/..

You may now see the message with the below-mentioned cat command.

cat download.log
more download.log

The wget command allows users to attach messages while downloading the file from the server for future reference. Though we have already seen a way to send messages, the below-mentioned wget command will also help if the previous one does not work for you.

Adding NotesTo File

wget -o /root/wget-log.txt https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt
# cat /root/wget-log.txt

5. Download Multiple Files


If you’re looking for the command to download more than one file at a time through the ‘World Wide Web and Get’ command, please follow the command pattern given below.

Download Multiple Files by wget

$ wget https://www.ubuntupit.com/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm

6. Read URLs From a File


Reading files from a file refers to finding out what actually is inside the file. Please put the URL inside a text file then do follow the Vim command and the ‘World Wide Web and Get’ command to read the files.

$ vi /tmp/download.txt
$ wget -i /tmp/download.txt

7. Resume Downloads


Accidentally disconnects the download or network failure; everything is required to resume the download. You can use the following wget commands to resume the downloads.

$ wget -c https://www.ubuntupit.com/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt

8. Force wget To Download


If you face issues connecting to a server or a web URL, please run a force command to connect the site via the wget command. Both wget and nohup commands let you force connect the web URL.

$ wget -cb -o /tmp/download.log -i /tmp/download.txt
$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &

9. Limit the Download Speed


For multiple file or package downloading via the ‘World Wide Web and Get’ command over a limited internet connection, you might need to limit the download speed. Please execute the following command given below to limit the download speed.

$ wget -c -o /tmp/susedvd.log --limit-rate=50k ftp://ftp.novell.com/pub/suse/dvd1.iso

10. Get Files From Password Protected Websites


We have already seen how to use user credentials in a wget command, this command also describes how you can directly grab a password-protected file on the web. Both of the below-mentioned methods are useful and handy for a quick way to download files from a password-protected site.

$ wget --http-user=vivek --http-password=Secrete https://www.ubuntupit.com/jahid/csits.tar.gz

Another way to use download a file with a password set site is to include the credentials inside the command.

$ wget 'https://username:[email protected]_server/file.tar.gz
$ ps aux

11. Download Mp3 And Pdf


If you intend to download a specific type of file, such as mp3 or pdf documents, you can execute the following command given below. Here the command is shown for FTP, but it will work for all other public domains as well.

$ wget ftp://somedom-url/pub/downloads/*.pdf
$ wget ftp://somedom-url/pub/downloads/*.pdf

12. Get File to The Standard Output via Wget Command


The following ‘World Wide Web and Get’ command will let you see the standard output format such as the file path, file type, required time, bandwidth consumption, etc in the terminal shell.

$ wget -q -O - 'https://url1/file.tar.xz' | tar -Jxzf - -C /tmp/data/

13. Create A Mirror of A Site with Wget Command


To make a duplicate mirror of a website, you can use the following ‘World Wide Web and Get’ commands. Both of the -m and --mirror syntaxes will do the same task, you can choose any of the following commands for mirroring a site.

mirror ubuntupit

$ wget -m https://url/
$ wget --mirror https://url/

14. Find HTTPS (TLS) Errors


If you’re a professional web developer who works around the SSL certifications, this ‘World Wide Web and Get’ command will be much helpful for you to generate a report for your TLS report with bugs and reports.

$ wget --no-check-certificate \
https://www.ubuntupit.com/robots.txt

15. Enable Timestamps on Wget Command


To print the timestamps on your download output, you can run the following ‘World Wide Web and Get’ command on your terminal shell. This command will let you know the server time when you’re loading the file.

wget --no-use-server-timestamps https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/CentOS-7-x86_64-NetInstall-1511.iso

16. Change Progress Bar


If you’re not satisfied with the traditional download status bar that Linux shows in the shell while downloading a file or package, you can use the following wget command to change the progress bar.

wget --progress=dot https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/CentOS-7-x86_64-NetInstall-1511.iso

17. Enable Partial Download in Wget Command


To perform a partial download from a web server, you may run the following wget command. Once you finish the partial download, you can change the destination and resume the download.

# wget -c https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/CentOS-7-x86_64-NetInstall-1511.iso

18. Retry Failed Download in Wget Command


The following commands show how you can force retry a failed download on a Linux shell through the wget command. Usually, the network failure, server timeout, and other network-based errors can be recovered by this command.

# wget -t 10 https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/CentOS-7-x86_64-NetInstall-1511.iso

19. Download From URL File List


If you need to download a bunch of files through the wget command with a single command, you can write down the web addresses in a text file; then use that text file with the wget command to download from a list. This way you can save your efficient time and hassle of downloading multiple files.

# cat to-download.txt
# wget -i to-download.txt

20. Make Time Delay After Failed Download


Since we have already discussed how to set force download and retry for downloading through the wget command, we can set a time interval before the command executes the next hit on the server. The following commands show how you can set a 10 sec of time gap after the download fails.

# time wget -w 10 -i to-download.txt

21. Non-Interactive Download


In a Linux shell, you can set the wget command in a daemon style to run the command in the system background. Please execute the following command to understand how to execute the wget command in a non-interactive way.

wget -b https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/CentOS-7-x86_64-NetInstall-1511.iso

22. Hide Output in Wget Command


To download a file or package via the ‘World Wide Web and Get’ command without showing the output in the terminal shell, you can use the following command. This command will let you input the download command, then it will silently finish the download and will notify you when the download ends.

hide output by wget

wget -q https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt

23. Debug Information in Wget Command


This handy ‘World Wide Web and Get’ command will be helpful for developers, through this command you can find debugging info of a file. The following command will also send you the essentials that you might need.

# wget -d https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt
# wget -nv https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt

24. View Server Response


If you’re a server administrator, this command will be very helpful for you. You can now use a simple ‘World Wide Web and Get’ command to check the response time of a site or server. This is actually not the same as the Ping command; here we get the actual response time of the server, not the time required by the client to reach the server through the internet.

wget -Server response ubuntupit

wget -S https://www.ubuntupit.com/

25. Set Timeout in Wget Command


Getting a timeout issue to reach a website or download a file is not a rare issue. In Linux, the wget command usually keeps trying to connect to the URL if it gets a timeout issue. To avoid contentious hitting the URL, you can execute the following wget command to cancel the download automatically after a certain attempt.

# wget -T 30 https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt

26. Use Credentials


If your download URL is protected with a username and a password, please run the following command to provide the user credentials for downloading through the wget command.

# wget --user=username --ask-password https://localhost/file.txt

27. Download Non Cached File


We all know how much it is helpful to grab cached files on a local machine. However, the following wget command will help you to download a non-cached file from the web.

# wget -d https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt
# wget -d --no-cache https://mirror.aarnet.edu.au/pub/centos/7/isos/x86_64/md5sum.txt

28. Download a Full Website


Since the wget command is a powerful command, it also allows you to download the entire website and store it in your local directory. Run the following command to download the full site.

$ wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL

29. Abandon Specific File Types with Wget Command


While downloading files through the wget command, you might need to skip or avoid a certain type of file for security purposes. The following command shows how you can reject the GIF files while downloading through the wget command.

$ wget --reject=gif WEBSITE-TO-BE-DOWNLOADED
$ wget -o download.log DOWNLOAD-URL

30. Discontinue Download After Specific Size


Sometimes, we need to download files with a certain file size limit. To set a limit of file size, wget has a dedicated syntax that you can see below. For instance, the below-provided wget command will stop downloading after the file size reaches 5MB.

$ wget -Q5m -i FILE-WHICH-HAS-URLS

31. Only Download Specific File Types


To download a certain type of file among a bunch of files on a server or website, you can execute the following wget command given below. For instance, the below command will download only the pdf files that we assigned via the command.

$ wget -r -A.pdf https://url-to-webpage-with-pdfs/

32. FTP Download With Wget Command


Not only through the HTTPS, HTTP, or public servers, the ‘World Wide Web and Get’ command also allows you to grab files from a local FTP server that is hosted in your local area network.

$ wget ftp-url

If there is a username and a password set for logging into the FTP server, please follow the below-provided command.

$ wget --ftp-user=USERNAME --ftp-password=PASSWORD DOWNLOAD-URL

33. Increase Retry Attempts in Wget Command


If your download fails due to a network issue or the server failure, or too much delay to communicate with the server, you can increase the delay time and increase the retry attempts through the wget command given below.

$ wget --tries=75 https://download.opensuse.org/distribution/leap/15.3/iso/openSUSE-Leap-15.3-DVD-x86_64-Current.iso

34. Download and Extract tar File By Wget Command


Sometimes, we might need to download a compressed file and extract the file into the directory. To make that task effortless, you can use the following command on your Linux system.

# wget -q -O - https://wordpress.org/latest.tar.gz | tar -xzf - --strip-components=1 -C /var/www/html

35. Help And Manual


All the above-mentioned ‘World Wide Web and Get’ commands are not the only commands that you’ll need forever. You can discover this command more. If you’re keen on the wget command, please go through the wget manual and help pages.

World Wide Web and Get' --help

$ man wget
$ wget --help

Ending words


Downloading through the ‘World Wide Web and Get’ command on Linux is really fun. But, it’s not a traditional downloader for Linux. Most often this command is used for downloading compressed package files, tools, and other software-related files through the terminal shell on Linux. The entire post has been a series of wget commands that might help you become a power Linux user.

Please share it with your friends and the Linux community if you find this post useful and informative. You can also write down your opinions regarding this post in the comment section.

Mehedi Hasan
Mehedi Hasan
Mehedi Hasan is a passionate enthusiast for technology. He admires all things tech and loves to help others understand the fundamentals of Linux, servers, networking, and computer security in an understandable way without overwhelming beginners. His articles are carefully crafted with this goal in mind - making complex topics more accessible.

LEAVE A REPLY

Please enter your comment!
Please enter your name here


You May Like It!

Trending Now