Open In App

How to Download File on Linux Terminal

Last Updated : 21 Apr, 2025
Summarize
Comments
Improve
Suggest changes
Share
Like Article
Like
Report

Downloading file on Linux is a basic operation, and the right command-line tool makes a huge impact in terms of speed, efficacy, and customization. While the graphical interface for simple downloads, CLI-based programs like wget, curl, axel, and aria2 come with features of parallel downloading, authentication, bandwidth limiting, as well as supporting redirects.

This article describes how to download files using wget for basic downloads, curl for specialized data transfer, axel for high-speed downloads, and aria2 for parallel multi-threaded downloads. No matter if you have to retrieve a single file, download in bulk, or overcome authentication obstacles, these tools offer robust solutions for all situations.

Download File on Linux Using wget in Linux Terminal

wget is a command-line tool for downloading files off the internet for Unix and Linux-based operating systems. It can work with HTTP, HTTPS, and FTP protocols, which means that users can download files without using a graphical interface.

1. Installing wget

The majority of Linux distributions have wget as a default application. If it’s not there, use the following command to download it via your package manager:

On Debian/Ubuntu-based systems, run the command:

sudo apt install wget

For CentOS/RHEL-based systems:

sudo yum install wget

On Arch Linux:

sudo pacman -S wget

2. Basic wget Usage

The most straight forward command for downloading a file through wget is:

wget <URL>
wget
wget command

Example: Run the command below to download samplefile.zip saved on the specified URL to the current working directory.

wget https://example.com/samplefile.zip
wget
download file using wget

3. Downloading Files to a Specific Folder

However, you can specify a different location using the -P (prefix) option because by default, wget saves the downloaded file in the current working directory.

Example: The command below will ensure that samplefile.zip is saved in /home/user/Downloads/

wget -P /home/user/Downloads https://example.com/samplefile.zip
wget
using -P in wget command

4. Downloading Multiple Files

In cases where there are multiple files to be downloaded, one can create a text document, for example: (urls.txt), with each line containing URLs for the files needing to be downloaded.

https://example.com/file1.zip
https://example.com/file2.zip
https://example.com/file3.zip

Use wget with the -i option to read the list from the text file and download all files sequentially:

wget -i urls.txt
url
Download files from text document

Note: This approach is useful when downloading bulk files, such as software packages or datasets.

Download File on Linux Terminal using curl

curl is a command line tool in Linux that allows you to transfer data to and from a server, using protocols such as HTTP, HTTPS, FTP, and SCP. curl is much more versatile than wget which is solely used for downloading. Wget is very useful but curl goes a step further by allowing the user to upload, download, and even automate tasks that are related to web communications.

1. Installing curl

In case curl is not already installed in your Linux distribution, you can use the following commands to install curl:

For Debian/Ubuntu-based distributions

sudo apt install curl

For CentOS/RHEL-based distributions

sudo yum install curl

2. Basic curl Usage for Downloading Files

The most basic method of downloading files in curl requires you to type the below command:

curl -O <URL>
curl
curl command

Example: The below command saves samplefile.zip in the current folder without changing the name of the file, which gets loaded in the current directory.

curl -O https://example.com/samplefile.zip
curl
Download file using curl command

3. Handling Redirects While Downloading

curl has an option to let users opt out of worrying about URL such as requesting a redirection to another site prior to the desired file being downloaded through the use of the -L option:

curl -L -O https://example.com/redirectedfile.zip

The command above will allow curl to obtain the file from the correct location even if the site uses HTTP 301 or 302 redirects.

curl
Handle the Redirects in curl command

4. Downloading Multiple Files at Once

By placing more than one -O flag, you can download multiple files at once:

curl -O https://example.com/file1.zip -O https://example.com/file2.zip

The above commands downloads file1.zip and file2.zip into the current directory.

curl
Downloading multiple files at ones using curl command

Alternatively, you can follow an approach where you save multiple URLs in a single file (urls.txt) and utilize it with:

xargs -n 1 curl -O < urls.txt      # This command reads each URL from urls.txt and downloads them one by one.

5. Downloading Files with Authentication

If a file requires a user name and password, the -u option offers a solution:

curl -u username:password -O https://example.com/protectedfile.zip

Note: The above command is useful when downloading files from protected servers, APIs, or private directories.

6. Setting Download Speed Limits

To control how much bandwidth is consumed, slow down the download speed with the --limit-rate tag:

curl --limit-rate 200k -O https://example.com/largefile.zip

The above command is restricts the download speed to 200 KB/s, preventing excessive network usage.

Accelerating Download File on Linux Terminal using axel

axel is a lightweight command-line download accelerator that speeds up file transfers by splitting the download into multiple parts and fetching them simultaneously using multiple connections.

1. Installing axel

In order to utilize axel, you must first install it by using the following commands:

For Ubuntu/Debian-based distributions

sudo apt install axel

For CentOS/RHEL-based distributions

sudo yum install axel

For Arch Linux:

sudo pacman -S axel

2. Downloading a File on Linux Terminal with axel

Download a file through axel by using this command line:

axel -n 10 https://example.com/largefile.zip     # -n 10 tells axel to use 10 parallel connections for faster downloads.
axel
axel command

3. Limiting Download Speed with axel

You can also set restrictions on the download speed in order to minimize excessive bandwidth consumption using the -s option.

axel -n 10 -s 500K https://example.com/largefile.zip         # This will limit the speed to 500 KB/s while using 10 connections.

Download File on Linux Terminal Parallelly using aria2

Aria2 is an advanced parallel file downloader that can download files through HTTP, HTTPS, FTP, SFTP, BitTorrent, and Metalink all at the same time. It’s developed to achieve the highest download speed by dividing files into several pieces.

1. Installing aria2

To install aria2 on different Linux distributions:

For Ubuntu/Debian:

sudo apt install aria2

For CentOS/RHEL:

sudo yum install aria2

For Arch Linux:

sudo pacman -S aria2

2. Basic File Download Using aria2

To download a specific file, use the following command.

aria2c https://example.com/samplefile.zip       # This command will fetch the file using multiple parallel connections
aa
Downloads the file using aria2 command

3. Downloading Multiple Files

If you have multiple files to download, create a text document (urls.txt) with all the addresses on separate lines.

https://example.com/file1.zip
https://example.com/file2.zip
https://example.com/file3.zip

Then, you can download every file all at once with:

aria2c -i urls.txt

4. Downloading Files in Parallel

To maximize speed by splitting a file into multiple segments, use:

aria2c -x 16 -s 16 https://example.com/largefile.zip
  • -x 16: his parameter allows downloading through 16 connections at once.
  • -s 16: This parameter divides the file to be downloaded into 16 portions for faster downloading.
gfg
Downloading files in parallel in using aria2 command

Conclusion

Downloading files from Linux terminal at the command line provides speed, flexibility, and automation, making it a valuable tool for users operating servers, scripting operations, or large data transfers. wget is an easy and dependable method for downloading files, but curl provides additional control over data transfers and API interactions, such as authentication and speed limiting.

For people who need faster downloads, axel speeds things up with multiple connections, and aria2 adds some extra speed with parallel multiple sources download support. Both of these tools do something distinct, and hence users can tailor their downloads as per their needs. With the knowledge of these tools, Linux users can efficiently improve efficiency, reduce download time, and gain more control over the network bandwidth so that files are transferred smoothly and without glitches.


Article Tags :

Similar Reads