Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
100% found this document useful (1 vote)
135 views

Curl Command

This document provides an overview of the cURL command line tool and its various capabilities such as downloading files and data from URLs using different protocols, uploading files to FTP servers, passing authentication, limiting transfer rates, resuming interrupted downloads, and more. It also describes how to use cURL to fetch content from web services, check server health, and write testing/monitoring scripts.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
135 views

Curl Command

This document provides an overview of the cURL command line tool and its various capabilities such as downloading files and data from URLs using different protocols, uploading files to FTP servers, passing authentication, limiting transfer rates, resuming interrupted downloads, and more. It also describes how to use cURL to fetch content from web services, check server health, and write testing/monitoring scripts.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 16

Curl command

Introduction
• cURL is a software package which consists of
command line tool
• a library for transferring data using URL syntax.

• cURL supports various protocols like, DICT, FILE, FTP,


FTPS, Gopher, HTTP, HTTPS, IMAP, IMAPS, LDAP,
LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP,
SMTPS, Telnet and TFTP.
• We can use the curl command to call web services
right from UNIX command prompt.
• receive a response
• check whether our server is healthy or not.
• We can write testing and monitoring scripts
using curl command.
• It's similar to wget command, which is also used for
HTTP requests but it's much more powerful and
supports many protocols.
Download a Single File

• The following command will get the content of the


URL and display it in the STDOUT (i.e on your
terminal).
• # curl url_name
• To store the output in a file, you an redirect it as
shown below.
• This will also display some additional download
statistics.
• # curl url_name > filename
Save the cURL Output to a file

• We can save the result of the curl command to a file by


using -o/-O options.

• -o (lowercase o) the result will be saved in the filename


provided in the command line
• # curl -o filename url_name

• -O (uppercase O) the filename in the URL will be taken


and it will be used as the filename to store the result
• # curl -O url_name
Fetch Multiple Files at a time

• We can download multiple files in a single shot by


specifying the URLs on the command line.

• # curl –O url_name1 –O url_name2

• The command will download both the url content


and save it in the same name under the current
directory.
Follow HTTP Location Headers with -L option

• By default CURL doesn’t follow the HTTP Location


headers.
• It is also termed as Redirects.
• When a requested web page is moved to another
place, then an HTTP Location header will be sent as
a Response
• it will have where the actual web page is located.

• # curl http://www.google.com
• # curl –L http://www.google.com
Continue/Resume a Previous Download
• Using curl -C option, we can continue a download
which was stopped already for some reason.
• This will be helpful when we download large files,
and the download got interrupted.
• If we say ‘-C -‘, then curl will find from where to
start resuming the download.
• We can also give an offset ‘-C <offset>’. The given
offset bytes will be skipped from the beginning for
the source file.
• Start a big download using curl, and press Ctrl-C to
stop it in between the download.
• # curl –O url_name
• # curl –C - -O url_name
Limit the Rate of Data Transfer

• we can limit the amount at which the data gets


transferred using –limit-rate option.
• we can specify the maximum transfer rate as
argument.
• # curl --limit-rate 1000B url_name
• The above command is limiting the data transfer to
1000 Bytes/second.
Download a file only if it is modified before/after
the given time

• We can get the files that are modified after a particular


time using -z option in curl.
• This will work for both FTP & HTTP.
• # curl –z date url_name
• # curl –z -21-apr-2019 http://ftp/reva.edu.in
• The above command will download the content only if it
is modified later than the given date.
• # curl –z -date url_name
• The above command will download the content, if it is
modified before than the given date.
Pass HTTP Authentication in
cURL
• Sometime, websites will require a username and
password to view the content
• With the help of -u option, we can pass those
credentials from cURL to the web server 

• # curl –u username:password url_name


Download Files from FTP server

• cURL can also be used to download files from FTP


servers.
• If the given FTP path is a directory, by default it will
list the files under the specific directory.

• # curl –u ftpuser:ftppass –O ftp_sername


Upload Files to FTP Server

• Curl can also be used to upload files to the FTP


server with -T option.
• # curl –u ftpuser:ftppass –T myfile.txt ftp_sername
• The above command will upload the file named
myfile.txt to the FTP server.
Get Definition of a Word using DICT Protocol

• we can use cURL to get the definition for a word


with the help of DICT protocol.
• We need to pass a Dictionary Server URL to it.

• # curl dict://dict.org/d:bash

• # curl dict://dict.org/d:bash:foldoc

You might also like