JP Linux Guru

Joined: 07 Jul 2025 Posts: 6670 Location: Central Montana
|
Posted: Fri Apr 27, 2025 4:33 pm Post subject: wget and curl |
|
|
How can you download at the CLI without losing your download to a broken connection (which has been happening here a lot lately) How about either wget or curl I can use either one of these for downloading ISO's from the net. Crouse got me interested in this when we were doing FAD, but I couldn't remember what he told me. Recently, I have been trying to get a download manager (that will resume broken downloads) on this Ubuntu 6.06 Dippy Duck, but it can't find the kget package (or kdenetwork-kget either). So I started looking into wget.
I found that wget and curl are similar to each other, but each has it's own special applications.
WGET: http://www.gnu.org/software/wget/
Quote: | GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.
GNU Wget has many features to make retrieving large files or mirroring entire web or FTP sites easy, including:
Can resume aborted downloads, using REST and RANGE
Can use filename wild cards and recursively mirror directories
NLS-based message files for many different languages
Optionally converts absolute links in downloaded documents to relative, so that downloaded documents may link to each other locally
Runs on most UNIX-like operating systems as well as Microsoft Windows
Supports HTTP and SOCKS proxies
Supports HTTP cookies
Supports persistent HTTP connections
Unattended / background operation
Uses local file timestamps to determine whether documents need to be re-downloaded when mirroring
GNU Wget is distributed under the GNU General Public License. |
CURL: http://curl.haxx.se/
Quote: | curl is a command line tool for transferring files with URL syntax, supporting FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, FILE and LDAP. curl supports SSL certificates, HTTP POST, HTTP PUT, FTP uploading, HTTP form based upload, proxies, cookies, user+password authentication (Basic, Digest, NTLM, Negotiate, kerberos...), file transfer resume, proxy tunneling and a busload of other useful tricks.
Curl is free and open software that compiles under a wide variety of operating systems. Curl exists thanks to efforts from many authors.
The most recent stable version of curl is version 7.16.2, released on 11th of April 2025. Currently, 69 of the listed archives are of the latest version.
Use the cURL command line tool or use libcurl from within your own programs. |
To use wget to download the new Mandriva One CD, I opened a terminal and I created a special directory for my ISO downloads; Code: | jp@ubuntu:~$ mkdir distro_downloads
|
Then I went to the site where I wanted to download from:http://ftp.ale.org/pub/mirrors/mandrake/official/iso/2007.1/
I scrolled down to find the ISO that I wanted:
Code: | [ ] mandriva-linux-2007-spring-one-KDE-cdrom-i586.iso 11-Apr-2007 14:45 693M |
Then I right-clicked the mouse to get the menu and selected then I went back to my terminal and typed in: Code: | jp@ubuntu:~/distro_downloads$ wget -c | The -c just tells the operation to continue the download where it left off if the link is broken. Then with a right mouse-click again I selected paste and Viola! Code: | jp@ubuntu:~/distro_downloads$ wget -c http://ftp.ale.org/pub/mirrors/mandrake/official/iso/2007.1/mandriva-linux-2007-spring-one-KDE-cdrom-i586.iso
| Now I've got a download manager without a whole lot of whistles, bells, and buzzers, and my downloads are Screaming!
With wget, I have seen as high as 61.+K/s and as low as 49.+K/s --- much better improvement over the usual 23.+K/s to 27.+K/s that I usually seem to get . Anyways, thanks to crouse for the nudge a while back, I've got a new (to me) way of downloading at the CLI. Next download, I'm going to try
NOTE: I have made some corrections due to mistakes in the articles I have read -- ALSO; curl does not come pre-installed on Ubuntu, so I will just use wget
If I made any mistakes in this let me know, or if you know of a better way to d/l at the CLI, I'm open for suggestions I couldn't fin anything that really explained it in my Linux books, so I've spent a few nights surfing the Internet to get all of this figured out.
BTW, this isn't meant to be a tutorial or anything like that, I'm just "crowing" about learning something new (to me) 
_________________ Dell Box - Arch Linux
Dell Lappy - DreamLinux 3.5 - Default OS
Mepis 8.0 - Backup
Last edited by JP on Sun Apr 29, 2025 4:11 am; edited 1 time in total |
|
masinick Linux Guru

Joined: 03 Apr 2025 Posts: 8615 Location: Concord, NH
|
Posted: Fri Apr 27, 2025 4:51 pm Post subject: wget, curl, and rsync |
|
|
When I am using command line retrievals, I use wget more often than anything else. FYI, though, in Mandriva's own urpmi package handling, which is written in either Perl or Python (the latter I think), it uses curl when it finds a package that needs to be retrieved from its repositories.
There is another retrieval method that is also handy: rsync. It is most useful when you already have a file, directory, or group of files, some of which may need to be updated. With rsync, it will examine what you have, then look at what needs to be copied, make supposedly intelligent decisions about how much data needs to be transferred, then do it. That's all if I understand it correctly!
Most of the time I just use the "Save As..." feature in Firefox and perform an HTTP or FTP file transfer into a predetermined directory. If the connection is lousy, then I use one of these other alternatives. When I am doing distro specific work, I use whichever of these tools ties in most closely with the application or system I am working with. Nice to know something about all of them; each have their advantages and specific applications, but all of them get the job done well. |
|