wget is a non-interactive network downloader. It supports HTTP, HTTPS, and FTP. Great for downloading files, mirroring websites, and recursive downloads.
Basic Usage
- wget <url> - Download file
- wget -O filename <url> - Save with custom name
- wget -P /path/ <url> - Save to directory
Download Options
- -O filename - Output filename
- -P prefix - Directory prefix
- -c - Continue partial download
- -N - Timestamping (only newer)
- -nc - No clobber (skip existing)
- -b - Background download
- -q - Quiet mode
- -v - Verbose
Recursive Downloads
- -r - Recursive download
- -l depth - Recursion depth (default 5)
- -k - Convert links for local viewing
- -p - Download page requisites
- -E - Save HTML as .html
- -np - No parent (don't go up)
- -nH - No host directories
- --cut-dirs=N - Ignore N directories
Filtering
- -A ext - Accept extensions (jpg,png)
- -R ext - Reject extensions
- -I list - Include directories
- -X list - Exclude directories
- --accept-regex - Accept regex pattern
- --reject-regex - Reject regex pattern
HTTP Options
- --header="Header: Value" - Custom header
- -U agent - User agent
- --referer=url - Set referer
- --post-data="data" - POST data
- --post-file=file - POST from file
- --method=METHOD - HTTP method
Authentication
- --user=user - Username
- --password=pass - Password
- --ask-password - Prompt for password
- --http-user=user - HTTP auth user
- --http-password=pass - HTTP auth pass
Cookies
- --save-cookies file - Save cookies
- --load-cookies file - Load cookies
- --keep-session-cookies - Keep session cookies
SSL/TLS
- --no-check-certificate - Ignore SSL errors
- --ca-certificate=file - CA cert file
- --certificate=file - Client cert
- --private-key=file - Private key
Proxy
- -e use_proxy=yes - Enable proxy
- -e http_proxy=url - HTTP proxy
- -e https_proxy=url - HTTPS proxy
- --no-proxy - Disable proxy
Speed & Limits
- --limit-rate=rate - Limit download speed
- -w seconds - Wait between requests
- --random-wait - Random wait (0.5-1.5x)
- -t tries - Retry count (0=infinite)
- -T timeout - Timeout seconds
- --dns-timeout - DNS timeout
Common Examples
Simple Download
wget https://example.com/file.zip
Download single file.
Resume Download
wget -c https://example.com/large-file.iso
Continue interrupted download.
Download to Directory
wget -P ~/Downloads https://example.com/file.zip
Save to specific directory.
Mirror Website
wget -m -k -p -E -np https://example.com
Create offline mirror of site.
Download All Images
wget -r -A jpg,jpeg,png,gif -np https://example.com/images/
Download only images recursively.
Background Download
wget -b https://example.com/large-file.iso
Download in background.
Rate Limited
wget --limit-rate=200k https://example.com/file.zip
Limit to 200KB/s.
With Authentication
wget --user=admin --password=secret https://example.com/protected/
Download with credentials.
From File List
wget -i urls.txt
Download URLs from file.
Spider Mode
wget --spider https://example.com/file.zip
Check if file exists (no download).
Ignore Robots
wget -e robots=off -r https://example.com
Ignore robots.txt.
Custom User Agent
wget -U "Mozilla/5.0 (Windows NT 10.0)" https://example.com
Spoof browser user agent.
Mirror Options Explained
wget -m -k -p -E -np https://example.com
- -m (--mirror) - Mirror mode (-r -N -l inf --no-remove-listing)
- -k (--convert-links) - Convert for local viewing
- -p (--page-requisites) - Get images, CSS, JS
- -E (--adjust-extension) - Save as .html
- -np (--no-parent) - Don't go to parent directory
wget vs curl
- wget - Better for downloads, recursive, mirroring
- curl - Better for API calls, more protocols, more flexible
Tips
- Use -c to resume interrupted downloads
- Use -b for large files (runs in background)
- Use --limit-rate to be polite to servers
- Use -np to stay within target directory
- Use --random-wait to avoid being blocked
- Check wget log with tail -f wget-log
- Great for scripting and automation
- Be respectful when mirroring sites