Scroll to navigation

GALLERY-DL(1) gallery-dl Manual GALLERY-DL(1)


gallery-dl - download image-galleries and -collections


gallery-dl [OPTION]... URL...


gallery-dl is a command-line program to download image-galleries and -collections from several image hosting sites. It is a cross-platform tool with many configuration options and powerful filenaming capabilities.


Print this help message and exit
Print program version and exit
Download URLs found in FILE ('-' for stdin). More than one --input-file can be specified
Filename format string for downloaded files ('/O' for "original" filenames)
Target location for file downloads
Exact location for file downloads
Load external extractors from PATH
Use the specified proxy
Client-side IP address to bind to
User-Agent request header
Delete cached login sessions, cookies, etc. for MODULE (ALL to delete everything)
File to load additional cookies from
Name of the browser to load cookies from, with optional keyring name prefixed with '+', profile prefixed with ':', and container prefixed with '::' ('none' for no container)
Activate quiet mode
Print various debugging information
Print URLs instead of downloading
Print URLs instead of downloading; resolve intermediary URLs
Print JSON information
Simulate data extraction; do not download anything
Print extractor defaults and settings
Print a list of available keywords and example values for the given URLs
Print a list of available extractor modules
Print a list of extractor classes with description, (sub)category and example URL
Write logging output to FILE
Write URLs, which get emitted by other extractors but cannot be handled, to FILE
Write downloaded intermediary pages to files in the current directory to debug problems
Maximum download rate (e.g. 500k or 2.5M)
Maximum number of retries for failed HTTP requests or -1 for infinite retries (default: 4)
Timeout for HTTP connections (default: 30.0)
Number of seconds to wait before each download. This can be either a constant value or a range (e.g. 2.7 or 2.0-3.5)
Number of seconds to wait between HTTP requests during data extraction
Number of seconds to wait before starting data extraction for an input URL
Do not download files smaller than SIZE (e.g. 500k or 2.5M)
Do not download files larger than SIZE (e.g. 500k or 2.5M)
Size of in-memory data chunks (default: 32k)
Do not use .part files
Do not skip downloads; overwrite existing files
Do not set file modification times according to Last-Modified HTTP response headers
Do not download any files
Do not run any post processors
Disable HTTPS certificate validation
Additional options. Example: -o browser=firefox
Additional configuration files
Additional configuration files in YAML format
Additional configuration files in TOML format
Create a basic configuration file
Do not read default configuration files
Username to login with
Password belonging to the given username
Enable .netrc authentication data
Record all downloaded or skipped files in FILE and skip downloading any file already in it
Stop current extractor run after N consecutive file downloads were skipped
Stop current and parent extractor runs after N consecutive file downloads were skipped
Index range(s) specifying which files to download. These can be either a constant value, range, or slice (e.g. '5', '8-20', or '1:24:3')
Like '--range', but applies to manga chapters and other delegated URLs
Python expression controlling which files to download. Files for which the expression evaluates to False are ignored. Available keys are the filename-specific ones listed by '-K'. Example: --filter "image_width >= 1000 and rating in ('s', 'q')"
Like '--filter', but applies to manga chapters and other delegated URLs
Store downloaded files in a ZIP archive
Convert Pixiv Ugoira to WebM (requires FFmpeg)
Convert Pixiv Ugoira to WebM in VP9 lossless mode
Convert Pixiv Ugoira to MKV without re-encoding any frames
Write metadata to separate JSON files
Write gallery metadata to a info.json file
Write image tags to separate text files
Set file modification times according to 'date' metadata
Execute CMD for each downloaded file. Example: --exec "convert {} {}.png && rm {}"
Execute CMD after all files were downloaded successfully. Example: --exec-after "cd {} && convert * ../doc.pdf"
Activate the specified post processor
Additional '<key>=<value>' post processor options


Download images from URL.
Print direct URLs from a site that requires authentication.
Apply filter and range expressions. This will only download the second, third, and fourth file where its type value is equal to "ugoira".
Scan URL for other URLs and invoke gallery-dl on them.
Gain OAuth authentication tokens for deviantart, flickr, reddit, smugmug, and tumblr.


The system wide configuration file.
Per user configuration file.
Alternate per user configuration file.



Mike Fährmann <>



2023-04-30 1.25.3