Experimental. Convenience command that combines map + scrape to save an entire site as local files.
Maps the site first to discover pages, then scrapes each one into nested directories under .firecrawl/. All scrape options work with download. Always pass -y to skip the confirmation prompt.
| --limit | Max pages to download | | --search | Filter URLs by search query | | --include-paths | Only download matching paths | | --exclude-paths | Skip matching paths | | --allow-subdomains | Include subdomain pages | | -y | Skip confirmation prompt (always use in automated flows) |
Download an entire website as local files — markdown, screenshots, or multiple formats per page. Use this skill when the user wants to save a site locally, download documentation for offline use, bulk-save pages as files, or says "download the site", "save as local files", "offline copy", "download all the docs", or "save for reference". Combines site mapping and scraping into organized local directories. Source: firecrawl/cli.