The scrapy command-line tool
scrapy [command] [OPTIONS] ...
Scrapy is controlled through the scrapy command-line tool. The script provides several commands, for different purposes. Each command supports its own particular syntax. In other words, each command supports a different set of arguments and options.
Fetch a URL using the Scrapy downloader
--headers
Print response HTTP headers instead of body
Run a spider
--output=FILE
Store scraped items to FILE in XML format
Query Scrapy settings
--get=SETTING
Print raw setting value
--getbool=SETTING
Print setting value, intepreted as a boolean
--getint=SETTING
Print setting value, intepreted as an integer
--getfloat=SETTING
Print setting value, intepreted as an float
--getlist=SETTING
Print setting value, intepreted as an float
--init
Print initial setting value (before loading extensions and spiders)
Launch the interactive scraping console
Create new project with an initial project template
Print command help and options
Log file. if omitted stderr will be used
Log level (default: None)
Disable logging completely
Always use this spider when arguments are urls
Write python cProfile stats to FILE
Write lsprof profiling stats to FILE
Write process ID to FILE
Set/override setting (may be repeated)
Scrapy was written by the Scrapy Developers.
This manual page was written by Ignace Mouzannar <[email protected]>, for the Debian project (but may be used by others).