-
Notifications
You must be signed in to change notification settings - Fork 187
Crawler Limitations and Request Throttling
This section includes options to limit the depth of exploration, throttle the number of requests, and control the overall scan time.
-
-r PARAMETER, --remove PARAMETER
Remove the specified parameter from URLs before scanning. -
--skip PARAMETER
Skip attacking the given parameter(s) in URLs or forms. -
-d DEPTH, --depth DEPTH
Set how deep the scanner should explore the website. A higher depth means the scanner will follow more links. -
--max-links-per-page MAX
Limit the number of (in-scope) links the scanner will extract for each page. -
--max-files-per-dir MAX
Set how many files or pages the scanner should explore within each directory. -
--max-scan-time SECONDS
Set the maximum time (in seconds) the scan should take. You can provide fractional values. -
--max-attack-time SECONDS
Limit the time each attack module should run (in seconds). You can use fractional values for more precision. -
--max-parameters MAX
If a URL or form has more thanMAX
input parameters, Wapiti will not attack it. -
-S FORCE, --scan-force FORCE
This is an easy way to adjust the scanning and attack intensity. Valid choices are:paranoid
sneaky
polite
normal
aggressive
insane
-
--tasks tasks
Specify the number of concurrent tasks to use for crawling the target, which affects how fast the scan proceeds.