On this page, select parameters that are used during scanning a web application. Click on each type of setting to view fields and details of each field.
This section defines how scanner should submit request to forms and the crawl scope.
Form SubmissionForm Submission
Select the type of requests to be submitted to web crawler.
- Select None if you want no requests submitted to forms unless application authentication is requested, in which case only the login form will be tested.
- Select Post to limit web crawling to POST forms.
- Select Get to limit web crawling to GET forms.
- Select Post & Get for the web crawler to submit request for all forms. This is the default request type. For authentication, this option is recommended best practice to ensure maximum vulnerability analysis and the most comprehensive scan results.
Form Crawl ScopeForm Crawl Scope
Select Form Crawl Scope to prevent forms with same fields from being ignored during the scan.
If you select the Include form action URI in form uniqueness calculation check box, form action URI is used in addition to form fields to calculate form uniqueness.
Maximum Links to crawlMaximum Links to crawl
Enter the maximum number of links and forms to crawl during the scan. The initial setting is 300 links per web application. The maximum is 8,000 links per web application.
Specify a user agent for scans using this option profile if your web application requires specific user-agent string to access it. Note that the value entered here will override any user agent specified in the web application settings under Header Injection.
If you do not specify any user agent settings, the scan uses default user agent settings. The default user agent setting that is used is user-agent: Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420+ (KHTML, like Gecko) Version/3.0 Mobile/1A543a Safari/419.3
Note: If your web application uses certain user settings, and if you do no the specify the same, the web application may not get loaded in such cases.
To know the user-agent string for a web application, copy the user-agent string from the request headers, by accessing the web application from your desired browser.
Request Parameter SetRequest Parameter Set
A parameter set is the request parameter settings that is injected into your web applications during scanning. These parameters are used while identifying forms and other request parameters are used by the web application being scanned. (These settings are not used for authentication or Selenium scripts.)
Use the default parameter set that is provided or create a new parameter set. To configure a new parameter set, go to Configuration > Parameter Sets.
We identify forms and other request parameters used by the web application being scanned. (These settings are not used for authentication or Selenium scripts.)
Select the Ignore common binary files based on file extensions check box to exclude the files with pdf, zip, and doc extensions during the scan.
By default, the check box is selected.
This section defines enhanced crawling parameters.
Enhanced CrawlingEnhanced Crawling
Select the Enhanced Crawling check box to enable enhanced crawling for scans using this profile.
With enhanced crawling enabled, we could cover links which are not detected organically but could have some vulnerabilities. We will re-crawl individual directories present in the links which are found during crawling. Enhanced crawling adapts a directory chopping approach. For the links that are crawled and response for the requested link is received, the response will be the candidate for chopping. All the directories existing in the link will get chopped and all the newly generated links will get added to the crawl queue as requests.
For example, lets assume if the following link is found during crawling:
https://www.example.com/foo/abc/xyz/register.php
If the enhanced crawling is enabled, it will first make a request to https://www.example.com/foo/abc/xyz
and will then remove the directory "xyz/" from the URL and crawl, https://www.example.com/foo/abc/
and later it will further remove "abc/" and will crawl https://www.example.com/foo/.
All the links found during this process of removal and re-crawling will get added to the crawl queue thus improving the scan coverage.
Enable SmartScanEnable SmartScan
Select the Enable SmartScan check box to enable SmartScan for scans using this profile. By default, the SmartScan is enabled.
SmartScan adds additional, more advanced scanning capabilities for testing web applications based on these frameworks: Angular JS, AJAX, Bootstrap, DWR and GWT.
This section defines the threshold for the timeout and unexpected consecutive errors that would be allowed during a scan. Once the threshold is exceeded, the scan is terminated.
By default, the Timeout Error Threshold and Unexpected Error Threshold check boxes are selected and default values are populated.
Timeout Error ThresholdTimeout Error Threshold
Select the check box and enter the number of consecutive timeout errors allowed during the scan. If the count of timeout error exceeds the threshold, the scan is terminated.
Unexpected Error ThresholdUnexpected Error Threshold
Select the check box and enter the number of consecutive unexpected errors allowed during the scan. If the count of unexpected error exceeds the threshold, the scan is terminated.
Select overall intensity of the web application scan. You can select predefined values or add custom values. By default, the Pre-Defined option is selected.
Pre-Defined scan intensityPre-Defined scan intensity
Select a value from the Scan Intensity list. By default, it is set to Low.
- Maximum - Scan performance is optimized for maximum bandwidth use. Scans at Maximum intensity level may be faster to complete but could overload your network, web server or database. In addition, scanning an under-powered web application at this setting could result in the scan ending with "Service Errors Detected" status.
- High - Scan performance is optimized for high bandwidth use.
- Medium - Scan performance is optimized for medium bandwidth use.
- Low - Scan performance is optimized for low bandwidth usage.
- Lowest - Scan performance is optimized for the lowest possible bandwidth use.
Custom scan intensityCustom scan intensity
Enter values in the following fields:
- # of HTTP Threads to define how many threads should be used to scan each host.
- Delay Between Requests in milliseconds (the delay introduced by WAS in between the scanning engine requests sent to the applications server).
Select the Use password bruteforcing check box to check vulnerabilities related to password-cracking techniques.
By default, the Use password bruteforcing check box and the System List with the Minimal option is selected.
If you choose the password bruteforcing option, select User list to use a custom user-defined password bruteforce list and then select the list.
To use a service-provided list select System List, and then select one of the options provided:
- Minimal (empty passwords + UID = password). Test the user name as a password and the empty password.
- Limited (+ 10 most common passwords). Test the user name as a password, the empty password, plus the 10 most common passwords from our passwords list.
- Standard (+ 20 most common passwords). Test the user name as a password, the empty password, plus the 20 most common passwords from our passwords list.
- Exhaustive (will increase scan time). Test the user name as a password, the empty password, plus all passwords from our passwords list.
- Custom. Test some custom number of password in addition to the user name and empty password.
Tip - Use this option if you have a lockout mechanism for a number of failed attempts. When selected, enter the maximum number of passwords to be tested. If you enter 10, we'll test the user name as password, the empty password, plus the 8 most common passwords from our password list.
Next Step: Option Profile - Search Criteria