Re: Level of Intrusiveness caused by Scans (535 Views)
Reply
Frequent Advisor
AutoDan
Posts: 48
Registered: ‎12-11-2011
Message 1 of 6 (627 Views)
Accepted Solution

Level of Intrusiveness caused by Scans

Hi,

 

Im quite new to Web Inspect and AMP, and was hoping somebody could help me with the following queries.

 

My team has been conducting scans in our test and preprod environment using a customised scan policy, which I believed is based on the default OWASP Top 10 Template (created before is started with the team).

 

By default we always use the "Crawl and Audit" option and all our scans have the option "Auto fill web forms during crawl" selected (using the default values).

 

What I would like to know is, that given the above settings, what level of intrusiveness/footprint will be left on the servers and databases of the applications we scan. For instance, given the above settings, will our scans create records in the database and potentially falsify test-data used for manual testing? What other potential problems could be introduced to the applications/servers we are conducting these scans on?

 

Also is there a way to determine the level of intrusiveness associated with a scan template? i.e. which scan templates will generate database transactions etc.

 

Many thanks,

 

Dan

 

Please use plain text.
Frequent Advisor
Atman_1
Posts: 31
Registered: ‎01-04-2011
Message 2 of 6 (604 Views)

Re: Level of Intrusiveness caused by Scans

Hello Dan,

 

Here are the answers to your questions.

 

1) will our scans create records in the database and potentially falsify test-data used for manual testing??

- Yes possibly depends on your custom policy if it includes the SQL injection checks then it attempts to write it to the database. You just need to run the scan with your custom policy and see what it does.

  

2)  What other potential problems could be introduced to the applications/servers we are conducting these scans on?

-   Depends on your site. E.G.. If your site has an upload file page then WI will attempt to upload files.

  

3) Also is there a way to determine the level of intrusiveness associated with a scan template? E.G... which scan templates will generate database transactions etc.

 

Yes. If you go to the Tools -> policy manager (File->New) you can see different types of policies. E.G... Passive Scan Policy and Quick are less intrusive and Assault and Aggressive SQL Injection are more intrusive. If you can go to the policy manager it will give you more idea on different types of policies and attacks.  

 

Let me know if this helps.

Please use plain text.
Frequent Advisor
AutoDan
Posts: 48
Registered: ‎12-11-2011
Message 3 of 6 (596 Views)

Re: Level of Intrusiveness caused by Scans

Hi Atman, thanks for your reply, that does clear up quite a few things, but I still have some questions.

 

Given your answer to my first question, can it be assumed that SQL Injection is the only attack within the OWSAP Top 10 that will create records in the database? If this attack was to be excluded from the scan policy would this mean no records would be created in the database?

 

Also, given the default selection in the scan wizard to auto-fill form fields, would this mean that records will be created in the database regardless of the scan policy?

 

 

 

 

Please use plain text.
Valued Contributor
jfapple
Posts: 44
Registered: ‎01-07-2011
Message 4 of 6 (584 Views)

Re: Level of Intrusiveness caused by Scans

No, you can't assume, isolate, or even associate an audit engine or attack type such as SQL Injection with adding records to the database. If you exclude SQLi from the policy, WI won't send any SQLi attacks to the site, but you are correct that regardless of scan policy, other WI activities can still create records.

 

Depending on the site, WI can add records even on a simple crawl. For example, if your site has a forum page and WI discovers the web form that adds to the forum, WI will attempt to submit the form using the default WebForm values After the scan, if you see records with default values like 12345, foo, or Jack Frost, then you will know these records came from WI. I have seen WI add several hundred records to a forum site after only one Standard level scan.

 

For this reason, it is never recommended to scan a production site. I also recommend capturing the pre-scan snapshot (or backup) of the site/database before the scan; then you can revert to the pre-scan state if you need to run multiple scans or at the conclusing of your pen testing. We often install the site/database in a VM where reverting to the pre-scan snapshot is a simple 5 minute process. In this environment, WI writing records to the database is not a concern.


Hope this helps,

Jeff

Please use plain text.
Respected Contributor
HansEnders
Posts: 585
Registered: ‎07-01-2008
Message 5 of 6 (575 Views)

Re: Level of Intrusiveness caused by Scans

jfapple gave the best response.  If you open WebInspect's Help guide, there is an article titled, "Preparing Your System For Audit", (under the Getting Started branch) which speaks to these risks.

 

In my experience, it is generally the Crawler that can cause the most havoc and not the audit engines as many assume.  The Crawler's job is to exercise everything on the site and if that page is asking the user to "Set Back To Factory Defaults" or "Change password?" (ChangePassword.do), that is what the Crawler will do many times in submitting the links on the page.  These events can be mitigated by scan configurations as needed:   Session Exclusions, Request Filters, List-Driven or WorkFlow Driven Audit-Only, non-administrative testing account, Restrict to Folder, et al.

 

Also, our SQL Injection attacks only make TRUE/FALSE probes and inserts SELECT statements.  Damage caused within the database will not be from our queries so much as from how the form inputs are designed to operate by the web app into that database.

 

For example, I once had to field a complaint from an Asian client whose server's root password was edited by WebInspect.  What actually occurred was that the Crawler submitted a multi-part form.  The page appeared to be an upload page, so WebInspect normally attempts to create a TXT file with a standard sentence as its content.  On the back-end this form input was used as the new root password which became, "File created by WebInspect", something that could not have been determined by reviewing the HTTP Request.

 

So when approaching an unknown or production system for the first time, you will want to use a pattern of testing that gradually increases the level of intrusiveness of the testing.  Here is a brief description of that process.

 

 

 

+++++

When testing systems from lightest to heaviest, I would use the following pattern of increasing intrusiveness.  Depending on your needs, you might want to drop back to step #1 whenever changing the user credentials or authentication for WebInspect, such as when going from a non-authenticated assessment to one with user-level credentials, or when changing that one to one with administrative-level credentials.  Bear in mind that we recommend only using plain, user-level permissions and that they will generate an adequate security review without the risk of crawling the site with administrative privileges.

 

1.  Crawl-Only with the Web Forms submission disabled (page four of the WebInspect Scan Wizard).

 

2.  Crawl-Only with the Web Forms submission enabled.  Or keep Web Forms submission disabled and use the Interactive scan method by enabling the "Prompt me for web forms" box found within the Method panel of the scan settings.
 Use your review of assessment #1 to determine if there are specific pages to exclude (Session Exclusions scan setting), or forms submissions to redirect/neutralize (Filters scan setting).

 

3.  Crawl-and-Audit using all of the setting from scan #2, and specifying the Passive scan policy on the third screen of the Scan Wizard.

 

4.  Crawl-and-Audit using all of the settings from scan #2, but using another light-weight scan policy such as Safe or Quick.

 

5.  Crawl-and-Audit using all of the settings from scan #2, but using a more "normal" scan policy such as Application (Only), OWASP Top Ten, XSS Only, or the Standard.

 

6.  Crawl-and-Audit using all of the settings from scan #2, but using a hazardous scan policy such as Assault or All Checks.  The All Checks policy includes 100% of our vulnerability database, including older, less accurate checks, so this scan will take the longest to complete and has the highest probability for False Positives and other chaff.

+++++

 

 

Here are two additional methods for controlling the inputs used by WebInspect.

 

1.  custom forms input - Encountered on the fourth page of the scan wizard or from the Tools menu as Web Forms Editor, this permits you to augment the dummy values used by the Crawler.  This may help locate or clean-up records later.

 

2.  customized attack engines - Within the Policy Manager's tool menu, or the Attack Exclusions scan settings you will find the Audit Inputs Editor.  This little-known tool permits the user to modify the inputs for select attacks that offer this customization.


-- Habeas Data
Please use plain text.
Frequent Advisor
AutoDan
Posts: 48
Registered: ‎12-11-2011
Message 6 of 6 (535 Views)

Re: Level of Intrusiveness caused by Scans

Sorry for the delayed reply.

Thankyou both jfapple and Hans for your replies, this clear up things quite a lot. Now to change our current procedures for conducting scans :o)
Please use plain text.
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation