12-09-2013 11:41 PM
During a Web Application Vulnerability Assesment using HP-WebInspect, has any one come aross a state, where the number of Crawls is much more ( 50 times ) than the number of Audits ? If so, what might be the reason for this ?
According to HP's recommendation, the number of Audits should be 3"x" times the number of Crawls on an average. The same was happening earlier. Now We see a remarkable drift in the way the scan progress. Also the amount of time it takes to complete a scan is too much ( close to 10 days when compared to 2 days earlier ). We are not using a huge web server though.
Does anyone have a suggestion/recommendation on how to optimize the usage of HP-WebInspect ?
12-16-2013 11:25 AM
I believe I am the one who commented on the Audit counts being 3x the Crawl counts. That is typical. What you are describing is very different and not expected. I will assume that you are reporting the final counts once the scan has completed, not at some point when the scan was Paused or halted.
The first things I would investigate are the Session Exclusions. Perhaps you defined a Session Exclusion that permits Crawling but no Auditing of the specified URI/page?
Finally, I must say that if your scan is running 10 days, then you are doing it wrong. ;-) Unless it is dealing with a truly large system, WebInspect scans should generally not even run overnight. My suggestion is that you contact Fortify Support and review the scan results and the Scan Settings together. I am sure there are features that could be enabled or configured to make your scan much more efficient. Key settings that come to mind might be Redundant Page Detection, Session Exclusions, deep-linked Start URL field, et al.
-- Habeas Data