Quantitative Look at Penetration Testing

Since 2004 Matta has been running a project to test the technical competence of security consultants. With probably the largest collection of data on the methodology’s and technical approach taken by some of the worlds best known security companies, some interesting conclusions can be drawn. The tests started as a result of a client asking for our assistance in running an RFP. The program grew from there as other companies became interested in doing the same thing. In 2006, Matta provided Royal Holloway University with a version of Sentinel to make the country’s first Penetration Testing vulnerable network for its students.

As there is a lot of data, the conclusions you reach will really depend on what you are looking for in the first place. I’ve documented in this article some of the things which I feel may be of general interest.

I would like to preface this article by saying that it is human nature to find the bad news more interesting than the good news. In our tests, we saw many impressive consultants. We tested companies which acted professionally and competently throughout, and there are consultancies who we admire and respect as a result of working either directly or indirectly with them. Many other companies could have set up the Sentinel program, and we don’t place ourselves higher than our peers. It just so happened that it was Matta that was asked to do it. Typically, the clients who have run Sentinel programs, are either looking for a global Penetration Testing supplier – which Matta is not – or they are running internal accreditation schemes. Our reports have always been considered objective, and if we have something subjective to say, it goes on a separate page in the report, which is marked as a subjective observation.

Looking at some findings then, the first, and perhaps the most startling fact of all is that every consultant who has gone through the test has always found vulnerabilities with their tools, which then failed to make it on to their final report.

We sniff and log all the network traffic during the test, and are often required to demonstrate to the vendor that they did indeed find the issue, which was then absent on their report.

Clearly, there is a real problem with time limited tests, and the work required to go through reams of unqualified data to sort out the real issues from the false positives. Things just get missed. Importantly it seems, at least in our tests, something gets left out on every occasion. Our tests are intense, and time limited, so perhaps a fair conclusion to reach is that if the consultant is similarly under pressure, either internally, or from the client, then expect to get incomplete results.

In some cases though, we also feel that the consultants would probably have missed the vulnerabilities regardless of the time limit. We believe that the output from some common tools is not so easy to read for those with less experience. Second, and for me the most baffling observation, is that some consultants – a small minority – but enough to be significant, don’t seem to read their briefing notes. This is really concerning. Each test we run has an engagement protocol. The vendor is given a set of briefing notes, with key information, including perhaps some login credentials to a web application, or a request to treat a database exactly as if it were a production system.

So whilst most consultants had no trouble executing the tests with these instructions, one consultant repeatedly crashed the database to get debug information. Not something you would want do on a production database! On a similar note, we did hear a real life story from a client, in which a penetration tester had tried to drop a database to prove he had effected a compromise. Fortunately, due to mitigating factors, he was unable to drop it, but the client was less than happy, and I don’t believe they required his services again.

Another consultant on our test, ran the password cracking tool, John the Ripper, on a system he was required to treat as production. He used 100% of the CPU for 24 hours on our ‘production’ server trying to crack the password. The sad thing was that the password was blank, and he never cracked it. His report stated that our password policy was very robust.

A further example with passwords was someone who spent hours trying to crack a password on an application, when the objective was privilege escalation, and the username and password were given to him in the briefing document. If only he had read it!

Most consultants of course, actually do read the briefing notes, and follow the instructions as you would expect, but if you’re engaging with a new vendor, it certainly pays to make no assumptions.

Third, every vendor has a methodology statement, and clearly some follow it, but actually we find many do not. This is one area, I believe we as an industry can do much better. The old UK government CHECK approach is a good one, and anyone can follow it regardless of whether you have CHECK accreditation or not. I believe that many vendors are not active enough in ensuring their adopted methodology is followed. Typically, some of the issues we have seen include:

  • missing issues, because the consultant has not stepped through it in a logical and progressive manner
  • going in too ‘deep’ because the consultant gets excited about some vulnerability they’ve found, but then forgets, or runs out of time to do some of the basics
  • running exploits, changing passwords, and failing to clean up afterwards. In the real world we have been on incident response calls where the ‘hacked host’ was just the result of a previous security consultant failing to clean up after an assessment.

As I mentioned before, there are companies out there who we admire and respect. We have worked with companies who were pinging our network, waiting for us to open the firewall to them and start the test. They worked round the clock, were courteous, communicated with us when necessary, and didn’t stop until we closed the connection at the end of the test. Then there were those that started late, and finished at 5 p.m. on the dot, even though they still had much more to do. There were those that read the briefing notes, and those that didn’t. Those which scanned all 65k+ ports, and those which did a quick scan only.

All consultants and vendors are not equal. Some of the less competent vendors are nevertheless good at selling their services to clients who may not be aware how to judge the difference. More often nowadays we see companies choosing their Penetration Testing vendors based on incorrect metrics, such as accreditations of varying value, and of course on price. My hope is that an independent body of technically competent people with experience in Penetration Testing, but who are not vendors, set up a program which works in a way similar to how we have run Sentinel, and to award technical accreditations to individual consultants, not companies, in a range of technical security assessment areas. Until then, as a vendor, we’ll continue to be put under pressure to ‘buy’ every new PCI, CISSP,CREST, CEH, et all accreditation to be competitive in the market, and most companies will continue to operate in the dark without a set of good, industry standard, technical metrics to guide them.

Don't miss