One of the side effects of podcasting is that I read a lot of infosec news on a daily basis and a lot of industry reports. Sometimes, I see an odd overlap. For instance, I was reading this article about a survey from McAfee on how long IT professionals believe it would take for them to detect a breach. The numbers were all over the map, but is described like this:
“… 22 percent thought they’d need a day to recognise a breach, with one in twenty offering a week as a likely timescale.
Just over a third said they would notice data breaches in a matter of minutes, which counts as real-time by today’s standards.
In terms of general security, three quarters confidently reckoned they could assess their security in real-time, with about the same number talking up their ability to spot insider threats, perimeter threats and even zero-day malware.”
The article raises the point that the polled population seems overly optimistic, however I think it needs to be explored a little deeper.
- Mandiant’s 2013 annual report claims that data breaches take an average of 243 days to find.
- Trustwave’s report finds that the average to be 210 days.
- Verizon’s DBIR finds that 66% of breaches in the scope of their report took “months” to “years” to discover.
This is not a minor miss. This is not “being overly optimistic”. This is a fundamental lack of understanding of the world we live in.
What concerns me most about this disconnect is how these beliefs are used as an input into risk management processes. If organizations are prioritizing their security efforts based on the input from internal authoritative sources, such as the 500 people McAfee polled, that breaches will be detected quickly, when in reality they take months, there will be little appetite to make improvements in detection capabilities.