Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Cybersecurity experts already know the “security best practices”, so what’s preventing them from being implemented?

Richard Pain, Cybersecurity Specialist, CIO Asia | Sept. 21, 2017
We examine why following best practices continues to be a challenge and what can be done about it.

RSA Logo This article is sponsored by RSA 

Cybersecurity Execs


The number of record breaking data breaches is increasing every year, the most recent of which being the Equifax data leak which may have exposed 143 million customers' personal data. Although every incident has its own unique factors, what often becomes apparent in incidents like this is that there were basic failings in applying cybersecurity best practices.

One of the first people to comment on the attack was John Pescatore, Director of the SANS Institute, who assessed that: "Failure of application security, including possible failure to mitigate the Apache STRUTS vulnerabilities enabled the breach. Other failures in basic security hygiene obviously led to a time to detect of almost 2 months."

One of the big red flags which Pescatore highlights, is the lag in detection time of this breach. However this is a major issue in cybersecurity worldwide, not just in this specific instance. Gartner estimates that it takes on average 205 days to detect a breach, whereas RSA estimates that it's closer to 250. Other organisations give similar estimates, but no matter whose numbers you use the message is unanimous that there's serious dwell time and a lag in detection.

Extended dwell time allows attackers to scout networks, move up the cyber kill chain and ultimately increase the cost of the eventual breach. It then follows that if dwell time can be reduced, the impact of breaches will also be reduced.

However for anyone that's worked in cybersecurity for any length of time, it's a near guarantee that they are already aware of the best practices they should be following. Read any cybersecurity guide or attend any conference and you will hear a lot of the same messages being repeated, as though that's it, problem solved. This raises the question that if the best practices are already known and can prevent an attack or mitigate the impact of cyberattacks, why are they not uniformly being applied?

The short answer is that it's not quite that simple. Applying the best practices is a challenge in itself, so to explore this point let's examine a few and see what's preventing them from being implemented:


Best practice: Diligently patch your systems when patches are released.

Barrier: Patching critical software can lead to unacceptable downtime and there are so many patches being released meaning that there is often a lag between when a new patch becomes available and subsequently installed, creating a window of vulnerability. Another issue is that certain software/firmware updates are not monitored at all, for example, when was the last time anyone updated their printer or router firmware?


1  2  3  Next Page 

Sign up for Computerworld eNewsletters.