We keep talking about new shiny, and increasingly fragile, controls that will prevent attacks or fiendishly clever algorithms or AI to which we can outsource all that hard or fast thinking we’re not good at but we are all still staring down the barrels of a loaded data breach gun waiting for it to go off. The thing is we seem to be holding that gun to our own heads and it’s not like we don’t realise. All the talk of ‘basics’, ‘essentials, ‘foundations’ points at a relatively common set of issues usually focused on some combination of the following:
- IT Maintenance (patching, replacing end-of-life platforms, inventories, baseline builds etc),
- Network security (internal segmentation),
- Access Management (efficient joiners, movers, leavers processes, privileged user management)
- Security Monitoring (effective visibility),
- Incident Response (tested plans, exercised staff)
I think many of us would be hard pushed to say that effective delivery of those areas wouldn’t dramatically reduce our residual cyber risks and also I think many security leaders would grudgingly agree there is a lot of room for improvement in these areas. I am interested why that is. Most of these have been known as security ‘basics’ for a very long time and we have government advice (Ten Steps to Cyber Security [PDF]) and even government certifications (Cyber Essentials) that talk to the fact that these are important as well as speeches from our regulators, so why are these the areas that let firms down when they are tested or when they are breached?
I’ve spoken before about the constraints on cyber teams from technology teams and once again I think we are looking at a failure to achieve positive cyber outcomes as a result of the delivery of processes critical to those outcomes by technology teams outside the security reporting line who have much less of a stake in delivering effective management compared to hitting a minimal service level and getting the cyber team off their backs.
To be clear Security Monitoring and Incident response are much more within the balliwick of security but IT maintenance (especially), network security and access management are often security requirements that security teams don’t deliver directly.
When boards of directors hold a security leader to account on cyber security many of the performance issues that are surfaced lie outside the direct control of that individual. When a public breach occurs and the root cause analysis is reported to be something in the ‘basics’ there is usually an outcry from talking-head experts that this was a ‘schoolboy error’ and a basic lapse in security that anyone could see would lead to disaster if only they had been asked for their opinion…..
This is why security leaders with strong soft skills such as influencing are increasingly more effective than technical experts who struggle to relate to professionals outside the discipline. The core of the game is now getting other people who are not measured on security and don’t come to work to be a security professional to deliver security outcomes that materially influence the cyber risk exposure of the organisation.
There is great hope in terms of some of the flashy new approaches, but not just those in cyber, those in technology too. The (slow) adoption of immutable systems and the (faster) adoption of automated build and deployment is likely to start cutting down on exposures as a result of poor maintenance. The (on the horizon) adoption of zero-trust networking approaches facilitated by cloud desktops and the automation of zoning rules in automated cloud-based flow control rules as a replacement for complex and hard to maintain manual internal segmentation will reduce the opportunity for small beachheads to turn into full invasions. My worries are that as William Gibson said “The future is already here — it’s just not very evenly distributed.”, we will have hybrid environments for many years to come and we are reliant on the technology teams to adopt and implement these approaches effectively, the same teams that at the moment struggle to deliver IT maintenance effectively.
Our security metrics are important but so increasingly are our technology metrics, how many security leaders get a seat at the table when they are being designed? Something for second line technology risk managers to consider perhaps.
Update:
A weekend spent ensuring patches were installed and mitigations were deployed in the face of the WannaCry worm has really highlighted some of the dependence we have on technology teams for patching effectively in order to deliver good security. I don’t think finger pointing is useful here and technology teams can’t be blamed for a worm but once again the front line of cyber defence was a combination of security practitioners pushing technology practitioners to take actions.
Many organisations have placed patching into technology teams as there was a risk from poorly tested patches that system uptime could be affected. The risk to technology delivery was deemed great enough that they were seen as the obvious home for the process. However, we now find that the effectiveness and timeliness of technology patching processes is a significant risk to security delivery. Is it time for patching to head into Security? Can we highlight and explain the shared accountability between security and technology such that the priority for patching is maintained in technology teams with competing priorities? Is it time to carve our the security essentials from general technology management? Some interesting conversations coming up I think.
3 thoughts on “Not so basic but definitely essential.”
Comments are closed.