How to Assess Security Maturity and Make Improvements

Security maturity matters: You wouldn’t ask a small child to ride a bike without training wheels, or later to drive a car before his little legs could reach the brake pedal. But all too often, the public assumes organizations can quickly move to adopt sophisticated and effective defense-in-depth cybersecurity schemes where no such capabilities have been before. But they can’t. Just as every individual person has to grow up, groups of persons working together in organizations have to take their security programs through a maturation process. They have to crawl and then walk before running.

That is why we use maturity models to measure and benchmark clients’ information security maturity during assessment consulting engagements. As described in our Security Assessments, we can assess an entire security program, or specific security domains. We calibrate our recommendations against the maturity level of the organization in each area. We knit the recommendations together in a comprehensive security improvement road map designed to optimize the client’s future efforts and investments.

security maturity & cybersecurity maturity model

The figure above displays the maturity model we use. Gratefully acknowledging prior research from Carnegie Mellon, Fred Cohen, and others we have developed our own descriptions and assessment criteria for the maturity levels.

Using over 400 criteria that we map to ISO or NIST control frameworks, we can go broad or we can go deep in any given maturity assessment. We can develop maturity scores or benchmark a client against peer organizations in the same vertical industry.  We can also track improvement in scores through multiple assessments over time. 

Learn More

The following paragraphs capture our thoughts on the characteristics of each maturity level, in a broad sense.

Initial Maturity Level

Organizations at the initial level lack formal security policies, functioning security governance or score very low in multiple security domains. Their average score comes out between 1.0 and  2.0. Although most organizations score above the Initial level, we still find some there – including large companies, government agencies, or universities.

In today’s cybersecurity climate, scoring at the Initial level is unacceptable for any organization that owns and manages IT assets, and owes any duties to shareholders, investors, regulators or taxpayers. On the Security Roadmap for Initial scorers, we recommend a crash program to establish a strong CISO (or similar) position, develop or refresh the security charter, establish security governance committees and increase staff to both security policy and security infrastructure development positions.

Developing Maturity Level

The typical organization at the Developing stage has a functioning security program, some security processes and infrastructure elements operating effectively, and multiple security initiatives under development. But they tend to be weak in basic domains such as network zoning / perimeters (where most organizations still have a relatively “flat” or even chaotic network(s) of IT assets) and score relatively low on overall identity and access management (IAM). It’s also unusual to find good levels of accountability and automation in security processes or technology, or to find advanced vulnerability management, data loss prevention (DLP) or security information and event management (SIEM) technologies operating at the Developing stage.

On the Security Roadmap for Developing scorers, we typically recommend fixing any “glaring gaps” found during the assessment and, for the rest, taking a building block approach. That is, get the basics right in the early roadmap stages before progressing to try and fully deploy more advanced capabilities. By the way, other maturity models use the term Repeatable instead of our term, Developing. Can you see how creating repeatable processes and reusable building blocks are critical to security program and infrastructure maturation?

For example, implementing sophisticated DLP and SIEM features, or deploying privileged access management (PAM) comprehensively tend to be overly difficult until basic IT functions like service ticketing, asset management, and IAM are mature enough to support them. That doesn’t mean you should never get started with projects like PAM, SIEM or DLP during the Developing stage, but you have to be extra careful to define a narrow scope for the initial stages of a phased effort.

Defined Maturity Level

Organizations at the Defined stage have established a comprehensive set of organization-wide security processes, policies and documented technical controls. However, they typically remain over-reliant on individual efforts. Processes such as change management, audit and supply chain security still need to improve. More work is also needed on increasing role-appropriate security knowledge and awareness as well as advancing security monitoring, analytics, and privileged access management control sophistication.

On the Security Roadmap for organizations at the Defined level, we emphasize audit, change management, advanced security monitoring and metrics to enhance verification and accountability as well as protection. With basic process and technology infrastructure in place risk management, vulnerability management, SIEM, PAM, and other programs can go into high gear.

We also encourage organizations to focus even more on metrics; early key performance indicator (KPI) and key risk indicator (KRI) measures introduced during the Initial or Developing stages should now be reviewed, improved, and expanded. Not only do better metrics improve accountability (from the risk and finance perspective) they help answer a critical question:

security maturity 2

Unfortunately there’s no good framework in the industry to answer this question in fully general manner nor much hard data on costs, benefits, or control effectiveness. That’s why organizations have to devise their own consensus approach. For what it’s worth, Security Architects Partners and others believe that something between a high “3” and a low “4” represents the sweet spot – that is, the level up to which most organizations continue to perceive rising marginal returns from improving security program maturity.

Managed Maturity Level

Every organization loves to claim its security program is “well-managed” and as they progress into the managed maturity level, they can say that with assurance. At level 4, organizations have defined and built a comprehensive set of people, process and technology controls. However, they remain reliant on manual processes and face challenges sustaining the security program in the face of continuous change to threat, regulatory, technology and business landscapes.  

Therefore, security roadmaps at the managed level must focus on increasing the level of automation for the infrastructure and processes through which they hope to make the program more cost-sustainable or scalable. For example, complex functions such as vulnerability management and security monitoring can be increasingly orchestrated across hybrid public/private cloud environments; access management, security monitoring and DLP advanced to support traditional, virtual, mobile and cloud endpoints. Organizations should also continue working to broaden coverage to additional business units, geographies, and applications.

Optimized Maturity Level

At the optimized maturity level, organizational security programs (almost) have it all. They have raised the bar on organization-wide security process and technology infrastructure to a pretty high level, including accountability, metrics, and automation. But let’s clarify, not all organizations at this level are so intent on continuous improvement that they’ll keep throwing increased funding and resources at security for security’s sake.

Moreover, organizations that get to the optimized level face challenges staying. Sustaining many if not most optimized security programs without unusually high levels of investment requires continuous risk, business, technology, and financial analysis in the face of change.

So what does a Security Roadmap for a level 5 organization look like? It focuses on sustainability and adaptability by continued work on architectural approaches to abstract and future-proof both process and technology interfaces. It seeks to establish an organizational culture that supports continuous improvements in security and risk management-related skills, processes and technologies.

More on Benchmarking and Industry Perspectives

As part of a maturity assessment, many clients like to benchmark their scores against industry peers. Although we can’t reveal specific clients’ names or scores, we’ve maintained average numerical maturity levels found from engagements performed over the years for many clients in several vertical industries. We’ve benchmarked the overall average for all industries and domains at 2.75.

Clients also want recommended numerical targets for improving their maturity scores over a 1, 2 or 3 year time frame. As noted above, there is no industry standard framework or formula for determining the ideal maturity level for a given organization. We use a risk-based approach in consultation with our clients to help them arrive at organization-specific maturity goals. Nevertheless, we believe that the 2.75 overall industry average is much too low.

Financial services have been on the front lines of threat and regulatory pressure. In many cases, they’ve had to mature the security program as a matter of survival. We’ve benchmarked the average large financial services organization at about the 3.5 level. We believe organizations in other industries that score lower – such as government, health care, hi tech, manufacturing and higher education – need to get closer to that 3.5 level. And many of them, along with financial services themselves, need to improve beyond 3.5 as the threat and regulatory landscape evolves.

So – happy maturing! Please check out more information on how we can help with Security Assessments or also with Cybersecurity Readiness Exercises. And contact us if you have any questions about our process, or would be interested in exploring a security assessment engagement.

Skip to content