Outcome Over Output - Measuring Change in Security Education

Outcome Over Output - Measuring Change in Security Education

“100% of employees completed the annual security awareness training 🥳”, “Yay, we are secure 🎊”… 🙃

Fostering and enabling a positive security culture within an organisation takes a lot of work, and relies on understanding the business well, and aligning to the goals of teams, people and leaders. But how do we measure the effectiveness of our efforts within Security Education? Simply tracking training completion rates only tells a small part of the story. This post explores a data-driven approach that goes beyond “outputs” to assess the true “outcomes” of security education and product security enablement initiatives.

Moving Beyond Blame:

We should, and have to, advocate for a “restorative just and learning ” culture. Focus on understanding the root cause of security incidents and risk, and building systems that minimise dependence on perfect human behaviour. Stepping far away from language such as “people are the weakest link”. I was introduced to the Cynefin Framework and like the way this helps us categorise issues as “clear,” “complicated,” “complex,” or “chaotic.” This allows for tailored responses – from clear-cut policies for simple problems to fostering collaboration for complex situations. You can learn more about the Cynefin Framework here: https://thecynefin.co/our-thinking/ , there’s also a great paper on The Theory of Change for Complex Systems .

Understanding Barriers and Cognitive Load:

  • Analyse Common Security Behaviour Barriers: Before implementing change, identify what’s preventing employees from adopting secure practices. Common barriers include a lack of understanding, unclear policies, and usability issues.
  • Measure Cognitive Load: It’s worth calling this out specifically. Consider the mental effort required to perform a secure behaviour. Can the ’task’/ ‘ask’ be completed at all with the current load for a person or team? Change needs space for it to occur, can we simplify things to increase the likelihood of adoption?

Measuring Secure Development Practises and Outcomes:

  • Developer-Side Scans and Actions: Track metrics like the number of hardcoded secrets found in code scans. Analyse the “why” behind these occurrences to identify patterns and learning opportunities.
  • Trend of Insecure Coding Issues: Monitor the frequency of insecure coding practices per application or across project codebases. This helps identify areas needing focused tooling, process or education improvements.
  • Tool and Paved Path Advocacy Measurement: What does the take-up look like after the release of new procedures, tools and guidance? What’s the meantime from release to take-up? Are the code review tools easy to use and integrate into the workflow?

Measuring Security Behaviour Change:

  • A/B Testing and Behavioural Science: Utilise A/B testing and behavioural science principles to determine the most effective ways to influence secure behaviours. This could involve testing different training delivery methods or messaging approaches.
  • Patching Time: Ah patching, there is a lot of talk about here and you’ll see many conversations in the community about how achievable patching really is, but nonetheless we should understand and measure it. Measure the average time to patch vulnerabilities. Understand if your efforts affect this.
  • Security Team Reach Out: Measure what teams/ people reach out to security and what sparks that engagement (positive or negative). Incident reporting rates can be included in this.

Effective Training Delivery:

  • Engagement Metrics and Delivery Channels: Measure engagement with security campaigns across various delivery channels (e.g., face-to-face, webinars, LMS, blogs). Track the simple parts of completion rates, view times, and click-through rates to assess the channels’ effectiveness.
  • Learning Through Discovery: Move from lecture-style training to interactive formats that encourage discovery and problem-solving. By guiding teams to find solutions and understand the impact on security, you leverage the “original idea” principle to increase ownership and adoption.
  • Goal Setting: Clearly define the primary goal of each campaign (e.g., knowledge gain, behaviour change).
  • Measurement Methods: Select measurable and verifiable methods for assessing impact, such as training data, security logs, or employee surveys. Learning Management Systems should have built-in real-time feedback mechanisms, and the same for workshops and live events, don’t just send a survey at the end.

SMART Criteria for Metric Selection:

The SMART criteria for selecting metrics has worked well across more than just cybersecurity. https://www.tableau.com/learn/articles/smart-goals-criteria

  • Specific: Does the metric align with a specific security goal?
  • Measurable: Can the metric be quantified or demonstrate progress?
  • Actionable: Can the results inform future security initiatives?
  • Relevant: Does the metric address your organisation’s security risk profile?
  • Time-Bound: Can the metric be measured at different points in time?

Measuring Communication Effectiveness in Security and Privacy

I strongly suggest reading Melanie Ensign’s post here on measuring communication: https://medium.com/discernible/measuring-communication-effectiveness-in-security-privacy-f72a90c8334a . Below are some short outtakes and some have already been mentioned above but especially for security communication Melanie and the team at Discernible are exceptional, and have great blog posts, resources and a newsletter.

Who ➡️ Says what ➡️ To whom ➡️ With what effect?

  • Output: the number of communication artefacts produced and/or distributed. It’s a measure of what the organisation does rather than its impact. 
  • Outtakes: measurement and analysis of how stakeholders received your communication such as awareness, recall, understanding, and retention.
  • Outcomes: the effect, consequence, or impact of communication activities, ultimately representing the perspective of stakeholders with a quantifiable change in attitude or behaviour.

Institute for Public Relations Communicators Guide to Research, Analysis and Evaluation: https://instituteforpr.org/wp-content/uploads/IPR-Guide-to-Measurement-v13.pdf

Measuring Security Culture Maturity:

This is a topic for a full post on its own, understanding and finding ways to measure “culture” is complex. Security culture maturity requires metrics that go beyond participation rates. We need to understand employee perceptions of security and the overall security culture within the organisation (from top to bottom). What influences it, is there space for change or ways to adapt to work with the existing cultures across the business and teams?

I like the Cyber Security Culture guide from Cygenta: https://www.cygenta.co.uk/cyber-security-culture-guide .


There is no one-size-fits-all, what metrics are available and what matters to each person, team and company will be different so this means everything needs to adjust and that’s why often these generic blog posts can fall short, for me this is a guide with key elements to remember and not lose sight of. Shallow metrics don’t reveal risk. By focusing on the “why” behind security incidents and behaviours, we can gain a real understanding of what, what and if we can change and get that insight into how effective we are.

Related Posts

Effective Teaching and Learning Techniques for Security Education

Effective Teaching and Learning Techniques for Security Education

Security Education is Changing Traditionally this function has focused on awareness and knowledge transfer, but that has begun to change, thankfully.

Read More
Books... They Have Been Read!

Books... They Have Been Read!

I’m a collector I have to admit something… I am a book collector.

Read More
In Praise of Constraints

In Praise of Constraints

Our digital lives allow for so much sprawl, we get lost, I get lost.

Read More