The training fallacy
Every year, employees across thousands of organisations sit through security awareness training. They learn not to click suspicious links, not to share passwords, and not to plug in unknown USB drives. They pass a quiz. They receive a certificate. The compliance team records the completion.
And very little changes.
This is not because the training is wrong. The advice is sound. It is because training — especially annual, module-based training — is a remarkably weak mechanism for changing behaviour. Research in organisational psychology consistently shows that knowledge alone does not drive behaviour change. People already know they should not click phishing links. The problem is not knowledge. It is environment, incentives, and systems.
Security culture is not an awareness problem. It is a design problem.
What we mean by culture
Culture, in the organisational sense, is the set of implicit norms that govern how people actually behave — as distinct from how policy says they should behave. It is the aggregate of thousands of small decisions made when no one is watching and no policy specifically applies.
A strong security culture is one where engineers consider security implications during design reviews without being prompted. Where a developer who notices a suspicious log entry escalates it immediately rather than assuming someone else will. Where "we should think about the threat model" is a normal part of technical discussion, not an interruption of it.
A weak security culture is one where security is perceived as someone else's responsibility — the security team's problem, the compliance team's burden. Where security controls are seen as obstacles to velocity rather than enablers of trust. Where incidents are met with blame rather than learning.
The difference between these two states has almost nothing to do with what people were told in a training session.
Systems over sermons
If you want to change security behaviour, change the systems that shape behaviour. Here are five mechanisms that are more effective than any training module.
1. Make the secure path the easy path
The single most powerful principle in security culture is this: the default behaviour should be the secure behaviour. If developers have to go out of their way to do the right thing, many will not — not because they are negligent, but because they are busy, under pressure, and optimising for the most immediate problem.
Embed security into the tools and workflows engineers already use. Pre-configured CI/CD pipelines that include SAST and dependency scanning. Infrastructure-as-code templates that enforce encryption, least-privilege IAM roles, and network segmentation by default. Secret management tools that are easier to use than hardcoding credentials.
When a developer pushes code, the pipeline should automatically check for vulnerabilities, exposed secrets, and misconfigured infrastructure. The feedback should be immediate — in the pull request, not in a quarterly report. The secure path should require less effort than the insecure one.
2. Integrate security into existing rituals
Every engineering team has rituals: sprint planning, design reviews, retrospectives, on-call rotations. Security should be woven into these existing ceremonies rather than bolted on as separate meetings.
Add a "security considerations" section to your design document template. Not as a checkbox, but as a prompt: What are the trust boundaries in this design? What data crosses them? What happens if this component is compromised? When teams see these questions in every design review, they become habitual.
Include security metrics in engineering dashboards. Mean time to remediate vulnerabilities alongside deployment frequency and incident response time. When security data sits next to performance data, it is treated with the same seriousness.
Incorporate security items into sprint planning. Vulnerability remediation, dependency updates, and access reviews should compete for prioritisation alongside feature work — not be relegated to a separate backlog that is perpetually deprioritised.
3. Blameless incident culture
How an organisation responds to security incidents — and near-misses — is the most visible signal of its security culture. If incidents are met with blame, people stop reporting them. If near-misses are ignored, they become incidents.
Adopt a blameless post-incident review process. The goal of the review is to understand what happened and why the system allowed it to happen — not to identify who is at fault. This is not about absolving responsibility. It is about recognising that in complex systems, incidents are usually the product of systemic conditions rather than individual failures.
When an engineer accidentally exposes a credential in a commit, the blameless question is not "why did they do that?" but "why did our system allow a credential to reach a public repository without detection?" The answer leads to better controls — pre-commit hooks, secret scanning, automated revocation — rather than better memos about being careful.
Organisations with strong blameless cultures report more incidents and near-misses. This is a feature, not a bug. You cannot fix what you cannot see, and you will not see what people are afraid to report.
4. Security champions, not security police
The traditional model — a centralised security team that reviews everything and approves changes — does not scale. It creates a bottleneck, fosters an adversarial relationship between security and engineering, and concentrates security knowledge in a small group.
The champion model distributes security responsibility across engineering teams. Each team designates a security champion — an engineer with an interest in security who receives additional training and serves as the team's security liaison.
Champions do not replace the security team. They extend it. They participate in threat modelling sessions, triage vulnerability findings for their team, and bring a security perspective to design discussions. They are the first point of contact for security questions within their team, reducing the load on the central team and increasing the speed of security feedback.
The champion role should be voluntary, recognised, and resourced. Allocate time for champions to attend training, participate in security community activities, and contribute to security tooling. Recognise their contributions publicly. The role should be seen as a career-enhancing opportunity, not an additional burden.
5. Measure what matters
You cannot improve what you do not measure, and you cannot sustain culture without feedback loops. Define security metrics that reflect actual behaviour, not just compliance status.
Mean time to remediate — how quickly are vulnerabilities fixed after they are identified? This measures operational responsiveness.
Percentage of deployments with security checks — are CI/CD pipelines consistently running security scans, or are teams bypassing them? This measures process adherence.
Phishing simulation click-through rates — not as a gotcha, but as a trend line. Improvement over time matters more than any single result.
Time from vulnerability disclosure to patching — for third-party dependencies, how quickly does the organisation respond to new CVEs?
Incident reporting rate — an increase in reported near-misses is a positive signal, indicating that people feel safe raising concerns.
Review these metrics regularly with engineering leadership. Celebrate improvements. Investigate deterioration. Make the data visible and actionable.
The role of leadership
Culture flows downward. If engineering leadership treats security as a priority — allocating time for remediation, attending security reviews, asking about threat models in design discussions — engineers will follow. If leadership treats security as a tax — something to be minimised and delegated — that message propagates just as effectively.
The most impactful thing a CTO or VP of Engineering can do for security culture is to visibly prioritise it. Approve the sprint time for vulnerability remediation. Attend the post-incident review. Ask "what are the security implications?" in the architecture review. These small acts of attention signal what matters far more effectively than any policy document or training module.
Culture is a practice
Security culture is not a state you achieve. It is a practice you maintain. It degrades when neglected and strengthens with consistent attention. It is built in the design review where someone asks about trust boundaries, in the incident response where blame is replaced with curiosity, in the CI/CD pipeline where security checks run on every commit.
Annual training is not the answer. Systems, incentives, rituals, and leadership attention are. Build the systems that make security the default, and the culture will follow.