Attackers Engineer Behaviour. Defenders Explain It.

Amy Stokes-Waters

Amy Stokes-Waters

Attackers are not doing something fundamentally different to security awareness teams. They are simply better at it. Both are trying to influence behaviour, interrupt habit, and push people away from their default responses. The difference is that attackers openly social engineer, while defenders insist on explaining social engineering as if behaviour changes through understanding alone. Until training accepts that it too is an act of influence, not education, the gap will remain, and attackers will keep exploiting it.

Let's cut the crap. We're going to stop pretending that attackers are doing something mystical, elite, or morally exotic. They are not wizards. They are not mind-control ninjas. They are not practicing a secret dark art in a hoodie-lined cave. What they are doing is far more mundane, and far more uncomfortable for us to admit: they are simply influencing people to do specific things, at specific moments, under specific conditions.

That's it. That's the trick.

Strip away the hacker cosplay and social engineering becomes embarrassingly simple. Get a human to behave the way you want them to behave. Nothing more. Nothing less. It is persuasion with timing, behaviour change with intent, influence applied deliberately. And on its own, it is neither good nor evil. Intention is the only thing doing the moral heavy lifting here.

christin-hume-Hcfwew744z4-unsplash (1)

Attackers use social engineering to influence people to do specific things, at specific moments, under specific conditions. A victim clicks a link while they are on the phone. They reset a password while flustered. They approve a request while pressure is applied and authority is implied. None of this is accidental. It is designed. It is timed. It is contextual. And it works because it targets behaviour, not knowledge.

Security awareness teams are trying to do the exact same thing, whether they admit it or not. They want to influence people to do specific things, at specific moments, under specific conditions. Not click the link. Pause when urgency spikes. Interrupt politeness. Question authority. Report early instead of quietly fixing it. The goal is not understanding in the abstract. The goal is action in the moment.

The only real difference is that attackers are honest about what they are doing, while defenders keep pretending they are just “raising awareness”.

Social Engineering

Attackers understand urgency not as a red banner in a slide deck, but as a physical sensation. They understand authority not as a box on an org chart, but as a tone that shuts down debate. They understand fear, curiosity, embarrassment, politeness, fatigue, and overload because these are not abstract ideas to them. They are levers, and attackers pull them relentlessly and without apology. They know when to sound helpful, when to sound irritated, when to escalate, and when to disappear.

What they don't do is hope someone remembers guidance from six months ago, or care if someone passed a quiz. They design moments that override habit and hijack attention right when it counts.

That is social engineering. And it works. Not because people are stupid. (Please, stop saying that). It works because attackers design behaviour change into the experience itself. They don't ask people to act, they make people act. Which, perhaps awkwardly, brings us to marketing.

Presuure

Marketing has been using social engineering for decades, and no one, not one single person, is clutching their pearls about it. Click this. Buy that. Limited time only. Last chance. Don’t miss out. Your competitors already have this. Just one more step. Free trial. Cancel anytime. We could go on.

Marketing does not run awareness programmes about persuasion. It does persuasion. It plays with urgency, authority, social proof, scarcity, curiosity, fear of missing out, ego, identity, convenience, and friction intentionally, strategically, and proudly.

No one in marketing says, “We simply hope users calmly reflect on the value proposition and make a rational choice six weeks later.” They design funnels. They design CTAs. They design journeys that nudge behaviour. They social engineer, and you call it good marketing.

Funny how that works.

Marketing

Now look at security training. When we talk about training, awareness, or culture, what we actually want is behaviour change. We want people to stop clicking on autopilot, we want people to interrupt politeness, question authority, slow down when urgency is screaming at them, and report early instead of quietly “fixing it”. That is not education. That is behaviour design. That is social engineering.

But for some unknown reason (seriously, answers on a postcard), we refuse to call it that. Instead, we hide behind softer language: awareness, understanding, learning outcomes, knowledge transfer. We deliver slides. We explain theory. We outline risks. We describe attacker tactics from a safe, emotionally neutral distance, then bolt on a quiz and declare victory. This is, honestly, the least persuasive possible way to change human behaviour.

Meanwhile, attackers are not teaching people about social engineering. They are using it. They do not explain cognitive bias; they trigger it. They do not lecture about authority; they impersonate it. They do not warn about urgency; they manufacture it. And we act shocked every single time.

Cyber security understands behavioural influence perfectly well, in theory. It writes papers about it, quotes psychology, loves behavioural economics, and name-drops decision science like it is at a dinner party. But the moment it has to design training, influence suddenly becomes manipulative, emotion becomes unprofessional, and experience becomes hard to measure. So everyone defaults to what is safe, tidy, and auditable: slides that will not upset anyone, messages that are technically correct and psychologically useless, and training that looks nothing like the situations people actually fail in.

And when it does not work, we blame the human. Obviously.

Training

Here is the bit everyone avoids saying out loud (well, except us, we've been saying it for ages). Awareness programmes do not fail because people do not care. They fail because we refuse to accept the nature of the task. We cannot ask people to override instinct, habit, and social pressure using calm explanations and perfect recall.

We teach social engineering in ways no social engineer would ever use. We describe pressure without applying it, discuss authority without embodying it, and warn about manipulation without ever letting people feel it safely and recognise it in themselves. Then we act surprised when, under real pressure, the theory evaporates. Poof.

Let's cut the crap (again), if you want people to resist social engineering under pressure, you have to social engineer them first. Ethically. Deliberately. Transparently. Not to trick them, but to prepare them. Not to deceive, but to rehearse. Behaviour does not change through explanation. It changes through experience.

Until we can accept that training is itself a form of social engineering, attackers will keep winning. They will keep designing experiences that meet people where they actually are: tired, busy, polite, distracted, not where policies wish they were.

Everyone is social engineering. Marketing knows it. Attackers own it. Mums and Dads do it every day. Only security awareness are still pretending they are not, while wondering why nothing changes.

And that is the real vulnerability.

Keywords