People are not the problem in cyber-security

| November 5, 2019

Those of us older than a certain age will recall an excellent British television series, Yes, Minister, and its successor, Yes, Prime Minister: they were required viewing for young and enthusiastic public servants in Canberra.

One of the more memorable episodes, ‘The compassionate society’, involved a hospital with no patients, though it did have 500 administrators and ancillary workers to ensure things ran smoothly. Not having patients both prolonged the life of the facility and cut running costs, ensuring it was one of the best-run hospitals in Britain.

Unfortunately, we find the same attitudes in other walks of life. The ‘socio-technical divide’ is a well-known phenomenon in technology. If only people weren’t involved, or just did what we told them, we’d have perfect systems.

That also reflects a lot of thinking and commentary around cybersecurity—‘people are the problem’. After all, most intrusions and attacks start with people being persuaded or misled into going onto disguised or infected sites, to handover details or otherwise compromise their own systems.

One estimate is that 94% of malware is delivered via email—phishing—which requires someone to preview, open and/or click on a link or file. If only people—users, clients, members of the community—didn’t do what people naturally do, we’d all have much more secure and efficient systems.

That’s muddled thinking.

First, people—thankfully—are messy. They’re changeable. They work in a variety of different styles, with different information. They desire both privacy and openness—mediated on their own terms, and with people whom they get to choose.

They’re impatient, focused, curious, biased, distracted, inspired and sometimes plain lazy. They’re driven by motivations that generally have little do with the incentives of engineers and managers who build and oversee systems and data.

They’re also the reason why we build technologies, systems, organisations and institutions to start with—to enable, support, help and entertain. We need more than cybersecurity specialists; we need good people to conceive, construct and care for good, adaptable, human-centred, secure, resilient systems, that account for the people who use or are supported by them.

Technology is inherently reductionist: that it fails to capture the complexity of humans and their systems should be unsurprising. Despite attempts to map ‘personas’ or ‘life journeys’, technological systems will always struggle to keep up—there’s no universal, singular human experience.

The fact that systems and system design either neglect or fail to accommodate the messiness of people and their underlying needs is fundamentally a human concern. That’s even more so as those same technologies become increasingly embedded in in our daily lives.

Second, the line that ‘people are the problem’ usually applies to the faceless masses. It tends to overlook that systems designers, engineers, managers, vendors and government ministers are also people and no less prone to the same messiness, incomplete decision-making, changeability and biases as anyone else.

They make mistakes, too. They respond to their own set of incentives, not necessarily others’, or society’s for that matter. And they tend to build systems that favour their own position and point of view.

Third, the systems themselves that people design, approve and administer are no less flawed. Moreover, there’s a case that the consequential misunderstandings, misinterpretations and misconfigurations are so deep in both practice and theory that everything is broken.

The result is that the whole, inevitably, is deeply complex. Complex systems have a variety of properties that make them unpredictable and less tractable to top-down, centralised control.

In such systems, cause and effect are rarely linear. Simple actions may have extraordinary effect, while massive effort may yield little change. Failures may cascade across systems, which may have networks, dependencies and behaviours previously unsurfaced and so unrecognised by managers, administrators and technical staff.

Systems and stakeholders—and their expectations—co-evolve, so that what worked once or in the last funding cycle, may not do so again. And preventive action may easily create increased rigidity and tension in the system. It may be better to allow small failures—analogous to backburning, for example—if they dilute the prospects for major conflagration or collapse.

In such circumstances, attacking officials for a failure—‘heads will roll’—or blaming the victim does little to encourage staff to learn and to build resilience. Instead, it may well increase the chances of greater systemic failure later. The slow work of building human capability and a systems-level understanding is likely to yield better results than firing staff or micromanaging.

Last, we live in a liberal, Western, free-market democracy. Democracies give preference to individuals and their rights, freedoms and opportunities.

Democracies are being challenged by authoritarian regimes and illiberal thinking, including within the democracies themselves. To many—disillusioned by the failure of democratic government to cope with the turmoil caused by digital disruptions, by inequality, by discrimination and biases, and by failures in corporate governance—a populist, or even an authoritarian or strongman approach, looks attractive.

It offers simple and direct action and solutions. The lack of consultation, and of transparency, is seen not as an impediment but as an enabler. And authoritarian systems are often able, at least initially, to generate more immediate and focused outcomes.

But a fundamental strength of democracies is that, by investing in individuals and giving them the freedom, and the ability, to create, build, prosper and take a large measure of responsibility for their own wellbeing, they build both legitimacy and resilience that authoritarian societies lack.

Increasingly, that’ll include their online activities as well. In a complex system, government has little hope of ensuring safety and security for all.

Technological fixes will be quickly overtaken; social fixes, including legislation, risk impairing people’s ability to look after themselves; and both, imposed with little understanding or consultation, will erode trust, legitimacy, resilience and the government’s own ability to enact change.

The discussion paper released by the Department of Home Affairs on a new cybersecurity strategy raised a multitude of questions—literally. That reflects the sheer complexity of the problem at hand. And if everything is indeed broken, reaching for quick technological or even legislative fixes is likely to add further complexity, increase fragility, increase insecurity and impair resilience.

And so, rather than assuming that ‘fixing’ cybersecurity and improving safety are matters for the central government—which has stretched resources and fewer levers than it once had to enact change—decision-makers would do well to place citizens at the centre of the challenge, and to ask themselves, ‘How can we best help our people to help themselves?’

Now more than ever is the time to double down on our democratic impulses, rather than seeking false shelter in an omnipotent government.

This article was published by The Strategist.

SHARE WITH: