Guy Brown/Alamy Stock Photo
SOME weeks ago, my flight was delayed due to a malfunction of the braking computer. A little later, the plane had to wrestle its way through some turbulence generated by a pre-alpine convection cloud that had eluded the weather radar. While I contemplated my tomato juice, the on-board Traffic Collision Avoidance System (TCAS) purred away down in the intestines of the fuselage, preventing a collision with another aircraft.
Evading the flight path of another plane in a contingency situation is a complex business. And as Samuel Arbesman points out in, the systems we have built to handle such problems have become black boxes. The rules by which the TCAS functions have evolved over decades and are so massively interwoven today that they escape the understanding of everyone, apart from the ubiquitous yet elusive “handful of experts”.
This challenge is not unique to aviation. It applies to finance, infrastructure, power plants, launch vehicles and other technological systems. They share some key traits: their elements and interactions may make sense individually, but their wider interdependencies are dynamic and unpredictable.
Even if our individual and collective cognitive faculties were up to the task of understanding massive complexity and its emergent behaviour – and they’re not – then there is the question of legacy. Much of what we use today has been designed incrementally and has been operating for a long time. It has been upgraded, patched, repaired and maintained. So, on top of everything else, the insoluble puzzle we have set ourselves is always changing.
The fact that Arbesman uses the term “overcomplicated” in reference to such systems should invite us to listen carefully. He is not someone who didn’t read the manual. A trained computational biologist, Arbesman uses quantitative models to explore the chaos around us. And because the “entanglement” he diagnoses today is more akin to an evolving ecology than a carefully configured and managed machine, Arbesman encourages us to adopt the attitudes and methods of field biologists.
These people delight in anomaly and embrace diversity. They derive intrinsic satisfaction from the observation, description and collection of “bundles of facts”, even if a full picture or generalised model is not immediately evident. By invoking the repertoire of the field biologist, Arbesman argues, we are in a better position to confront our systems.
This practice – observing and making sense of the sometimes contradictory interplay between actors and processes – smacks strongly of another field-based approach: ethnography. And it’s no coincidence that ethnographic field research has become an essential tool in understanding our relationship with technology, and a key player in applied domains as distinct as healthcare design and bespoke defences. Ethnographers have explored and explained institutions ranging from weapons laboratories to particle accelerators, utility regulators and Mars mission control rooms.
Overcomplicated is not an advertisement for ethnography. It does not explicitly address human agency at all. Even systems with evident political dimensions, such as tax law or the Challenger shuttle loss, are understood as technological rather than sociotechnical. This is an uncanny omission: do humans not inadvertently contribute to, passively allow, or even actively promote overcomplication?
Not any more: Arbesman suspects that our tech truly has outgrown us. This is a big claim, and many readers may balk at the idea of discounting the role of humans in how technology works. But at the very least, they will have to concede that the approach is entertaining, and provides us with the necessary external vantage point from which to observe the subtle imperfections and, here and there, the fundamentally flawed logic of our systems.
“Many of our systems were designed incrementally: they are insoluble puzzles that are always changing“
True to the remit of the field biologist, Arbesman stops short of calling for the decommissioning or prevention of overcomplicated systems. One can see why. There is much value to be had in looking at technology through a naive lens. Suggesting how one should actually respond to the burgeoning and powerful machine ecology Arbesman describes is a task for another book.
Still, how we respond is an urgent issue. We face some consequences of complicated tech today – think of the debris lacing Earth orbits, or the world’s stockpiles of nuclear weapons – and these are likely t0 haunt our political lives for generations, needing more than a field biologist’s inquisitive tinkering and cautious optimism to solve.
To its credit, Overcomplicated gives the reader the tools necessary to make this very argument. The governance of technology, so often an arcane business, is dissected here with aplomb, as Arbesman strings together the key concepts that describe overcomplicated systems. Readers armed with notions of interoperability (the ability of different systems to talk to each other and exchange information), kludgeyness (the relative likelihood that a system fix will cause trouble later) and other arcana can at least begin to argue on equal terms with systems analysts and designers. This is important: I would argue that acquiring fluency in systems-speak is fast becoming a civic duty.
In any event, Arbesman’s freshly elucidated concepts are excellent field tools: they are the translucent sampling containers you take with you as you wade through the glitch-infested shallows of an algorithm; the night vision camera you employ when stalking incompatibilities through the primordial thickets of a code forest; the head torch for abseiling into the crevasses of operating systems; the depth-meter for a cave dive into the murky world of automation. You will only catch a glimpse of what’s going on. But you will begin to discern and respect patterns, and orient yourself in a landscape that would otherwise remain opaque.
The warning implicit in Overcomplicated is clear: if you ignore the intricacies of intractable systems, refuse to engage with the anomalous underbelly of concealed electronic complexity, or fail to attribute due importance to minute but critical parts, then those ever-so-fleeting “edge cases” will sooner or later resurface as freakishly bizarre incidents, and catastrophes as inevitable as they are unanticipated.
This article appeared in print under the headline “When systems go feral”
More on these topics: