Overcomplicated

Published in IEEE Spectrum Magazine, Jan 2017


I had enjoyed Samuel Arbesman’s previous book, “The Half Life of Facts,” which was a discussion of the exponential pace of change as exemplified by Moore’s Law.  When I saw the title of his recent book, “Overcomplicated,” I immediately assumed that it would be a warning that we technologists had gone too far in creating complex systems, and would advocate moving to simpler systems, just as an overweight person might be advised to go on a diet.  I was prepared to argue against such a conclusion, but as I subsequently discovered, Arbesman does not say that complexity is necessarily bad or that we should seek simplicity.  In short, he maintains that systems are now unknowably complex, it will get worse, and … just get over it.

Much of the book is spent in discussing the reasons why complexity is inevitably increasing.  Arbesman writes that “almost everything we do in the technological realm seems to lead us away from elegance and understandability, and toward impenetrable complexity and unexpectedness.”  He cites three main reasons for increasing complexity – accretion, interconnection, and edge cases.  Accretion is the result of large systems being built on top of smaller and older systems, often by the incorporation of legacy code, producing what we call “kluges.”  When these subsystems become interconnected, the entanglement can change what was simply complicated to truly complex.  Finally, complexity is exacerbated by the inevitable existence of edge cases – that myriad of exceptions and rarities that constitute the long tail of cases that, though individually negligible, must all be accounted for in system design.  The complexity resulting from these factors has passed a tipping point where no single person can fully understand a complete system.

At this point in reading the book I’m thinking that this is all familiar territory to us engineers.  The question is – what do we do about it?  I begin to feel let down.  Arbesman passes quickly by our main strategies to manage complexity – abstraction of subsystems to hide the complexity of lower levels, and “good hygiene” in coding.  Good things to do, but won’t solve the ultimate problem.  So what else is there?

Arbesman argues for two approaches.  The first is that we need to create more generalists -- ideally “T-shaped” people with a deep specialty to go with a more superficial knowledge of a broad area.  This has become particularly important now that most systems involve divergent fields of specialty.  However, he does acknowledge that generalists working alone are relatively useless unless accompanied by specialists.  Moreover, the market does not at present support generalists.  Obviously, it was easier to be a renaissance man in the renaissance!

The other approach, which is dealt with at some length, is that we need to think more like biologists than physicists.  A physicist would be inclined to attempt a mathematical analysis of the system, hoping for an elegant solution that explains and predicts behavior.  On the other hand, a field biologist, faced with the overwhelming complexity wrought by evolution, would work bottom up by cataloging the appearance and behavior of creatures he finds, hoping perhaps to identify a new species. I’ve been thinking about this biological approach, but I’m not sure how much insight it would likely provide.  Some of our bugs do represent new species, like the “GOTO” trap in programming, but the overwhelming majority might be one-offs, with little general relevance.

I agree with Arbesman’s final conclusion – that we should celebrate the functionality and sophistication of our creations.  Our achievements will necessarily move beyond the understanding of a single human.  After all, evolution has created us, with astounding functionality and resilience, but inevitably with esoteric fragilities.  Perhaps it will be the same in technology.