The Whole and Its Parts
Why complex isn't just more complicated and how to deal with complexity.
Is a mechanical clockwork complicated? Or is it complex? And how about the human brain? Is it complex or complicated? Or maybe even both? When we don't (immediately) understand things, we call them complicated, and very complicated things we call complex. But is this even correct and permissible, or is our sense of language misleading us here?
Let us first approach this question from familiar territory. The transition from obvious or simple to complicated is the horizon of one's knowledge. Whether someone sees something as obvious or complicated always depends on their knowledge. A mechanical clockwork is undoubtedly complicated for me, but not for a watchmaker. It is probably the other way around with computers: less complicated for me than for the watchmaker. Therefore, the boundary between these two categories runs between the “known knowns” and the “unknown knowns,” as Dave Snowden puts it in his Cynefin framework1. A complicated situation can be understood in principle, and there are answers. Still, they may not always be the standard answers or best practices but require a more profound expert analysis to find them.
If my car suddenly starts running erratically, I would first go through the apparent possibilities of a layman: Does it still have enough fuel? Do all the tires have enough air? If none of this solves the problem, my knowledge and experience are insufficient, and I must see a mechanic. If my computer runs slowly, I go through the list of processes, kill one or the other, or restart the computer (the best practice par excellence with Windows for decades). But if none of that helps, I need the advice of an expert (or get a Mac).
But how do I recognize when I've reached my wit's end and should consult an expert? How do I know what I know or don't know? And can I admit to myself that a situation is beyond my capabilities? These are tricky questions that humans are reluctant to answer honestly.
We tend to overestimate ourselves dramatically based on an initial basic understanding. Our incompetence is fundamentally unconscious, as Dave Dunning explains2: “If you are incompetent, the skills you need to give a correct answer are the very skills you need to recognize what a correct answer is.” This Dunning-Kruger effect, named after its two discoverers3, favors fruitless trial and error with panaceas in situations that have long since required a more profound analysis by real experts.
Even though it is often used in everyday language, complicated is not simply the little sister of complex. The difference between complicated and complex is as fundamental as between a Ferrari and the rainforest. The Ferrari is a complicated device that can be disassembled, and—this is crucial—it can be understood analytically via the individual functions of its components. The behavior of a Ferrari is deterministic. When I press the accelerator pedal, the relevant components interlock in a planned way and accelerate the car. If this is not the case, it is broken and needs to be repaired. Essentially, the whole is the sum of the parts. The behavior of a complicated system is predictable and unsurprising4.
The rainforest also consists of many individual elements and influencing variables. Still, the dynamics of these elements with each other are entirely different from those of the Ferrari. And that is precisely what makes the difference. It gets complex when the quantity, arrangement, and relationship of largely autonomously acting system elements are constantly in flux5. The behavior of such a complex system can no longer be explained by its components, but patterns emerge from the interaction6: “Its performance is never equal to the sum of the actions of its components; rather, it is a function of their interactions.”
The Cynefin framework describes complex situations as the area of “unknown unknowns”7. In other words, we do not know what we do not know and must first explore what there is to know. Cause and effect can only be understood retrospectively and macroscopically based on behavioral patterns, not analytically and microscopically from the individual elements. Although neuroscientists understand the structure and functioning of the brain very well, they are no better than I am at predicting thoughts or actions. Psychological experiments can nevertheless demonstrate various thought patterns on a macroscopic level, such as the Dunning-Kruger effect mentioned earlier.
Therefore, the boundary between complicated and complex runs between predictable and surprising, static and dynamic, and ultimately between dead and alive. Dealing successfully with complexity requires a change of method from analytics to empiricism. Behavior and causalities can no longer be determined analytically, i.e., by breaking them down into components. They can only be explored, understood, and described using suitable hypotheses and experiments to verify or falsify them.
Complicated projects are carried out by and with people. This human factor quickly turns a technically challenging problem into a complex situation. Because people are involved and affected, a project is almost always complicated and complex at the same time.
As companies generally consist of many people and functional units and make products for global markets, which in turn consist of many players and are subject to many influencing factors, the vast majority of situations will be complex or at least have complex components to a significant extent. Comprehensive analysis is, therefore, only of limited help. Instead, an empirical approach is needed to validate or falsify ideas quickly in the real environment.
This is the sweet spot of agility with its short feedback loops. Product increments are delivered at short intervals and ideally tested on the market immediately to better understand what works and what doesn't. However, learning and improvement are not just about the what but also the how. Agility is, therefore, characterized by the continuous improvement of collaboration by the people involved. With these two learning loops, agility offers a sound basis for dealing with complexity in terms of collaboration between people on the one hand and in terms of interaction with the market and customers on the other.
When the Method Becomes the Problem
Abraham Maslow wrote in 1966, “It is tempting if the only tool you have is a hammer, to treat everything as if it were a nail”8. This tendency to apply tools and methods to any problem, no matter how unsuitable, simply because of their availability is also known as the “Law of Instrument” or simply “Maslow's Hammer.” Combined with the Dunning-Kruger effect mentioned in the previous section, this also explains many aberrations in one or other agile transformations, such as committees that suddenly organize their work in sprints, even though they are neither a team nor working on a joint product.
However, Maslow's hammer is not a problem for laypeople; it was meant to describe a blind spot of experts. This blind spot becomes painfully noticeable when dealing with complex issues in areas where experts primarily work on complicated things. Engineering, for example, is mainly concerned with building something complicated. Accordingly, the approach is analytical: experts break down the problem, analyze the parts and various aspects, and then put the solutions together. This is how factories, cars, and airplanes are built.
Regular airplanes, at least.
The art of aircraft construction was already well-advanced by 1976. That year, the Concorde became the first supersonic passenger aircraft to commence regular flight operations. It carried its passengers from London or Paris to New York at more than twice the speed of sound in a record time of 3 to 3.5 hours—twice as fast as before. However, a wholly different and, at first glance, much simpler challenge of aircraft construction was still unsolved at this time.
In 1959, it was not only that preliminary developments for the Concorde began in France and Great Britain. This year, the British industrialist Henry Kremer also donated a prize of 5,000 British pounds for the first human-powered aircraft that could fly a horizontal figure eight around two posts at a distance of half a mile (806 meters) within 8 minutes. These were the rules of the Kremer Prize. In 1967, Kremer doubled the prize money and finally increased it to 50,000 British pounds in 1973. Despite this impressive sum, equivalent to around 780,000 US dollars in today's purchasing power, many teams failed to solve this problem over the years.9
The American physicist Paul MacCready had a doctorate in atmospheric disturbances and was a passionate glider pilot but not an aircraft engineer. He only had some experience building indoor airplane models from his youth and hanging gliders with his sons. In the summer of 1976, he was 100,000 US dollars in debt due to a guarantee for a friend's failed start-up. According to the exchange rate at the time, this sum corresponded almost precisely to the 50,000 British pounds of the Kremer Prize, which is why Paul MacCready became interested in the problem of human-powered flight.
Lacking prior knowledge of the “right” way to design airplanes and lacking the budget for a large team of experts and expensive equipment, Paul MacCready did not spend much time analyzing and planning like the other professional teams. Having studied the flight of vultures during his summer vacation, he came up with the idea of trying his luck with a lightweight “model aircraft” with an enormous wingspan of 29 meters, about the size of a DC‑9. Within just two months, the first version of the Gossamer Condor, consisting of aluminum tubes, wire ropes, and rigid foam covered with a polyester film, was ready for a test flight. This ended—like so many afterward—with a crash.
But that was precisely the point.
The Gossamer Condor was the naïve work of an amateur who did not care how professionals constructed airplanes according to the state-of-the-art at the time. This state-of-the-art technology used by the competitors led to very nice-looking and relatively fast airplanes. Still, it also made them quite complex and heavy--too heavy to be run solely by human muscle power in the long term. The real competitive advantage of Paul MacCready's design was not its lightness or other technical refinements but the fact that the Gossamer Condor was simple to build and repair. This allowed the team to learn from failures more quickly than the competition.
The success of this tactic was not long in coming. Within a few months, the small team around Paul MacCready was able to overtake the competition and improve the Gossamer Condor from failure to failure to such an extent that, with professional cyclist Brian Allen as pilot, they finally managed to fly the figure eight around the two posts half a mile apart in a relatively leisurely 7:25:05 minutes on August 23, 1977. And just two years later, on June 12, 1979, the same team crossed the English Channel with the Gossamer Albatross, the successor to the Condor, and was awarded the second Kremer Prize, worth 100,000 British pounds10. Paul MacCready was out of debt and went down in the annals of aviation.

The professional teams before him followed the rules of engineering. If this art made supersonic flight and landing on the moon possible, this seemingly simple problem could indeed be solved with it. However, the problem was more complex than initially thought. The same analytical approach to engineering that had continually improved aircraft construction over the decades could not deal with the complexity of human-powered flight.
Paul MacCready followed his instincts and approached this problem more empirically (also due to a lack of alternatives) than his more analytical competitors. He concentrated his minimal resources on the essentials and omitted everything else. The airplane didn't have to be fast or nice-looking; it just needed a large wingspan for a lot of lift with as little weight as possible because human muscle power was the limiting factor. To learn quickly from experiments and try out modified designs, it had to be simple to build and easy to repair, as Greg McKweon summarizes the approach with quotes from a lecture by Paul MacCready at MIT11:
The real challenge was not to build an elegant plane that could fly the figure eight around the two posts on the field but to develop a large, lightweight plane, "no matter how ugly it is," that could be "repaired, modified, changed and redesigned again after a crash—and quickly." At that moment, he suddenly realized: "There's an easy way to do it."
Anyone who has spent many years of their education and professional life successfully using a hammer on different types of nails will find it challenging to recognize the screw and change the tool. At first glance, the Kremer Prize appeared to be a rather complicated problem—or at least the experienced engineers thought it was. In reality, however, the complexity was dominated by the restriction of a single parameter, namely, powering it solely through the muscle power of a person. It was only through the empirical approach of trial and error that Paul MacCready, as a layman, succeeded in doing what the experts before him had been denied for so long.
Check out also the chapters published so far:
David J. Snowden and Mary E. Boone, “A Leader's Framework for Decision Making,” Harvard Business Review, November 2007, https://hbr.org/2007/11/a-leaders-framework-for-decision-making.
Errol Morris, “The Anosognosic's Dilemma: Something's Wrong but You'll Never Know What It Is (Part 1),” Opinionator (blog), June 20, 2010, https://opinionator.blogs.nytimes.com/2010/06/20/the-anosognosics-dilemma-1/.
Justin Kruger and David Dunning, “Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments.,” Journal of Personality and Social Psychology 77, no. 6 (1999): 1121--34, https://doi.org/10.1037/0022-3514.77.6.1121.
Snowden and Boone, “A Leader's Framework for Decision Making.”
Mark Lambertz, Die intelligente Organisation: das Playbook für organisatorische Komplexität, 2. Auflage (Göttingen: BusinessVillage, 2019), 40f.
Russell L. Ackoff, “Systems Thinking and Thinking Systems,” System Dynamics Review 10, no. 2--3 (June 1, 1994): 180, https://doi.org/10.1002/sdr.4260100206.
Snowden and Boone, “A Leader's Framework for Decision Making.”
A.H. Maslow and John Dewey Society, The Psychology of Science: A Reconnaissance, Gateway Edition (Harper & Row, 1966).
“Kremer-Prize,” on Wikipedia, February 9, 2024,
“Kremer-Prize.”
Greg McKeown, Effortless: Make It Easier to Do What Matters Most, First edition (New York: Currency, 2021), 126.