The Costly Confusion
The highway can be built. The accidents require something else.
There is a difference between designing a highway system and designing fewer accidents.
The highway system is designable. You can specify it, model it, hand it to engineers, and what emerges will closely resemble what you drew. You can point to the blueprint and say: that is what we are building. The materials are known, the steps are known, the outcome is knowable. This is a complicated problem — hard, expensive, requiring enormous coordination — but ultimately solvable through planning.
Fewer accidents is something else entirely. Accidents happen at the intersection of driver behavior, road condition, weather, fatigue, enforcement, signage, vehicle design, the particular stress someone was under that morning, and a hundred other variables that are never all in view at the same time. You can change the speed limit. You can redesign an interchange. You can add a guardrail. Each of these influences the system. None of them controls it. The moment you start treating “fewer accidents” like a highway — as if you can blueprint your way to it — you have already misunderstood the problem.
This distinction, between the highway and the accidents, is what Complexity Design is about.
We use the word “complex” to mean difficult or elaborate. In complexity science, it means something more specific: a system whose behavior cannot be predicted or designed from its parts. Traffic is complex. A conversation is complex. A team is complex. A city is complex. These systems have emergent properties — behaviors that arise from the interaction of elements that nobody authored and nobody fully controls.
A car engine is not complex in this sense. It is complicated. So is a computer program, an org chart, a project plan. Complicated things can be difficult to understand, but they yield to analysis. You can take them apart, understand each piece, and predict how they will behave. Complexity resists this. The interesting behavior is precisely what you can’t get to by taking the thing apart.
The confusion between complicated and complex isn’t just a semantic problem. It causes real damage. Organizations hire consultants to solve complex problems with complicated tools — frameworks, matrices, process diagrams — and then wonder why nothing changes. Managers write detailed plans for situations that will require constant improvisation the moment they begin. Designers prototype solutions to problems that are actually ecosystems, and ship things that solve the wrong layer.
One of the axioms I teach is:
Plans are always complicated, execution is always complex.
Think about any project you have been part of. The plan, at the moment it was written, existed in a controlled space. It assumed a stable environment, predictable people, and information that would stay accurate. Then execution began, and the plan met the world. Someone left. A dependency shifted. A stakeholder changed their mind. A user behaved in a way that wasn’t in the brief. This is not a failure of planning. It is the nature of complex systems. The world does not hold still for your blueprint.
This doesn’t mean planning is useless. It means planning and executing are different activities, requiring different orientations. One is about building something in a controlled space. The other is about staying responsive inside a system that is always changing.
Another axiom:
Emergence is the opposite of design.
Design is intention made visible. You have an idea of the outcome, and you build toward it. Emergence is what happens when you don’t — when a system is left to interact with itself, and something appears that nobody specified or predicted. The best team cultures are emergent. So are the best conversations, the most interesting cities, the insights that come from a room of people who are genuinely listening to each other.
You cannot design emergence directly. But you can create conditions for it. This is one of the core skills in Complexity Design: learning to work with conditions rather than outcomes. It means asking not “how do I produce this result?” but “what kind of environment makes this result more likely?” The shift sounds subtle. In practice it changes almost everything about how you work.
Then there is the question I put to my students every semester: can an AI system be complex?
The answer the course leads toward is no — and the reasoning is worth sitting with. Complex systems are open-ended, meaning there is always more context, always another variable that wasn’t in the model. A human being navigating a complex situation is never working from a complete picture. They are reading the room, adjusting in real time, bringing the weight of every experience they have ever had to bear on a moment that has never happened before.
An AI system, however sophisticated, is a complicated system — an elaborate stacking of known constructs, trained on a fixed world, operating within the dimensions it was given. It can simulate the outputs of complexity. It can find patterns in complex data. What it cannot do is be inside the open-endedness the way a person is. That is not a gap more processing power will close. It is structural.
This matters not because AI is therefore bad or limited in some dismissive sense, but because the confusion about what AI can and cannot do is itself a complicated/complex confusion. Treating AI as a solution to complex problems is like designing the highway when you need to address the accidents.
Complexity Design is not a methodology. It doesn’t offer a framework you can apply to any situation and get a result. What it offers is a different relationship with what can and cannot be controlled — a set of orientations that make you more useful inside systems that won’t hold still.
The highway can be built. Fewer accidents require something harder: the willingness to stay inside the problem without resolving it too soon, to design conditions rather than outcomes, and to trust that what emerges from genuine complexity is usually more interesting — and more true — than anything you could have drawn in advance.

