Charles Bonini was a business professor who noticed a problem with the way people understand complex situations. When he investigated, he found that it wasn't a problem so much as an unresolvable paradox.

These days, it's popular to compare the brain to a computer. After all, a brain and a computer both work via a series of signals that turn individual components on and off. Combine a lot of these, and they can do computations. More complex functions of the brain, can be explained by more complex processes in a computer.


Now what about brain injuries? What is a loss of blood circulation in the brain comparable to, or the build up of fluid that causes brain damage through pressure. Well, the loss of circulation is like a loss of electricity to the computer that somehow damages the computer permanently. And the build up of pressure is, I don't know, like someone's butt when they accidentally sit on their iphone after drunkenly stumbling into a taxi. It's a good model, but after a while, it doesn't hold up.

All models break down, eventually. Whether they are physical models, mathematical models, or long, wordy descriptions, no model can completely describe what an object actually is. And that, according to business professor Charles Bonini, is only half the problem. As you try to make a model more and more accurate, you run into another kind of trouble.

When you want to find the best route to a destination, you grab a map. But perhaps the map isn't enough. You need to know what streets won't let you turn left. You want to know if there's going to be a roadblock, or traffic. Of course you can get a more accurate map, but there's going to be more visual clutter on it. You'll have to sort through more information to set your course.


And what if you need even more accurate information? Perhaps you're walking, and you need to know where impassable boundaries are, or what the elevation changes are like. Get a map with that information, and there's even more clutter. Once you collect information, and you need an expert to interpret the information for you. Eventually, you reach a degree of accuracy that makes it easier just to strike out on your own and hope for the best.

And this is Bonini's Paradox: The less information a model carries about its subject, the less useful it's going to be in helping someone understand that subject. And yet, the more information a model carries about its subject, the less useful it's going to be in helping someone understand any single point of that subject. Any sufficiently detailed map of a region is going to be just as dense and difficult as the region itself. Any sufficiently detailed model of a brain is going to be a brain.

This sounds spurious โ€” or like a scientist's version of a Magritte painting. It does, however, make an important point. Any model, of anything, is in an act of editing. It picks out what we think is important regarding the subject, and directs our attention to how that important thing can be manipulated.


We're not looking at reality, but a tool for helping us shape reality โ€” if only in our minds. When we think of things like this, the idea of playing with the model, or building a new one entirely, is easier to comprehend. Want to change the way people look at the world? Just find a better (more useful, more unique, more accurate, more specific) way for people to imagine it.

Via Simulation of information and decision systems in the firm