The Wisdom of Restraint
Res Extensa #74 :: Why doing nothing about a problem is often the best bet
"If you see ten troubles coming down the road, you can be sure that nine will run into the ditch before they reach you." —Calvin Coolidge
We've all experienced the situation where something we're worried about, a task we need to get done, or even a project that we're actively working on gets overtaken by events — made obsolete or irrelevant without any intervention.
If you pay close attention, you see this happen all the time in the workplace. Problems are made irrelevant before they get any attention. These days we're often much better at finding problems we could spend time on than we are at fixing them. Just read the news — existential threat after crisis after pending catastrophe. Then pay attention to how many we never really do much about: interest ebbs, and we go back to business as usual.
It turns out not all problems are worthy of solving right away. In fact, fixing the problem sometimes makes things worse than simply leaving it alone.
In his book Antifragile, Taleb has a chapter on "iatrogenesis": the idea, taken from medicine, that "naive intervention" causes more harm than the affliction. iatro = healer; genesis = origin: "caused by the healer".
He references examples in the history of health care where doctors killed the patients they were trying to help, through their naive intervention and poor understanding of the underlying ailments. From thalidomide for morning sickness, to bloodletting, lobotomies, or arsenic treatments, countless doctors attempted to help, only to have their naiveté about reality cause more harm than good.
Intervention isn't always healthy, particularly when the second- and third-order consequences are hard to predict.
Taleb reflects on what can be learned from an iatrogenic failures in fields outside of medicine. Scientific farmed forestry creates monocultured, unhealthy tree crops. Helicopter parenting creates unstable, fragile children. Central economic planning stunts economic growth.
There's wisdom in the ability to recognize the risks of causing more problems than you're solving.
"The art of being wise is knowing what to overlook." —William James
Why do we see naive intervention in companies? And how would attempting to solve a business problem make the problem worse? Seems counterintuitive.
Iatrogenics is most relevant when working with complex dynamic systems, where the system's behavior is opaque, and problems are often multicausal and hard to get detailed visibility on. Downstream effects hard to predict as change generated by a solved problem flows through the system.
Misidentifying, misunderstanding, underestimating problems
Thanks to the "business engineering" mindset of how we think about modern organizational planning, there's a bias to see every problem as not only solvable, but also worthy of solving. We've got the expertise — let's use it! But this bias might push a team to jump to a conclusion on a problem before understanding how or why it's a problem, and (sadly) that it's even a problem in the first place.
The bias to act like a "fixer" causes a misidentification of certain things as problems. "Everything looks like a nail", as they say. Then to make matters worse, inside of a complex organization it's impossible to predict the scope and downstream impacts of proposed solutions.
Overmeasurement
One of the things I notice in 21st century business — especially in technology — is that we are utterly drowning in data, metrics, analytics, and a business culture that screams every chance it gets about the value of being "data-driven". If you aren't a data-driven culture, you're a failure1.
And of course data is an asset — it would be stupid not to use the telemetry we have on a business as feedback for driving decision making. Imagine the shock of showing an 1880s Andrew Carnegie the sorts of real-time data about global operations we have today. There was a time when the day to day goings-on in a company were nearly totally opaque, with many days of lag time between actions and feedback.
I'd contend that the levels of detailed insight we have today overexpose us to too much of a good thing. Since we can investigate every single biomarker in the organization so granularly — and our culture is constantly telling us to "care about data" — the temptation to highlight and catalog every single aberration is too great. We notice every tiny inconsistency, blip, or outlier, and the engineering mindset can't help but attempt to intervene and suppress the blips. We're sitting at a dashboard flooded with "problems" that historically would've "rolled into the ditch" before we ever noticed them. This is the same reason physicians recommend against full-body scan MRIs to screen for disease: the risk of false positives and unnecessary treatment is too high.
The problem isn't with metrics themselves. If handled with caution, data can raise alarms about impending problems, or generate interesting insights or new ideas. If we instrument the organization for the important bleeding problems, it's easier to avoid the bias in noticing the million little nicks that aren't worth any attention.
When you know too much, the allure to fix every flaw causes an overindexing on inconsequential imperfections.
Getting ahead of yourself
Another variation on this theme is trying to pre-solve for concerns before they become real problems. You've heard the sayings: get ahead of the curve, "go where the puck is moving", "prepare for scale". I'd agree that solid situational awareness of your business means having your attention biased toward the future rather than the past. On where you're going versus where you've already gone. But it's possible to go too far.
Leaning too far forward means you're solving for problems you have less (or no) fidelity on yet. Any attention you put on preparing for an unpredictable future means less of it on problems in the present.
One example of solving for future problems is hiring too early. There's danger in hiring roles you don't need (or at least don't need in a full-time capacity). Once a person has caught up on any obvious glaring problems in their domain, they might then go looking for places to apply their talent where it's not needed, or even harmful.
I think there's a relationship to the kinds of skills you hire and the tendency to fall into iatrogenic traps. In other words, if you have a small set of problems that needs solving but you hire a whole FTE to work on it so you can "get ahead of similar problems in the future", that person is likely to bias toward noticing unimportant things that align with their ability to solve them, and raise attention to less critical things. When you bring someone on board with a specialized skillset, they naturally seek out problems that match their expertise, even if those problems are minor or not particularly urgent.
With a mess of problems bombarding the company every day, there's also a push to processify things — to treat the messiness as a problem itself. But there should be restraint on how much you create processes, checks, and bureaucratic layers. Each new process implemented to address the mess starts the clock on its ossification. It sets processes in amber that might need to evolve so fast with the business that they become obsolete before they drive benefit. You've got to keep an eye on the balance between efficiency and effectiveness.
Bureaucracy's purpose is risk reduction. But in business you need to be experimenting and trying new things, at least as much as you bureaucratize the things you're already doing. With too much process-creep, a company becomes less agile, can't respond to changing customer demands, or shifting levels of product-market fit. And product-market fit is a moving target. You have to be continuously innovating somewhere.
When I look through a list of problems we could work on, or even my personal to-do list, my very first question about each item is "what happens with this if I don't do anything?"
If you're honest with yourself on this question, tons of problems are more optional than you think, especially in a chaotic environment with hundreds of critical things to pay attention to. Everything that dilutes your attention from where you're focused is a decision that that problem is equally important.
I think "just leave it alone" is a much more effective strategy than we often like to admit. It requires getting okay with imperfection, and letting the little problems float by without distracting you. And there's wisdom in that kind of restraint, patience, and focus.
Sometimes you don't need automated analytics instrumentation to make decisions. You have more "data" in your head than you think.