Why optimisation fails in complex systems
Why does a single-minded focus on one variable fail?
Why does optimising just one variable (e.g. efficiency, robustness or evolvability) fail to achieve that limited objective? The short answer is that the system adapts to subvert the attempted optimisation. Optimisation requires control, and control requires metrics and measures. All metrics, simple or complex, are only a proxy for the objective itself. Therefore they inevitably fall prey to Goodhart’s Law.
Goodhart’s Law, however, explains just a subset of ways in which optimisation fails. The failure is not restricted to social systems, and it does not require adaptation. Natural selection can achieve all that adaptation can, e.g. the failure of forest fire suppression.
Within the human body, the failure is often simply a logical consequence of the biological processes of homeostasis and allostasis e.g. the brain undergoes compensatory adaptations in response to psychiatric medication. And in some cases, like river flood control, all it takes is the inexorable force of gravity and silt buildup to cause systemic deterioration. Complex systems inevitably adapt to subvert a single-minded focus on one legible, codified objective.
Another characteristic of this failure is that failure is often delayed and preceded by success. Human adaptation takes time, and natural selection usually takes even longer. In ‘Seeing Like A State’, James Scott describes how “scientific” forestry in 18th and 19th century Germany transformed German forests into monocultures created with the sole purpose of maximising the volume of wood that could be extracted. Although the exercise ended in catastrophic failure (“forest death”), the extent of the failure was only clear after a century and what preceded it was, in Scott’s own words, “a resounding success”.
The same pattern of initial success followed by gradual deterioration and increased catastrophic risk can be seen in complex systems across domains. For example, forest fire suppression and river flood control follow the same pattern. Likewise, monetary and macroeconomic policy over the inflation-targeting era will probably follow the same path, with failure following decades of apparent success.
The history of modernity is the history of the consequences of this sort of optimisation. The Soviet economic system began with a single-minded focus on maximising total output (‘val’) and eventually ended up with a system that was fragile, unable to innovate, and did not even come close to maximising output. Sooner or later, firms that focus solely on maximising shareholder value fail to achieve even this limited objective. Instead, managers furiously optimise for the short-term share price at the expense of longer-term shareholder value. And eventually, even the share price itself is lower than it would be otherwise.
Even what seems like unambiguous successes of modernity may simply be failures waiting to happen. Is modern industrial agriculture an unambiguous success or are we still living through the initial high?
This is not an argument to abandon modern technology and control. Instead, it argues for a different approach to interventions in complex systems. In future essays, I will look at the interventions that work best in complex systems. But first, I will describe the pathology and the nature of modern complex systems and the patterns that define them.