A few days ago I wrote about an example from a presentation by Don Reinertsen on the benefits of small batch sizes. Nassim Taleb brings up similar ideas in Antifragile. He opens one chapter with the following rabbinical story.
A king, angry at his son, swore that he would crush him with a large stone. After he calmed down, he realized he was in trouble, as a king who breaks his oath is unfit to rule. His sage advisor came up with a solution. Have the stone cut into very small pebbles, and have the mischievous son pelted with them.
The harm done by being hit with a stone is a nonlinear function of the stone’s size. A stone half the size does less than half the harm. Cutting the stone into pebbles makes it harmless.
Related post: Appropriate scale
I’m unconvinced. At best, this seems like an analogy that works both ways. You and Taleb are using the energy as the batch size, but my brain intuitively views it as the first case being a small batch of very difficult to “analyse” (survive) data, whilst the second is a much larger batch of very easily digestible pieces. So it’s a win for larger batches of easier data analysis.
The king, being not very bright, had a large stone broken up into pebbles — which he then piled on top of his son, who died.
I see that behavior all the time in “risk mitigation plans” that claim that once-frightening project risks have been addressed by systematic identification and monitoring. They seem to think that identifying and tracking the potential hazards somehow magically reduces their aggregate risk, even if you don’t do anything to change their individual probabilities of occurrence or likely impacts.