I was recently reminded of this story from 99pi - one of my favourite podcasts - and a story they recounted there from Galileo's last book: Two New Sciences.
Before he gets into the famed dialogues between Sagredo and Simplicio and Salviati, he tells us a simple story about a construction site: a tall column that would presumably form an important structural role in the finished building, supporting either the roof or a portico, was delivered long before it was needed.
The builders calculated that it would have to be stored for many months, stretching across the winter, and they were fearful that if they left it lying horizontally on the ground it would sink into the mud and that it would be very hard to lift the following year. So they decided to create two wooden supports, one for either end: the column would lie across the supports and it would be easy enough to get their ropes underneath it when they were ready to lift it again.
But then they became worried that all of the column's weight would bear down on its unsupported centre point, and that it might well break there over the winter - it was not designed to lie horizontally after all. So - not unreasonably - they came up with the idea of a third wooden support at the centre point.
But when they returned to work in the spring, they discovered that one of the end supports had rotted away - and that the column had broken very neatly directly across the centre support.
What moral does Galileo (and 99pi) take from this story? that sometimes the little extra precautions we take to avoid disaster add complexity to a system - and that it is complexity that creates the disasters we were trying to avoid.
They provide many other examples: The early nuclear reactor - Fermi 1 in Michigan - narrowly avoided a catastrophic meltdown caused by the last minute addition of an extra filter to the coolant - that came loose and blocked the system it was supposed to protect.
Also in the world of nuclear physics, the 3 Mile Island disaster was made worse by its own warning systems - that at one point were flashing 750 warning lights at the handful of operators, who could then make no sense of the overload of information they were being given.
In a simpler tale, the 2017 Oscars disaster where the wrong winner was announced, was caused by too many safety features - designed to avoid that very situation.
There is an interesting link, too, with the 2008 banking crisis.
I heard this podcast first years ago and have listened a few times since - and I have often deferred to its wisdom in both school and domestic settings. Simplicity is a virtue, I have learned. Trying to achieve too much often means we achieve nothing at all.
And I have thought of the underlying issue recently when much of the country was left without power due to Storm Éowyn. The lack of electricity was bad enough, but for many it came along with the loss of heat, water and phone signals: we have developed systems that are wonderfully efficient - until something goes wrong. And then the enormously complex interactions between all of our systems can make things much worse than they would once have been.