Book Review – To Forgive Design: Understanding Failure

Advertiser Disclosure: Opinions, reviews, analyses & recommendations are the author’s alone, and have not been reviewed, endorsed or approved by any other entity. This site may be compensated through the advertiser Affiliate Program. For more information, please see our Advertising Policy.

Good nature and good sense must ever join;

To err is human, to forgive, divine.

— Alexander Pope, “An Essay on Criticism”

WARNING: Engineering geek alert.

Bear with me here, and I’ll eventually get around to personal finance. If you’d rather skip all that engineering stuff and just read about financial independence then jump down to the final paragraph.

One of Henry Petroski’s previous books (25 years ago) is “To Engineer is Human”. His latest is To Forgive Design: Understanding Failure. He’s a civil engineering & history professor at Duke University, and he makes the most of the overlap between his two fields.

This book sings to me because I started the Navy’s nuclear power school nearly 30 years ago. It seemed that as soon as we understood the basics of a system, whether it was nuclear or electrical or mechanical, the very next thing we were taught was how it would fail.

The same trend followed through our land-based military nuclear plant training, and out in the submarine fleet it ramped up several notches. Everyone expected you to know how the systems were designed and how they worked together, but you really had to understand how they failed.

You trained a dozen times a week on these failures and the emergency procedures. Even during the world’s most boring midwatches you were monitoring for signs of impending failure and doing maintenance before it failed. You were never happy when something managed to fail without warning, but you were rarely surprised.

It’s a little ironic. The Navy’s submarine force has the world’s most expensive equipment and most highly trained personnel, but all we do is chase down broken gear. Every day we pushed our gear (and ourselves) to the limits of the safety margins, and we needed to know how to handle failure. Imagine how bad it would be if we didn’t buy quality and spend all our time training.

My family will confirm that three decades of watchful paranoia has made me a little difficult to live with, but I immediately recognize the failure philosophy in Petroski’s book. He describes one design and construction triumph after another, and in the very next paragraph he explains how they turned into miserable failures.

Probably the most famous example of a civil-engineering disaster is the 1940 Tacoma Narrows Bridge collapse. Millions of high-school physics students have seen the video of undamped oscillation resonance, and I’m sure that many of them have snickered at our ancestors’ engineering hubris. Who could design such a vulnerable system, and why didn’t they see this problem coming when they were building it?

Petroski digs into the details and leads us down the primrose path to destruction. Dozens of these bridges were built during the 1930s, and their physics and construction were quite well understood. However, the Tacoma Narrows design pushed the envelope, and its construction clearly showed that it was susceptible to oscillations.

Construction crewmembers actually suffered from motion sickness, and on the drive across it was common for the roadway to flex enough that cars would lose sight of each other. For several months it was the country’s most popular bridge adventure, and it generated a huge revenue in tolls. But in retrospect, the failure analysis showed that the span behaved more like a wing than a bridge, and when a small part eventually failed on one side it took very little time for the Tacoma Narrows winds to generate the lifting forces that pushed the rest of the bridge to collapse.

Another example is the Comet passenger jet airplane. It was a marvel of 1950s aerospace technology, taking passengers faster and higher and further than any propeller-driven aircraft. Everything went fine at first but then several of them exploded during routine flights, killing everyone on board and spreading debris over many square miles.

It seemed impossible to reconstruct the cause of the disasters, let alone fix the problem. One engineer suspected fatigue cracking and eventually managed to carry out a series of horribly expensive full-scale tests on Comet fuselages. The results finally showed that cracks began in the squared-off corners of windows and hatches where metal stress was highest, and where fatigue cycles had a critical effect. The entire industry redesigned aircraft and maintenance to minimize this danger and to reduce the impact of failure.

Petroski writes chapter after chapter of one failure after another, from 18th-century bridges to the space shuttles. In each case he shows how designs and operating procedures were declared safe mainly because nothing bad had happened– yet. Even mathematical analysis, full-scale models, and computer simulations could only predict known effects on materials and designs. All of the world’s engineering expertise couldn’t account for failure from effects that weren’t even recognized, let alone understood. If we blindly (and optimistically) pushed the parameters of materials and their lifecycles, we couldn’t accurately forecast the results.

Part of his book is a warning. Many of the disasters occurred during what appeared to be routine operations or inspections, but later analysis showed that “normal” had evolved over the years to “extreme”. Other disasters happened from events that seemed like a good idea at the time, like staging construction materials on a bridge without accounting for the cumulative concentrated effect of the heavy weight. Crane operators became accustomed to minor equipment annoyances and learned to work around them, not appreciating that a “minor” safety violation would have catastrophic effects when the cranes were loaded to their capacity.

Petrsoski’s harshest criticism is reserved for flawed corporate culture. A disconnect between engineering and operations, or flawed communications and rivalries, will hasten failure. Confused reports or even alerts from the trenches are explained away (or ignored!) by upper management, with predictable results. It happens in the best of organizations, and in a few entrenched bureaucracies it can happen more than once.

He also explains how institutional memory only seems to last for a generation before the warning signs are forgotten. Thumbrules of design and construction are forgotten, or new materials & techniques appear to make the “old” guidelines obsolete. A disaster may resonate through the industry for decades, but eventually a new generation of engineers has only a faint appreciation of the history– and they don’t see how the lessons of an old building collapse could apply to their “modern” understanding of the technology.

Back to the personal finance theme.

The author writes for engineers, but the lessons are also painfully applicable to personal finance. It’s all too easy to ignore warning signs in our budget or our spending, let alone in Wall Street’s latest “sure thing” financial engineering. Every analyst assures us that “it’s different this time”– until it isn’t. Every new generation doesn’t comprehend how truly awful the last recession felt, until they personally experience their own recession. Every retirement plan is susceptible to dangers whose symptoms may be misunderstood or, even worse, ignored until it’s too late.

Will we ever learn from our flawed design process? If two centuries of history is any indication… no. However, Petroski’s book is a text in my daughter’s civil-engineering curriculum, and she’s ready to sign up for the submarine force. I hope that all of our design and operating experience will make us skeptical of the process, painfully aware of how far we can push our equipment, and sensitive to signs of trouble. We may not be able to avoid failure by design, but hopefully we can minimize its effects during operation.

Consider failures when you save for financial independence, too. Be aware that you can’t predict every catastrophe, let alone avoid them. Diversify your investments so that a failure in one or two areas won’t wipe out everything. Have a safe source of income for a subsistence level of spending– even if your safety net is “just” Social Security. Read as much financial history as you can so that you’re ready to stay calm when it tries to repeat itself. And once you’ve achieved your financial independence, keep watching for those little alerts that a problem might be starting. You can’t eliminate failure, but you can deal with it.

Related articles:
Problems with retirement calculators
Is the 4% “safe” withdrawal rate really safe?
Retirement planning: “Just tell me what to do!”
How much cash in a retirement portfolio?

Does this post help?

Sign up for more free tips on financial independence and military retirement by Facebook, Twitter, or e-mail!

WHAT I DO: I help you reach financial independence. For free. I retired in 2002 after 20 years in the Navy's submarine force. I wrote "The Military Guide to Financial Independence and Retirement" to share the stories of over 50 other financially independent servicemembers, veterans, and families. All of my writing revenue is donated to military-friendly charities.

2 Comments
  1. Spot on. You’ve found the connection between engineering and personal finance. I read Petroski as a college senior (civil engineering), and never forgot him. Diversification, redundancy, margin of safety, dependency, design, failure analysis — all engineering principles that contribute to financial success as well. Thanks Doug!

    • Thanks, Darrow! I enjoy reading my college daughter’s civil engineering textbooks. She said yesterday that her engineering economics professor started talking to her class about personal finance…

    Comment? Question? What's on your mind?