Failing at the First Rule of Risk Management

by | February 16, 2022

Editor’s note: This article is part of a series. Click here for the previous article.

Gordon Graham here to close out my thoughts on Family Six of the 10 Families of Risk—Information Risks. Thanks for your continuous feedback on these ramblings. Your comments make me think, which is always good. As I thought about where I wanted to take this article, I knew I’d have to give Madame Editor a heads up: It will go over my allotment of words. Fortunately, she agreed to indulge me.

I was in Thatcher, Ariz., a couple years ago doing my first class of the New Year and I wrapped it up with my signature comments on the “three basic rules of risk management.” The undersheriff of the county chatted with me after the program and reminded me that he first heard the three rules in one of my programs in the late 80s.

As I drove back from Thatcher to Phoenix Sky Harbor airport in my Hertz sled (what a beautiful part of the world that is), I got to thinking that for 40 years, I have been wrapping up many of my programs with these “three rules.”

With this in mind, let me present “rule one” of these three rules. The source of this rule is the great risk management guru of the 40s, Archand Zeller. I never met him, but I was introduced to his work when I was in graduate school at the University of Southern California’s Institute of Safety and Systems Management. It goes like this:

“The Human does not change. During the period of recorded history, there is little evidence to indicate that man has changed in any major respect. Because the man does not change, the kinds of errors he commits remain constant. The errors that he will make can be predicted from the errors he has made.”

Or more simply stated, we must learn from the mistakes of the past. Let me expand on this a bit: We must learn from our personal mistakes, but that is a relatively small number. The better idea is to learn from the mistakes that others who are “similarly situated” (same profession and/or job description) have made. That’s where I am headed today.

As I get older and look at what I need to do for my profession while I am still able to do things, the need to preserve institutional knowledge is number one on my list.

Let me start with the end of the story: We are not doing a good job learning from the past mistakes made public safety operations.

As I reread that sentence, I want to modify it to emphasize something—we are not even aware of most of the past mistakes made by public safety personnel—especially in law enforcement—let alone learning from these mistakes. And this needs to be fixed.

Time for more digressing! As I get older and look at what I need to do for my profession while I am still able to do things, the need to preserve institutional knowledge is number one on my list—seriously! Those of you who view the Lexipol Today’s Tip (you can access individual tips or sign up to receive one each week), may remember that I did a tip surrounding this important issue. More recently, Lexipol and Police1 started the Institutional Knowledge Project to create a repository of lessons learned.

I’ve advocated for projects like these because we are throwing away too much institutional knowledge and we are not learning as much as we should from the past. Finally, he is getting to the point.

You are probably aware that when a plane crashes, NTSB shows up and does an in-depth investigation on the crash. They have a template for how to do the investigation, they have a variety of people with different areas of expertise look at all the facts, they finalize a report and then it goes on the NTSB website—where ANYONE and EVERYONE can read it, study it, share it and teach the findings to others so the errors and omissions that caused the crash will not recur.

When a firefighter dies, in many cases NIOSH shows up and they have a template for how to investigate the death, they have a variety of people on that team, they do the in-depth investigation and they get beyond the proximate cause and look for what really caused the death. The report is then finalized and is put on the NIOSH website where ANYONE and EVERYONE can learn what caused the tragedy—so that it will not recur. On the fire website I co-host with Chief Billy Goldfeder,, we link to these reports because they’re critical for learning from and preventing mistakes.

So let’s talk about line-of-duty deaths (LODDs) in law enforcement. I am a huge fan of Officer Down Memorial Page (ODMP)—they have a fantastic app that should be on your phone. I respect their great work in gathering information about LODDs in American law enforcement. Having said that, they publish what they know about the incident.

Take the LODD of Officer Paul Dunn from Florida in 2020. ODMP reported what they know about the event. Their summary was three paragraphs. The first dealt with identification of the officer and the third dealt with a bit of the officer’s life. Paragraph two focused on what happened:

He was en route to the police station on his department motorcycle when he struck the raised median of the roadway. He was thrown from the motorcycle and sustained fatal injuries.

So what caused this terrible tragedy? Well, he hit a raised median and was thrown from his bike! What do you learn from that summary? In no way do I mean to minimize this death—nor is it my intent to criticize the good people at ODMP—but all they can publish is what they know about the event. They don’t have a team on the ground to investigate and reconstruct the event and determine root cause. And as a result, all we will learn from this event is the officer hit a raised median and was thrown from his bike.

The better idea is to learn from the mistakes that others who are “similarly situated” have made.

Can you imagine if that was a plane crash that killed the pilot? Do you think the NTSB report would be a bit more detailed than “the pilot hit a tree and was ejected from the plane”?

I GUARANTEE you there would be a lot of other factors involved in that NTSB investigation, including fatigue (which we rarely talk about in public safety), training, type of equipment, maintenance of equipment, weather conditions, personal issues concerning the pilot (going through divorce, bankruptcy, death of a child) and all the other related issues that may have contributed to the tragedy.

I am fearful this type of investigation is not always done on law enforcement LODDs, and when it is done, the distribution is very limited. As a result, other similarly situated people will make the same mistakes and we will have another LODD.

If you do not yet hate what I am saying here, let’s take it a step further. Every year cops in America kill about 1,000 people. Personally, I think that number is remarkable when you think about how many daily contacts (including dealing with felons) American law enforcement makes—and yet only 1,000 deaths.

What do we learn from these (mostly) shootings? In my opinion most of them are “good shoots” where the officer/deputy involved was forced to use deadly force. But you and I know that a small number of them are questionable—and some are flat out wrong.

What do we learn from the “bad shoots”? A report is done on the event but is it published? And does it provide necessary information to help us learn from the event?

I am well over my word allocation at this point, so I will have to sum things up. This piece is my final effort on information risks. We make decisions based on information. I am concerned that the amount of information we have on public safety LODDs and critical incidents is limited and we need to better learn from past tragedies.

With all of this in mind, I am a big fan of standardizing the process of investigating LODDs and officer-involved shootings. After these investigations are completed, the reports should be released so every member in our profession can learn “what really happened”—and have a chance to prevent similar tragedies from occurring.

I recognize some people will criticize my thinking—and I won’t take it personally (that is the nice thing about being old). Some will argue that we shouldn’t make the information available to everyone. And why not? Our citizens are demanding (and deserve) full transparency in our operations. And our personnel have the right to learn from these tragedies so as to have a better chance of going home safe at the end of every shift.

In my next article, I will talk about Family Seven of the 10 Families of Risk: Human Resources. Until then, please work safely.

TIMELY TAKEAWAY—Take a look at what is going on in Wisconsin, where a statute requires an independent investigation whenever the action or inaction of a law enforcement officer results in the death of an individual. When the Wisconsin Division of Criminal Investigation serves as the independent agency, it provides a complete report on its website. Other states are starting to consider this approach. Take a look and ask yourself, what could we learn as a profession if every state and every agency followed this approach? What tragedies could we prevent?

GORDON GRAHAM is a 33-year veteran of law enforcement and the co-founder of Lexipol, where he serves on the current board of directors. Graham is a risk management expert and a practicing attorney who has presented a commonsense risk management approach to hundreds of thousands of public safety professionals around the world. Graham holds a master’s degree in Safety and Systems Management from University of Southern California and a Juris Doctorate from Western State University.

More Posts
Share this post:

The Briefing – Your source for the latest blog articles, leadership resources and more