In law enforcement, confidence is treated as a survival trait and uncertainty as a liability. We tell ourselves that failure is not an option, that hesitation gets people hurt, and that decisiveness separates professionals from amateurs. After enough time in this profession, though, a pattern becomes hard to ignore.
The people who are the most certain they are correct are often the least willing to examine why, quick to dismiss what they do not understand and slow to question familiar answers. What begins as necessary confidence can quietly harden into something far more dangerous.
A growing problem in modern policing and high-liability training conversations, one law enforcement professionals see increasingly, is how confidently people reject things they don’t actually understand, and people speaking with certainty about things they possess little knowledge or comprehension of.
It appears in policy debates, training arguments, use-of-force discussions, and conversations regarding the foundational principles of each topic. The pattern is familiar. A complex system, a hard-earned body of expertise, or a technical explanation is offered. Instead of curiosity, the response is rejection. Instead of engagement, there is certainty. If it does not fit neatly into an existing mental framework, it must be wrong, corrupt, outdated, or intentionally misleading.
This tendency does not exist in a vacuum. It is reinforced by a broader societal precept that failure is not an option. From early education through professional life, we are conditioned to believe that uncertainty signals weakness and that admitting error carries reputational cost. In high-consequence professions like law enforcement, this belief is amplified.
The stakes are real, the scrutiny is constant, and mistakes are unforgiving. Over time, the pressure to appear certain becomes a pressure to be certain, even when the evidence does not justify it.
This is not simple ignorance. Ignorance can be corrected. What we are dealing with here is something more entrenched, a mix of cognitive bias and cultural conditioning that actively resists correction. Psychology and cognitive science offer several concepts that help explain this behavior.
Three of the most relevant are epistemic arrogance, the Dunning-Kruger Effect, and the Einstellung Effect. These terms are often used loosely or interchangeably, but they describe distinct failures of judgment. Each operates differently, and each plays a role in how individuals and institutions make poor decisions while remaining convinced they are acting rationally.
I have seen all three at work in policing, in training culture, in administrative decision-making, and in public commentary. I have also seen how damaging they become when they overlap. Understanding the difference among them is not an academic exercise. It is a practical necessity in a profession where mistakes carry legal, moral, and sometimes irreversible consequences.
Epistemic arrogance is the most visible and, in many ways, the most corrosive of the three. It is not a formal diagnosis or a narrowly defined cognitive bias. It is an attitude. Epistemic arrogance occurs when a person treats their own understanding as the outer boundary of what can reasonably be true.
If they cannot comprehend an idea, or if it conflicts with their intuitive sense of how the world works, they assume the problem lies with the idea rather than with their own limitations.
As Tom Nichols has argued, this posture has become increasingly common in modern institutions, where personal conviction is often treated as equivalent to expertise (Nichols, The Death of Expertise).
In practice, epistemic arrogance often disguises itself as skepticism. It is not the same thing. Skepticism asks questions and withholds judgment. Epistemic arrogance issues verdicts. It presents itself as common sense and often relies on phrases like “that doesn’t make sense to me” or “I don’t see how that could be true,” as though personal comprehension were an objective standard. Research on motivated reasoning shows that people are especially likely to reject information that threatens their sense of competence or status, particularly in environments where failure is stigmatized rather than treated as instructive (Kunda, “The Case for Motivated Reasoning”).
This mindset thrives in environments where expertise is unevenly distributed and institutional trust is low. Law enforcement occupies that space daily. Officers operate at the intersection of lived experience, policy mandates, legal standards, and research.
When someone without depth in one of those domains confidently dismisses it, epistemic arrogance is often at work. It shows up when neuroscience is rejected because it complicates training doctrine, when data is waved off because it conflicts with anecdote, or when established legal standards are dismissed as out of touch because they feel counterintuitive. Nichols notes that this rejection of expertise is rarely about evidence and more often about perceived threats to autonomy and authority.
Epistemic arrogance is not limited to nonexperts. Intelligent, capable professionals can exhibit it outside their areas of competence and sometimes within them. The issue is not being wrong. It is refusing to seriously consider that one’s understanding may be incomplete. In a culture where failure is not an option, intellectual humility becomes risky, and arrogance becomes a defensive posture.
The Dunning-Kruger Effect functions differently. Where epistemic arrogance is attitudinal, Dunning-Kruger is metacognitive. It describes a phenomenon in which individuals with low competence in a specific domain overestimate their ability because they lack the skills needed to accurately assess themselves.
The original research by David Dunning and Justin Kruger demonstrated that participants performing poorly on objective measures consistently rated their performance as above average, precisely because they lacked the insight needed to recognize their deficiencies (Dunning and Kruger, “Unskilled and Unaware of It”).
This is not a moral failing. It is a structural limitation of human cognition. When people do not know what expertise looks like, they mistake familiarity for mastery. Someone who has completed minimal training or absorbed surface-level information may sincerely believe they understand a subject as well as those who have spent years studying it. The problem is not arrogance in the conventional sense. It is miscalibration.
In law enforcement, this effect often appears early in a career, but it does not always disappear with experience. It can persist in environments where feedback is weak, where correction is discouraged, or where confidence is consistently rewarded over competence. The officer who believes they have “seen it all” after a few years on the street may speak with absolute certainty, unaware that their experience represents only a narrow slice of a much larger operational reality. When failure is treated as unacceptable, there is little incentive to reassess one’s own limitations.
What separates the Dunning-Kruger Effect from epistemic arrogance is awareness. The epistemically arrogant individual dismisses expertise because it threatens their worldview or status. The Dunning-Kruger subject believes they already possess the expertise being challenged. One is defensive. The other is blind. Both are reinforced by a professional culture that punishes error rather than treats it as feedback.
The Einstellung Effect adds another layer of complexity. Unlike the other two, it is not primarily about ignorance or inflated self-confidence. It is about rigidity. The Einstellung Effect occurs when a familiar solution prevents recognition of a better one. Once a method has worked, the brain latches onto it.
Past success becomes an obstacle to adaptation. This phenomenon was first demonstrated by psychologist Abraham Luchins, who showed that people often fail to see simpler solutions once they have learned a particular problem-solving method (Luchins, “Mechanization in Problem Solving”).
This effect is especially relevant in professional environments. Experience builds efficient mental shortcuts, but those shortcuts can narrow perception. In policing, the Einstellung Effect shows up when agencies cling to outdated tactics because they worked in the past, when training curricula remain static despite changes in law or science, or when supervisors insist on doing things the way they have always been done. This is not ego-driven arrogance. It is cognitive inertia reinforced by institutional risk aversion. When failure is not an option, innovation feels dangerous.
The real danger emerges when these three forces combine. A decision-maker may rely on outdated practices due to the Einstellung Effect. If that same person exhibits epistemic arrogance, they may dismiss new research or external critique because it does not align with their established understanding.
If they are also affected by the Dunning-Kruger Effect outside their core expertise, they may sincerely believe they have evaluated the evidence competently when they have not.
Research on organizational failure shows that serious breakdowns are rarely caused by a single mistake, but by layered cognitive and cultural blind spots that prevent early correction (Reason, Human Error).
At that point, meaningful correction becomes nearly impossible. New information is rejected. Expertise is discounted. Self-assessment remains inflated. The individual or organization becomes insulated from reality by its own confidence. In law enforcement, where legitimacy depends on both performance and public trust, that insulation is dangerous.
This matters because policing does not operate in a forgiving environment. Errors are public, consequences are severe, trust is fragile, and both the judicial courts and the court of public opinion are unforgiving in their verdicts. Cognitive biases that might be inconsequential elsewhere can become catastrophic when embedded in policy, training, or operational decisions. James Reason’s work on high-risk systems makes clear that organizations obsessed with avoiding visible failure often create conditions for larger, systemic failure down the line.
One of the most frustrating dynamics I have observed is how often these failures are mistaken for strength. Confidence is praised. Decisiveness is rewarded. Doubt is treated as weakness. Yet studies of high-reliability organizations consistently show that resilience depends on the ability to surface errors early and correct them before they cascade (Reason; Nichols).
The Dunning-Kruger Effect is rarely addressed directly because doing so requires honest feedback and a culture that tolerates correction. It is easier to allow overconfidence than to confront it. Unfortunately, overconfidence scales. When promoted, it multiplies. Leadership failures often trace back not to lack of intelligence, but to unchecked certainty.
The Einstellung Effect is perhaps the most insidious because it looks like professionalism. Standardization, consistency, and tradition all have value. But when they harden into doctrine, they stop serving the mission. The challenge lies in distinguishing proven practice from habitual repetition, especially in a profession that equates deviation with risk.
What connects all three is a failure to accurately locate oneself in relation to knowledge. Epistemic arrogance assumes understanding is sufficient. Dunning-Kruger assumes ability is high. The Einstellung Effect assumes the existing solution is optimal. Each represents a different way of declaring the learning process finished. In a culture where failure is unacceptable, declaring learning finished feels safer than admitting uncertainty.
The corrections are not complicated, but they are uncomfortable. They require the willingness to be wrong in public, to defer to expertise without resentment, and to question default solutions deliberately. They require systems that reward accuracy over bravado and adaptability over tradition.
In my experience, the most effective professionals are not those who claim to know the most, but those who remain aware of how much they do not know. They treat expertise as a resource, not a threat. They understand that unfamiliarity is not evidence of falsehood. They abandon familiar solutions when those solutions no longer fit the problem.
This is not an argument for indecision or endless doubt. It is an argument for aligning confidence with competence and belief with evidence rather than comfort. In a profession built on judgment, that alignment is essential.
If law enforcement is serious about professionalism, legitimacy, and effectiveness, it must confront these cognitive realities honestly. Not by naming them in training slides, but by building cultures and systems that account for them. The human mind will always take shortcuts. Heuristics are psychological certainties. The question is whether we acknowledge that fact or pretend it does not apply to us.
The most dangerous phrase in armed professions is not “I don’t know.” It is “I already know what I need to know.”


Comments