For several years now, this question has lingered in the background of my thinking, surfacing only occasionally in conversation but never quite leaving my mind. On a handful of occasions it has made its way into dialogue with other instructors, usually briefly and almost cautiously.
But it’s a question I keep returning to: Do we really need a focus on morality in firearms training? Not safety rules. Not policy. Not legal updates. Morality. In a profession defined by measurable standards and hard outcomes, is that even an appropriate focus?
The skepticism makes sense. Firearms training lives in a world of irreversible consequences. It is built on precision, documentation, and defensibility. We are accountable for what we teach and for what our students do with what we teach.
When someone introduces language that sounds philosophical, the reflex in many instructors is protective. Keep it practical. Keep it measurable. Keep it operational. Don’t let the range turn into a seminar.
And if morality meant ideological debates, virtue signaling, or political framing, I would agree with the skepticism. That has no place in a high-liability skill domain. But that’s not what I’m talking about. What I’m talking about is far more practical and far more uncomfortable.
Whether we acknowledge it or not, firearms instruction is already moral instruction. It always has been.
It is embedded in what we reward, what we tolerate, what we excuse, and what we normalize. The range is not morally neutral. The way we teach lethal capability is never morally neutral. The only question is whether we address that reality intentionally or allow it to operate implicitly through culture and drift.
When someone says, “We don’t need morality, we need safety and standards,” I understand the impulse. Safety rules are clear. Standards are measurable. Qualifications are documentable. They feel objective. But safety rules and standards are not self-executing.
They rely on human beings to interpret and apply them under imperfect conditions. And human beings, especially under stress, do not operate like checklists.
They conserve effort. They respond to social pressure. They defend identity. They rationalize. They drift. They comply when watched and cut corners when they believe the consequences are low. That is not a moral indictment. It is a recognition of how the human brain works. If we train firearms skill while pretending the human operating system doesn’t matter, we are building competence on top of fragility.
Firearms training is a capability amplifier. It increases a person’s ability to impose force. In a low-liability domain, mistakes are learning experiences. In a high-liability domain, mistakes are often irreversible. That reality changes the equation. It means we cannot afford illusions, including the illusion that technical skill can be separated from judgment.
Early in my career, I leaned heavily into the technical side. You can measure hits. You can measure time. You can enforce safety rules. You can document compliance. That feels solid. It feels defensible. And in a controlled environment like a square range, it often looks sufficient. Students can perform well. They can recite policy. They can demonstrate safety habits. But performance in a controlled environment is not the same thing as readiness in an uncontrolled one.
The learning science literature has been clear on this for years. What looks fluent in training can be fragile in retention and transfer. Robert and Elizabeth Bjork describe how conditions that make practice feel easy often create weak long-term learning, while conditions that feel more effortful can produce more durable retention and adaptability (Bjork & Bjork, 2011). In other words, smooth performance during training can be deceptive.
That insight applies not just to mechanics, but to moral reasoning. A student can say the right things in a classroom. An instructor candidate can echo the right policy language. But under stress, people default to internalized patterns, not rehearsed answers. When the rules get fuzzy, when incentives conflict, when time compresses and social pressure intensifies, what governs behavior is not the slide deck. It is the internal architecture that training has built.
This is where two developmental models become useful, not as academic ornaments, but as practical lenses: Kohlberg’s stages of moral development and the Dreyfus model of skill acquisition.
Kohlberg describes how people reason about right and wrong. At early stages, behavior is guided by punishment avoidance or reward. Later, it may be guided by approval or conformity. More mature stages involve duty, respect for systems, and eventually principled reasoning that can resist group pressure and authority gradients (Kohlberg, 1981).
If you’ve instructed long enough, you’ve seen these differences. You’ve seen the student who behaves safely because they don’t want to be corrected, and you’ve seen the student who behaves safely because they view it as a duty to others.
Dreyfus, on the other hand, describes how people acquire complex skills. Novices rely on rigid rules. They think step by step. Their competence is brittle. As they gain experience, they begin to recognize patterns and respond more fluidly. Experts operate with contextual awareness and adaptability, drawing on internalized models rather than explicit instructions (Dreyfus & Dreyfus, 1986).
What makes these models especially relevant in firearms training is that they describe the same underlying shift from different angles. Dreyfus describes the shift from external rules to internalized performance models. Kohlberg describes the shift from external consequences to internalized ethical models. In firearms instruction, both are developing at the same time.
And here is the key: the skill can kill someone, and the moral reasoning determines what someone does when the rules get fuzzy.
We love rules in firearms culture. We should. The four safety rules are foundational. SOPs matter. Qualifications matter. But rules are scaffolding.
They are not the structure itself. In real-world conditions, rules can conflict. Policy does not anticipate every scenario. Time pressure narrows options. Social dynamics distort perception.
In those moments, behavior is driven by internal models, not by laminated cards.
On the performance side, that means the shooter will default to whatever motor programs and perceptual habits are most deeply ingrained. Under stress, attention narrows, working memory is taxed, and automated behaviors dominate. If training has built brittle competence that only functions in predictable environments, that brittleness will surface when context shifts.
On the moral side, the same principle applies. If compliance is rooted in punishment avoidance, behavior may degrade when supervision is low. If compliance is rooted in group approval, behavior may drift when the culture rewards shortcuts. Only when moral reasoning is internalized at the level of duty and principle does it reliably resist pressure.
This is where I’ve come to use the idea of dual-track progression. Every shooter and every instructor develops along two tracks: performance architecture and ethical architecture. The performance track determines what the person can do under variable conditions. The ethical track determines what the person chooses to do under conflicting incentives.
If you advance one track and neglect the other, you create risk.
High skill with low moral maturity is not a virtue in a high-liability domain. It is a liability multiplier. The shooter who performs impressively but cuts corners, normalizes deviance, or prioritizes identity over duty exports that instability into the culture. Students learn what is truly rewarded. Over time, standards erode not through open rebellion, but through quiet drift.
Drift is one of the most underestimated threats in firearms training. It is the gradual normalization of deviance that occurs when shortcuts become routine, when near misses are dismissed as luck, and when instructors inherit methods without examining their purpose. Drift is rarely malicious.
It is usually the product of cognitive bias and institutional pressure. People want efficiency. They want to finish on time. They want to keep students engaged. They want to avoid confrontation. In a high-liability domain, those ordinary motivations can create extraordinary risk.
At the same time, moral seriousness without durable skill is also dangerous.
The conscientious officer whose performance architecture collapses under cognitive load can make catastrophic errors, not because of bad intent, but because of fragile training. Stress alters perception and decision-making. Situation awareness can degrade.
Without deliberate training for adaptability and transfer, even well-intentioned professionals can fail at the worst moment.
So do we need morality in firearms training? If by morality we mean political discourse or ideological conformity, no. That has no place on a professional range. But if by morality we mean principled restraint, defensible decision-making, and the internalization of duty under pressure, then yes, we absolutely do.
The reason is simple. Firearms instructors are not merely teaching mechanics. They are shaping decision-makers who carry lethal capability into morally complex environments. Every drill, every correction, every tolerance sends a signal about what matters. Culture is moral instruction, whether we call it that or not.
Consider how often real problems arise in gray zones rather than clear violations. A student engages in borderline unsafe behavior. An instructor hesitates to correct it because the student is influential or because the class is behind schedule. That hesitation is not a technical issue. It is a moral decision point. And the choice made in that moment trains everyone watching what the true standard is.
Or consider the instructor candidate who designs drills that reward speed in ways that gradually erode safety margins. The students love it. The drills look impressive. But the incentive structure quietly shifts from disciplined control to performance display. That is not a technical oversight. It is moral immaturity riding on technical competence.
Kohlberg’s framework helps explain why surface compliance is not enough. Two individuals may follow the same rule for very different reasons. One may do so to avoid punishment. Another may do so out of duty. Under pressure, those reasons matter. If the incentive structure changes, the first individual may drift. The second is more likely to hold.
Dreyfus’s framework complements this by reminding us that visible performance is not equivalent to expertise. Expertise involves pattern recognition, contextual awareness, and adaptability. It requires exposure to variability and feedback that challenges assumptions. Instructors who mistake repetition for mastery risk building shooters who are fluent but fragile.
Cognitive psychology has also demonstrated how biases distort judgment, especially under uncertainty (Kahneman, 2011). Instructors are not immune. They can overestimate competence based on confidence. They can confuse charisma with capability. They can overlook warning signs in favored students. If moral reasoning and decision-making are not part of instructor development, these biases operate unchecked.
There is also an operational dimension that cannot be ignored. Critical incidents are evaluated not only on technical grounds but on moral narratives: reasonableness, proportionality, restraint, duty. Public trust is shaped by those narratives. Agencies are scrutinized through them. If we fail to prepare instructors and officers to understand the moral dimension of lethal force, we leave them vulnerable in courtrooms, in media coverage, and in the broader social environment.
This does not mean training for optics. It means recognizing that moral reasoning is inseparable from professional survivability. An officer can be legally justified and still face intense moral condemnation. An instructor can be policy-compliant and still have cultivated a culture that encourages avoidable risk. Ignoring morality does not remove it from the equation. It simply ensures we are unprepared to engage it.
If morality is to have a place in firearms training, it must be approached with the same rigor as skill acquisition. It must focus on decision-making under ambiguity, not on abstract virtue. It must examine how incentives shape behavior. It must challenge instructors to evaluate what they reward and what they tolerate. It must reinforce that stewardship, not performance, is the core of the instructor role.
When dual-track progression is taken seriously, instructor development changes. Candidates are evaluated not only on how they shoot, but on how they respond to conflicting incentives. Not only on how they run drills, but on how they handle gray zones. Not only on their ability to perform, but on their willingness to enforce standards when it costs them something.
That is not soft. It is disciplined. It recognizes that in high-liability domains, internal architecture determines external outcomes. Dreyfus reminds us that performance must be adaptive and resilient. Kohlberg reminds us that moral reasoning must be principled and stable. Together, they describe the full architecture required for professionals who carry lethal capability.
So do we need a focus on morality in firearms training? If we are honest about what the profession demands, the answer is yes. Not as ideology. Not as sermon. But as an intentional effort to develop principled restraint and defensible decision-making alongside technical skill.
The range will always require safety rules, standards, and measurable performance. Those are foundational. But they are not sufficient. If we stop at mechanics, we build half a structure and hope the rest holds. In a profession defined by irreversible outcomes, hope is not a strategy.
Morality, approached professionally and cognitively, is not a distraction from firearms training. It is the stabilizing force that determines how skill is applied when the environment becomes uncertain and the rules are no longer clear. In a high-liability world, that is not optional. It is part of the work.
References
Bjork, R. A., & Bjork, E. L. (2011). Making things hard on yourself, but in a good way: Creating desirable difficulties to enhance learning. In M. A. Gernsbacher, R. W. Pew, L. M. Hough, & J. R. Pomerantz (Eds.), Psychology and the real world: Essays illustrating fundamental contributions to society (pp. 56–64). Worth.
Dreyfus, H. L., & Dreyfus, S. E. (1986). Mind over machine: The power of human intuition and expertise in the era of the computer. Free Press.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Kohlberg, L. (1981). The philosophy of moral development: Moral stages and the idea of justice (Vol. 1). Harper & Row.

Comments