What Is The Banality Of Evil Psychology

The banality of evil is the idea that horrific acts are often carried out not by monsters or psychopaths, but by ordinary people who simply stop thinking critically about what they’re doing. The phrase was coined by political philosopher Hannah Arendt in 1963 after she observed the trial of Adolf Eichmann, a Nazi official who helped organize the Holocaust. What struck her wasn’t his malice. It was his shallowness.

Where the Concept Came From

Arendt traveled to Jerusalem in 1961 to cover Eichmann’s trial for The New Yorker. She expected to encounter a fanatic or a sociopath. Instead, she found a man who spoke almost entirely in clichés, couldn’t form an original thought, and seemed genuinely incapable of seeing the world from anyone else’s perspective. “The deeds were monstrous,” she wrote, “but the doer was quite ordinary, commonplace, and neither demonic nor monstrous.” The only distinctive trait she could identify was “a curious, quite authentic inability to think.”

This wasn’t stupidity, Arendt insisted. It was something more unsettling: a total absence of reflection. Eichmann followed procedures, used official language, and carried out orders without ever pausing to consider what those orders meant in human terms. When confronted with situations that didn’t have a standard protocol, he was helpless. His cliché-ridden speech, Arendt noted, produced “a kind of macabre comedy” in the courtroom.

From this, Arendt developed a broader philosophical claim. Evil, she argued, is not deep or demonic. It is a surface phenomenon. “It can overgrow and lay waste the whole world precisely because it spreads like a fungus on the surface,” she wrote. “Thought tries to reach some depth, to go to roots, and the moment it concerns itself with evil, it is frustrated because there is nothing. That is its ‘banality.'” The antidote, in her view, was thinking itself: the habit of examining whatever happens to come to pass, regardless of the outcome.

What Psychology Found

Arendt was a philosopher, not a psychologist, but her observations launched decades of experimental research into how ordinary people end up doing terrible things. Three landmark studies form the backbone of this work.

Obedience to Authority

In the early 1960s, psychologist Stanley Milgram designed an experiment at Yale University that tested whether people would inflict pain on a stranger simply because someone in a lab coat told them to. Participants were instructed to deliver increasingly powerful electric shocks to another person (actually an actor) every time they answered a question wrong. The shocks weren’t real, but the participants believed they were. Sixty-five percent of participants went all the way to the maximum 450-volt shock, a level clearly marked as dangerous.

Milgram’s conclusion mirrored Arendt’s insight almost exactly: “It is not so much the kind of person a man is as the kind of situation in which he finds himself that determines how he will act.” Participants who delivered the shocks weren’t aggressive or sadistic. They did it, Milgram found, “out of a sense of obligation, a conception of his duties as a subject, and not from any peculiarly aggressive tendencies.” They became so focused on following instructions that they lost sight of what those instructions actually produced.

The Power of Roles

Philip Zimbardo’s 1971 Stanford Prison Experiment took the question further. College students, screened for psychological health, were randomly assigned to play either guards or prisoners in a mock prison built in a university basement. Within days, many of the guards began acting in cruel, dehumanizing, and even sadistic ways. Prisoners broke down emotionally. The study had to be shut down early because it was spiraling out of control, all among people who had been “normal, healthy, ordinary young college students less than a week before.”

Zimbardo framed the lesson as a question of “bad apples” versus “bad barrels.” The conventional explanation for cruelty is that bad people do bad things. His experiment suggested that bad environments can make ordinary people do bad things. The situation, the roles, the uniforms, the power structure, can override individual character in ways most people don’t expect and wouldn’t predict about themselves.

The Pull of the Group

Even without authority figures, group pressure alone can distort individual judgment. Solomon Asch’s conformity experiments in the 1950s showed that about one third of people will give an obviously wrong answer to a simple visual question, just because everyone else in the room gave that wrong answer first. Later replications have confirmed this rate, finding 33% conformity for factual questions and 38% for political opinions. The rest of the participants resisted, but the fact that a third of people will override their own perception to match a group consensus reveals how powerful social pressure is, even for matters with a clear correct answer.

How People Disconnect From Their Actions

Psychologist Albert Bandura identified eight specific mental strategies people use to justify harmful behavior without feeling guilty. He called this process moral disengagement, and it helps explain the cognitive machinery behind banal evil.

  • Moral justification: reframing harm as serving a higher purpose (“I’m protecting the nation”)
  • Euphemistic labeling: using sanitized language to disguise what’s actually happening (“enhanced interrogation” instead of torture)
  • Advantageous comparison: making your actions seem minor next to someone else’s worse behavior
  • Displacement of responsibility: believing you’re not truly responsible because someone above you gave the order
  • Diffusion of responsibility: feeling less accountable because many people are involved
  • Distortion of consequences: minimizing, ignoring, or denying the harm caused
  • Dehumanization: stripping victims of human qualities so harming them feels less wrong
  • Blaming the victim: convincing yourself the victim brought it on themselves

These mechanisms don’t require conscious deception. People genuinely believe their own rationalizations. And bureaucratic structures make several of these strategies almost automatic. When your job involves filling out forms, following protocols, and reporting to a chain of command, it’s easy to displace responsibility upward (“I was just doing my job”), diffuse it sideways (“everyone else was doing it too”), and lose sight of consequences entirely because you never see the people affected by your decisions.

Recent neuroscience research supports this. A 2024 study found that when people follow orders, the brain activity associated with moral conflict is measurably reduced compared to when they make the same choice freely. Obeying authority doesn’t just give people a social excuse for harmful acts. It appears to genuinely diminish the internal alarm that would normally make them hesitate.

Where the Original Theory Falls Short

Arendt’s concept remains enormously influential, but the specific case that inspired it has come under serious scrutiny. Historians who have examined documents and recordings unavailable during the trial, including tapes of Eichmann speaking candidly in Argentina before his capture, have reached a starkly different conclusion about the man himself. The available evidence, according to scholars like David Cesarani and Bettina Stangneth, strongly suggests that Eichmann was a committed antisemite who actively desired the extermination of Jews, not a passively thoughtless bureaucrat.

In other words, Eichmann was likely performing banality in the courtroom. He adopted the role of clueless functionary because it was his best legal defense. Arendt, critics argue, took the performance at face value. The judges at his trial did not. They concluded he was an ideologically motivated antisemite, and later historical research corroborates their judgment.

This matters because it highlights a real limitation of the theory: some perpetrators of atrocity are true believers, not just thoughtless rule-followers. The banality of evil explains one pathway to mass harm, but not the only one. Ideology, hatred, and deliberate cruelty are also real forces. The concept works best as a description of how systems enable harm through ordinary people, rather than as a universal explanation for all evil.

Why It Still Matters

Even if Eichmann himself was a poor example, the psychological pattern Arendt identified shows up repeatedly in modern contexts. The Volkswagen emissions scandal is a frequently cited case: engineers and managers across multiple levels of the company participated in a scheme to cheat pollution tests, each using moral disengagement strategies appropriate to their position in the hierarchy. No single person needed to be a villain. The organizational structure distributed responsibility so effectively that the system produced massive harm while individuals told themselves they were just doing their jobs.

Research on state-sponsored torture in apartheid-era South Africa found the same pattern. Perpetrators operated from within normal bureaucratic organizations, not as deviant outsiders. They displaced responsibility to superiors and subordinates alike, always pointing somewhere else in the chain of command. The torture wasn’t a breakdown of the system. It was the system working as designed.

The core insight of the banality of evil, then, is not really about evil people. It’s about the conditions that allow people who think of themselves as decent to participate in harm without recognizing it as harm. Bureaucratic distance from consequences, obedience to authority, conformity with peers, and the mental tricks of moral disengagement can combine to produce outcomes that no individual participant would have chosen on their own. The most dangerous thing about this kind of evil is that the people carrying it out often genuinely don’t see themselves as doing anything wrong.