Obedience in psychology is the tendency to follow instructions from a person perceived as an authority figure, even when those instructions conflict with your own judgment or moral beliefs. It differs from other forms of social influence because it involves a clear power dynamic: someone with authority gives a direct or implied order, and someone with less power carries it out. The study of obedience became one of the most important (and controversial) areas of social psychology after a series of experiments in the 1960s revealed just how far ordinary people would go when told to by an authority figure.
How Obedience Differs From Conformity and Compliance
Obedience is one of three major types of social influence studied in psychology, and the distinctions matter. Conformity is a change in your beliefs or behavior caused by the presence of other people around you. You might adopt the opinions of your friend group or dress like your coworkers without anyone explicitly asking you to. Compliance is going along with a specific request or demand, even if you disagree with it, like changing your vote when it’s made in public versus private. Both conformity and compliance can happen between equals.
Obedience is different because it specifically involves a hierarchy. There’s an authority figure (a boss, a doctor, a military officer, an experimenter in a lab coat) and someone who perceives themselves as lower in that hierarchy. The person obeying typically feels they have less choice in the matter, and the pressure comes from the authority’s perceived legitimacy rather than from peer influence.
The Milgram Experiment
Nearly everything psychologists understand about obedience traces back to Stanley Milgram’s experiments at Yale University in the early 1960s. Milgram recruited ordinary people from the general public and told them they were participating in a study about learning and memory. Each participant was assigned the role of “teacher” and was instructed to administer increasingly powerful electric shocks to a “learner” (actually an actor) every time the learner answered a question incorrectly. The shocks weren’t real, but the participants didn’t know that.
The shock generator had switches labeled from 15 volts up to 450 volts, with labels ranging from “slight shock” to “danger: severe shock.” As the voltage increased, the actor would cry out in pain, bang on the wall, and eventually go completely silent. When participants hesitated or wanted to stop, the experimenter, a calm man in a lab coat, would issue a series of prompts: “Please continue,” “The experiment requires that you continue,” “You have no other choice, you must go on.”
The results stunned the scientific community. A full 62.5% of participants administered the maximum 450-volt shock to an unresponsive, possibly dying person simply because an experimenter told them to. Before the study, a group of psychiatrists predicted that fewer than 1 in 1,000 participants would go that far. The actual rate exceeded their prediction by a factor of 500.
Why People Obey: The Agentic State
To explain why so many people complied, Milgram proposed a concept he called the “agentic shift.” In everyday life, people operate in an autonomous state where they feel personally responsible for their actions and decisions. But when placed under the direction of an authority figure, many people shift into what Milgram called an agentic state. In this state, they see themselves as agents carrying out someone else’s wishes rather than as individuals making their own choices. The responsibility for what happens feels like it belongs to the authority, not to them. Participants in his experiments often said they were “only following orders.”
This shift isn’t just psychological. Research using precise timing measurements has found that when people act under orders, they perceive a greater gap between their action and its outcome compared to when they act voluntarily. In other words, their brains literally process the connection between pressing a button and causing harm differently when someone told them to do it. The sense of personal agency shrinks.
What Changes How Much People Obey
Milgram didn’t run just one experiment. He ran dozens of variations to understand which factors increased or decreased obedience. Several situational variables made a dramatic difference.
- Proximity to the victim: When the teacher and learner were in separate rooms, obedience was highest. When participants had to be in the same room as the person they were shocking, or physically hold the learner’s hand onto a shock plate, obedience dropped significantly. Distance made it easier to follow orders.
- Proximity of the authority: When the experimenter left the room and gave instructions by phone, obedience fell sharply. Some participants even lied, saying they were increasing the voltage when they weren’t. Authority is much less compelling when it’s not physically present.
- Legitimacy of the setting: The Yale University environment lent credibility to the experiment. When Milgram moved the study to a run-down office building with no university affiliation, fewer people obeyed.
- Presence of dissenters: When other “teachers” (actually actors) refused to continue, participants were far more likely to stop as well. Seeing someone else defy authority gave people permission to do the same.
The Stanford Prison Experiment
Philip Zimbardo’s 1971 Stanford Prison Experiment pushed the study of obedience in a different direction. Instead of testing responses to direct orders, Zimbardo wanted to see what happened when people were simply placed into roles with built-in power dynamics. College students were randomly assigned to be either guards or prisoners in a mock prison built in the basement of Stanford’s psychology department. No one was told to be cruel.
The guards became abusive so quickly that the experiment had to be shut down after just six days, though it had been planned for two weeks. Zimbardo concluded that people don’t always need explicit orders to behave in harmful ways. Sometimes the role itself, along with the authority and power it carries, is enough. The uniform, the title, the social expectation of what a “guard” does, all combined to produce behavior that none of the participants would have predicted of themselves.
Personality and Obedience
Situational factors explain a lot, but not everyone in Milgram’s studies obeyed. About 35% of participants in the original experiment refused to continue at some point. This raises the question of whether certain personality traits make some people more prone to obedience than others.
Research on the “authoritarian personality,” developed by Theodor Adorno and colleagues in the 1950s, offers one framework. Using a measurement tool called the F-scale, Adorno identified a cluster of traits that predict a general disposition toward submission to authority. People who score high tend to show “authoritarian submission,” a broad attitude of deference to authority figures including parents, leaders, and institutions. They are more likely to agree with statements like “Obedience and respect for authority are the most important virtues children should learn.” They also tend to be preoccupied with power and dominance, identifying strongly with power figures and placing high value on toughness and strength. According to Adorno, the high-scoring authoritarian “achieves his own social adjustment only by taking pleasure in obedience and subordination,” often driven by a deep fear of being perceived as weak.
Obedience in the Real World
The reason psychologists care so much about obedience isn’t because of what happens in labs. It’s because of what happens in hospitals, cockpits, military units, and workplaces. In healthcare, cases have been documented where nurses administered drugs they knew were potentially harmful because a physician ordered it. One well-known case, the death of Elaine Bromiley during a routine surgical procedure, illustrated how medical staff can fail to challenge a physician’s decisions even when monitors are showing life-threatening readings and colleagues are raising concerns. The social pressure to defer to the person “in charge” can override training and professional judgment.
In military settings, the link between obedience and atrocity has been studied extensively. The defense of “following orders” has appeared in virtually every war crimes tribunal of the last century. Research on how orders are formulated during wartime shows that commands are often vague or indirect, allowing those who carry them out to use the same psychological distancing Milgram observed in his lab.
Aviation safety improved dramatically once the industry recognized that co-pilots and crew members were reluctant to challenge captains, even when they noticed serious errors. Crew Resource Management training, now standard in commercial aviation, was designed specifically to counteract the obedience dynamic in cockpits by encouraging junior crew members to speak up.
Modern Replications
Milgram’s original studies could never be repeated exactly as designed today because of ethical protections that his own work helped inspire. But in 2009, psychologist Jerry Burger conducted a partial replication that retained nearly all of Milgram’s methods while stopping participants at the 150-volt mark, the point where the learner first demands to be released. Burger found that 70% of participants were willing to continue past that threshold, compared to 82.5% in Milgram’s original version at the same voltage level. The results were strikingly similar, suggesting that the human tendency toward obedience hasn’t changed much in nearly half a century.
How Obedience Research Changed Ethics
The distress that Milgram’s participants experienced, visible as trembling, sweating, and nervous laughter during the experiments, raised serious questions about the ethics of psychological research. Many participants later reported lasting guilt and anxiety about what they had been willing to do. The backlash helped push psychology toward the formal ethical guidelines that now govern all human research.
The foundation for these protections actually predates Milgram. The Nuremberg Code, created in 1947 in response to Nazi medical experiments, established that participants must give informed consent, must be free to end an experiment at any time, and must be protected from unnecessary physical and mental suffering. But Milgram’s work made it impossible to ignore how poorly psychology was applying those principles. His studies became a catalyst for institutional ethics review boards and for specific requirements that research must not risk participants’ psychological wellbeing, personal values, or dignity. Today, any study involving human participants must be reviewed and approved before it begins, with explicit protections for the right to withdraw without penalty.

