The Milgram Experiment: Why Good People Follow Bad Orders
In 1961, Stanley Milgram ran an experiment that would haunt psychology for decades.
The setup was simple: a “teacher” (the real participant) would administer electric shocks to a “learner” (an actor) whenever they answered questions wrong. The shocks started mild — 15 volts — and increased with each wrong answer, up to a potentially lethal 450 volts.
The learner would cry out in pain. Beg to stop. Eventually go silent.
The question: How far would ordinary people go?
The Horrifying Results
65% of participants delivered the maximum 450-volt shock.
Not psychopaths. Not sadists. Regular people from New Haven, Connecticut — teachers, salesmen, engineers, laborers. Two-thirds of them administered what they believed were potentially fatal electric shocks to a stranger, simply because a man in a lab coat told them to continue.
Why This Happened
Milgram identified several factors that drove obedience:
1. Gradual Escalation
The shocks didn’t start at 450 volts. They started at 15. Each step was just slightly more than the last. Once you’ve agreed to 15, why not 30? Once you’ve done 300, what’s 315?
This is the foot-in-the-door technique — small commitments lead to larger ones.
2. Authority Legitimacy
The experimenter wore a lab coat. The study was conducted at Yale. These signals screamed legitimate authority. We’re trained from childhood to obey authority — teachers, doctors, bosses. That training doesn’t switch off.
3. Diffusion of Responsibility
“I was just following orders.” When someone else is giving commands, we feel less personally responsible for outcomes. The authority figure absorbs the moral weight.
4. Physical Distance
The “learner” was in another room. Participants couldn’t see the suffering directly. When Milgram moved the learner closer — or had participants hold their hand on the shock plate — obedience dropped dramatically.
The Uncomfortable Mirror
What makes Milgram’s experiment so disturbing isn’t what it says about those people. It’s what it says about all of us.
Every participant who delivered the full shock believed they were a good person. They showed visible distress — sweating, trembling, nervous laughter. They wanted to stop. But they didn’t.
The situation was stronger than their character.
Modern Implications
Milgram ran his experiments to understand the Holocaust — how ordinary Germans participated in mass murder. But obedience to authority didn’t end in 1945:
- Corporate scandals: Employees who “just followed orders”
- Medical settings: Nurses administering harmful doses because a doctor said so
- Tech ethics: Engineers building surveillance tools because the company asked
- Social media: Content moderators reviewing traumatic content for $15/hour
Authority isn’t inherently bad. We need structure. We need expertise. But blind obedience is dangerous — and we’re more susceptible than we think.
Defense Mechanisms
-
Question legitimacy. Is this authority actually legitimate in this context? A lab coat doesn’t make someone an ethics expert.
-
Notice escalation. If you’re being asked to do something slightly worse than before, pause. The pattern itself is a warning sign.
-
Restore distance. When you feel moral responsibility slipping away, bring it back. “Am I personally okay with this?”
-
Practice disobedience. Small acts of resistance build the muscle for larger ones.
The Real Lesson
Milgram didn’t prove that people are evil. He proved that situations can make ordinary people do terrible things — and that understanding the situation is the first step to resisting it.
The experiment wasn’t really about the shocks.
It was about you.
The Milgram Experiment is one of 12 psychology experiments explored in the Sleight app. Download free on the App Store to learn more.
Want more patterns like this?
Sleight teaches you 44 psychological patterns with real examples and ethical applications.
Download the app free →