Rethinking the Infamous Milgram Experiment in Authoritarian Times
It’s usually cited as showing that people will follow dubious orders under social pressure—but a more important lesson may be that some people will refuse
by Jacob M. AppelEver since social psychologist Stanley Milgram published his “Behavioral Study of Obedience” in 1963, it has become almost de rigueur to reinterpret the validity and significance of his findings every few years. The experiment has woven itself into the American cultural fabric; its macabre setup and unsettling results likely remain well-known to a wide swath of educated laypersons who can describe no other work in the field.
In brief, Milgram, at the time a 26-year-old assistant professor at Yale University, recruited subjects to participate “in a study of memory and learning,” which entailed administering an associative learning task to another subject (actually an accomplice in the study) and then administering painful shocks of substantially higher voltage for each incorrect answer. The purported goal was to study human obedience in the wake of the atrocities of Nazi Germany when, as Milgram described it, “millions of innocent persons were systematically slaughtered on command.” The results proved “surprising” in “the sheer strength of obedient tendencies”; in this first reported experiment, 26 of 40 American subjects shocked the victims at the highest level. Twenty variations with more than 600 additional subjects yielded similar outcomes.
Milgram’s conclusions have been confirmed in multiples settings, with both men and women, with dogs as subjects, and even publicly on the French game show Le Jeu de la Mort, where 80 percent of participants imposed maximum shocks in a similar setup. Professor Jerry M. Burger of Santa Clara University has done a remarkable job of replicating Milgram’s work while imposing necessary ethical safeguards including careful participant screening, lower voltages and immediate disclosure after the sessions; he found obedience rates in 2006 “only slightly lower” than Milgram’s in the early 1960s.
Moreover, Milgram’s work dovetails with purported evidence from unrelated studies. These include Solomon Asch’s conformity experiments, in which subjects reported gross mismeasurements in the length of a line when under social pressure, and Philip Zimbardo’s highly controversial Stanford Prison Experiment, where college-age subjects showed increased brutality when arbitrarily assigned the role of “guards” in a mock prison.
But what should the takeaway be from Milgram’s research? For more than a half century, investigators—most prominently Thomas Blass—have sought to explain why Milgram’s subjects proved so obedient. Although correlates have been found with personality, internal versus external locus of control, underlying belief systems and situational factors, no answer has proven entirely satisfactory.
Instead, the public is generally left with Milgram’s own impression as explained in his book Obedience to Authority: An Experimental View (1974): “Tyrannies are perpetuated by diffident men who do not possess the courage to act out their beliefs.” Or, even more broadly, in the subtitle of his Harper’s article from the previous year: “A social psychologist’s experiments show that most people will hurt their fellows rather than disobey an authority.”
But maybe that’s not the striking result. Blass has noted that there must be “individual differences in obedience ... because in most obedience studies, given the same stimulus situation, one finds both obedience and disobedience taking place.” In other words, some people do disobey. Some of Milgram’s subjects did defy the experimenter. Like Jan Rensaleer, a Dutch immigrant who responded to the experiment’s warning that he had no other choice to continue at 255 volts with the following memorable declaration:
“I do have a choice. Why don’t I have a choice? I came here on my own free will. I thought I could help in a research project. But if I have to hurt somebody to do that, or if I was in his place, too, I wouldn’t stay there. I can’t continue. I’m very sorry. I think I’ve gone too far already, probably.”
In some cases, the subject stood up during the experiment and walked away.
So maybe it is a mistake to view Milgram’s work as an “obedience experiment”—although he clearly did. Maybe what he actually conducted was a disobedience experiment, showing that some people will not follow orders no matter how strong the social pressure.
They are out there, waiting the moment when history calls upon them to disobey. We should not lose sight of them in the weeds of social psychology. They are Stanley Milgram’s unheralded legacy—and we may even stand among them.