Gina Perry, an Australian psychologist, once regarded Milgram as a “misunderstood genius who’d been penalized in some ways for revealing something troubling and profound about human nature.” But by the end of her research on Milgram’s experiments she said, “By the end of my research I actually had quite a very different view of the man and the research” (NPR, 2013).
Perry interviewed a few study participants decades after the exam, one un-named participant was quoted telling her “The thought of quitting never… occurred to me” (NPR, 2013).
Two things Perry addresses are what she called the “powerful parable” and the concern that 65 percent is the only significant measurement addressed. “Perry believes that despite all its ethical issues and the problem of never truly being able to replicate Milgram’s procedures, the study has taken on the role of what she calls a ‘powerful parable’” (Cherry, 2008). Meaning, that Milgram’s work inspired other researchers to explore what makes people follow orders and what leads them to question authority. While the parable is not necessarily a criticism, she does criticize the statistics that are greatly publicized. Milgram had a total of 18 experiments, each having a different rating for obedience and disobedience. However, 65 percent obedience is always mentioned when discussing Milgram’s experiments. What Perry criticizes about this is that this 65 percent only applies to the first, baseline, study, not the other 17 studies.
In 1964 Milgram used the American Psychologist to respond to Baumrind’s concerns about the unethical experiment, his ambition, and his breach of trust. Milgram was in agreement that the experiment upset and distressed some of the participants, but still defended his experiment as ethical. Milgram made it clear in his writing that it was not his intentions to induce stress in his experiments. To verify his intentions, he presented the results of some follow-up procedures. Milgram sent each of his participants a report about the experimental procedure. Appended to the report was a questionnaire asking participants to reflect on their experience. Milgram ended up with 92 percent of subjects returning the questionnaires; with almost 84 percent saying they were glad to have participated and only 1.3 percent said they were sorry they had participated (Blass, 2004, p. 125-127).
One of the most important controversies in regards to his research had to deal “with the ethics of immersing participants in a highly stressful situation without their prior consent and deceiving them into believing that they had hurt, and possibly harmed, an innocent human being” (Blass, 1998, p. 50-51). Half a century later, the rage over the controversy of ethics and meaning of Milgram’s experiments continues. For example, there was a 3-day academic bun fight at Nipissing University in Canada called the 2013 Obedience to Authority Conference to discuss issues that still arise in regards to the experiments (Chin, 2013).
Participation in the Stanley Milgram Experiments (Photo Credit: Derek Gregory)
An ethical issue that received attention was the deception Milgram lead on. Subjects thought they were participating in an experiment on learning and memory, where Milgram was studying the effects of punishment on learning. Not until the (what Milgram called) debriefings did participants know they had not actually hurt anyone. However, many critics believe debriefing was not enough because it did not prevent any subsequent psychological damage that could have affected participants. The realization that they could administer such lethal levels of shock to another human being could have long-term negative psychological effects on the subjects (Controversy in Ethics of Obedience Research). Milgram’s experiment really ignited a debate particularly in social sciences about what was acceptable to put human subjects through (NPR, 2013).
“They were all shocked in the findings. They suggested that the participants knew that no shocks were being administered, but they played along so as not to ruin the study” (Forsyth, 2010, p. 248). Many social psychologists felt that trust was a large factor in regards to this experiment. However, many of these social psychologists felt that trust was a factor that made the participants know the shocks were not real.
In October 1963, the Journal of Abnormal and Social Psychology published a 9-page article written by Milgram, titled “Behavioral Study of Obedience,” which highlighted his obedience experiments. “By his fourth sentence he was already referencing Nazi death camps and their ‘daily quotas of corpses,’ implying that the Holocaust was something his 9-page paper would help the world understand” (Baker, 2013). Milgram was especially attracted to obedience research studies due to the impact of the atrocities of World War II. He drew parallels between the behavior of the subjects he saw in the lab to the willingness of ordinary Germans to slaughter the Jewish people and other minorities during the Holocaust. It was not that Milgram thought there was something wrong with the Germans, but instead there was something wrong with humanity and he wanted to try and find the answer to this problem.
German Nazis during WWII (Photo Credit: Becket Adams)
A participant of Milgram’s experiment (Photo Credit: Saul McLeod)
What Milgram and other thought they would discover versus what they truly discovered
Milgram was certain that very few participants would actually carry out the orders of the experiment (to 450-volts). “So he was surprised when 26 of the 40 (65 percent) individuals who served as teachers in the initial experiment administered the full 450-volts to the presumably helpless learner” (Forsyth, 2010, p. 244). Only a few predicted that anyone would give a shock greater than 180-volts. A panel of psychiatrists, college students, and middle-class adults were asked by Milgram to make predictions about the results of the experiment. “Most people, including both experts and laypersons alike, were surprised by the level of obedience Milgram discovered in his research” (Forsyth, 2010, p. 247).
The Shock Generator (Photo Credit: Jeffry Ricker, Ph.D.)
The baseline study (I will define baseline study as his first initial experiment) done for his experiment, which was then followed by seventeen variations to the experiment, showed that 65 percent of participants would adhere to authority. Milgram had 40 men come to Yale and they were “assigned” either a “learner” or “teacher” role, not knowing that the learner role was truly a confederate. In the baseline study, the teacher was in a room with a man in a lab coat and would perform a memory test with the learner. For each wrong answer the learner was to give, the teacher would administer a shock. “The generator had 30 different switches running in 15-volt increments from 15 to 450-volts. The higher levels of shock were labeled in big letters as ‘Intense Shock’, ‘Extremely Intensity Shock’, ‘Danger: Severe Shock’, and, ominously ‘XXX’” (Jones, 2006, p. 397). Once the shock hit 300-volts, the learner would pound, vigorously, on the laboratory walls, this was repeated at 315-volts, but not heard from again after that level.
Milgram’s 17 variation experiments duplicated the baseline experiment with slight variations. Experiment 2 was a voice-feedback, where the teacher could hear the learner’s complaints from an adjacent room. Experiment 3 focused on proximity, placing the learner in the same room as the teacher, only a few feet away from one another. Experiment 4 was a touch-proximity, where the teacher and learner were in the same room and the teacher had to physically make the learner touch the shock plate.
The 3 variations of the experiment altered how many participants carried out the acts of obedience. “35 percent of the subjects defied the experimenter in the Remote condition, 37.5 percent in Voice-feedback, 60 percent in Proximity, and 70 percent in Touch-Proximity” (Milgram, 1974, p. 53). Other variations of the experiments changed whether or not the man in the lab coat (who was a high school biology teacher) was present in the room and the location of the physical experiment (moved it off of Yale’s campus).
Stanley Milgram is well known today for his controversial experiment on obedience. During the 1960s, while he was a professor at Yale, Milgram conducted a series of experiments on obedience. His findings were shocking: most people, Milgram found, will obey authority figures when instructed to harm others, even if such actions were contrary to their own, personal beliefs. Milgram’s experiment had enormous implications for understanding how so many people could come to take part in the mass murder of Jews and other ethnic minorities in Nazi Germany. However, Milgram’s experiments were questionable from an ethical perspective and they were criticized heavily at the time and since then by social psychologists and other scholars. This paper will discuss Milgram’s experiments and the criticisms that it drew. Because Milgram’s experiments have been central to the study of genocide, it is important to understand the objections that people have raise to them as well as the way that they have been and continue to be justified.
Milgram’s obedience experiments have had a mixed reception in psychology.
On one hand there is recognition of the importance of the work but this is tempered by real concerns about the ethics of his procedure, doubts about the meaning of the results and particularly an almost disregard of Milgram’s attempts to explain his results (Lunt, 2009, p. 63).
Many of the criticisms Milgram received were extreme, but may have been missing the main point that his experiments demand our attention, provoke us to think and raise important questions about power and subjectivity (Lunt, 2009, p. 63).
Whether the ethical arguments and other arguments made are valid against Milgram’s experiments, one thing is for sure, he taught everyone something about obedience. His interest in the Holocaust sparked his initial interest of obedience to authority and today it is proven that anyone is susceptible to obedience to authority. “To a remarkable degree, Milgram’s early research has come to serve as a kind of all-purpose lighting for discussions about the human heart of darkness” (Baker, 2013).
Adolf Eichmann’s Trial (Photo Credit: Timothy Nunan)
Adolf Eichmann was the head of the Department for Jewish Affairs in the Gestapo from 1941-1945. Eichmann was also the chief of operations in the deportation of 3 million Jews to extermination camps. It was Eichmann who organized the Wannsee Conference of January 1942, focusing on the issues related to the “Final Solution of the Jewish Question.” He began to organize the mass deportation of the Jews from Germany to Bohemia in accordance with Hitler’s orders to make the Reich free of Jews as rapidly as possible.
At the end of World War II, Eichmann was arrested and confined to an American internment camp. Eichmann was able to escape the American internment camp, unrecognized. He then fled to Argentina and lived under the alias of Ricardo Klement for 10 years until Israeli Mossad agents abducted him in 1960 to stand trial in Jerusalem. Eichmann’s trial went from April to August of 1961. On December 11, 1961 Eichmann was indicted on 15 criminal charges, including crimes against humanity, crimes against the Jewish people and membership in an outlawed organization. He was then sentenced to death December 15, 1961. Two minutes before midnight on May 31, 1962, Eichmann was executed by hanging in Ramleh, Israel. He was cremated and his ashes were spread at sea, beyond Israel’s territorial waters. This was the only time Israel has enacted a death sentence (Adolf Eichmann, 1997).
At his trial, he expressed surprise at being hated by Jewish people, stating that he had “merely obeyed orders, and surely obeying orders could only be a good thing.” He was declared sane by 6 psychiatrists and was described at his trial as a very average man (McLeod, 2007). The New Yorker magazine sent over reporter, Hannah Arendt, to cover the trial. It was because of Eichmann’s dull bureaucratic demeanor that Arendt coined the phrase “the banality of evil” (Perry, 2013).
Adolf Eichmann (Photo Credit: Krusty 1960s History Story)