Skip to main content

The experiment requires you to continue

Authority plays a prominent – perhaps obvious – role in the legal system. Police, lawyers, and judges all wield prestige and authority. Psychologists have been interested in people’s tendency to obey authority for decades, and much of our understanding of the psychology of authority is built on the work of one man. If you’ve taken an introductory psychology class, you have probably heard about Stanley Milgram’s studies on obedience to authority. Milgram’s program of obedience research is some of the most famous – if not the most famous – work in psychology. Many people have heard of Milgram’s work, but most people don’t know the details of his experiments or about the follow-up work of researchers who came after him.

In the most well-known version of Milgram’s obedience experiment procedure1, an experimenter first greets two participants and explains that they are to take part in a study on the role of punishment in learning. The experimenter (apparently) randomly assigns one of the participants to play the roles of “teacher” and “learner.” However, one of the participants is actually a member of the research team (what we call a “confederate”). The assignment procedure is rigged so the confederate is always assigned to be the learner. The learner is brought to a separate room (equipped with a microphone and speaker so the teacher and learner can communicate) and hooked up to an electric shock generator. It is then the teacher’s task to guide the learner through a series of memorization tasks, and each time the learner fails to correctly answer a question, the teacher must administer an electrical shock to the learner, with increasing voltage each time. The maximum voltage switch was labeled with a vague but ominous “XXX.” With each shock, the learner cried out in pain and obviously increasing distress. The experimenter does relatively little in this procedure, but if the teacher expresses reluctance to shock the learner, he has a series of scripted verbal prods that request that the teacher continue to administer the shocks. For example, the teacher is told, “The experiment requires you to continue.”

A diagram of the typical obedience experiment.
The results were striking: More than 60% of the participants proceed to administer shocks to the learner up to the maximum voltage, despite his vocal objections. Milgram conducted studies with 23 variations on the procedures, and each variation produced different levels of obedience2. For instance, in some variations of the procedure, the learner claimed to have a heart condition and, after receiving several shocks, simply stopped responding (having apparently collapsed and possibly died). Despite the apparent gravity of the situation, this addition did not change the rates of obedience very much. One version, in which the teacher was only indirectly responsible for the shocks (the teacher read out the questions, and another person gave the shocks), elevated obedience rates to staggeringly high levels of more than 90%. There were two variations in particular that produced notably low rates of obedience: (1) when the experimenter did not directly instruct the participant to increase the shocks and (2) when the learner was a friend or relative of the teacher. However, obedience rates were not zero in these versions; they were around 12-15% – remarkably high considering the possibly fatal consequences if the procedures were real.

There are times in Milgram’s writing that he seems deeply unsettled by his results – that under circumstances that should intuitively have restrained people from obeying immoral commands, people continued to administer shocks. In 1965, he wrote,

What is the limit of such obedience? At many points we attempted to establish a boundary. Cries from the victim were inserted; not good enough. The victim claimed heart trouble; subjects still shocked him on command. The victim pleaded that he be let free, and his answers no longer registered on the signal box; subjects continued to shock him. At the outset we had not conceived that such drastic procedures would be needed to generate disobedience, and each step was added only as the ineffectiveness of the earlier techniques became clear. The final effort to establish a limit was the Touch-Proximity condition [in which the teacher had to physically hold learner against the shock generator]. But the very first subject in this condition subdued the victim on command, and proceeded to the highest shock level. A quarter of the subjects in this condition performed similarly. The results, as seen and felt in the laboratory, are to this author disturbing3.

As compelling and disturbing as Milgram’s results are, there is some disagreement among psychologists about how exactly to interpret them. There are numerous challenges for researchers trying to explain why the rates of obedience vary under different circumstances. First, Milgram did not seem to have a clearly charted plan for the different variations of the procedure that he tested. That is, he seemed to be testing different situations that he intuitively thought might influence the rates of disobedience, rather than sequentially testing a particular theory. Second, the standards of research ethics have changed since Milgram’s time. People are understandably concerned about the distress participants experience when they believe they have hurt someone during the experiment and, after the experiment is done, knowing that they are capable of hurting and possibly killing someone. Because of this, it is challenging for researchers to use variations of Milgram’s procedures today. Some researchers have replicated the obedience experiment with modified protocols that stop the procedure early, at around the middle of the voltage scale, sparing participants the possibility of thinking they have killed someone. Although these replications have been informative (and have largely obtained results very similar to what Milgram found4), they remove what is arguably the most interesting part of the study. Because Milgram’s original data are somewhat limited and because ethical concerns restrict our ability to conduct follow-up studies, we can’t easily collect the data we would probably need to put the settle arguments of interpretation. However, we’ve still learned a lot from Milgram’s work and from the efforts to make sense of it.

One of the most interesting and prominent explanations for Milgram’s findings is that rather than blindly following authority, people are motivated to comply by identifying with the experimenter, and they are motivated to disobey by identifying with the learner5. That is, people in an obedience study have two competing sources of responsibility. They are ostensibly a critical part of a legitimate scientific study, but they have a responsibility not to unduly harm the learner. Under circumstances that lead them to identify more with the experimenter – for example, when the experiment takes place under the auspices of a prestigious university or when they are relatively detached from shocking the learner – they obey at high rates. Under circumstances that lead them to identify more with the learner – for example when they have a prior relationship with the learner or when they must administer the shocks with their own hands – they are less likely to obey. Another way of thinking about this is that people are more likely to do things they believe are wrong if they think they are doing them for a purpose they also believe in and feel close to.

These tendencies toward obedience can have devastating results. People “following orders” have committed atrocities such as the My Lai massacre or the liquidation of the gulags6. Milgram himself set out to study obedience because he was interested in explaining how so many ordinary people could have helped carry out the Holocaust. His writing makes his concerns about the perils of obedience quite clear: “If… an anonymous experimenter could successfully command adults to subdue a fifty-year-old man, and force on him painful electric shocks against his protests, one can only wonder what government, with its vastly greater authority and prestige, can command of its subjects.”3

Obedience may have more subtle, smaller-scale (but still grave) consequences, too. To our knowledge, there is no experimental research on these issues, but close examination of wrongful conviction cases suggests that obedience may play a role in some false confessions. Many proven false confessors were under the impression they were serving an important role and assisting the police. Sometimes, this impression was formed because of lies told by interrogators. For example, Adrian Thomas, who falsely admitted to murdering an infant, was told if he could describe exactly how he caused the injuries, doctors might be able to save the child’s life*. Thomas provided a video-recorded demonstration of how he hurt the infant – even though he hadn’t actually done it. Exonerees Jeff Descovic and Amanda Knox have also described how they were led to believe they were important sources of information for murder investigations. Both of them ultimately provided false incriminating statements to the police. For at least part of these investigations, all three of these exonerees seemed to believe they were assisting in a legitimate and just cause by bending to the authority of interrogators. This is admittedly speculation, but it’s possible that the willingness to confession – even to things you didn’t do – is increased by the tendency to obey authorities whose motives you believe in and with whom you identify.

Milgram’s work shone a light on a dark and troubling part of human behavior. Although it is decades old, the research continues to be relevant, to inform researchers, and to help us understand puzzling behavior. Research on obedience to authority is difficult to do, but given the damage obedience can cause, it is well worth the effort.  

The post was written by Timothy Luke and edited by Will Crozier.

Notes

* The infant was, in fact, already dead when the interrogators told Thomas his confession was so urgently required. It was later found that the child had not been injured and had actually died of a separate medical issue.

References

[1] Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper & Row.

[1] Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371-378.

[2] Haslam, N., Loughnan, S., & Perry, G. (2014). Meta-Milgram: An empirical synthesis of the obedience experiments. PloS one9(4), e93927.

[3] Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations18(1), 57-76.

[4] Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist64(1), 1-11.

[4] Doliński, D., Grzyb, T., Folwarczny, M., Grzybała, P., Krzyszycha, K., Martynowska, K., & Trojanowski, J. (2017). Would you deliver an electric shock in 2015? Obedience in the experimental paradigm developed by Stanley Milgram in the 50 years following the original studies. Social Psychological and Personality Science8(8), 927-933.

[4] Dolinski, D., & Grzyb, T. (2016). One Serious Shock Versus Gradated Series of Shocks: Does “Multiple Feet-in-the-Door” Explain Obedience in Milgram Studies? Basic and Applied Social Psychology38(5), 276-283.

[5] Reicher, S. D., Haslam, S. A., & Smith, J. R. (2012). Working toward the experimenter: Reconceptualizing obedience within the Milgram paradigm as identification-based followership. Perspectives on Psychological Science7(4), 315-324.

[5] Haslam, S. A., Reicher, S. D., & Birney, M. E. (2014). Nothing by mere authority: Evidence that in an experimental analogue of the Milgram paradigm participants are motivated not by orders but by appeals to science. Journal of Social Issues70(3), 473-488.

[6] Glover, J. (1999). Humanity: A moral history of the twentieth century. Yale University Press.

Comments

  1. You are providing good knowledge. It is really helpful and factual information for us and everyone to increase knowledge. Continue sharing your data about psychologist stouffville. Thank you.

    ReplyDelete
  2. You can have all of that right here with this video slot - your max bet can go all finest way|the 카지노 사이트 method in which} a lot as} $1000, and that claims lots about how brave of a spirit you need to|you should|you have to} be. Helen, aka probably the most stunning woman on the earth, will gift you three re-spins to pile up as many high-paying symbols as you can to|you probably can}. Ivory Citadel is themed around an ancient, mystical jungle temple. The animals rule this specific jungle properly as|in addition to} its bonus rounds, and also you won’t catch sight of any humans. Casual playing lovers to excessive rollers, with bets starting from $0.6 to $600, Fortune Coin manages to keep the gaming expertise of each camps a top-notch one.

    ReplyDelete

Post a Comment

Popular posts from this blog

Let's talk about the role of psychology in law

Will and Timothy are joined by guest Dr. Jason Chin, for a chat about the relationship between psychology and law. In this chat format, we gather regular authors and guests in Slack and have a moderated conversation, guided by prompts and questions selected in advance. Participants get to respond to each other's points, make comments, and ask each other questions in real-time. The transcript has been lightly edited. Will Crozier &#x1F419 Welcome to another Exercise in Exceptions chat! We’ve talked a lot about how psychology research can influence the law – but that research needs to make it into the courts to actually make the intended difference. However, it’s never as easy as explaining a study or two to a jury. In this chat, we’re going to discuss this collision a bit – how psychology science is used in the legal system. Timothy and I are joined by Dr. Jason Chin , a lecturer at the TC Beirne School of Law at University of Queensland, Australia. Wel

When investigations go wrong – in science and policework

A story of both a wrongful conviction and scientific fraud We’ve talked about many of the ways police investigations can go wrong, including mistaken eyewitness identifications , memory errors , and false confessions . Often, when people imagine police investigations running afoul, they imagine egregious cases in which police plant evidence or physically torture suspects to get them to produce confessions they know are false. Although situations like that do occur, mistakes in investigations require no intentional wrongdoing. A detective doesn’t need to be trying to get a false confession, for instance, in order to get one ( as our guest writer Fabi Alceste has written about) . Errors happen often without the investigators realizing anything has gone wrong. Similarly, when people imagine bad scientific research happening, they often imagine scientists fabricating data or committing outright fraud. Scientific fraud is a problem, but it’s quite rare. However, there are many questio

Interrogation, coercion, and false confessions: The case of Brendan Dassey

In December 2015, Netflix brought the justice system to the forefront of the public consciousness with its award-winning documentary Making a Murderer . The 10-episode serialization followed the story of Steven Avery – a Wisconsin man who was wrongfully convicted of rape in 1985, spent 18 years in prison before being exonerated, and was then arrested and ultimately convicted for the murder of Teresa Halbach in 2005. While the documentary focuses primarily on Steven Avery as the main suspect, Making a Murderer also highlighted a lesser-known element of the case: the confession and conviction of Brendan Dassey, Avery’s nephew. Although there’s a lot to analyze in this case, we focus here on a nagging question – did Brendan Dassey give a false confession? Dassey’s involvement in the case was little understood until Netflix’s deep dive into the case, but has emerged as one of the most scrutinized and tragic elements. Briefly, the police focused on Avery as a suspect in the disappe