Originally posted at the Independent Bloggers’ Alliance
From Thursday night’s Daily Show, in which Jon Stewart interviewed Dr. Philip Zimbardo, who recently wrote a book entitled the Lucifer Effect. I recognize him from those Discovering Psychology videos, which I have been showing to my psych classes for years. (Demetrius still maintains that he looks uncannily like The Master from Doctor Who.)
Philip Zimbardo: It’s great to be here–nothing I’ve done in my whole career is going to earn me more “pizazz units” than being on your program. As my students say, “It’s totally awesome, man!”
Jon Stewart: Students at Stanford talk that way?
Philip Zimbardo: Students everywhere talk that way.
Jon Stewart: I’m going to start talking that way! Your book, it’s called the Lucifer Effect. Now, I was a psychology major–
Zimbardo: What did you get in Introductory Psych?
Stewart: Introductory Psych 101, I got “Yes, your essay was long enough.” (Laughter.) But the two famous experiments are the Stanford Prison Experiment and the Milgram shocking experiment, that we’re all taught that people are much more evil than they would appear to be on the outside. Tell us about the Stanford experiment.
Zimbardo: No, people are not more evil than they would appear to be on the outside. The Stanford Prison Experiment that I detail at great length in the Lucifer Effect really describes the gradual transformation of a group of good boys, 24 college students who volunteered to be in the experiment. Only the normal, healthy ones, randomly assigned by a flip of the coin to be guards or prisoners. What we see is how quickly the good boys–and that’s important, they start off good–become brutal guards, and the normal kids become pathological prisoners–
Stewart: Now when you say “the slow descent from good to evil, it took a week, did it not?
Zimbardo: No, it actually took 36 hours. (Laughter) We were counting in minutes. (More laughter.) No, at 36 hours, the first prisoner had an emotional breakdown, and each day after that, another one followed suit. So the study was supposed to go two weeks; I had to end it in 6 days because it was out of control. These good guards were totally into the role of being sadistic, controlling, and dominant. The prisoners rebelled and they got their asses kicked, and the guards just dominated them, and we ended the study because it was out of control.
Stewart: Are those people now running the country? (Laughter and applause.)
Zimbardo: Some of them got jobs at Enron. (Laughter.)
Stewart: It boils down to this–I get the sense that we’re in the trouble we’re in because of this idea that there is good and there is evil.
Zimbardo: Right.
Stewart: And it doesn’t mix, and we are locked in some sort of Hobbit-like battle between the two, but what you’re suggesting is, it’s pretty much of a flux.
Zimbardo: Oh, absolutely. Essentially what The Lucifer Effect is, is a celebration of the human mind. The human mind is this exquisite organ, which has the infinite capacity for any of us to be kind or cruel, selfish or destructive, villians or heros, and because of that capacity, it really is the situation that moves us in a path to be perpetrators of evil–most of us do not, but are innocent bystanders. And the good thing that comes out of my research is, some of us get moved to be heros. And so the question is, why do good people turn evil, and how can we get ordinary people to be more heroic?
Stewart: Well, here’s an interesting thing. We had a kid on the show named Ishmael Beah, from Sierra Leone. As a teenager, they gave him a mixture of gunpowder and cocaine, gave him a gun and told him, “These people killed your family” and turned him into a killer. And then he, himself, worked out of that and has become somewhat heroic. So each person has that same capacity. But, in the so called “death cult”, is that how they get people to be this way. Is there a certain kind of technique to turn people into that?
Zimbardo: Yes, but that’s extreme. In Rwanda, it was enough to have the local government announce on the radio that starting today, the Tutsi are our enemy. And they went around giving each Hutu family a macheti and a club, and they say “We want to destroy the enemies, because they are a threat to our national security“–you’ve heard that song before. And in three months, Hutus killed 800,000 of their neighbors. And the “weapon of mass destruction” was what? It was a machete and a club.
Stewart: Is there an innoculation?
Zimbardo: Yeah, of course. None of these things happen–in the Stanford Prison study, I draw the parallels of Abu Ghraib, which are identical. I mean, the things that happened in our study all happened at night shift. The worst things that happened at Abu Ghraib were on night shift where there was no supervision and no oversight. You want to eliminate evil in institutions, you have to have strong oversight. You have to have leadership that says, “This is what we will do, this is what you can’t do. We will do no harm, and if you do harm, here’s what’s going to happen. You’re going to get in trouble, you’re going to lose your job.
Usually in situations like that, the leadership backs off–they let you do whatever you want. In Abu Ghraib, those abuses went on for three months–who was watching the store?
Stewart: So you’re saying they should have stopped Abu Ghraib after 36 hours.
Zimbardo: And the same thing with the war we’re in–maybe 48 hours.
Stewart: Absolutely. It’s unbelievabley fascinating, even for someone who did very poorly in psychology. The Lucifer Effect, it’s on the bookshelves now–Philip Zimbardo.
Nothing shockingly new here–the Stanford Prison study took place way back in 1971. The insane thing, in my mind, anyway, is that so many still don’t get what studies like that showed us about human nature.
This link takes you to some of the things Zimbardo has said about the parallels between the prison experiment and Abu Ghraib.
http://www.prisonexp.org/links.htm#iraq
Thanks for the link…incredible amount of links there on so many related topics.
I think many people simply don’t want to comprehend the implications of that study plain and simple, just too scary to them to believe it.
A paper that is relevant but wasn’t cited is Selfishness Examined: Cooperation in the Absence of Egoistic Incentives, Linnda R. Caporael, Robyn M. Dawes, John M. Orbell, and Alphons J. C. van de Kragt, Behavioral and Brain Sciences 12 (1989): 683-739. Unfortunately, given the barbaric attitude of our society to knowledge, this paper is not available on the Web.
The main point of the paper is that humans are wired to exhibit altruistic behavior to people they consider to belong to their own group, but not to people who they consider to fall outside of their group. As far as I recall, the paper also points out that who you consider to belong to your own group can be manipulated.
That explains why our corporate media goes out of their way to portray Muslims and poor people in general as not belonging to what we are, to our group. Our brains are wired so that if we don’t believe that someone is “one of us”, we don’t care what happens to them.
Bittorrent links:
The Daily Show 03.29.07
The Daily Show 2007 03 26 – Week of March 26 2007
I would’ve loved to go to Stanford and taken psychology from Zimbardo…one of many regrets…
Thanks for sharing this — I’m tempted to head over to the iTunes store and lay down a couple of bucks for this episode… š
I’m tempted to head over to the iTunes store and lay down a couple of bucks for this episode
Why not just download one of the torrents I linked to above for free?
Fascinating — and disturbing.
The only thing I would quibble with is that Zimbardo says that most of us are innocent bystanders in the face of evil actions. Bystanders, yes. Innocent? I’m not so sure. Don’t we have a responsibility to do what we can do stop it? Most of us don’t think of ourselves as heroic and don’t take the big risks heroes do.
But doesn’t decency call us to take at least the smaller risks in the face of evil. If enough people react, maybe one heroic action won’t be necessary. Or maybe a few people who start out small will develop their hero muscles.
In any case, the study invites us all to get off our moral high horses.