In the last lecture, we discussed categorical thinking, which is an extremely valuable aspect of human thought. When it comes to social groups, however, categorical thinking can give rise to certain problems if people see sharp divisions between themselves and others or between ingroups and outgroups. In social psychology, the term "ingroup" simply refers to a group that you're a member of, and the term "outgroup" refers to a group that you're not a member of. So, one person's ingroup may be another person's outgroup, and vice versa. At any given moment, we're all members of many, many ingroups and outgroups, and in some cases, our membership changes over time. For example, a few years ago I joined the "over 50 years old group," which had been an outgroup up until that point. But here's the interesting thing: as soon as people start categorizing and focusing on groups—whether they're a member of the group or not—certain psychological processes come into play. First, focusing on a group can change how we see the individuals or elements within that group. And second, we often end up seeing one group as better or worse than another, which is one reason why the U.S. Supreme Court ultimately gave up on the idea of separate but equal treatment under the law. As soon as you've got two or more groups—especially social groups— it's very hard to treat them as equal. So, let's examine each of the last two points, beginning with the first one. In 1963, Henri Tajfel and Alan Wilkes published a very thought-provoking, and now classic, study in which people were shown eight lines and asked to estimate, in centimeters, how long the lines were. The lines were all between 16 and 23 centimeters— that is, 6 to 9 inches long. Here's a reproduction of the lines in the exact proportions used. In some variations of the experiment the lines were shown individually, and in others they were shown at the same time. That turned out not to make a big difference. What did make a big difference was grouping the lines. In some experimental conditions, the four shortest lines were each shown with the letter A and the four longest lines were each shown with the letter B. In other words, they were grouped by letter; the A group had shorter lines than the B group. And when the lines were shown with letters, even though they were the very same lines, people tended to overestimate by 100% or more the difference in length between the longest line in group A and the shortest line in group B— that is, they came to regard the lines in B as long lines, and they rated the shortest line in that group a little bit longer than it really was. Now, why should we care? Well, just as Solomon Asch used a line judgment task to study conformity, Tajfel used a line judgment task to study stereotyping. Stereotyping? Really? What do line lengths have to do with stereotyping? Here's what Tajfel and Wilkes wrote in their original report, and I think it's interesting enough that I'd like to read you just a few lines. They wrote: "These findings may possibly have some fairly wide implications for a variety of judgment situations. They represent, in a sense, a simplified exercise in stereotyping. An essential feature of stereotyping is that of exaggerating some differences between groups classified in a certain way and of minimizing the same differences within such groups." In other words, what Tajfel is saying is that the same dynamic that takes place with respect to line judgments also applies to judgments of people. And even though there are certain exceptions, to a great extent he was right. According to social psychologist David Wilder, who published a major review on the topic, people "often assume similarities within groups and differences between groups to a greater degree and across a broader range of characteristics than is warranted by objective evidence." Once people are grouped, we tend to exaggerate differences between groups and similarities within groups. And I should add one historical footnote: Tajfel, who was a world renowned social psychologist, became interested in stereotyping in large part because he was a Polish-born Jew who the Germans took prisoner during World War II, and who lost every member of his immediate family to the Holocaust. So once again, there's a link between social psychology and the Holocaust. In the span of just 15 or 20 years, we have Tajfel, Milgram, Ash, Moscovici, and others all trying to make sense of what happened in the Holocaust, and find ways to prevent similar tragedies in the future. Now, remember, I said that the formation of groups not only alters how people think about the members, but that it also leads us to see one group as better or worse than another, making it hard to have groups that are seen as separate but equal. Well, in addition to his research on the first topic, Henri Tajfel made major contributions to our understanding of the second topic. And the way he did this was by developing a very creative research procedure that's now known as the "minimal group paradigm." To see how it works, I'd like you to take five seconds and estimate the number of dots on the following screen. Don't count them—just estimate what the number feels like. Congratulations! Based on your answer, it looks like you're an overestimator, and we all know it's better to be an overestimator than an underestimator, right? Well, no—we don't. To the best of my knowledge, there's no difference in life outcome between people who overestimate or underestimate the number of dots on a screen, but what Tajfel found is that when people were randomly assigned to receive feedback that they were overestimators or underestimators, regardless of the estimates that they gave, people tended to show "ingroup bias." That is, they tended to favor members of their own group over members of the outgroup. For example, when asked to allocate money to other overestimators and underestimators, excluding themselves, Tajfel found that they tended to give more money to members of their own group than members of the outgroup. Likewise, a Canadian study found that if you ask people to go through a photo album and pick five strangers who look like they're supporters of each major political party, liberals and conservatives both guess that relatively attractive people are supporters of their party. That's ingroup bias. And by the way, the Wizard of Oz replicated this study— got the exact same results. Joking aside, what makes Tajfel's results so striking is how little it takes to trigger ingroup bias. What the minimal group procedure reveals is that even when groups have no history of conflict and don't even know each other, they still show ingroup favoritism. In the words of social psychologist Marilyn Brewer, "Many forms of discrimination and bias may develop not because outgroups are hated, but because positive emotions such as admiration, sympathy, and trust are reserved for the ingroup and withheld from outgroups." Even a chance event, such as one group getting something that another doesn't as a result of a coin toss, is enough to trigger ingroup bias. This result was found in a wonderful little study published many years ago by Jacob Rabbie and Murray Horwitz, the last study that I'll mention in this video. In the study, the experimenter divided junior high school students in the Netherlands into four-person "green groups" and four-person "blue groups" for what he said were administrative reasons only. And even though the students didn't interact with each other, he referred to the students as greens or blues so that they had a group label. Then, after students completed several surveys, the researcher told students in the experimental conditions that he wanted to give them each a transistor radio as a token of appreciation for participating in the study, but unfortunately, that he only had enough radios for members of one group. There were three experimental conditions. In the chance condition, he said: "Perhaps the best thing we can do is flip a coin to decide which group gets the radios and which does not. Okay? You want heads or tails?" The experimenter then tossed a coin, gave radios to the winning group, and apologized to the losing group. In the experimenter condition, the experimenter simply said: "Let's see— I'll give the radios to this group." He then gave the radios to a randomly selected group. And in the group condition, the experimenter allowed a randomly chosen group to vote by secret ballot to decide who would get the radios, and regardless of the actual vote, he announced that they had voted to receive the radios themselves. You can imagine how this would make the other group feel! And the experimenter then delivered the radios as before. In addition to these three conditions, there was also a control condition in which students were divided into groups but no mention was made of transistor radios. What Rabbie and Horwitz found is that students in the experimental conditions (who either received or were denied transistor radios) displayed ingroup bias, whereas students in the control condition did not. Students in the experimental conditions—even in the chance condition—rated the ingroup as less likely to be hostile and more desirable to belong to than the outgroup. And they also saw members of their own group as more responsible and desirable as friends than members of the outgroup. So even a coin toss was enough to trigger ingroup bias— a finding that suggests we need to be careful when we create groups or teams. But we also need to be careful not to overinterpret or misinterpret the lesson from minimal group research. The results do not mean that prejudice and discrimination are merely a matter of superficial group dynamics. Clearly, prejudice is also a function of culture, of politics, history, and economics. And second, the fact that ingroup bias is so easy to trigger doesn't mean that it's unavoidable or inevitable. There are very effective techniques to reduce prejudice, stereotyping, and discrimination. This week's reading assignment on prejudice suggests some ways to do this, and thanks to the American Psychological Association, there are also some translations of this particular reading, in slightly shortened form, posted on the web at understandingprejudice.org/apa. For this week's assigned reading, you're welcome to read any of these translations or the original in English. Before you do that reading though, let me suggest that you watch the next video, which will set up some of the topics in that reading, because there's more to the story than ingroup bias.