Intellectual humility is the real openness to the evidence, not just the evidence you have, but the evidence that's available in the context. Often, we ignore that evidence because of our feelings, because of our fears, because of our overconfidence that we're right or because of our underconfidence that we're in a position to know. Humility requires us to look for the available evidence and form our beliefs on their basis. This is the point of view argued for by Ian Church and Peter Samuelson in their book, which is the companion to this course, "Intellectual Humility: An Introduction to the Philosophy and the Science." There's a few outstanding examples in the book that I want to go through because they bear on issues of believing on the basis of what other people tell you. Remember, the truth is in the middle. We often need to know whether the person we're talking to is trustworthy or not and we need to rely on our own experience, our own evidence and our ability to sift through the evidence to ask them questions and to follow up when there is disagreement in different cases. One of the obstacles to actually forming our beliefs on the basis of the available evidence, our background evidence, the evidence they happen to have, the evidence that's out there, that supports or undermines their reliability are our emotions, our fears and sometimes our biases, and sometimes these biases come from our culture. So let me give you a couple of examples involving our fears. One of the most controversial issues in America at this time is climate change. And it's surprising because when you look at the science, the science is overwhelming that our climate is in fact changing. Whether it's man-made or not, it's happening and it's something that we're all going to have to face collectively, not just country by country but as a species around the entire globe. But many people won't believe it. So suppose you haven't even thought about it, but someone starts to describe it to you. And you think, oh my goodness, that means enormous changes to the environment and it could mean enormous changes to agriculture, to our food supply, to where we live in cities, to migration patterns and so on. It is scary, extremely scary! And you don't want it to be true. I don't want it to be true. None of us wants it to be true. Our fear then could lead many of us to think it's not going to happen. Our fear, because we want it to be true that it's not going to happen, could lead us to look for evidence that it's not going to happen. So confirmation bias often happens, we want something to be true and so we look for evidence supporting what we want to be true so that we're not scared anymore. But suppose you're talking to a scientist, and suppose this scientist has nothing to gain and we all have so much to lose if climate change is true. And you ask them, what's the evidence? And they say the evidence is overwhelming, that the climate is going to change in radical ways maybe in our lifetimes, certainly, the lifetimes of our children and grandchildren. The world is going to be a very different place. And suppose you don't believe the scientists and you think, well you know, they're mistaken, they're not always right and maybe this guy is lying to me. And you do that out of your fear because you don't want it to be true. Then your emotions are driving your response to the evidence, in particular, the available evidence. In a way, you're being overconfident that you're in the right when here someone who's an expert in the sciences that you're simply ignoring. Think about it. We rely on scientists all the time. We got in airplanes that we fly around the world and that's physics, that's engineering, that's computer science, we rely on scientists in so many ways it's impossible to even count them all. But when we don't want what they're telling us to be true, we let our emotions drive our response. That's not the intellectually humble response, in a way, it's a kind of arrogant response that perhaps you're in a better position to know than the other person. So when we are responding to testimony, it's true that we often have to be on the lookout for possible signs of insincerity when we're trained to see them, or that we're on the lookout for possibly that they don't know what they're talking about and we need to ask them for their reasons. But in some cases, there's such a strong asymmetry between our position and their position because they're really experts on the topic, that we would need extremely good reasons not to believe that person. Let's think about another case from the book by Church and Samuelson. Suppose you're going to buy a car and you're not very good at telling whether any car is better than another car, but there's a particular kind of car you really want to buy. And you find that there's a used car salesman with a lot who's selling the kind of car you've always wanted. Maybe you've always wanted a 1984 Mazda GLC. I don't know why you'd want one of those, but you might. And you might see that this used salesman has one. And you go, and the used salesman tells you it's only ever had one owner, and that person hardly ever drove it, it's a fantastic car, you really should buy it at this great deal. And because you're anxious about it, but that's the kind of car you really want, you ignore all the evidence you've been given, all the advice from your friends that you should not simply believe a used car salesman and you buy it, thinking not only did you get a great car but you got a great deal. Those are cases often times where we think you need to get a second opinion. Drive that car to a mechanic and get the mechanic's opinion. There's available evidence, that is, used car salesmen often have large motives to mislead you and there's additional evidence that you can acquire from a second opinion from a mechanic that you trust that the car is a good one or that it isn't. If you simply insist on believing you bought a good car because you don't want to feel bad about being fooled, then you're not being intellectually humble. You're making the wrong response by not going out and making sure you've got all of the available evidence. Those are both cases where the hearer is responding in the wrong way to the speaker. In their book, they also discuss another case. This is a case discussed by Miranda Fricker in her book "Epistemic Injustice" and it's a great case because you can watch the movie. The case is from the movie, "The Talented Mr. Ripley." In that movie, there are four main characters. There's Mr. Ripley, there's his friend Dickie, there's Dickie's girlfriend Marge, and then there's Dickie's father Mr. Greenleaf. Mr. Ripley is extremely charming, he comes from a non-privileged background but he assimilates into this group of people this group of friends that are very privileged, and Mr. Ripley wants to have the life that Dickie has and he becomes Dickie's good friend or so Dickie believes. And he charms Mr. Greenleaf, and Mr. Greenleaf thinks Ripley is a wonderful character. And then one day, Ripley kills Dickie to take over the kind of life that Dickie has been living. Dickie has disappeared. We know watching the film that Dickie disappeared because he's been killed. The police start their investigation. Who should they be looking for? Well, they're looking for Dickie, and if Dickie is dead they're looking for his killer and then we know it's Ripley. But Mr. Greenleaf doesn't think it's Ripley because Ripley has so charmed Greenleaf. Marge, however, figures it out. She is aware of all the available evidence that suggests it's Ripley and she takes this evidence to Mr. Greenleaf. But does he believe her? No. He doesn't believe her at all. He's been charmed on the one hand by Ripley, and on the other hand, he's full of prejudices and biases that prevents him from believing a woman, especially on a controversial topic, a topic she cares much about. He starts to believe that she is simply being hysterical and that women can't be trusted on these topics. He doesn't believe her at all because of his biases. It turns out there are many biases like these that we have where we downgrade whether the person we're talking to is to be trusted or not. We think they're not in a position to know. Maybe it's a gender bias, maybe it's a racial bias, maybe it's a class bias. In all of these cases, there's a kind of arrogance that we possess, maybe not even knowing that we possess it, because of our biases which means we will not believe certain kinds of people, we will not be forming beliefs on the available evidence, the evidence they possess, because we think they're not in a position to have it. That's kind of a failure to be intellectually humble. It's a failure to respond to the available evidence driven by our biases. And again, it's difficult to be aware of our biases and to overcome them. But more and more research is being done in the social sciences that help expose these biases, in particular, the social science of social psychology and we're developing techniques to train ourselves so that we're not tricked by those biases. And it doesn't feel good to be told that we've got these biases. Often, they are gender biases or racial biases or class biases. But what intellectual virtue requires is that we overcome our fears, we overcome those emotions in order to grow. So just as we should go see the doctor to find out whether we're sick or not, that's the best thing to do, so too we should be open to the possibility that there are biases that cloud our judgments that prevent us from forming beliefs on the basis of the available evidence. Because when we do, we are better off overall. We acquire more information that helps us get more knowledge about the world and so to make plans and judgments that are more likely to be effective and people with that kind of information will become our partners in getting a better understanding of the world.