This lectures about heuristics and biases and
the ways that they create intellectual humility or intellectual arrogance.
And we'll start by defining intellectual humility and
arrogance so we can have a working definition that will take us forward.
We can define intellectual humility in three ways.
First of all, we can think of it as
an appropriate sense of one's present knowledge limits.
This means knowing the extent to which one doesn't know.
One is knowledge for example of facts.
What facts one does know and what one doesn't know.
One's ability to give explanations of how things work
and the sense of how well one understands various phenomena in the world.
That's one sense. A different sense is it
appropriate sense of one's future knowledge limits that is
to say what one is likely to know in the future about how
the world works or about how various phenomena occur.
It can also be what one could know and this means
not stuff that one is likely to know but one could know if one choose to know it.
So if for example one might think one could fully master
quantum mechanics even though one might choose never to do so.
And finally, another sense of intellectual humility—
intellectual arrogance—is having an appropriate sense of one's cognitive abilities.
This overlaps with knowledge but then also include some sense of problem solving skills,
calculation skills and any other performance based estimates.
What we mean here is that one might think one can think on
one's feet very quickly and with great facility.
But one might also think that one does necessarily know more than another one can
just simply process information more quickly.
Now intellectual arrogance we'll think of
as roughly the opposite of the intellectual humility.
It implies an inflated sense of one's present knowledge,
one thinks one knows more than one really does.
It implies over-optimism about what one will know,
what one could know in the future.
So this implies knowing a great deal about what the future
is gonna be holding in terms of one's knowledge but not being accurate.
And also could include over confidence about
one's abilities thinking one's much more able cognitively than one really is.
But these aren't precise opposites.
Intellectual arrogance may imply pride in social comparison.
Something that's not typically applied by intellectual humility.
That's to say if one is intellectually arrogant,
one tends to think one's much better than one's peers and one's very proud of that fact.
It also has intellectual arrogance a certain confidence or certainty associated with it.
One is quite confident or certain that one is intellectually superior to one's peers.
But if one is intellectually humble one may be not sure about how bad one is maybe may
have a great lack of confidence in one's skills or be confident
one is actually quite inferior intellectually.
So confidence and certainty correlates strongly with
intellectual arrogance and not so much with intellectual humility.
Now one question that arises here is why there should be intellectual arrogance.
It seems so socially undesirable you can ask why is it so common.
If it's so unpleasant to be arrogant and it causes alienation others why do we see it?
And there seem to be two factors.
One would have to do with social motivational constraints namely that various tendencies
to self-enhance one's own impression leads one to inflate one sense of competence.
But there are also, even if one is not motivated to think
one is better than others, potential cognitive
biases and heuristics that cause one to inflate one's knowledge of
what one knows about the world even though one really doesn't
have that knowledge. And these cognitive biases and heuristics
are what the focus of our talk will be,
because they're particularly interesting.
It could be that one really doesn't want to be arrogant but one nonetheless falls into
various forms of arrogance because of these biases and
heuristics working in ways that often occur outside awareness.
These two causes can interact.
That is to say social motivational factors may reinforce the biases and
the biases may make one feel like better but they also are clearly distinct.
So again as I said our focus will be in cognitive biases and heuristics.
And we want to define these briefly to get a better sense of what they mean.
And we'll think of heuristics as shortcuts in thinking,
that entail quicker computation,
less intense processing but sometimes at the cost of making errors.
So they're shortcuts, they're desirable because they reduce
cognitive load, but they do often have costs associated with them.
When there's errors that we make are consistently in
one direction they're often called biases.
Now most researchers consider most heuristics to have some kind of adaptive value.
They're there for a reason.
We need those shortcuts to get by in our daily lives.
One way is thinking about what's happening is what is
called sometimes the attribute substitution model of
heuristics where difficult questions are often
answered by substituting the easier one instead.
And so one can think of this is trying to swap out what is a very hard way problem with
a simpler one but at a cost because sometimes we'll have the errors.
And we'll refer to social motivational factors when they heavily interact with
these cognitive biases and that's often true in developmental accounts.