To study this phenomenon of overconfidence, Daniel Kahneman and Amos Tversky asked subjects to confirm in writing their agreement with statements that reflected specific facts, for example: “I am 98% sure that the length of the air route between New Delhi and Beijing is more… miles, but less… miles” (Kahneman & Tversky, 1979). The majority of the subjects showed excessive self-confidence: in about 30% of cases, the correct answer lay outside the interval about which they were 98% sure. (The length of the air route between New Delhi and Beijing is 2,500 miles.)
To find out whether overconfidence extends to social judgments, David Dunning and his assistants developed the following scenario (Dunning et al., 1990). They asked Stanford University students to suggest how a stranger would answer a number of questions, including the following: “How would you prefer to prepare for a difficult exam — alone or together with friends?” and “How would you rate your lecture notes — how sloppy or how neat?” Having information about the type of questions, but not knowing what exactly they would have to ask, the subjects first interviewed their future respondents, asking them about their level of education, hobbies, academic interests, aspirations, and who they were according to the Zodiac sign, that is, about everything that, in their opinion, could be useful to them. Then, while the respondent subjects answered 20 questions in writing, choosing one of the two proposed alternative answers, the interviewer subjects predicted the answers of their respondents and assessed their level of confidence in their own forecasts.
In 63% of cases, the interviewers’ forecasts were justified, i.e. the probability was exceeded by 13%. However, on average, they were 75% confident in their forecasts. Predicting the answers of their roommates in the dorm, they were 78% sure and proved to be right 68% of the time. But that’s not all: the most self-confident subjects were more likely to be overconfident. Studies have found a slight positive correlation between self-confidence and the accuracy of recognizing whether an interlocutor is telling the truth or lying (DePaulo et al., 1997). When it comes to evaluating their lover’s sexual experience or their roommates’ favorite activities, people are also overconfident (Swann & Gill, 1997).
Don’t miss the most important science and health updates!
Subscribe to our newsletter and get the most important news straight to your inbox
(Kaller, you have raised an interesting question about the importance of competence.
— An educated person differs from an uneducated one in that he understands how little he knows. This has long been a truism. Therefore, there are many insecure people among smart people.
— The same can be said about competence in general. Incompetent people do not realize their inferiority.
— It seems to me that this is also true of emotional and social competence.
— I’ve noticed that self-absorbed and ill-mannered people don’t notice that they’re just like that. This is their distinctive feature.
— Kaller, are you following my train of thought? – no. Can I come back to my question?)
The irony is that the less a person knows, the more arrogant they are. According to Justin Kruger and David Dunning, “to understand what competence is, you need to be competent” (Kruger & Dunning, 1999). Students who received the lowest scores in the course of testing their knowledge of grammar, logic and sense of humor are more likely than others to overestimate their talent in these areas. People who do not know what good logic or good knowledge of grammar is often do not even realize that they lack them. If it is true that ignorance can generate self-confidence, then we have the right to ask: what exactly is our ignorance that we are unaware of?
In chapter 2, we noted that people are very inaccurate about their long-term emotional responses to good and bad events. But are they better at predicting their own behavior? To answer this question, Robert Vallone and his colleagues asked students to predict in September whether they would continue their studies, choose a subject in which they would specialize, whether they would stay on campus next year, etc. (Vallone et al., 1990). Although, on average, students were 84% confident in their predictions about themselves, almost half of them turned out to be incorrect. Moreover, they were wrong in 15% of those forecasts that they were 100% sure of.
When assessing their chances of success in a case such as, for example, an exam in a major discipline, people express the greatest confidence in a successful outcome when there is a lot of time left before the “moment of truth”. As the exam day approaches, the possibility of failure becomes quite visible and self-confidence tends to wane (Gilovich et al., 1993).
Roger Buehler and his colleagues write that most students arrogantly underestimate how much time they will need to complete written papers and other assignments in their core subject (Buehler et al., 1994). They are not alone.
— Designers constantly underestimate the cost of projects and do not meet the deadlines. In 1969, Montreal Mayor Jean Drapeau proudly announced that a stadium with a retractable roof would be built in the city for the 1976 Olympic Games and that the cost of this project was $120 million. This money was only enough for the construction of the roof, which was completed in 1989.
— Investment specialists advertise their services, presumptuously believing that they will be able to bring down the stock exchange index. At the same time, they forget that at a given stock price, for every stockbroker or buyer who says, “I’m selling!” there will always be someone who says, “I’m buying!” The stock price reflects the balance between these presumptuous judgments. Incredible as it may seem, economist Burton Malkiel (1999) reports that the portfolio of securities compiled by investment analysts turned out to be no better than the one that was typed at random.
Wise men are too aware of their weaknesses to consider themselves infallible; and the one who knows the most understands better than others how little he knows. Thomas Jefferson, Writings>
— Editors, evaluating the manuscripts sent by the authors, also make amazing mistakes. The writer Chuck Ross (Chuck Ross, 1979), using a pseudonym, mailed Jerzy Kosinski’s typewritten novel “Steps” to 28 major publishers and literary agencies. The novel was rejected by everyone, including the publishing house RandomHouse, which published it in 1968, after which the author received the National Literary Award and sold more than 400,000 copies. HoughtonMufflin Publishing House, which has published three of Kosinski’s novels, almost accepted the manuscript: “The style and style of your novel without a title were admired by those of us who read it. They can only be compared with Jerzy Kosinski’s style and style… The disadvantage of the manuscript is that it does not contain anything new.”
<About the atomic bomb: This is the stupidest thing we’ve ever done. I’m telling you as an explosives expert: it will never explode. Admiral William Lee to President Truman, 1945>
— People who are prone to arrogant decisions and endowed with power are capable of plunging the world into chaos. The arrogant Adolf Hitler fought all over Europe from 1939 to 1945. The arrogant Lyndon Johnson sent the American army to save democracy in South Vietnam in the 1960s. The arrogant Saddam Hussein attacked Kuwait in 1990, and the arrogant Slobodan Milosevic declared in 1999 that he would never allow peacekeeping troops in Kosovo.
What causes overconfidence? Why doesn’t life experience teach us to be more realistic in our self-assessments? There are several reasons for this. First, people tend to remember their erroneous judgments as situations in which they were almost right. This phenomenon was described by Philip Tetlock, who in the late 1980s asked several scientists and political scientists, based on their current point of view, to predict the future of the Soviet Union, South Africa and Canada (Tetlock, 1998, 1999). Five years later, communism collapsed, South Africa turned into a multinational democratic state, and Canada remained united. Experts who were more than 80% sure correctly predicted exactly this development of events only in 40% of cases. However, the experts who made a mistake, reflecting on their judgments, expressed confidence that they were mostly right after all. “I almost hit the nail on the head,” many of them said. “The hardliners almost succeeded in their fight against Gorbachev.” “The Quebec separatists almost won the referendum on secession.” “If de Klerk and Mandela had not agreed among themselves, the transfer of power into the hands of the black majority would have been much more bloody.” It is difficult for experts, political scientists, psychotherapists, as well as compilers of stock and sports forecasts, to get rid of excessive self-confidence.
{President Lyndon Johnson in Vietnam (1966). Excessive self-confidence, such as the one he demonstrated, condemning the army to inevitable defeat, is at the root of many blunders, both serious and minor}
People have another characteristic: they are not inclined to look for information that can refute what they believe. The validity of this statement was proved by Wason (you can repeat his experiment yourself), who presented three numbers to different people — 2, 4 and 6, obeying one simple rule he formulated for himself: the numbers are arranged in ascending order (P. C. Wason, 1960). To help the subjects identify this rule, Wason asked each of them to name three numbers himself, and each time he told them whether the numbers they proposed satisfied his rule. When the subjects were sure they understood the rule, they had to stop and say it out loud.
The result? The correct answer was as rare as doubts: 23 out of 29 people who formulated the rule incorrectly convinced themselves that they had done it correctly. As a rule, they preferred not to refute their guesses, but formed some kind of incorrect belief about the rule (for example, that we are talking only about even numbers) and then sought confirmation of their assumption by presenting the experimenter with three digits – 8, 10, 12. People are eager to confirm their beliefs, but are in no hurry to look for evidence. capable of refuting them. We call this phenomenon evidence bias.
Our preference for information that confirms our beliefs helps explain the remarkable stability of our Self-images. The results of experiments conducted at the University of Texas (Austin) According to William Swann and Stephen Reed, students seek, find, and memorize information that supports their self-image (Swann & Read, 1981; Swann et al., 1999a, 1999b, 1994). We choose as our friends and spouses those who share our opinion of us, even if we do not rate ourselves very flatteringly (Swann et al., 1991; 1992, 2000). Swann and Reed compare this self-affirmation to the behavior of a person with a dominant Self-image at a party. From the first moment, he looks for his acquaintances among those present, about whom he knows that they recognize his superiority. Then, during the conversation, he presents his views in such a way that the expected respect is guaranteed to him. After a party, it is difficult for him to recall conversations where his influence was minimal, it is much easier for him to recall his own persuasiveness in those conversations in which he “played the first fiddle.” Consequently, the impressions received at the party will confirm his Self-image.
<If you know something, possess that knowledge, and if you don’t know something, admit that you don’t know it; this will prove your erudition. Confucius, A literary collection>
The Cure for Overconfidence
What lessons can we learn from overconfidence research? One of them is that you need to be careful about accepting the dogmatic judgments of others. Even people who are absolutely sure that they are right are wrong. Self-confidence does not always correspond to competence.
There are two known ways that successfully reduce the prejudice generated by excessive arrogance. One of them is immediate feedback (Lichtenstein & Fischhoff, 1980). In real life, meteorologists who make weather forecasts and those who bet on races receive unequivocal feedback every day. Therefore, experts in both groups assess the likelihood of their predictions being correct quite adequately (Fischhoff, 1982).
When people think about why an idea might be correct, it begins to seem so (Koehler, 1991). This means that the second way to reduce the level of arrogance is to get people to think about at least one convincing argument for why their judgments may be false, i.e. to force them to take into account the information that refutes their judgments (Koriat et al., 1980). Managers could encourage more realistic judgments if they insisted that all suggestions and recommendations include reasons why they might not work.
Nevertheless, we must be careful not to undermine people’s self-confidence so much that they spend too much time introspecting themselves or allow doubts to paralyze their resolve. In those moments when wisdom is needed, those who lack self-confidence may remain silent or refrain from making serious decisions. Overconfidence can cost us dearly, but grounded self-confidence is adaptive.
Source: Myers D. “Social Psychology”
Photo: dreamstime.com