* The article by researcher at the CNRS in Paris, Hugo Mercier was published in Aeon. Aeon is an online magazine that asks big questions, in search of new answers and a new perspective on social reality, science, philosophy and culture. NEWS 24/7 reposts a story every week for those who love original thinking on issues old and new.
We all know people who have suffered from overconfidence: betrayed clients, abandoned lovers, shunned friends. Indeed, most of us have been “burned” by misplaced trust. These personal and indirect experiences lead us to believe that people are overconfident, often bordering on naivety.
Actually, we don’t trust enough.
Take data on trust in the United States (it would be the same at least in most wealthy democracies). Interpersonal trust, which measures whether people believe others are generally trustworthy, is at its lowest level in nearly 50 years. However, people are unlikely to be any less trustworthy than before: the huge drop in crime in recent decades suggests otherwise. Trust in the media is also at rock bottom, even though mainstream media has an impressive (if not flawless) record of accuracy.
Meanwhile, trust in science has held up relatively well, with most people trusting scientists most of the time. However, in at least some areas, from climate change to vaccination, part of the population does not trust science enough, with disastrous consequences.
Social scientists have a variety of tools to study the degree of trust and reliability of people. The most popular is the trust game, in which two participants play, usually anonymously. The first participant receives a small amount of money, say $10, and is asked to decide how much to transfer to the other participant. The amount transferred is then tripled and the second participant chooses the amount to be transferred to the first. At least in Western countries, trust is rewarded: the more money the first participant transfers, the more money the second participant sends, and therefore the more money the first participant ends up with. However, newbies transfer on average only half of the money they received. In some studies, a variant was introduced in which the participants knew each other’s nationality. Prejudices caused participants to be wary of certain groups – Israelis of Eastern origin (Asian and African immigrants and their descendants born in Israel) or black students in South Africa – transferring less money to them, although these groups have proven to be just as reliable as the most respected groups.
If people and institutions are more trustworthy than we think, why aren’t we doing it right? Why don’t we trust more?
In 2017, sociologist Toshio Yamagishi was kind enough to invite me to his apartment in Mashida, a city in the Tokyo metropolitan area. The cancer that would take his life a few months later had weakened him, but he retained his youthful enthusiasm for research and his sharp mind. On this occasion, we evoked one of his ideas with profound implications for the subject: the asymmetry of information between trust and mistrust.
When you trust someone, you come to understand whether your trust was justified or not.. An acquaintance asks if he can sleep over for a few days. If you accept, you will know if he is a good guest or not. A colleague advises you to get a new software application. If you follow his advice, you’ll find out if the new software works better than what you’re used to.
Conversely, when you don’t trust someone, most of the time you never know if you should have trusted them. If you don’t invite your acquaintance, you won’t know if he would be a good guest or not. If you don’t follow your colleague’s advice, you won’t know if the new software is actually better, and therefore if your colleague is giving good advice in this area.
This information asymmetry means that we learn more by trusting than by not trusting. Moreover, when we trust, we not only learn about specific people, but we learn more generally about the type of situations we should or should not trust. We become better when we trust.
Yamagishi and his colleagues demonstrated the benefits of trust in learning. Their experiments resembled trust games, but participants could interact with each other before making the decision to transfer money (or not) to each other. The more confident participants were better able to determine who would be trusted or to whom they should transfer money.
The same pattern is found in other areas. People who trust the media more are better informed about politics and current affairs. The more people trust science, the more scientifically educated they are. Even if this evidence remains correlational, it stands to reason that people who are more trusting should know better who to trust. In trust as in anything else, it is practice makes perfect.
Yamagishi’s insight gives us reason to be confident. But then the mystery reaches its climax: if trust offers such learning opportunities, we should trust too much rather than not enough. Ironically, the very reason why we should trust more – the fact that we get more information from trusting than from not trusting – can make us tend to trust less.
When our trust is betrayed – when we trust someone we shouldn’t – the cost is significant, and our reaction ranges from annoyance to anger and despair. The upside – what we learned from our mistake – is easy to overlook. In contrast, the cost of not trusting someone you could trust is, as a rule, invisible. We don’t know the friendship we could have made (if we let this knowledge sleep for us). We don’t realize how useful some tips would be (if we used our colleague’s tips on the new software application).
We don’t trust enough because the cost of misplaced trust is too obvious, when the (learning) benefits of mistrust, as well as the costs of mistrust, are largely hidden. We should consider these hidden costs and benefits: think about what we learn by trusting the people we could befriend, the knowledge we could acquire.
Giving people a chance is not just the moral thing to do. It’s also the smart thing to do.