Member-only story
We probably all know someone who is extremely distrustful of fellow human beings. In fact, in some ways this has also become a feature I some countries’ political landscapes — deep distrust of the other.
So how do you get them to be more trustful? Well, the interesting thing that researchers have uncovered is that those who distrust humans most are more likely to trust AI more!
This was a part of study by researchers at Penn University — they recruited 676 participants to take part in a study in which they were told they were evaluating a new moderation tool for online content that helped to identify hate speech and suicide ideation.
Posts were then shown that had been flagged as being one of those categories or not. The participants were told they had been flagged by a human, by AI, or by both. They then completed a survey on their individual differences which included distrust in others, political ideology, experience with technology, and trust in AI.
Surprisingly, or not surprisingly, those that most distrusted their fellow human beings, trusted AI the most. This also included those who had a stronger conservative ideology. The converse also applied - the more trust people had in human beings the less they trusted AI.
There was also a group of “power” users, those with the most experience of technology and they trusted AI less — they thought AI wouldn’t be able to identify the nuances of human language — they may be more aware of the limitations of AI than others.
So, who would have thought but trust in AI and humans has a negative correlation and political ideologies also predict this!