Wednesday, June 21, 2023

Some academic research on why people accept misinformation - or why some people just don't care if information is true or not

From Nature: Accuracy and social motivations shape judgements of (mis)information.

The extent to which belief in (mis)information reflects lack of knowledge versus a lack of motivation to be accurate is unclear. Here, across four experiments (n = 3,364), we motivated US participants to be accurate by providing financial incentives for correct responses about the veracity of true and false political news headlines. Financial incentives improved accuracy and reduced partisan bias in judgements of headlines by about 30%, primarily by increasing the perceived accuracy of true news from the opposing party (d = 0.47). Incentivizing people to identify news that would be liked by their political allies, however, decreased accuracy. Replicating prior work, conservatives were less accurate at discerning true from false headlines than liberals, yet incentives closed the gap in accuracy between conservatives and liberals by 52%. A non-financial accuracy motivation intervention was also effective, suggesting that motivation-based interventions are scalable. Altogether, these results suggest that a substantial portion of people’s judgements of the accuracy of news reflects motivational factors.

 

Partisans are more likely to entrench their beliefs in misinformation when political outgroup members fact-check claims.

The spread of misinformation has become a global issue with potentially dire consequences. There has been debate over whether misinformation corrections (or "fact-checks") sometimes "backfire," causing people to become more entrenched in misinformation. While recent studies suggest that an overall "backfire effect" is uncommon, we found that fact-checks were more likely to backfire when they came from a political outgroup member across three experiments (N = 1,217). We found that corrections reduced belief in misinformation; however, the effect of partisan congruence on belief was 5x more powerful than the effect of corrections. Moreover, corrections from political outgroup members were 52% more likely to backfire–leaving people with more entrenched beliefs in misinformation. In sum, corrections are effective on average, but have small effects compared to partisan identity congruence, and sometimes backfire--especially if they come from a political outgroup member. This suggests that partisan identity may drive irrational belief updating.

 
For more: Confirmation Bias

the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.[1] People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias cannot be eliminated, but it can be managed, for example, by education and training in critical thinking skills.