A false information on Facebook generates six times more interaction than one from an official or more truthful site. This is the result of a study conducted by the University of New York and the Grenoble Alps.
The institutions reviewed the publications issued by more than 2,500 Facebook pages from August 2020 to January 2021, according to the Washington Post.
The engagement obtained by false information is much higher, up to six times more, than that of those emanating from more reputable sources.
When we talk about engagement we refer to the degree of interaction that an account or information gets on social networks. It is measured by the number of “likes”, times it is shared or comments made.
According to the researchers, disinformation at both ends of the political spectrum gets more participation. However, far-right (conservative) information publishers publish more false information than far-left ones.
A Facebook spokesperson responded to Business Insider on the outcome of the investigation.
“It represents a very small amount of all the content on the platform,” he said. “Engagement also cannot be confused with the number of people who actually see the information on Facebook.”
“When you look at the content that gets the most reach on Facebook,” notes the spokesperson, “it is nothing like what this study suggests, as shown in our content report.”
Facebook in the face of the great conflicts of 2020-21
First with social protests, then with the presidential elections between Donald Trump and Joe Biden (he even suspended Trump’s page), and always during the coronavirus pandemic, the work of the networks has been criticized.
Many see Facebook as a hotbed of false information, and criticism of Mark Zuckerberg’s platform has even come from President Biden.
The first North American president indicated that the information in the network “kills people”, due to what he considers as reduced measures on the coronavirus.
However, Facebook responded with statistics of pages blocked for misinforming about the disease and vaccines, in addition to all its contribution to health institutions.
The network announced in mid-August that deleted more than three thousand accounts and 20 million pieces of content due to disinformation since the start of the pandemic, in 2020.
“COVID-19 continues to be a major public health problem,” said Guy Rosen, Vice President of Integrity at Facebook. “We are committed to helping people obtain authoritative information, including information about vaccines.”
“We continue to remove harmful misinformation about the coronavirus and ban advertisements that attempt to exploit the pandemic for financial gain,” he stressed.