Meta’s Election Research Opens More Questions Than It Answers
“We don’t know what would have happened had we been able to do these studies over a period of a year or two years,” Guess said at a press briefing earlier this week. More importantly, he said, there is no accounting for the fact that many users have had Facebook and Instagram accounts for upwards of a decade now. “This finding cannot tell us what the world would have been like if we hadn’t had social media around for the last 10 to 15 years or 15 or 20 years.”
There’s also the issue of the specific timeframe that the researchers were able to study—the run-up to an election in an atmosphere of intense political polarization.
“I think there are unanswered questions about whether these effects would hold outside of the election environment, whether they would hold in an election where Donald Trump wasn’t one of the candidates,” says Michael Wagner, professor of journalism and communication at University of Wisconsin-Madison, who helped oversee Meta’s 2020 election project.
Meta’s Clegg also said that the research challenges “the now commonplace assertion that the ability to reshare content on social media drives polarization.”
Researchers weren’t quite so unequivocal. One of the studies published in Science, found that resharing elevates “content from untrustworthy sources.” The same study showed that a substantial amount of news on Meta’s services are consumed exclusively by conservative users, and that most of the misinformation caught by the platform’s third-party fact checkers is concentrated amongst and exclusthis group, which has no equivalent on the opposite side of the political aisle, according to an analysis of about 208 million users.
Another study found that while participants whose feeds excluded reshared content did end up consuming less partisan news, they also ended up less well informed in general. “We often see that polarization and knowledge kind of move together,” says Guess. “So you can make people more knowledgeable about politics, but then you’ll see an increase in polarization among the same set of people.”
“I don’t think the findings suggest that Facebook isn’t contributing to polarization,” says Wagner. “I think that the findings demonstrate that in 2020, Facebook wasn’t the only or dominant cause of polarization, but people were polarized long before they logged on to Facebook in 2020.”
The studies released today represent just the first tranche of research. Thirteen more are expected over the coming months that will focus on topics including the impact of political advertisements and attitudes toward political violence around the January 6 insurrection at the Capitol.
Meta spokesperson Corey Chambliss told WIRED that the company does not have plans to allow similar research in 2024. When asked about whether Meta would be funding further research, Chambliss pointed to the company’s newly announced research tools, particularly the Meta Content Library and API. “The Library includes data from public posts, pages, groups, and events on Facebook,” he says. “For Instagram, it will include public posts and data from creator and business accounts. Data from the Library can be searched, explored, and filtered on a graphical user interface or through a programmatic API.”
Notably, the newly published studies did not investigate ways to specifically de-polarize users. As a result, researchers say that while there are reasons to be concerned about social media’s impact on politics, it’s no more clear what the policy solutions could be.
“It would have been nice for the public, for lawmakers, for regulators and for social science to have a better idea of what kind of interventions might make things better,” says Wagner.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.