WSJ News Exclusive | Meta Officials Cite Security Concerns for Failing to Release Full India Hate-Speech Study
Executives at
Meta Platforms Inc.
META 3.33%
privately told rights groups that security concerns prevented them from releasing details of its investigation into hate speech on its services in India, according to audio recordings heard by The Wall Street Journal.
Meta, the parent company of Facebook, in July released a four-page summary of a human-rights impact assessment on India, its biggest market by users, where it has faced accusations of failing to adequately police hate speech against religious minorities. The India summary was part of the company’s first global human-rights report. The 83-page global report offers detailed findings of some previous investigations; it included only general descriptions of its India assessment, which disappointed some rights advocates.
“This is not the report that the human-rights team at Meta wanted to publish, we wanted to be able to publish more,”
Iain Levine,
a Meta senior human-rights adviser, said during private online briefings with rights groups in late July after the summary was released, according to the recordings.
“A decision was made at the highest levels of the company based upon both internal and external advice that it was not possible to do so for security reasons,” he said.
The company said at the time of the report’s release that it wouldn’t publish the full India assessment. It also said United Nations guidelines for companies reporting on human-rights issues caution against releasing details that could imperil stakeholders, a term that generally refers to people such as staff and external researchers involved in the reporting process.
Representatives from the rights groups contended in their meeting with Meta executives that the company wasn’t being transparent in its human-rights efforts, that it appeared not to take the undertaking seriously and that the groups had participated in good faith only to see Meta bury the findings, according to the recordings.
The fact that Meta isn’t releasing the full assessment is “a slap in my face and my people’s face who have endured so much hate speech on this platform,” said a person in the briefing who identified herself as an Indian Muslim researcher, according to the recordings. “We want a release of this report—now,” she said.
Mr. Levine and
Miranda Sissons,
Meta’s human-rights director, said they understood those complaints and wished they had been able to release more details, according to the recordings.
The executives said during the briefings that the effort represented an important first step in Meta addressing human-rights concerns. They said the summary was written after consulting the guidance on human-rights impact assessments for digital companies from the Danish Institute for Human Rights.
“This is the beginning of a reporting process where I think no activist, no human-rights defender of any kind would ever think that any of the work any company, or probably any entity, that is done is good enough and this team would agree,” Ms. Sissons said in one briefing, the recordings show.
Mr. Levine, who worked for more than three decades for global human-rights groups before joining Meta in 2020, told attendees of the briefings that 120 people at Meta reviewed the report, and that it was approved by president of global affairs
Nick Clegg
and chief legal officer
Jennifer Newstead.
A Meta spokesman declined to comment.
Meta has for years faced criticism from rights groups and has been probed by authorities regarding the presence of hate speech on its platforms in India, where more than 300 million people use Facebook and more than 400 million are on its WhatsApp messaging service.
Meta has said it invests significantly in technology to find hate speech across languages in India.
In 2020, Meta’s safety team concluded that a Hindu nationalist organization in India supported violence against minorities and likely qualified as an organization that should be banned from Facebook, the Journal reported that year. Facebook didn’t remove the group following internal security-team warnings that doing so might endanger both its business prospects and staff in India.
Share Your Thoughts
Should Meta release the study’s findings? Why or why not? Join the conversation below.
Then in mid-2020, Meta hired an American law firm, Foley Hoag, to undertake what the company called an independent human-rights impact assessment on its operations in India. Rights groups said the company was stifling the effort, the Journal reported in November 2021. Meta said at the time that it was trying to be thorough rather than meet any deadline for its release.
Meta has in the past released executive summaries of assessments on its operations in Indonesia, Sri Lanka and Cambodia, as well as the full version of one it commissioned on Myanmar.
The India summary said that among the human-rights risks in India were restrictions of freedom of expression and information and third-party advocacy of hatred that incites hostility, discrimination or violence. Other risks include violations of rights to privacy and the security of individuals, the report found.
Deborah Brown,
a researcher at Human Rights Watch in New York who didn’t participate in the briefings, said the U.N. guidelines stipulate that companies should provide enough information for the public to determine whether companies are actually addressing human-rights concerns.
Meta’s decision not to provide details undermines relationships with civil-society groups and creates the perception that the company doesn’t take human-rights responsibilities seriously, she said.
“This is our first human-rights report,” Mr. Levine said in one briefing, according to the recordings. “It’s been a steep learning curve for the human-rights team and for the company more broadly.”
Write to Newley Purnell at [email protected]
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.