Facebook starts effort to boost equity in housing ads
Meta Platforms Inc. said Monday that it has begun implementing technology designed to improve the equity of housing advertising displayed to Facebook users as part of a June settlement agreement with federal officials.
The adoption of new online advertising practices was a crucial component of a settlement among the Justice Department, federal housing officials and Meta regarding housing discrimination charges against the tech company.
Meta’s new Variance Reduction System, as it is formally called, relies on machine-learning technology that is designed to show housing ads to audiences that more closely reflect the eligible target audience for that ad.
“We will continue to make this work a priority as we collaborate with stakeholders to support important industry-wide discussions around how to make progress toward more fair and equitable digital advertising,” said Roy Austin, Meta’s vice president of civil rights and deputy general counsel, in a written statement.
Meta said it plans to expand use of the new machine-learning system to employment and credit ads. It is illegal to deny someone housing or employment based on federally protected characteristics such as race, religion and sex.
Meta’s new ad-distribution system was developed after more than a year of collaboration with the Justice Department and federal housing officials, the company said. The new ad system works by showing the ad to a large group of people and then measuring their aggregate age, gender and estimated race and ethnicity. That measurement is compared to measurements of the population of people who would have been eligible to see the ad. If there is a difference between the two measurements, the system corrects for the differences as the ad is shown to more people.
In June, the Justice Department said the settlement marked the first time that Meta would be subject to court oversight for its ad-targeting and -delivery system. The complaint from the federal government said Meta enabled and encouraged advertisers to target housing ads by relying on race, color, religion, sex, disability, familial status and national origin.
The Justice Department called Meta’s new technology a groundbreaking resolution that “sets a new standard for addressing discrimination through machine learning.”
“This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms,” Kristen Clarke, assistant attorney general of the Justice Department’s Civil Rights Division, said.
In addition to agreeing to build the new ad-delivery system, Meta in June also agreed to pay a civil penalty of $115,054, the maximum available under the Fair Housing Act, federal officials said. The company also agreed to stop using a tool for housing ads called “Special Ad Audience” that used a machine-learning algorithm to target Facebook users who shared similarities with groups of individuals selected by advertisers.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.