Government Lending

Meta to settle with DOJ over Fair Housing Act claims

Meta must change the way it decides who sees housing ads before Dec. 31

Meta Platforms, formerly known as Facebook Inc., agreed to settle allegations that its in-house advertising system discriminates in choosing which users receive housing ads, the Department of Justice announced on Tuesday.

The department claims that Meta’s housing advertising system uses a “discriminatory algorithm” that is preferential toward certain demographics over others. The DOJ said that this violates the Fair Housing Act.

Per the settlement, the social media giant has until Dec. 31, 2022 to overhaul its “special ad audience” tool and develop a new tool that is more inclusive. The new system will be subject to approval from the DOJ.

Additionally, Meta will be required to pay a civil penalty of $115,054 for violating the Fair Housing Act.

A spokesperson from Meta said in a statement that in light of the settlement, the company will innovate how advertising gets delivered to a user.

“We will be building a novel machine learning method within our ads system that will change the way housing ads are delivered to people residing in the US across different demographic groups,” the Meta spokesperson said.

Rise and fall of data in financial services

 Now that consumers are more comfortable than ever before with digital tools and marketing, is 2022 a good year or a bad year to start using data to drive growth? From this white paper, you will learn the changes to third-party data, how financial services organizations are turning to internal data to fuel growth, and how new technology has made using internal data easier than ever before.

Presented by: Total Expert 

The government’s lawsuit, filed earlier this week, alleges that the company “enabled and encouraged” advertisers to target housing ads by relying on the race, color, religion, sex, disability, familial status and national origin of a Facebook user as metrics to decide who is eligible and ineligible to receive housing ads.

The DOJ also said that Meta’s ad delivery system uses machine-learning algorithms that help determine which subset of an audience receives a housing ad, relying on race, national origin and gender. These characteristics are protected by the Fair Housing Act, the department said.

Kristen Clarke, assistant attorney general at the DOJ, said in a statement that going forward, companies need to be mindful of how they implement algorithmic tools.

“This settlement is historic, marking the first time that Meta has agreed to terminate one of its algorithmic targeting tools and modify its delivery algorithms for housing ads in response to a civil rights lawsuit,” Clarke said. “The Justice Department is committed to holding Meta and other technology companies accountable when they abuse algorithms in ways that unlawfully harm marginalized communities.”

The Department of Housing and Urban Development referred the matter to the Justice Department for litigation, the DOJ noted. In 2019, HUD alleged that Meta’s ad delivery system violated the Fair Housing Act.

The current agreement between Meta and the DOJ is contingent on approval from the U.S. District Court for the Southern District of New York, where the government filed its lawsuit against the media giant earlier this week.

Most Popular Articles

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please