Affordable HousingEnforcementMortgagePolitics & MoneyReal EstateTechnology

Meta rolls out new system to address housing ad discrimination

The Variance Reduction System is part of a settlement with the DOJ for the lawsuit claiming Meta violated the Fair Housing Act

Meta Platforms, formerly known as Facebook Inc., announced on Monday the rollout of a new system to address algorithmic discrimination in housing advertisements after more than a year of negotiations with the Department of Justice (DOJ). 

The Variance Reduction System (VRS), required as part of the settlement made public on June 27, resolves a lawsuit filed in the U.S. District Court for the Southern District of New York. In the lawsuit, the DOJ claimed that Meta’s housing advertising system used a “discriminatory algorithm” that is preferential toward certain demographics over others and violates the Fair Housing Act.

The parties announced Monday that they had agreed on the new Variance Reduction System’s compliance targets. However, Meta will be subject to court oversight and regular review of its compliance with the settlement through June 27, 2026.

The company will extend the system’s use to U.S. employment and credit ads over the coming year, according to Roy L. Austin Jr., vice president of civil rights and deputy general counsel at Meta.

“Across the industry, approaches to algorithmic fairness are still evolving, particularly as it relates to digital advertising,” Austin said in a statement. “But we know we cannot wait for consensus to make progress in addressing important concerns about the potential for discrimination.” 

The executive mentioned Meta discontinued the use of Special Ad Audiences, an additional commitment in the settlement.

The DOJ’s lawsuit alleged that the Meta “enabled and encouraged” advertisers to target housing ads to Facebook users by relying on the race, color, religion, sex, disability, familial status and national origin as metrics to decide who is eligible and ineligible to receive housing ads.

The lawsuit also said that Meta’s ad delivery system used machine-learning algorithms that help determine which subset of an audience receives a housing ad, relying on race, national origin and gender. The Fair Housing Act protects these characteristics, the department said.

Regarding the new system, Kristen Clarke, the assistant attorney general of the Justice Department’s Civil Rights Division, said, “This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms.”  

“Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws,” the assistant attorney added. 

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular Articles

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please