The Meta platform, parent of Facebook, has promised to correct its discriminatory advertising practices in housing, which violate US fair housing legislation to avoid criminal prosecution, judicial sources announced Tuesday.
According to the lawsuit from the US Department of Housing and Urban Development, Meta uses algorithms on Facebook to choose the recipients of housing ads, based on a bias by race, color, religion, gender, disability, family status and origin.
The agreement they reached requires, as of December 31, “for the first time Meta to change its ad publication system to address algorithmic discrimination,” said the prosecutor of the South Court of Manhattan, Damian Williams.
Mark Zuckerberg’s company promised that it will stop using an advertising tool known as “Special Ad Audience” for housing ads, which according to the whistleblower, is based on a discriminatory algorithm aimed at users who “look alike”, based on features protected by the Fair Housing Act (FHA).
Meta should develop a new system to address, among others, racial disparities in its personalized housing advertising system. This will be the first time that a court will monitor Meta for its personalized advertising system.
If Meta fails to comply with the agreement, the court will opt for criminal proceedings, but for now the company will have to pay a fine of $115,054, the maximum contemplated by the US Fair Housing Law.
The government’s lawsuit seeks to change the algorithm that Facebook uses to find users who share similarities with groups of individuals selected by an advertiser using various options provided by the social network, which decides what type of users are eligible or ineligible to receive their advertising.