Judge Approves Settlement in AI Discrimination Lawsuit Over Rental Scoring Algorithm.
A federal judge has approved a settlement in a class action lawsuit claiming that an algorithm used by SafeRent Solutions to evaluate rental applicants discriminated based on race and income.
The lawsuit, led by plaintiff Mary Louis, argued that the third-party service used by landlords unfairly rejected her apartment application, a case that highlights the growing concern over the role of algorithms in rental screening.
In the spring of 2021, Louis, who had a housing voucher, was shocked when her application for a Massachusetts apartment was denied by an algorithm-based service. The lawsuit accused the company of failing to account for housing vouchers, which are critical for low-income applicants, and placing too much weight on credit scores—factors that disproportionately affected Black and Hispanic renters.
The settlement, approved on Wednesday, requires SafeRent Solutions to pay over $2.2 million and implement changes to its screening processes. However, the company did not admit any wrongdoing, maintaining that their scores comply with applicable laws. Despite the settlement, SafeRent has agreed to revise its practices to ensure fairer evaluation of applicants, including a provision that prohibits the inclusion of its scoring feature in reports when an applicant uses a housing voucher.
Louis’ attorneys argue that these AI systems, although not intentionally discriminatory, can lead to outcomes that perpetuate systemic inequalities, particularly for marginalized groups. “Management companies and landlords need to know that they’re now on notice, that these systems that they are assuming are reliable and good are going to be challenged,” said Todd Kaplan, one of Louis’ attorneys.
The case is a landmark step in holding companies accountable for algorithmic bias in housing, with broader implications for other industries where AI impacts individuals' lives.
As AI technology continues to influence critical decisions, from job applications to medical treatments, legal experts stress the need for stronger regulation. In Louis' case, a judge’s ruling reinforced that algorithms impacting access to housing could be held accountable for discriminatory practices. SafeRent’s screening service, which assigns scores to applicants, was found to violate fair housing standards, particularly with regard to housing voucher holders.
The management company's reply, which utilized SafeRent's screening service, stated, “We do not accept appeals and cannot override the outcome of the Tenant Screening.”
Louis expressed a sense of defeat, remarking that the algorithm did not understand her.“Everything is based on numbers. You don’t get the individual empathy from them,” said Louis. "There is no beating the system. The system is always going to beat us.”
Louis' son located an affordable apartment for her on Facebook Marketplace, which she has since occupied, although it is $200 more expensive and situated in a less favorable neighborhood.
“I’m not optimistic that I’m going to catch a break, but I have to keep on keeping, that’s it,” said Louis. “I have too many people who rely on me.”
Related: Robbins LLP Notifies Shareholders of Class Action Lawsuit Against Visa Inc