A groundbreaking class action lawsuit addressing alleged racial and economic discrimination by an AI-based tenant screening algorithm has ended in a $2.2 million settlement. The case, led by plaintiff Mary Louis, centered on SafeRent Solutions’ scoring system, which allegedly penalized housing voucher users and disproportionately impacted low-income Black and Hispanic renters.
The settlement requires SafeRent to discontinue its scoring feature in certain cases and mandates third-party validation for future algorithmic tools. While SafeRent did not admit fault, the case underscores growing concerns about bias in AI-driven decision-making in housing, employment, and other critical areas.
Mary Louis, the lead plaintiff, recounted her struggle to find housing after SafeRent’s algorithm denied her application despite her 16-year history of timely rent payments. Efforts to appeal the decision were rejected, highlighting the lack of recourse for those affected by such technology.
With minimal regulations governing these AI systems, the lawsuit sets a precedent for legal challenges to algorithmic discrimination, as calls for stricter oversight remain stalled in many states. The resolution of this case may signal a shift toward increased accountability for technology companies deploying AI tools with far-reaching consequences.