Understanding the Settlement
SafeRent, a tool used by landlords to screen tenants, has agreed to stop using AI-generated scores for applicants who use housing vouchers. This decision comes after a class action lawsuit was filed in Massachusetts, which claimed that the scoring system discriminated against Black and Hispanic applicants. The US District Judge approved a $2.3 million settlement to resolve these allegations and promote fair housing practices.
Key Details of the Settlement
- SafeRent will no longer provide a tenant screening score for those using housing vouchers.
- The company cannot use its “affordable” SafeRent Score model to score these applicants.
- Landlords must now assess voucher users based on their complete rental history instead of relying on a score.
- The settlement funds will support Massachusetts rental applicants who faced discrimination due to the scoring system.
Significance of the Change
This settlement is crucial for promoting equality in housing access. By eliminating the use of AI scores, SafeRent aims to ensure that all applicants, especially those using housing vouchers, are treated fairly. This change reflects a growing awareness of the need for transparency and fairness in tenant screening processes. It underscores the importance of evaluating potential tenants based on their overall qualifications rather than relying on potentially biased algorithms. The outcome of this case could influence future practices in tenant screening across the United States.











