How to Keep AI Bias Out of the Housing Sector

How to Keep AI Bias Out of the Housing Sector

There has been a whole lot of concern raised about bias creeping into AI due to the models being training with already biased information. Afterall, if AI is using any previously created content then it inherently has some biased baked in because before now, people wrote and created content on the web and people are biased. This became evident early on as AI started to catch mainstream attention especially in racial bias in AI generated imagery.

Recently, a significant case involving algorithmic bias in advertising was settled in New York. The U.S. Department of Justice (DOJ) filed a lawsuit against Meta Platforms, Inc., formerly known as Facebook, alleging that the company's advertising algorithms for housing ads were discriminatory. The lawsuit claimed that Meta's algorithms allowed advertisers to target users based on characteristics protected under the Fair Housing Act (FHA), such as race, religion, sex, and national origin. This practice led to some users being unfairly excluded from seeing certain housing advertisements (https://www.justice.gov/usao-sdny/pr/united-states-attorney-resolves-groundbreaking-suit-against-meta-platforms-inc-formerly ).

Under the terms of the settlement, Meta agreed to cease using its "Special Ad Audience" tool, which was found to discriminate based on FHA-protected characteristics. Furthermore, Meta must develop a new system to address these disparities and ensure fairer ad delivery. The settlement also includes the appointment of an independent third-party reviewer to monitor Meta's compliance with the new system.

This case highlights the complexities of using AI in advertising and the potential for bias within these systems. It also sets a precedent for holding technology companies accountable for algorithmic discrimination, emphasizing the need for transparent and fair AI practices in all sectors.

Bias in the housing sector runs deep; landlords and housing providers may discriminate against individuals based on race, either consciously or unconsciously. This can result in minorities facing higher rent prices, being denied housing, or being steered toward less desirable neighborhoods. Women, particularly single mothers, might be discriminated against in rental applications or housing opportunities. Landlords often prefer tenants with higher incomes, which can disadvantage low-income individuals, even if they have reliable payment histories. Families with children may be discriminated against due to assumptions about noise or property wear and tear. Individuals with disabilities might face difficulties finding accessible housing or might be discriminated against based on assumptions about their ability to maintain the property.


How AI Could Alleviate Bias (With good algorithms of course)

Automated Application Processing: AI can be used to process rental applications, ensuring that all applicants are evaluated based on the same criteria without human bias. For example, algorithms can focus on objective factors like credit scores, rental history, and income verification.

Fair Pricing Models: AI can analyze market data to set fair rental prices, helping to eliminate discriminatory pricing practices based on race, gender, or other biases.

Predictive Analytics: AI can predict tenant behavior and reliability based on data-driven insights rather than subjective judgments, reducing biases related to income, family status, or other personal characteristics.

Monitoring and Auditing: AI can continuously monitor rental practices and flag potential discriminatory patterns, allowing for proactive measures to address bias.

Rental Property Housing Quality Standards (HQS) Inspections: Inspective has developed an AI-based application to conduct HQS inspections ensuring rental properties are meeting state, local, and federal guidelines for basic living conditions that ensure the properties meet minimum safety and habitability standards. This ensures people who provide social assistance, have disabilities, and:

  • Low-Income Families have access to safe and affordable housing by minimizing the risk of substandard living conditions.
  • Elderly Residents can live in environments that accommodate their specific needs, promoting health and safety.
  • Veterans receive the quality housing they deserve, recognizing their service and sacrifices.
  • Minority Communities are protected from discriminatory practices that might otherwise lead to unequal housing opportunities.
  • Tenants with Health Issues are guaranteed housing that does not exacerbate their conditions, ensuring homes are free from hazards like mold and poor ventilation.

By leveraging AI, thorough unbiased inspections should be able to be conducted, providing a reliable assessment that helps landlords maintain compliance, Public Housing Authorities and agencies provide better transparency, and support tenants in securing quality housing.


要查看或添加评论,请登录

社区洞察

其他会员也浏览了