Skip to navigation Skip to main content Skip to footer
flexiblefullpage

Residential Products Online content is now on probuilder.com! Same great products coverage, now all in one place!

billboard
Image Credit
Image: Vitalii Vodolazskyi / stock.adobe.com

In an effort to enforce the provisions of the Fair Housing Act and protect Americans' civil rights from discrimination in the process of attaining housing, the Department of Housing and Urban Development (HUD) has released two new documents focused on the use of artificial intelligence (AI) and algorithms in housing processes. One document addresses the potential misuse of AI's algorithmic capabilities in tenant selection while the other focuses on housing availability advertisements—two activities where research indicates that AI software can introduce bias and potential discrimination, Route Fifty reports.

The tenant screening guidance released by HUD explains that housing providers cannot discriminate against applicants based on race, color, religion, sex or gender identity, national origin, disability or familial status — and that providers will be held accountable for discriminatory actions leveraged internally or by third party algorithms. 

“A housing provider or a tenant screening company can violate the Fair Housing Act by using a protected characteristic — or a proxy for a protected characteristic — as a screening criterion,” the document reads. “This is true even if the decision for how to screen applicants is made in whole or in part by an automated system, including a system using machine learning or another form of AI.”

Similarly, HUD also stipulates that targeted advertisements for housing — encompassing entities or individuals posting ads for housing and property opportunities and services covered by the Fair Housing Act — can be held liable if found to discriminate based on HUD’s protected characteristics.

Read more

 

PB Topical Ref
leaderboard2
catfish1