More landlords and lenders are using AI. Fewer regulators are checking them for bias.
By Cassandra Dumay
Published on April 11, 2026.
The housing industry is rapidly adopting artificial intelligence tools to help identify who gets a home loan or lease, but the Trump administration is rolling back protections that ensure fair evaluations are fair. The use of AI to predict outcomes, such as a home's selling price or how likely someone is to afford their rent, has led to increased interest in expanding the role of computerized systems in housing. However, some fear that these new systems could inadvertently reinforce discrimination in society. The Federal Reserve Governor Michael Barr, an outspoken critic of the administration's deregulatory agenda, said that while AI models are trained with data that reflect historic patterns of discrimination, it could lead to unintentional bias. The Department of Housing and Urban Development initially acknowledged disparate impact methods as an important tool to root out potential discrimination, but in President Donald Trump's second term, has changed its stance, arguing that disparate impact enforcement by federal agencies was unfair to businesses and led to illegal racial preferences. Critics argue that these tools are difficult for government agencies to enforce due to lack of resources and resources.
Read Original Article