Colin Lecher reports in The Verge:
Landlords and lenders are pushing the Department of Housing and Urban Development to make it easier for businesses to discriminate against possible tenants using automated tools. In deciding whether to rent or sell someone a home, businesses run background checks, calculate insurance costs, examine credit, and take account of an applicant’s history. The tools that are used can have a devastating cost: a faulty or biased algorithm won’t just harm a single person, but can shut people out of housing in entire neighborhoods.
Landlords and lenders are pushing the Department of Housing and Urban Development to make it easier for businesses to discriminate against possible tenants using automated tools. Under a new proposal that just finished its public comment period, HUD suggested raising the bar for some legal challenges, making discrimination cases less likely to succeed. Fair housing advocates have cried foul, arguing that the change will open the door for companies to discriminate with algorithms and get away with it.Like most modern industries, the housing market relies on automation. In deciding whether to rent or sell someone a home, businesses run background checks, calculate insurance costs, examine credit, and generally take account of an applicant’s history. The tools that are used are largely hidden from public view, but they can have a devastating cost: a faulty or biased algorithm won’t just harm a single person, but can shut people out of housing in entire neighborhoods.To help ensure communities are all treated equally by those tools, the Department of Housing and Urban Development finalized a rule in 2013 known as the disparate impact standard. Under the rule, if a protected group of people is harmed by a policy — even if that policy isn’t directly targeted at that group — then the company or government agency that implemented the policy can be held liable. If a zoning algorithm disproportionately harms people of color, for example, the city might face a lawsuit under the rule.The standard has proven to be a crucial aid for advocates dealing with algorithmic discrimination. In one recent case out of Connecticut, a fair housing group has used the policy to sue over an automated background check system. Under the new rule, attorneys would have to go jump through new legal hoops to make a disparate impact case. The proposed change has generated tens of thousands of comments, and a review of them shows a clear divide, as fair housing and civil rights advocates square off against private industry.Housing, mortgage, and insurance companies have said the old rules are too burdensome. A mortgage subsidiary of DR Horton, which bills itself as the largest home-builder in America, said in a comment to HUD that the new plan could “reduce frivolous and arbitrary claims.” Another mortgage company told HUD that the revised rule would provide “clarity and uniformity for those who seek to comply with their legal responsibilities.” One insurance company argued to the agency that the changes would “more appropriately position insurers to defend against disparate impact challenges.”But many local and national advocates have said that the changes would completely upend their work on behalf of vulnerable people. National groups including the Electronic Frontier Foundation and the American Civil Liberties Union have pushed back on the HUD plan. In a comment to the agency, the nonprofit Center for Democracy and Technology called the proposal an “unprecedented departure from decades of HUD and federal court precedent” and said that the agency’s reasons for the proposal “have no basis in law or in data or computer science.”The Greater New Orleans Fair Housing Action Center, a nonprofit group in Louisiana, has used the disparate impact standard to challenge an algorithm that unfairly distributed less money for black families to rebuild their homes after Hurricane Katrina. In a comment to HUD, the organization said the agency was proposing a “safe harbor” for housing companies that use algorithms to determine policy, and in the process, setting up the housing market to be “rife with discrimination.”Cashauna Hill, executive director of the center, told The Verge that HUD’s changes would make similar cases “all but impossible” to pursue in the future.“People have not only gotten smarter about how to discriminate,” she says, “but we also know that housing providers are outsourcing a lot of the work to data companies, and algorithms are doing a lot of the work.”
0 comments:
Post a Comment