In the News

Comment: CoreLogic Use of Algorithms to Screen Housing Candidates Challenged in Approaching Trial

MLex

January 13, 2022

If a bank, landlord or employer illegally discriminates against a person based on an automated decision by a computer algorithm, who is guilty of discrimination — the algorithm, or the individual or organization that denies the job, apartment or service? A US District Court trial in early 2022 may decide whether there is any crack of daylight between the responsibility of the algorithm and the landlord, based on allegations by a Connecticut housing advocate that sued a tenant-screening service over its use of an algorithm to deny housing based on an applicant’s criminal record.

If a bank, landlord or employer illegally discriminates against a person based on an automated decision by a computer algorithm, who is guilty of discrimination — the algorithm, or the individual or organization that denies the job, apartment or service?

A US District Court trial in early 2022 may decide whether there’s any crack of daylight between the responsibility of the algorithm and the landlord, based on allegations by a Connecticut housing advocate that sued a tenant-screening service over allegations its “CrimSAFE” algorithm violated fair housing and financial privacy laws in denying housing to a disabled Latino man.

That bench trial slated to begin March 14 in New Haven, Connecticut, is to pit defendant CoreLogic Rental Property Solutions against the Connecticut Fair Housing Center, which said in a 2018 complaint that CoreLogic’s algorithm violated the Fair Housing Act (see here) by disqualifying “African-American and Latino applicants from securing housing based on discriminatory use of criminal records as rental criteria.”

— Denied housing —

The case involves Carmen Arroyo, who attempted to move her adult son, Mikhail, into the Willimantic, Connecticut, apartment complex where she lived, following a 2015 accident that left Mikhail unable to walk, speak, or care for himself. Mikhail Arroyo was denied housing based on a “disqualifying record” returned by the CrimSAFE algorithm that did not provide any more detail on what the offense was alleged to be, the suit says. CrimSAFE pulls in data from a national database of criminal arrest and incarceration records to screen applicants for rental housing.

“Typically, people talk about whether the algorithm can be liable. This will, I believe, be one of the first cases to show that the vendor providing these algorithmic reports is responsible,” said Christine Webber, a lawyer who represents the Arroyos in the case.

“You can always litigate over whether an individual landlord relied on criminal history improperly, but here it’s getting at the underlying system that allows this to happen, not just to the Arroyos, but to thousands of people across [Connecticut] and many more nationwide,” Webber said.

— RPS View —

. . .

The CrimSAFE algorithm is too simple to truly be labeled artificial intelligence, Webber said, though it was an example of automated decision-making. Webber said the goal of the trial is to win injunctive relief that would block use of the algorithm in the way it was used with Arroyo.

“That is the heart of what the trial is going to be about,” she said. “It’s hard for anybody to deny there is adverse impact, and so it’s really a question of ‘can the defendant justify the adverse impact as a business necessity, and if so, can we show there are less discriminatory alternatives to the current system?’.”

There are alternatives, she said. “This is so much about the injunctive relief and getting the system to change. They don’t want to change the way they do business,” she said.

Claims under both the US Fair Housing Act and Fair Credit Reporting Act — that law protects the privacy of information collected by consumer reporting agencies — will be at issue in the trial, which is due to end March 31. Data cited by Bryan in her 2020 summary judgement ruling found that Blacks comprised nearly 30 percent of the people arrested in Connecticut in 2016, but they were only about 11 percent of the population.

That disparity is the key reason why an algorithm used to screen tenants based on criminal arrest records alone will discriminate against racial minorities, Webber said.

“When you decide to automatically exclude people from housing opportunities based on the simple existence of a criminal record, that is an across-the-board policy that is going to have disparate impacts on African-Americans; it’s going to have a disparate impact on Latinos,” she said.