In the News

Your Data Is Discriminating…Against You

Marie-Claire

October 1, 2020

For some, privacy infringement doesn’t just mean annoying ads; it could mean being denied a job or housing. Prachi Gupta investigates big data’s big problem.

In 2016, Carmen Arroyo’s 22-year-old son, Mikhail, regained consciousness from a six-month coma. He had been electrocuted while atop an electrical pole and had fallen nearly 30 feet, leaving him unable to walk, speak, or take care of himself. Arroyo, then 44, filed an application with her landlord requesting permission for her son to move into her apartment in Willimantic, Connecticut, with her. According to court records, the application was quickly denied without explanation, and Mikhail was sent to a rehabilitation facility, where he would remain for more than a year while his mother searched for a reason why.

Arroyo contacted the Connecticut Fair Housing Center (CFHC), a nonprofit that provides free legal services to alleged victims of housing discrimination. In the process of filing a complaint against the landlord, Arroyo and her lawyers discovered that the landlord didn’t know why Arroyo was denied either; the decision hadn’t been made by him but by an algorithm used by CoreLogic, a software company he had enlisted to screen potential tenants. After Arroyo filed her complaint, the landlord allowed Mikhail to move in with his mother. Arroyo’s lawyers kept digging and ultimately determined what caused the rejection: a citation for shoplifting from 2014 (which has been withdrawn), according to court documents. “He was blacklisted from housing, despite the fact that he is so severely disabled now and is incapable of committing any crime,” says Salmun Kazerounian, a staff attorney from CFHC who represents Arroyo.

What happened to the Arroyo family is just one example of data leading to discrimination. Automated data systems—technology like CoreLogic’s—use collected intel (public data, such as DMV and court records, that may also be packaged with information scraped from the Internet, like social-media activity) to make life-altering decisions, including whether applicants get jobs, the cost of their insurance, or how a community is policed. In theory, these systems are built to eliminate bias present in human decision-making. In reality, they can fuel it.

That is in part because algorithms are made up of biased data and often don’t consider other relevant factors. Because low-income people have more contact with government agencies (for benefits like Medicaid), a disproportionate amount of their info feeds these systems. Not only can this data fall into corporate hands, but the government itself uses it to surveil. For example, when UC Berkeley law professor Khiara Bridges interviewed pregnant women applying for prenatal care under Medicaid, she found that they had to reveal their sexual histories and incidences of domestic violence—details that can then be shared with other public agencies. “I talked to pregnant women who came to the clinic just to get prenatal care, and then the next day they would get a call from Child Protective Services,” Bridges says. When people seek support from the state, “that can end up penalizing them later,” adds University of Baltimore law professor Michele E. Gilman. A person applying for a public benefit can be flagged as a risk, which limits future housing or employment opportunities. People who don’t need to apply for public benefits are exempt from these injustices. 

The problem is pervasive, invisible, and cyclical: Biased data is used to justify surveillance, creating an endless feedback loop of discrimination. In 2012, the Chicago Police Department began using predictive analytics, reliant mostly on arrest-record data, to increase surveillance on certain individuals it considered more likely to commit, or be victims of, gun violence. The program was shelved in 2019 after findings showed it was ineffective at reducing homicides. Civil-rights groups said it perpetuated racial bias. “The algorithm is developed from what you give it,” says Brandi Collins-Dexter, senior campaign director for racial-justice-advocacy group Color of Change. “If you give it trash, it’s going to give you trash.” Feed an algorithm biased information and it will enable future bias.

Read the complete article here on Marie Claire.

Cohen Milstein is partnering with Connecticut Fair Housing Center and the National Housing Law Project in representing Carmen Arroyo.