Automating Inequality
How High-Tech Tools Profile, Police, and Punish the Poor
Virginia Eubanks
-
According to court documents, in December 2007 just over 11,000 documents were unindexed. By February 2009, nearly 283,000 documents had disappeared, an increase of 2,473 percent. The rise in technical errors far outpaced increased system use. The consequences are staggering if you consider that any single missing document could cause an applicant to be denied benefits
-
The automation's impacts were devastating for poor and working-class Hoosiers. Between 2006 and 2008, the state of Indiana denied more than a million applications for food stamps, Medicaid, and cash benefits, a 54 percent increase compared to the three years prior to automation
-
Coordinated entry is based on two philosophies that represent a paradigm shift in the provision of homeless services: prioritization and housing first. Prioritization builds on research by Dennis Culhane from the University of Pennsylvania, which differentiates between two different kinds of homelessness: crisis and chronic. Those facing crisis homelessness tend to be experiencing “short-term emergencies [such as] eviction, domestic violence, sudden illness, or job loss, or reentering communities after incarceration.” The crisis homeless, Culhane argues, often self-correct
-
While there is little research on the subject, a study of data from the 1998 Canadian Incidence Study of Reported Child Abuse and Neglect found that approximately 4 percent of reports of child maltreatment were intentionally false. Of the 15,139 total reports of child abuse and neglect Allegheny County received in 2016, we can conservatively estimate that 605 were intentionally false. It is illegal to call a malicious report into a child abuse and neglect hotline. But Pennsylvania currently accepts reports from anonymous callers, so there is little a parent can do if a neighbor, relative, or acquaintance decides to harass or intimidate them in this way. The AFST has no way of recognizing or screening out nuisance calls. Call referral is a deeply problematic proxy for maltreatment. It can be easily manipulated. CYF's own research shows that it creates nearly all the racial disproportionality in the county's child protective system. In other words, the activity that introduces the most racial bias into the system is the very way the model defines maltreatment.
-
African American poverty decreased dramatically during the 1960s and the African American share of AFDC caseloads declined. But the percentage of African Americans represented in news magazine stories about poverty jumped from 27 to 72 percent between 1964 and 1967.
-
Ronald Reagan's 1976 stump speech about the lavish lifestyle of “welfare queen” Linda Taylor was intended to make the face of welfare both Black and female. “There's a woman in Chicago,” he said during the New Hampshire Republican presidential primary contest. “She has 80 names, 30 addresses, 12 Social Security cards and is collecting veterans' benefits on four non-existing deceased husbands. She's got Medicaid, getting food stamps and she is collecting welfare under each of her names. Her tax-free cash income alone is over $150,000.” Ms. Taylor was eventually charged with using 4 aliases, not 80, and collecting $8,000, not $150,000, but Reagan's overblown claims found fertile ground, and the image of the welfare queen has remained central to our country's understanding of public assistance
-
Models are useful because they let us strip out extraneous information and focus only on what is most critical to the outcomes we are trying to predict. But they are also abstractions. Choices about what goes into them reflect the priorities and preoccupations of their creators.
-
In its suit against IBM, the state charged that the company had misrepresented its ability to modernize complicated social service programs and failed to meet the performance standards contained in the contract. Automated counties lagged behind “as-is” counties in almost every area of performance: timeliness, backlogs, data integrity, determination errors, and number of appeals requested
-
In his famous novel 1984, George Orwell got one thing wrong. Big Brother is not watching you, he's watching us. Most people are targeted for digital scrutiny as members of social groups, not as individuals. People of color, migrants, unpopular religious groups, sexual minorities, the poor, and other oppressed and exploited populations bear a much higher burden of monitoring and tracking than advantaged groups.
-
While coordinated entry may minimize some of the implicit bias of individual homeless service providers, Blasi reflected, that doesn't mean it is a good idea. “My objection to [coordinated entry] is that it has drawn resources and attention from other aspects of the problem. For 30 years, I've seen this notion, especially among well-educated people, that it's just a question of information. Homeless people just don't have the information.”
-
In short, when poor and working people in the United States become a politically viable force, relief institutions and their technologies of control shift to better facilitate cultural denial and to rationalize a brutal return to subserviency. Relief institutions are machines for undermining the collective power of poor and working-class people, and for producing indifference in everyone else.
-
If sleeping in a public park, leaving your possessions on the sidewalk, or urinating in a stairwell are met with a ticket, the great majority of the unhoused have no way to pay resulting fines. The tickets turn into warrants, and then law enforcement has further reason to search the databases to find “fugitives.” Thus, data collection, storage, and sharing in homeless service programs are often starting points in a process that criminalizes the poor.
-
Despite their cruelty and high cost, county poorhouses were the nation's primary mode of poverty management until they were overwhelmed by the Panic of 1873.
-
Removing human discretion from public assistance eligibility may seem like a compelling solution to the continuing discrimination African Americans face in the welfare system. After all, a computer applies the rules to each case consistently and without prejudice. But historically, the removal of human discretion and the creation of inflexible rules in public services only compound racially disparate harms.
-
Widespread reproductive restrictions were perhaps the inevitable destination for scientific charity and eugenics. In the Buck v. Bell case that legalized involuntary sterilization, Supreme Court Justice Oliver Wendell Holmes famously wrote, “It is better for all the world if, instead of waiting to execute degenerate offspring for crime or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. The principle that sustains compulsory vaccination is broad enough to cover cutting the Fallopian tubes.” Though the practice fell out of favor in light of Nazi atrocities during World War II, eugenics resulted in more than 60,000 compulsory sterilizations of poor and working-class people in the United States.