User:Silentrod/sandbox

The use of predictive analytics, data mining, and automated decision-making in managing welfare in America fails the impoverished and creates an oppressive surveillance system that deters the vulnerable from seeking assistance, argues Virginia Eubanks in her 2017 book Automating Inequality. A Professor of political science at the University of Albany, SUNY, Eubanks adds to the growing chorus of concern that challenges the idea that computer-based algorithms and systems are unbiased and accurate.

Part of the problem is that the mathematical formulas and programming in these systems are both opaque and lack all the information there is about each person; these programs substitute other variables or “proxies” for the missing information. An example of this in predictive analytics would be a program declaring a person as unlikely to repay a loan because of where they live, or their formal education level, or even their language pattern. The decisions are not fully informed or based on inaccuracies or assumptions. These systems are called opaque because of the variables and proxies – the information that informs the programs – are not known by the average user. Understanding how an automated system makes the decisions is virtually unknowable to the public: every system and program uses different criteria, information, and substituting proxies.

The other part of the problem is the faith people put in automated decision-making: the idea that the program or system is accurate and fair is not always true. The belief that the computer must be right can cause people to second-guess their experience, knowledge, and judgement. Fortunately, as Eubanks explains, hybrid systems that permit workers higher decision-making power over programs can reduce these instances, and are in use.

Instead of helping people in need, these unintentional issues may worsen the problem. These systems compound the misery of poverty, creates systemic racism and bias, and perpetuates the established concept that social policy less of a social commitment and more of a personal fault. In much the same way as the Industrial-era Poorhouses of the 19th and 20th centuries acted as a moralized determent to seeking poverty relief by separating families and exploiting the poor as a labour force, the use of automated systems in modern poverty management continues to deter, exclude, and dehumanize the vulnerable. These modernized systems discourage and divert users from public benefits, and erode civil liberties such as the right to mobility. The moralized suffering of poverty – the idea that poverty is a personal fault or character flaw - becomes grounds for racist and classist hierarchical structures that would not be tolerated by the wealthier classes.

Using ethnographies and interviews involving the management of automated welfare systems and the lived-experience of users, Eubanks shows how these modern systems create what she calls “the Digital Poorhouse.”

From Poorhouse to Database
The Poorhouse was a communal institution of last resort for the poverty-stricken. The infamous Rensselaer County House of Industry – the first Poorhouse built in Troy, New York – was an institutionalized nightmare, where the mentally-ill slept on urine-soaked hay and lacked access to sanitation. At the same time, they were housed in 4.5 x 7-foot cells for upwards of six months at a time

The economic Depression of 1819 fuelled a growing upper and middle-class fear of the publics’ increasing reliance on social assistance or “pauperism”. This concern led to Josiah Quincy III’s idea of separating the needy into two primary groups: those deserving of charity, and those that can work for a living, or the “able poor.” In Quincy’s mind, the worthy were the elderly and infirm, infants, and those with a “corporeal disability”; the rest can work to some degree or another. Those deemed able yet failed to work were heaped with scorn.

The advent of the scientific charity movement created the perspective that families in need of public benefits were like curious cases in need of solving, further increasing a public distrust and suspicion of welfare users.

Collecting information and creating databases about poor people is not a new development. During the eugenics movement in America, social scientists began interviewing, photographing, finger-printing, and evaluating the poor across America. They inventoried children, mapped out family trees, and measured people’s heads. They investigated and identified people’s sexual histories and preferences and their personal habits and behaviours. The notes from the studies were moralized, noting poor people as imbecilic and feeble-minded, "harlot," and 'dependent”.

Modern welfare in America still reflects some of the values and practices of the Poorhouse and the scientific charity and the eugenics movements. Service workers are still referred to as  ‘caseworkers’ and clientele are ‘cases’; the poor continue to have databases of information about their personal lives generated, compiled, evaluated and shared with others. Like the institutional deterrent of the Poorhouse, the use of predictive analytics, data mining, and automated decision-making diverts and discourages the poor and vulnerable from seeking and getting the help they need.

Automating Eligibility in the Heartlands
Turning her attention to the state of Indiana, the author surveys the problematic evolution of automated welfare services, from system-wide crashes to court cases and protests. The tragic failures of the system are also featured, such as Omega Young, who’s medical, and public benefits were cancelled because she failed to attend a meeting; she was hospitalized for her terminal cancer at the time and was receiving treatment.

The reason why thousands in Indiana were denied assistance or purged from the welfare rolls, Eubanks argues, is because of the classist and racist assumptions about the poor that persists today. The suspicion that people needing help are lazy, dishonest, and that they should be discouraged.

Although the system has improved by adopting a hybrid system rather than a more automated option ,  poverty continued to increase despite a reduction in public benefit use

High-Tech Homelessness in the City of Angels
Los Angeles, California, has had a long-standing problem with homelessness. The use of a coordinated entry system - connecting the homeless with shelter - was intended to make things easier to shelter the homeless by cutting red tape and promoting stabilized housing. However, a lack of stable rent history and low or no credit kept landlords for housing those in need. Unfortunately, the homeless problem continues.

Data collected for users of the program – including personal history, photograph, social insurance number, and known hang-outs – would then be shared with 168 different organizations ranging from charity to government to police services.

An additional concern is lax or nonexistent security provided to this data. Specifically, before 1996, access to state welfare records such as these required due legal process: warrants and approval were requirements for police and other services to see the files. After the welfare reforms of 1996, the homeless lost the due process afforded to other Americans, in that the police could access a person’s welfare history on request. An example would be Operation Talon, when American authorities mined the data of food stamp recipients with a database of warrants. Recipients with any warrant were called in under the pretense of a meeting and were instead arrested.

This erosion of civil rights is only perpetrated against the poor and vulnerable, Eubanks carefully notes. Other sensitive data, such as mortgage histories or student loans, still require warrants for the authorities to access them. The more affluent classes would not tolerate these intrusions and infringements of rights.

The Allegheny Algorithm
The Child, Youth, and Family Services of Pittsburgh, Pennsylvania, used predictive analytics to guess the likelihood and severity of alleged child maltreatment. Included in the algorithm was two proxied values: the frequency of reports of mistreatment, and whether the child was with their biological parents, or in the care of another. Although touted for its “fair to good” accuracy, the predictions had a 70% error rate.

That inaccuracy created false positives, resulting in the surveillance and intervention of the state when it was not required and potentially exacerbating tough times. Of even greater danger was when the program under-rated the severity of the maltreatment, possibly ignoring children in real need.

Like LA’s homeless, those in need in Pennsylvania likewise faced an enhanced data investigation compared to wealthier families. Social workers mine social media sites and public benefit databases for a more detailed evaluation of the family. What is problematic for the author is that while public benefit-usages records are used to inform the caseworker, private or out-of-pocket service records are not included. That is an issue not because Eubanks thinks that all service records – counselling, therapy, etc. – should be made available. Instead, the fact that people in need access to supportive services means that those private details are included in the worker’s evaluation. People who can afford private services do not have that history included in the caseworker’s assessment, which both potentially endangers the children and privileges the appearance of the more affluent family. The family in need, on the other hand, by previously seeking help, appears worse because there is a paper-trail of supportive services. As Eubanks argues, this means the system demonstrates both the lack of privacy afforded the poor, and the biased idea that richer families are more deserving of privacy. In essence, this creates an algorithmic bias against the poor.

The inclusion of data from previously accessed public benefit services can itself be a deterrent to seeking further aid. The growing – and lasting – history of use and the erosion of privacy may cause people to refrain from getting help.

The Digital Poorhouse (and How to Dismantle It)
For Eubanks, the primary roadblock in creating a more equal and equitable society is that our ethical treatment and regard for the poor has not evolved nearly so quickly as technology has. In the context of the social net of welfare services, this means that society cares less about the suffering and more about the perceived threat the impoverished may cause the wealthier, such as the middle- and upper-classes.

Eubanks offers two recommendations to dismantle the Digital Poorhouse and improve the outcomes of people needing social assistance: a declarative “Principles of Non-Harm for Big Data,” and the implementation of a guaranteed basic income. Intended for the digital age, Eubanks offers an oath of respect for and the consent of end-users, the acknowledgement and removal of biases and barriers to seeking aid, and the overall use of the welfare state as a mechanism to help people rather than surveil them.

A Guaranteed Basic Income is precisely that: a modest income provided by the government without conditions as a replacement for the current welfare system. Advocates of the plan claim that a ‘strings-free’ income could reduce financial stress in low-wage households, encourage educational pursuits, and eradicate the stigma associated with welfare.

Critical Reception
Writing about all the issues involved in poverty in contemporary America – including race and racism – would make for a very thick book indeed. Although Eubanks does specifically write about the disproportional representation and impact on minorities, some have stated that the role and connection between discrimination and race were not explored critically enough. Further to that, the book has been criticized for failing to explore the relationship between social services-related data and police action.

Automating Inequality: How high-tech tools profile, police, and punish the poor successfully bridges the gap between academic research material and mainstream reading due to the accessible writing Eubanks delivers. Described as “riveting”, the book has raised both questions and concerns about the invasive and expansive surveillance web services users find themselves under. More directly, the fear that what is applied to the vulnerable may just as quickly be done to the rest of the populace.