[IS/MDIA 590]Yohta's Workspace-Community Data

Week12(4/10)

Automating Inequality (2018・Virginia Eubanks)

[Chapter4.THE ALLEGHENY ALGORITHM]

Probabilities are based on the fact. Models can't take recent trend into consideration. Things change, people often make unreasonable
decisions, influenced by various factors.

"The AFST, like all risk models, offers only probabilities, not a perfect prediction. Though it might be able to identify patterns and trends, it is routinely wrong about individual cases. "(p.117)

"But they(models) are also abstractions. Choices about what goes into them reflect the priorities and preoccupations of their creators. Human decision-making is reflected in three key components of the AFST: outcome variables, predictive variables, and validation data."(p.118)


If this is the case, more residents would game the public service system, and cause less use of public services would lead to a negative spiral.
"The AFST sees the use of public services as a risk to children. A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF,  Supplemental  Security  Income, SNAP, and county medical assistance. (p.128)

"Families  avoid  CYF  if  they  can  afford  to because  the  agency  mixes  two distinct and contradictory roles: provider of family support and investigator of maltreatment."(p.129)

If resistance to the authority considered as a negative indicator, fewer people would have the incentive to assert themselves, and this consequently leads to a negative spiral.
"Research  by  University  of  Denver sociologist Jennifer Reich shows that, as police officers, many child welfare caseworkers see resistance as an indicator of guilt"(p.136)


"I find the philosophy that sees human beings as unknowable black boxes and machines as transparent deeply troubling.  It seems to me a worldview  that surrenders any attempt at empathy and forecloses the possibility of  ethical development.  The  presumption  that  human  decision-making  is  opaque  and inaccessible is an admission that we have abandoned a social commitment to try to understand each other"(p.138)
 

<Term>

 Allegheny Family Screening Tool (AFST):
New predictive risk model the county is using to forecast child abuse and neglect.
 

[Chapter5.THE DIGITAL POORHOUSE]

Efficacy of us, not them, make ourselves avoid eye-contact with a man.
"When we passed the anguished man near the Los Angeles Public Library and did not ask him if he needed help, it was because we have collectively convinced ourselves that there is nothing we can do for him."(p.144)

This quote would relate with the discussion we had on 4/3.
It seems as if the system attributes responsibility to an
individual, because he/she didn't have literacy or lazy by nature, 
But no/limited discussions for the system itself.

"Our public policy fixates on attributing blame for poverty rather than remedying its effects or abolishing its causes. "(p.144)

These classifications would more likely to reinforce the status quo, where the poor stays the poor, and vice versa.
"Classification measures the behavior of individuals to group like with like. Prediction is aimed instead at networks. The AFST is run on every member of a household, not only on the parent or child reported to the hotline. Under the new regime of prediction, you are impacted not only by your own actions, but by the actions of your lovers, housemates, relatives, and neighbors."(p.149)

Power dynamics of the system. The system might be designed to protect the benefit of the upper-middle class, not for the sake of 
improvement of the poor environment


"No poverty regulation system in history has concentrated so much effort on trying to guess how its targets might behave. This is because we, collectively, care less about the actual suffering of those living in poverty and more about the the potential threat they might pose to others."(p.149)


"We  measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community."(p.154)

"Asked by Julia Carrie Wong of The Guardian how he felt about his role in Uber’s future ,Rob Judge, who had been driving for the company for three months, said, “It feels like we’re just rentals. We’re kind of like placeholders until the technologycomes out.”(p.157)

"But  bias  isintroduced  through  programming  choices,  data  selection,  and  performancemetrics. The digital poorhouse, in short, does not treat like cases alike."(p.159)

<Term>

Cultural denial:
The process that allows us to know about cruelty, discrimination, and repression, but never openly acknowledge i

Digital poorhouse:
The digital poorhouse is part of a long American tradition. We manage the individual poor
in order to escape our shared responsibility for eradicating poverty.

Questions

It seems to me that fundamental trust for public services is broken due to the negative spiral:
  1. Digital poorhouse
  2. Blindly depend on the system for decision-making
  3. Resistance by targeted people would be considered as negative indicators
  4. Efficacy minimizes and hopelessness leads to fewer people to raise voice
I assume this is caused by a lack of incentive for service providers to improve their algorithm.
1. Is there any legal approach that force service providers to take community members qualitative satisfaction into consideration?
2. Most of the residents can't choose their living places. But is there any way to promote free market system for service-providers, such as changing 
    the system if the satisfaction by residents went below a certain threshold?