Rage Against the Algorithm

Dated Jul 4, 2021; last modified on Mon, 05 Sep 2022

The lack of explanability is a common theme. Higher-ups claim the machine is unbiased, while the workers on the ground say, “It’s not me; it’s the computer”.

Automating Inequality: How High-tech Tools Profile, Police, and Punish the Poor should be an enlightening read.

Computers Can Solve Your Problem. You May Not Like the Answer

The algorithm had four guiding principles:

  • Increase # of high school students starting after 8am
  • Decrease # of elementary school students dismissed after 4pm
  • Accommodate the needs of special education students
  • Generate transportation savings

Unprecedented opposition to the algorithm’s solution. Angry parents signed petitions and stormed the school committee. BPS dropped the solution.

Complications:

  • Younger students were being forced into earlier start hours.
  • Politically connected families were trying to get better the start times, as opposed to equitable distribution amongst neighborhoods.
  • Black/brown parents tend to have lower-wage jobs that are inflexible to schedule changes. \(\approx\) 85% would be affected.

It was people who made the final call. This was a fundamentally human conflict, and all the computing power in the world couldn’t solve it. [David Scharfenberg]

I like that quote from the article. Specifying the objectives for the algorithm was hard. The computer did what it was told!

To Gmail, Most Black Lives Matter Emails Are ‘Promotions’

70% of political and racial justice emails to Gmail accounts are placed “promotions” tab, which isn’t as noticeable as the “inbox” tab.

Sorting email looks like a hard software problem. If the sender isn’t in the phonebook, and Gmail noticed the same sender in thousands of emails to other Gmail accounts, how likely is it that the sender is sending a personal email?

It seems that the onus is on mail list maintainers to instruct users how to make sure that future emails get routed to the “primary” tab, e.g. create a rule that routes our emails to your primary inbox, so that you don’t miss our emails.

A Healthcare Algorithm Started Cutting Care, and No One Knew Why

Slashdot’s editor posted , which is how I came to learn of this issue, but is a more comprehensive account which provides multiple viewpoints.

The program was used to apportion home care assistance. The underlying problem is insufficient resources. The algorithm aims to divvy up what is available as equitably as possible, without falling to the subjectivity of care assessors.

The work by Prof. Fries on the Minimum Data Set (MDS) for Resident Assessment and Care Screening is probably influential in InterRAI . The items in the MDS include cognition, hearing, vision, mood, social functioning, informal support, physical functioning, continence, disease diagnosis, health conditions, nutrition/hydration, dental, skin, environment, service utilization and medications.

The needs assessor administers an annual questionnaire, and the algorithm sorts patients into levels of need, with each level affording a standard number of hours of care.

Given the weight of factors considered, determining a ranking of the factors looks like a difficult problem. Some implementations even ignored some of the data points collected by the care assessors.

However, there were flaws, e.g. failing to factor in cerebral palsy or diabetes (although Fries' theoretical considered these, the 3rd party software was not updated with the developments); double amputees being marked as mobile because of wheelchairs, etc. The algorithm was also unstable for people at the margins.

The pieces highlight the plight of those who had their care cut. But doesn’t that also mean that there are people who were now getting care due to the algorithm? Where is their story? Are they likely to be part of communities that tend not to have an online voice?

Arkansas' in-house system was opaque, and in court proceedings, the data was found to be deeply flawed and mostly discarded. Its decisions were inexplainable even to people handling appeals. Its abrupt introduction led to undesirable drastic change. The system was discontinued.

References

  1. Computers Can Solve Your Problem. You May Not Like the Answer. What happened when Boston Public Schools tried for equity with an algorithm. David Scharfenberg. apps.bostonglobe.com . Sep 21, 2018.
  2. To Gmail, Most Black Lives Matter Emails Are 'Promotions'. Adrianne Jeffries; Leon Yin. themarkup.org . Jul 2, 2020.
  3. What Happened When a 'Wildly Irrational' Algorithm Made Crucial Healthcare Decisions. Erin McCormick. www.theguardian.com . Jul 2, 2021. Accessed Jul 4, 2021.
  4. Brant E. Fries, PhD. Brant E. Fries. sph.umich.edu . scholar.google.com . Accessed Jul 4, 2021.
  5. A Healthcare Algorithm Started Cutting Care, and No One Knew Why. Colin Lecher. www.theverge.com . Mar 21, 2018. Accessed Jul 4, 2021.