The paper is written in a period when NLP practitioners are producing bigger (# of parameters; size of training data) language models (LMs), and pushing the top scores on benchmarks.
Environmental Risks Large LMs consume a lot of resources, e.g. training a single BERT base model on GPUs was estimated to use as much energy as a trans-American flight.
Marginalized communities are doubly punished. They are least likely to benefit from LMs, e....
The lack of explanability is a common theme. Higher-ups claim the machine is unbiased, while the workers on the ground say, “It’s not me; it’s the computer”.
Automating Inequality: How High-tech Tools Profile, Police, and Punish the Poor should be an enlightening read.
Computers Can Solve Your Problem. You May Not Like the Answer The algorithm had four guiding principles:
Increase # of high school students starting after 8am Decrease # of elementary school students dismissed after 4pm Accommodate the needs of special education students Generate transportation savings Unprecedented opposition to the algorithm’s solution....