…Similar faults have seen voters expunged from electoral rolls without notice, small businesses labeled as ineligible for government contracts, and individuals mistakenly identified as “deadbeat” parents. In a notable example of the latter, 56-year-old mechanic Walter Vollmer was incorrectly targeted by the Federal Parent Locator Service and issued a child-support bill for the sum of $206,000. Vollmer’s wife of 32 years became suicidal in the aftermath, believing that her husband had been leading a secret life for much of their marriage.
Equally alarming is the possibility that an algorithm may falsely profile an individual as a terrorist: a fate that befalls roughly 1,500 unlucky airline travelers each week. Those fingered in the past as the result of data-matching errors include former Army majors, a four-year-old boy, and an American Airlines pilot—who was detained 80 times over the course of a single year.
Many of these problems are the result of the new roles algorithms play in law enforcement. As slashed budgets lead to increased staff cuts, automated systems have moved from simple administrative tools to become primary decision-makers.
You’re a 16-year-old Muslim kid in America. Say your name is Mohammad Abdullah. Your schoolmates are convinced that you’re a terrorist. They keep typing in Google queries likes “is Mohammad Abdullah a terrorist?” and “Mohammad Abdullah al Qaeda.” Google’s search engine learns. All of a sudden, auto-complete starts suggesting terms like “Al Qaeda” as the next term in relation to your name. You know that colleges are looking up your name and you’re afraid of the impression that they might get based on that auto-complete. You are already getting hostile comments in your hometown, a decidedly anti-Muslim environment. You know that you have nothing to do with Al Qaeda, but Google gives the impression that you do. And people are drawing that conclusion. You write to Google but nothing comes of it. What do you do?
This is guilt through algorithmic association…