…Similar faults have seen voters expunged from electoral rolls without notice, small businesses labeled as ineligible for government contracts, and individuals mistakenly identified as â€œdeadbeatâ€ parents. In a notable example of the latter, 56-year-old mechanic Walter Vollmer was incorrectly targeted by the Federal Parent Locator Service and issued a child-support bill for the sum of $206,000. Vollmerâ€™s wife of 32 years became suicidal in the aftermath, believing that her husband had been leading a secret life for much of their marriage.
Equally alarming is the possibility that an algorithm may falsely profile an individual as a terrorist: a fate that befalls roughly 1,500 unlucky airline travelers each week. Those fingered in the past as the result of data-matching errors include former Army majors, a four-year-old boy, and an American Airlines pilotâ€”who was detained 80 times over the course of a single year.
Many of these problems are the result of the new roles algorithms play in law enforcement. As slashed budgets lead to increased staff cuts, automated systems have moved from simple administrative tools to become primary decision-makers.
Algorithms have consequences.
But Iâ€™m not quite sure that without the neutral side of the Internetâ€”the livestreams whose â€œpacketsâ€ were fast as commercial, corporate and moneyed speech that travels on our networks, Twitter feeds which are not determined by an opaque corporate algorithms but my own choices,â€”weâ€™d be having this conversation.
-Zeynep Tufekci (University of North Carolina)
The importance of algorithms in our lives today cannot be overstated. They are used virtually everywhere, from financial institutions to dating sites. But some algorithms shape and control our world more than others â€” and these ten are the most significant.
Just a quick refresher before we get started. Though there’s no formal definition,Â computer scientists describe algorithms as a set of rules that define a sequence of operations. They’re a series of instructions that tell a computer how it’s supposed to solve a problem or achieve a certain goal.Â
- Google Search
- Facebook’s News Feed
- OKCupid Date Matching
- NSA Data Collection, Interpretation, and Encryption
- “You may also enjoy…”
- Google AdWords
- high Frequency Stock Trading
- MP3 Compression
- IBM’s CRUSH
When was the last time you read an online review about a local business or service on a platform like Yelp? Of course you want to make sure the local plumber you hire is honest, or that even if the date is dud, at least the restaurant isnâ€™t lousy. A recent surveyÂ found thatÂ 76 percentÂ of consumers check online reviews before buying, so a lot can hinge on a good or bad review. Such sites have become so important to local businesses that itâ€™s not uncommon for scheming owners to hire shills to boost themselves or put down their rivals.
To protect users from getting duped by fake reviews Yelp employs an algorithmic review reviewerÂ which constantly scans reviews and relegates suspicious ones to a â€œfiltered reviewsâ€ page, effectively de-emphasizing them without deleting them entirely. But of course that algorithm is not perfect, and it sometimes de-emphasizes legitimate reviews and leavesÂ actual fakes intactâ€”oops. Some businesses have complained, alleging that the filter can incorrectly remove all of their most positiveÂ reviews, leaving them with a lowly one-Â or two-stars average.
This is just one example of how algorithms are becomingÂ ever more important in society, for everything from search engine personalization, discrimination, defamation, and censorship online, to how teachers are evaluated, how markets work, how political campaigns are run, and even how something like immigration is policed. Algorithms, driven by vast troves of data, are the new power brokers in society, both in the corporate world as well as in government.
Youâ€™re a 16-year-old Muslim kid in America. Say your name is Mohammad Abdullah. Your schoolmates are convinced that youâ€™re a terrorist. They keep typing in Google queries likes â€œis Mohammad Abdullah a terrorist?â€ and â€œMohammad Abdullah al Qaeda.â€ Googleâ€™s search engine learns. All of a sudden, auto-complete starts suggesting terms like â€œAl Qaedaâ€ as the next term in relation to your name. You know that colleges are looking up your name and youâ€™re afraid of the impression that they might get based on that auto-complete. You are already getting hostile comments in your hometown, a decidedly anti-Muslim environment. You know that you have nothing to do with Al Qaeda, but Google gives the impression that you do. And people are drawing that conclusion. You write to Google but nothing comes of it. What do you do?
This is guilt through algorithmic association…