From Code-Dependent: Pros and Cons of the Algorithm Age, a new report from the Pew Research Center and Elon University’s Imagining the Internet Center and based on responses from “ technology experts, scholars, corporate practitioners and government leaders” who were asked this question: “Will the net overall effect of algorithms be positive for individuals and society or negative for individuals and society?.”
Algorithms are programs designed to identify and respond to inputs, and are the basis for all machine learning and so-called artificial intelligence. As Merriam-Webster notes, the word “was formed from algorism ‘the system of Arabic numerals,’ a word that goes back to Middle English and ultimately stems from the name of a 9th-century Persian mathematician, abu-Jafar Mohammed ibn-Musa al-Khuwarizmi, who did important work in the fields of algebra and numeric systems.”
Algorithms are ubiquitous in a wired world, used to track us and sell us on products and ideas, yet they themselves remain hidden from view as they harvest data about our likes and dislikes, habits, hobbies, driving patterns, and much, much more.
Here are some of the responses we think are particularly telling:
Chris Showell, an independent health informatics researcher based in Australia, said, “The organisation developing the algorithm has significant capacity to influence or moderate the behaviour of those who rely on the algorithm’s output. Two current examples: manipulation of the process displayed in online marketplaces, and use of ‘secret’ algorithms in evaluating social welfare recipients. There will be many others in years to come. It will be challenging for even well-educated users to understand how an algorithm might assess them, or manipulate their behaviour. Disadvantaged and poorly educated users are likely to be left completely unprotected.”
Writer James Hinton commented, “The fact the internet can, through algorithms, be used to almost read our minds, means those who have access to the algorithms and their databases have a vast opportunity to manipulate large population groups. The much-talked-about ‘experiment’ conducted by Facebook to determine if it could manipulate people emotionally through deliberate tampering with news feeds is but one example of both the power, and the lack of ethics, that can be displayed.”
An anonymous president of a consulting firm said, “LinkedIn tries to manipulate me to benefit from my contacts’ contacts and much more. If everyone is intentionally using or manipulating each other, is it acceptable? We need to see more-honest, trust-building innovations and fewer snarky corporate manipulative design tricks. Someone told me that someday only rich people will not have smartphones, suggesting that buying back the time in our day will soon become the key to quality lifestyles in our age of information overload. At what cost, and with what ‘best practices’ for the use of our recovered time per day? The overall question is whether good or bad behaviors will predominate globally.”
This consultant suggested: “Once people understand which algorithms manipulate them to build corporate revenues without benefiting users, they will be looking for more-honest algorithm systems that share the benefits as fairly as possible. When everyone globally is online, another 4 billion young and poor learners will be coming online. A system could go viral to win trillions in annual revenues based on micropayments due to sheer volume.
Example: The Facebook denumerator app removes the manipulative aspects of Facebook, allowing users to return to more typically social behavior.”
Several respondents expressed concerns about a particular industry – insurers. An anonymous respondent commented, “The increasing migration of health data into the realm of ‘big data’ has potential for the nightmare scenario of Gattaca writ real.”
An executive director for an open source software organization commented, “Most people will simply lose agency as they don’t understand how choices are being made for them.”
One respondent said, “Everything will be ‘custom’-tailored based on the groupthink of the algorithms; the destruction of free thought and critical thinking will ensure the best generation is totally subordinate to the ruling class.”
Another respondent wrote, “Current systems are designed to emphasize the collection, concentration and use of data and algorithms by relatively few large institutions that are not accountable to anyone, and/or if they are theoretically accountable are so hard to hold accountable that they are practically unaccountable to anyone. This concentration of data and knowledge creates a new form of surveillance and oppression (writ large). It is antithetical to and undermines the entire underlying fabric of the erstwhile social form enshrined in the U.S. Constitution and our current political-economic-legal system. Just because people don’t see it happening doesn’t mean that it’s not, or that it’s not undermining our social structures. It is. It will only get worse because there’s no ‘crisis’ to respond to, and hence, not only no motivation to change, but every reason to keep it going – especially by the powerful interests involved. We are heading for a nightmare.”
A scientific editor observed, “The system will win; people will lose. Call it ‘The Selfish Algorithm’; algorithms will naturally find and exploit our built-in behavioral compulsions for their own purposes. We’re not even consumers anymore. As if that wasn’t already degrading enough, it’s commonplace to observe that these days people are the product. The increasing use of ‘algorithms’ will only – very rapidly – accelerate that trend. Web 1.0 was actually pretty exciting. Web 2.0 provides more convenience for citizens who need to get a ride home, but at the same time – and it’s naive to think this is a coincidence – it’s also a monetized, corporatized, disempowering, cannibalizing harbinger of the End Times. (I exaggerate for effect. But not by much.)”