My last essay was about communication. More specifically, it was about how technology has invaded and overtaken our ways of communicating with each other to the extent that it’s removed much of what makes humans really connect. As I was drafting that essay, another theme began to come out around my concerns with all this computer code that envelopes us every day. And that’s where we are going to go today.
First, a disclaimer or two. Although I am 57 years old at the time of this writing in 2019, I do not consider myself a Luddite. (For any of you millennials or Gen Zers out there who might be reading this, a Luddite refers to someone afraid of new technology. Look it up. A little history would be good for you.) Over my lifetime, I have seen computers go from suspicious room-size machines to something you carry in your pocket. Everywhere. And use for everything from a calculator to a phone to a navigator to an entertainment source. I’ve seen the Internet move from a military project to a ubiquitous, unseen force that invades every aspect of our lives. And I’ve seen the world become connected in a broad, instantaneous, rule-less way that is equal parts amazingly beneficial and amazingly destructive. I use a good many of these tools. However, while I have Twitter and Insta accounts, I follow without posting and Alexa has not made her way into my home. Why? Because of my fear of the Tyranny of the Algorithm.
An algorithm, very simply put, is computer code that provides rules and instruction for some action and output. Algorithms are how Amazon recommends things you may like, what you’ll see next on YouTube and NetFlix, the ads you see in your Facebook feed. These algorithms take information you provide through your browsing habits and other publically (we hope) available information about you, and the code provides the output you see. Those rules are influenced, as well, by advertising dollars. We all get annoyed enough at ads that pop up on Facebook for shoes you were looking at on some website yesterday, but with the explosion in artificial intelligence it’s time to get very wary. (Another disclaimer: I know just enough about coding and artificial intelligence to be dangerous. If this blog had broader readership, I’m sure I would be slammed on my ignorance of nuance but I think I have the basics right.)
Artificial Intelligence, or AI, is essentially machines (computers and what they control) learning to do things on their own. Algorithms on digital steroids. The beneficial side of AI includes anticipating your needs—everything from populating your grocery list to improving the accuracy of surgical robots. The bad side is the tendency for AI to “suggest” you into deeper and narrower subsets of what life has to offer by assuming that if you like a little of something then clearly you would like a lot more. And in the absence of other stimuli, we forget that there is more out there.
Let’s take an example as simple as browsing in a bookstore. We all have favorite genres but most people are still willing to entertain picking up something totally new and different—if it catches your eye. I have always loved going to bookstores and browsing their bargain bins. Not just because I do love a bargain but because of the joy I get from stumbling across a book I never would have actively searched for based on what I had previously read. And reading things I probably never would have actively searched for is a big part of how I have grown and learned and, most importantly, evolved my thinking over the years. If algorithms had their way, though, I would read nothing more than Jody Picoult novels and non-fiction around social science. Broaden this example to all of media—TV, magazines, news—and you get the sense of the source of my angina.
This narrowing and deepening of existing thinking concerns me for everyone but mostly for people who have grown up in this age of the algorithm. So many people my own age seem to have lost the ability to be curious and seek knowledge outside of their comfort areas but at least we had decades of broad exposure before the Age of the Algorithm. What about young people who are forming their first impressions of the world? If they just keep going narrower and deeper into their first perspectives, our current polarization on any given topic is just going to get worse. Try an experiment. Go to YouTube and watch the first video that pops up. Then just let autoplay do its thing. You might start on a video on how to grow tomatoes, then recipes with tomatoes, then things you can do with tomatoes and then end up a couple hours later on videos showing you how to build acid filled pipe bombs. And then you’ll get a visit from the FBI.
While sending you down a narrow and deep pathway is scary enough, remember something else: While algorithms may seem like highly complex gifts from the gods, they are code written by humans. Yes, they can learn, but the rules they use for “learning” are those written by their human creators. There are already concerns about bias unintentionally built into today’s algorithms by the highly homogeneous “bro” coding culture. And unintentional bias can be insidious. I remember crayon boxes when I was a kid. There was a pale pink color named “skin”. Imagine being a non-white kid with that crayon box. There was no intention of disenfranchising non-white people by putting that label on that color, just as there is most likely no intention of coding bias into algorithms, but the effect is still the same—creating barriers for some and advantage for others.
So, we’ve seen that algorithms can take you down a narrow and deep pathway without your conscious knowledge and they can pave that pathway with rocks of micro-bias. While few of us will let these algorithms take us so far as to result in a visit from the authorities, I fear that these algorithms are already taking away our creativity and curiosity about new thinking. And creativity and curiosity are the backbone of our innovative culture. Take that away and we are on a slippery slope of decline as a nation. In the scientific community, a patentable invention is defined as “something unexpected to someone skilled in the art.” If we let algorithms encourage us to ignore and discredit anything outside of our narrow comfort zones, how are we going to continue to make the sort of breakthrough discoveries that keep our country and economy strong?
What is it that I want you to do? Don’t give up your phones or your Alexas. They are wonderful tools! But also don’t just accept their output. Be curious. Search out answers outside your comfort zone or area of expertise. Go browse the bargain bin at a bookstore. Chemists can put all the “ingredients” of life into a flask and put them under the conditions of “early earth”. Amino acids and simple proteins will form. But we can’t make life. There is something more to it that the parts and the rules—more than the algorithm. Similarly, AI and machine learning may be starting to approximate the mechanics of thinking, but they are not human. Far from it. Remember, though, that you ARE human. Stay human. And beware the tyranny of the algorithm.
I agree with your concerns. Artificial Intelligence and the algorithms that power it can be downright scary. I especially appreciated your pointing out that even when these algorithms “work” they often have unintended consequences, such as perpetuating bias in the workplace, discouraging exploration of new things, or worst of all, slowly carrying people into extremism. The latter is especially frightening as it doesn’t have any apparent political bias, conservative, liberal or whatever. People are just gradually moved farther and farther from the mainstream, until they’re associating largely with people at one of the edges of society, who inevitably propose terrible things that almost no one would sanction. But in a very large sample of anything there are always outliers . . . .