What do algorithms really want? ASU’s Ed Finn investigates in a new book


March 31, 2017

From finding a movie to watch on Netflix, navigating traffic with Waze or Google Maps, fine-tuning your household budget with apps like Mint, or finding a date on Tinder, we find ourselves relying on algorithms more and more every day. But do we really understand them? What exactly are we buying into when we offload our data analysis, decision-making, and memory into these mysterious systems? And how is the proliferation of algorithms shaping the world around us, from high finance to pop culture?

Enter "What Algorithms Want: Imagination in the Age of Computing," a new book by Ed Finn, director of ASU’s Center for Science and the Imagination. The book delves deeply into the history of algorithms, investigating the foundations of computing in early mathematics and rarefied philosophical discourse. Finn, who is also an assistant professor in the School of Arts, Media and Engineering and Department of English, argues that algorithms are mediators between our idealized dreams about perfect knowledge of the universe (and ourselves), on one hand, and the messy realities of people and organizations in the real world, on the other. Cover of Ed Finn's book "What Algorithms Want," showing a distorted city skyline in grayscale. Download Full Image

Finn’s journey into the elusive hearts and minds of algorithms creates unexpected bedfellows: economist and philosopher Adam Smith and the Facebook game "FarmVille," or Apple’s Siri and Denis Diderot, one of the creators of the world’s first encyclopedia. We caught up with him to discuss the history of algorithms, common misconceptions about how they work, the kind of future we’re building with and through them, and how we can get to know them better. (This conversation has been lightly edited for length and clarity.) 

Question: How did you get interested in writing about algorithms, as someone with a background in literature?

Answer: I’ve always been interested in how computers are changing way we read and write. Something as simple as spell check has changed the way we spell by normalizing on a single spelling of words like “judgment” and, infamously, auto-correcting “cooperation” to “Cupertino” in many public reports and papers. Sometimes spell check feels like it’s encouraged generations of students to stop trying to spell things at all! Today when we think about what it means to be human, which is the fundamental question of the humanities, we have to consider our complicated relationships with algorithms.

Q: The jumping-off point for the book is that people tend to think of code and algorithms as purely objective and rational. Where does that idea come from? What are its historical roots?

A: Scientists and philosophers have talked about a universal mathematical language for the universe for centuries. One example I discuss in the book is Gottfried Wilhelm von Leibniz, who imagined a mathesis universalis: a language that would perfectly describe the scientific laws of the universe. Computers emerge out of mathematics, and so we want to believe that they carry the perfection of that ideal world with them. But the more we bring computation into the world, the messier things get, because reality is not an ideal space; we can only pretend to understand what’s going on most of the time.

man's portrait
Ed Finn

Q: Are algorithms tools to expand and intensify the power of the people and groups who create them? Or do they exercise their own unique force in the world, beyond the interests of their creators?

A: It depends on what algorithm you’re talking about. Algorithms are ways for us to extend and magnify our thinking, our ideas, our intelligence, and that can be transformative in good and in bad ways. We’ve all seen instances where algorithms enable groups of people to act in ways that weren’t possible before, like Facebook and Twitter’s roles the Arab Spring. On the other hand, the pyramid of wealth erupting out of Silicon Valley only seems to be getting pointier, and we need to ask ourselves how we can use computation to truly make the world a better place, and not just a more efficient profit center.

Q: Are we in danger of being swamped by algorithms? After all, they’ve already wiped out video stores, undermined print newspapers and magazines, and they’re moving in on taxi companies and loan officers. 

A: I do think the sea change of computation is just beginning, from automation in the workplace to algorithms that are fundamentally changing dating, finance, music production, and many other spheres of activity. These are arenas in which the pace of social and technological change is so fast that it sometimes feels like we don’t even have the words to describe what is happening. That said, humans are incredibly adaptable, so I think the question is not whether algorithms are going to take over so much as, how are we going to change as we do more of our thinking and our work in collaboration with algorithms.

Q: One of your major conclusions in the book is that we need to understand algorithms better. But we’re using algorithms more and more all of the time, for an ever-expanding range of tasks. So how can we have this gap in literacy that you talk about? What are we missing about algorithms, as expert users of them?

A: One of the great seductions of algorithms in culture is their capacity to simplify our choices. If you and I both rate a movie four stars out of five on Netflix, we might mean completely different things, but those distinctions can quickly get lost in the numbers. One basic form of literacy is to reflect on all of the choices that are not on the elegant menus and interfaces presented to us. In the book I talk about the word abstraction, a very important idea in computer science. Every abstraction also involves leaving out some context – some of the messiness of life – and so we need to become more astute about the abstractions we buy into.

Q: What can people do if they want to increase their algorithmic literacy?

A: There are a few easy places to start. One is to get a primer on how symbolic logic works. By and large, code is built on a fairly simple set of logical operators, that, when you come right down it, are all about switching gates between on and off positions. The fact that from the bottom up, algorithms are defined by true or false – and not “maybe” – is one basic thing to understand. Another thing I’ve found really helpful is to learn something about the hardware, not just at the level of an individual computer, but the networks, huge server farms, and physical infrastructure that makes the internet possible. People think the internet is everywhere, like some kind of spiritual presence, but it is very real physical infrastructure, and there are places like these centralized data centers that you can go to see it.

Q: Some of the most memorable analyses in the book focus on popular culture, from "House of Cards" and the movie "Her" to poetry and science fiction. How are algorithms changing the actual content and aesthetics of popular culture? It’s obvious that Netflix and Spotify revolutionize the delivery of content, but are they also changing what we actually see, hear, and read?

A: One idea I’m noodling on is what happens when every art form has its auto-tune. If you think about the art of digital photography, from high-end professional cameras to the lenses we all carry around in our pockets now, you see not only an explosion in the quantity of photographs, but also, I suspect, a rising level of quality. Most new smartphones automatically correct and improve images as soon as you take them, and that raises interesting questions about what it means to be a photographer or an artist.

Q: What algorithm do you rely on most in your own life?

A: Google. It’s not even a contest. For me, as I suspect for many other people, almost every knowledge-based question I have starts with a Google search or a delve into the terrifying array of archives Google has of my life: notes, emails, photos. Google seems committed to empowering and extending the minds of its users in very concrete ways, and I think it behooves everyone involved in so-called “knowledge work” to think about how much our thinking and even the horizons of our possible thoughts are shaped by platforms like Google.

Joey Eschrich

program manager, Center for Science and the Imagination

480-442-2682