Skip to Content



Surveillance Capitalism In Today's Digital Age

November 18, 2019 00:00 AM UTC
- Updated September 25, 2020 16:48 PM UTC

A look at how the commodification of data can change human behavior, widen the inequality gap, and reshape the future of capitalism with Shoshana Zuboff, Author of The Age of Surveillance Capitalism.

Thank you, Brian. And welcome Shoshana. Thank you, Adam. And hello, everybody. The Guardian called your book a chilling expose of the business model that underpins the digital world. You brought a copy of your book on stage with you. Thank you for doing that paper back. What is, um, Shoshana? What is surveillance, Capitalism? Surveillance? Capitalism is a new market form born digital in many ways, that diverges from everything that we've learned about capitalism for the last few centuries. But in this critical way, it emulates the old pattern. It takes something that lives outside the market dynamic. It brings it into the market dynamic for productions in sales. That's called making a commodity. The thing is, we're all familiar with the idea that industrial capitalism claim nature turned it into a land real estate, things that could be sold in purchase. In the case of surveillance capitalism, it takes a dark and startling twist because what it claims for the market dynamic now is private human experience. Private human experience is re envisioned as a free source of raw materials for new processes of production and sales, specifically translation into behavioral data behavioral data that Phil's new data supply chains on their way to new computational factories that we call a I machine learning. In those factories, products are made their computational product. They predict future human behavior. These prediction products are sold not back to us from whence they came, but to a new group of business customers with a profound interest in understanding what we will do soon and later. These new markets trade exclusively in human futures. Of course, all of this began with the click through rate is the first globally successful prediction product, and the online targeted advertising market is the first globally successful human futures market. Now it's spread beyond Tak beyond the sector into the normal economy. So But you've been researching this before. GDP are before Cambridge Analytica. You've made the case that and you make it very specifically about Google being the first and the best that they took our data and sold it to others without our permission. All right, so let's fine tune that a little bit. May I? Please? Of course. Of course. They took our data obviously without our permission, because most of the taking happens without our knowledge when they write about these I'm getting over a pneumonia. I'm not contagious or sick anymore, but I've got a little cop. So bear water right here, if you need it. Yeah, thank you so much. So when they write about their mechanisms and methods the patent scientist, the data scientists within the company, the researchers, they celebrate this one key factor that they learned how to take our personal information, various aspects of our personal experience while bypassing our awareness. This is celebrated. This is why I called surveillance Cap. This is what puts with surveillance surveillance capitalism. All right, so they're taking our data without our personal information, our experience without our knowing it, translating it into data without our knowing it. They're not selling the data. They're putting it through the factory. They're analyzing it, Their computing it. This was Google's original secret sauce, where they produce the quality score that allowed them to bruise the click through what rates from the so called data exhaust that no one was using. These were the leftover data in their servers that were more than what was needed for profit improvement for service improvements. So it's it's now these these computational products coming out of the factory's Adam. That's what's getting sold. And that's why in hundreds of videos, believe me, I've watched them all. Eric Schmidt can sit on a stage like this and talk to you folks and say, We're not selling your data and no one comes in to arrest him or anything for for, uh, possibly not telling the truth because they're not actually selling our data. They're selling the proprietary meaning that they create from the data about our future behavior. And so let me ask you in all earnestness where the harm is and I'll sort of facetiously, by way of asking the question. I'll say, for example, I love Google Maps. Just love it because it's so helpful to me. You understand why I'm making that statement? Of course I do. Of course, there there are a couple of key things to say about this number one. Nothing that I say or write is an argument against the digital. We are citizens of a democratic society in the digital century we deserve. The service is that had been created that have made our lives better in many, many ways. Our societies does their big data. We should be using big data for the new remedies and treatment for all kinds of cancers that were desperate to South problems were desperate to sell. We should be using big data to solve climate catastrophe and to eliminate the plastic particles that are now even in the Arctic snow. That's not what we're doing. The most powerful companies in this sector with the with the material infrastructure, the corner nearly on the market of data scientists, the ones who really know how to do big data. They are entirely trained on one thing, and that's quick stream science. They are entirely trained on these computational products that move into their market for their revenues and their profit. They are not using big data to solve our knees or our society's needs. You've said that this, that what they're doing is antithetical to democracy. How so? I understand there it from your perspective there. Craven. They're just trying to make money. All right. So, Adam, a minute ago, you said I like those maps and we started to talk about that. You should be ableto have the map without the overhang off the negative externalities that come with this market form So let me talk quite briefly about what the key externalities are. Then I'm referring to number one. We're talking about an economic logic, not about bad people and just rude bad behavior. We're talking about an economic logic that compels certain forms of corporate behavior in order to have great predictions. Remember, that's what they're selling their selling certainty to business customers. They figured out three critical imperative number one. You want to feed a I to get great predictions, you need to feed it a lot of data. Economies of scale, intensification of competition turns out scales not enough. We also need scope varieties of data. Ultimately, in this in this competition, they came to understand that the most predictive data comes from intervening in our behavior to learn how to cope, tune and heard our behavior in the direction of their guaranteed outcome. Good for revenue, good for profit. Good for them. Good for their business customers. This is how the game works. This is called economies of action. These air designed two utilize the digital architecture to work their will through the medium of digital instrumentation to intervene in our behavior in ways that we are not aware of to push our behavior in the directions that serve the bottom line. Economies of action are a new zone of experimentation. We saw it being developed as Facebook's massive scale contagion experiments, where they were successful in getting more people to go vote in the 2000 UH, 10 midterm elections. They were also successful in creating an emotional contagion, making people happier and sadder. These things were published in 2012 in 2014 and when they published these findings, they celebrated. We know we can manipulate cues and social comparison dynamics online to get people to change their behavior and feeling in the real world, and we know they could do it without them ever detecting are present, they never know we're doing it. So that's number one. These experiments continued. Adam under Pokey Mongo. Anybody in this room ever play Pokemon? Go with your friends and family, Okay. Mongo incubated in Google over many years, brought to you by the same gentleman who brought your street view. And Google Earth has spent most of his career filling Google supply chains with surplus behavioral data for these new computations. In the case of Pokemon go. There were businesses from McDonald's, Starbucks to Joe's pizza paying Pokey Mongo for guaranteed footfall. And what Pokemon go. This group, Niantic Labs that came out of Google is what they learned how to do was use the rewards and punishments of Gamification to hurt our behavior through the city to the places that were going to pay them. Fees for our football, football in the real world, being the absolute equivalent of click through in the online world. Now the next zone of experimentation, Adam is set to be the city. This is the war that's going on in Toronto right now. We're sidewalk lab flash. Google Slash alphabet has its sights set on the Toronto waterfront, and only two weeks ago the citizens of Toronto ah group of them in a highly democratic process that they went through good, slow democracy beat back the first really extreme aims of this proposal, and the elected officials stood up to constrain the proposals. Where are we with democracy? Let me answer you very succinctly. Number one surveillance capitalism is on a collision course with democracy eroding from below and completely restructuring democracy from above. From below, it takes aim at human agency human autonomy at what we consider to be individual sovereignty, the things that allow us to have decision rights to be self determinant. This now global means of behavior modification takes aim at human autonomy, without which a democratic society is impossible to imagine from above. We are now entering the third decade of the digital century. Adam. This was supposed to be the apotheosis of democratization this digital century. Instead, what we see is a new kind of society marked by extreme forms of social inequality, a new social inequality. What is this social inequality, concentrations of knowledge in these companies, best expressed in the growing of this between what we know and what can be known about us. This is intolerable for a democratic society. But that's not all. These extreme concentrations of knowledge produce extreme concentrations of power that what we can do is now no match for what can be done to us using these extreme concentrations of knowledge through the medium of the digital. So when I couldn't help thinking and the picture that you've painted of these companies is like in the industrial age, the these factories that dump sludge polluted sludge into the river because nobody was stopping them from doing your analysis. Yes. Then they were The regulations came up to stop them from doing it. You already a philosopher? Now, for about a minute more, I'd like you to be a philosopher king and say exactly what democracy should do about this. First of all, this thing has had 20 years to root and flourish largely unimpeded by law. That's to say that we're the beginning off this not at the end. So we have to understand that the route now moves through politics. Only democracy only law only a new regulatory vision will reign in surveillance capitalism. By the way, In the massive Pew Research Study published just this past week, 81% of the U. S sample says that the risks of corporate surveillance outweigh the benefits. 81%. If you want to see what's coming in the future, pay attention to that number. We need laws that interrupt supply and demand that disassemble the incentives for the surveillance dividend. That means we interrupt supply. Taking our experience without our knowledge and therefore without our consent or the right to combat must be simply illegal. Let's call it what it would be called in any first or second or third grade classroom. It would be called Hey, let's go to the other end that supply. Let's talk about demands. We outlaw markets the trade in human futures because we know that they have predictably destructive consequences to human autonomy and to democracy. We outlawed them the same way we outlaw markets, trade in human organs or babies or slaves because we know that they have destructive consequences and are incompatible with our aspirations as a democratic society. That's where we are today, Shoshana. I cannot imagine you are a very popular in Mountain View or Menlo Park, California but I want to thank you for coming here and giving us your message today.