Transcript
Transcript: GC Data Conference 2024: Fireside Chat with Shoshana Zuboff and Jim Balsillie
[00:00:01 The CSPS logo appears onscreen.]
[00:00:04 The title “GC Data Conference 2024, Fireside chat with Shoshana Zuboff and Jim Balsillie” appears on screen with the conference graphics.]
[00:00:11 The screen fades to Erika-Kirsten Easton and Phil Gratton.]
Erika-Kirsten Easton: Up next, we are going to heat things up with a little fireside chat with Shoshana Zuboff and Jim Balsillie on the evolving global landscape of data-driven business models and the ethical and civic considerations.
[00:00:25 Taki Sarantakis is shown sitting with Shoshana Zuboff and Jim Balsillie.]
Taki Sarantakis: I am not a pinch me kind of guy, but today, pinch me. I am up here with two people that I admire enormously and who I learn from all the time. One in book form and one in person. So, the first is Professor Shoshana Zuboff, who has written the book of the digital age. And we're going to talk a little bit about the book going forward throughout the next hour or so. To me, this book is analogous to Adam Smith's The Wealth of Nations. In 1776, Adam Smith wrote a book that basically was describing what was happening in the world. We were changing from small-scale merchants who controlled the production of their goods, and we were moving to the beginning, the early beginning of the industrial era, through specialization of labour, through trade, etc., and to be able to kind of see stuff like that before it actually happens, within 20 years, 30 years, 100 years, railways everywhere, factories everywhere, small-scale merchants almost gone relative to people doing tasks over and over and over again. He saw ahead and Professor Zuboff has seen ahead. She is, again, without a doubt in my mind at least, she is the Adam Smith of our era. And to her right is one of her former students. For real. When were you at Harvard Business School, Jim?
Jim Balsillie: I'd like to make a proposal to Professor Zuboff that I won't say the year if she doesn't say the year.
(Laughter)
Shoshana Zuboff: I was hired when I was 16. I was a prodigy.
Taki Sarantakis: So, Jim, you all know as the co-founder of what was back then called Research in Motion, but really it was the BlackBerry. And I can't tell you the extent to which a Canadian company from Kitchener-Waterloo for a moment in time took over the world. It was everywhere. It changed the way we worked, it changed the way we lived and it was a Canadian invention. And the spinoffs from that continue to this day. There continues to be a kind of a research in motion mafia that continues to produce companies and products across the world. Jim has had a second act as a philanthropist, most notably through the Council of Canadian Innovators, which promotes small Canadian enterprises to become big Canadian enterprises. But today he is here because he is also one of the world's brightest lights on digital, on data and what is happening to us. So, we're going to kick this off by having an unaged, or a Jim of indeterminate age, give us a presentation, and then we'll draw Professor Zuboff into the presentation. So, with that, Jim, over to you, my friend.
Jim Balsillie: Thanks so much, Taki. It's a pleasure to be with you here today and especially delightful to share the stage with my teacher, someone who has taught and mentored the brightest thinkers of our time on the promise and perils of the data-driven economy. So, welcome. And then, to do it here with the Canada School of Public Service, our pillar and our hope for advanced capacity of Canada's civil service.
The topic of our fireside chat is the defining policy issue of our time and especially relevant to Canadians with the proposed privacy and AI legislation before us, specifically C-27 and the Artificial Intelligence and Data Act. I will take a few minutes to share some facts and thoughts to frame our discussion today. The rapid evolution of technology over the past three decades has ushered in a new era where wealth and power stem from owning valuable intellectual property and controlling valuable data and artificial intelligence engines. Unlike tangible goods, IP data and AI exhibit unique characteristics, demanding distinct considerations for both innovators and policymakers at the national and international level. This digital transformation is also creating a new kind of social and political space, reshaping both public and private spheres. IP data and AI are now the world's most valuable business and national security assets. In 1976, 17% of the value of the S&P 500 was intangibles. Today, intangibles comprise over 91% of the S&P 500's $40 trillion total value. IP, and data and AI, with their strong public good characteristics, play pivotal roles in the 21st century economy. They influence not just economic aspects, but also social security, health and geopolitical realms. The data driven economy naturally tends towards monopolies due to features like economies of scale and scope, network externalities and information asymmetries, posing challenges to competitive markets, business dynamism and sovereignty for smaller nations.
Understanding the structural economics of the data economy is crucial for shared prosperity, protecting human rights, public good considerations, security and democracy. Patents are measurable indicators of innovation, innovative output directly impacting wealth and power at the firm and national levels. With the emergence of the knowledge-based economy, advanced innovation, countries have strategically and systemically focused on owning and protecting these assets and then embedding them into value added products and services. The patent system was originally designed as an incentive reward framework for genuine inventions, but has turned into monopolization of knowledge and information requiring new strategies for countries that own few of them, like Canada. As part of the digital transformation, the cross-cutting considerations around AI, data and AI, involve values, wealth generation, distribution, competitive markets, rights such as privacy, health, democratic processes and national security.
I believe data governance is the most important public policy issue of our time. Whoever controls the data and requisite compute power controls who and what interacts with it. Furthermore, any data collected can be reprocessed and analyzed in new ways in the future that are unanticipated at the time of collection. This has major implications for personal autonomy, security, democracy and the global economy. The effects of data and AI are increasingly widespread into larger demographics, particularly among vulnerable populations, including children. Abuse of data and AI can compromise not just information ecosystems, but democratic processes. There's a global race underway by large firms and nation states to own critical AI IP by filing IP AI patents, as shown on the chart on the left. A recent article noted that the world's smartest companies and investors are pouring billions into all in bets on AI. ChatGPT achieved a million users in just five days, outpacing other platforms by months or years. This underscores the high consumer willingness to embrace new products or platforms, despite the potential risks.
The litany of adverse effects of surveillance capitalism's business model are well-documented and worsening. This slide summarizes some of the most insidious harms, including hollowing out of the private sphere, a mental health crisis for youth, undermining fair markets and economic dynamism. The erosion of democracy and growing cybersecurity harms. Yet, in the face of such obvious harms, linked vested interests resist the need for protections. These distinct vested interests are private sector monopolists, those who manipulate the arena for democracy, and state security agencies. And for my final slide, the root cause of surveillance capitalism's downstream harms lies in the undermining of human rights, which must be restored to fit for our information society and our contemporary realities. This includes rights to privacy, reality, knowledge, agency, society and political democratic integrity. This is indeed a fight for our future at the frontier of new power and an existential fight for democracy. Shoshana, thank you for your tireless intellectual leadership, which has inspired so many and shaped so much of my thinking and advocacy.
Taki Sarantakis: Welcome. Thank you so much, Jim.
Shoshana Zuboff: Thank you, Jim.
Taki Sarantakis: So, Professor Zuboff, we are at a data conference, but your former student is talking about rights, politics, monopolies. Why? Why is he talking about rights today?
Shoshana Zuboff: All right. So, we're going to start off with a really hard question. To answer the question, I'm going to tell you a little story, and it's taken me a while to figure out this story. I'm going to try to make it short. Back around the time when Jim was a student of mine, just before that, when the Harvard Business School was courting me to hire, they sent me around to meet with a bunch of old professors, all men, no ladies' rooms at all. So, we were having a very interesting conversation with another, a professor who I was meeting for the first time, and we were talking about the prospects of capitalism. And he told me that he was very worried about the prospects of capitalism because he and many other economists considered that there was nothing left to commodify. This is important. So, I will confess to you, this was the year 1980. And from, now, let us understand that even by 1986, only 1% of the world's information was stored digitally. 1986, which to me, for better or worse, feels like 5 minutes ago. So, back then in 1980, it wasn't crazy to say what is there left to commodify?
Everything that could be mass produced, mass marketed, has been mass produced, mass marketed. Where do we go from here? Marx called it that primitive accumulation. Where do you find stuff that nobody knows about that you can just take for free and then turn it into something that can be mass produced and mass marketed? So, this professor was saying to me, hey, we've like exhausted all of that. Where's capitalism going to go? Okay. At the time I thought this can't be right. But I thought about it for a very long time, as you can see, like 45 years. And he had a point. So, now let's fast forward from 1980 to, well all right, let's just say the year 2000, 2000-2001, right in that zone, something called the dot-com bust is happening in Silicon Valley. All these brilliant little startups are tanking, going bankrupt, investors are withdrawing their money. The smartest guys in the room were considered to be the Google founders and they are facing exactly the same tragedy and fears as everyone else in Silicon Valley, despite how smart they are, despite having what everyone thought was the best search engine so far.
Okay, what was the problem? They couldn't figure out how to monetize data. And that's what they said to themselves. Like, we don't know how to monetize it. And they had various plans on the table. Subscriptions, selling services to big corporations, various things, all of which could have worked and given them a nice, tidy business. But something else happened. They were very motivated, by the way, because Larry Page, one of the founders, as you know, had a had an obsession with Tesla, the inventor, who died penniless despite having been brilliant and a genius, and he had a mortal fear of dying like Tesla. And Sergey Brin said publicly, because I was able to quote it in my book, I don't want to turn out to be just another schmuck in Silicon Valley, like who had a company but couldn't make money. So, they were very motivated. Two things happened, and they happened in quick succession, and this changed everything. Number one, they made a discovery that no one had anticipated, and that discovery was that every time anyone interacts with the Internet or anything Internet-enabled in any way, they leave behind behavioural traces, a little trail. People have called it data exhaust, digital exhaust, digital breadcrumbs. A trail of what they've been doing.
And what they discovered was that not only did people leave behind these signals of their activity, their behaviour, but that those signals were highly predictive of future behaviour. If you put them together and computed them in a special way, you could begin to predict not only what they're doing now, but what they're doing next and what they're likely to be doing later. This was a discovery. Okay. Matching that discovery, they came up with a new idea, an audacious idea that would solve the problem of capitalism in our time. Not that they knew it, but they were soon to learn it. That idea was that now that we have these traces and nobody knows they're leaving them, by the way, did I mention that? And nobody knows that we can see them and collect them and compute them. But now that we have them, we can actually make predictions of behaviour and sell them just like any other…
Taki Sarantakis: Commodity.
Shoshana Zuboff: Commodity. We can package them up like barrels of oil and tons of wheat, and we can sell them like commodities. This solved the problem that my soon to be new professor friend at Harvard Business School was worried about in 1980. But guess what? It created a lot of new problems. And I'm going to just say that and then stop talking for a moment until you ask me another question. Point is… What are we commodifying? This is the part that has taken me a long time to really integrate. It's one thing to be commodifying wheat and oil. When we are commodifying bets on the future of human behaviour it means that every single aspect of our business has to be oriented to making that behaviour maximally predictable. Another thing that Larry Page said way back at the beginning, and he said it to somebody later, but he was reflecting on their mental zeitgeist when they started Google. He said the societal goal was always our main goal, because for that commodity business to be really lucrative, like trillions of dollars of market capitalization lucrative, which is what they've achieved, it means that we need to make a society in which the forces that are, the forces that are directed toward collectives and individuals, those forces work to make behaviour more predictable. That is a different kind of society than an open, free, democratic society. And this is the key point now. The key point now is what it means to have a new era of capitalism, surveillance capitalism, which, as you know, has spread now far beyond the tech giants. In fact, maybe some of you saw the Mozilla research that was published very recently on the car industry? Anybody see that?
Taki Sarantakis: Yeah.
Shoshana Zuboff: The 25 leading car brands globally now all make their money on that right hand, is it the right hand? Yeah, the right-hand side of Jim's picture where all the knowledge intangibles are. Because now, when you buy a truck, or a car or whatever, that thing is a loss leader for data extraction. And it turns out that the 25 biggest brands are extracting your sex life, and what was the other really good one? Your sex life, oh, and your genetic material. And it has to do with sensors and all kinds of things that are built into the cars. So, they're getting everything, your eye gaze, everything. But they're also now combining that with telematics, but they're also combining that with everything you do in relationship to their websites, everything you do when you walk into their showrooms, everything. And of course, everything available about you already in the world. So, and this is true of every industry. It's true of education, it's true of health care, it's true of real estate, it's true of finance. It's true of every industry, every product and every service, and certainly, every product called smart and every service called personalized has been repurposed as a loss leader for data extraction.
Turns out that when human behaviour and predictions of human behaviour, when the human is what is commodified, when that is what becomes the foundation for capitalism, some things happen that are fundamentally different than anything that has ever happened before. Because what that means is not only is commodification driving the concentration and the monopolization that Jim showed you, the concentration of economic power, it means that those very same operations are driving concentrations of social power because social power is what is required to be able to intervene in the flow of free, open human life, and shift it and manipulate it, herd it, tune it, condition it, reinforce it in particular ways to make it more predictable for the sake of our commercial outcomes.
And the third thing that happens is once we have economic power and then we have social power which destabilizes societies, we take advantage of that destabilization, we take advantage of these new pockets of weakness in order to accrue more governance functions within our private domain so that democracy gets weaker every moment that surveillance capitalism becomes more deeply and densely institutionalized. That is the world that we are in today and it all rides on the back of the question of…
Taki Sarantakis: Rights and data.
Shoshana Zuboff: Rights and data. Because if it weren't for the fact that we have now commodified the human and that nothing escapes this net, we would not have to be talking about rights.
Taki Sarantakis: So, that was a pretty cool answer.
(Laughter)
And I want to really, I want to read a quote from your book that really hit this home for me. And I was, I've been reading about data and AI and things for many, many years, but it didn't really, I didn't really absorb the consequences of it until I read your book. And I want to read this quote, and I think I'm going to read it twice because it's so powerful. "Power was once identified with the ownership of the means of production, but it is now identified with ownership of the means of behavioural modification." And that, to me is when I really got it, which is to say, if you have enough data, if you know what somebody listens to, what somebody watches, who their friends are, where they're going, when they speed, when they don't speed, when they jaywalk, when they don't jaywalk. You know the things about them, but more importantly, you know what they're about to do, because you then become a molecular data point. And you become a molecular data point that people then use to modify your behaviour. You were going to do X, but because they know what you have just done, they know what you're about to do.
So, that's fascinating. So, Jim, let's bring you into this. In your first life, you were a businessman. You were making billions of dollars. You used…
Jim Balsillie: Still making money, just to let you know, so…
(Laughter)
Taki Sarantakis: Yeah, but you've started talking to us, you started waving the red flags long before anybody else in Canada did, to my knowledge. Why is it that you were raising alarms? I mean, you were a tech guy. You were like the ultimate tech guy in Canada.
Jim Balsillie: I think the part that's particular domestically is that if you're going to do global business in the knowledge economy, you're doing public policy all day, every day. And there's a relationship between those two. And I was on the US Business Council, I was deeply involved in these things globally, and you're always working the policy framework because the intangibles are a function of principally negative rights that are established by the state. So, these are the frameworks, and they change hundreds of times a day, and we had a sort of acute form of neoliberalism domestically. The thing I say about the United States is they talk Jeffersonian but rule Hamiltonian. In Canada, we talk Hamiltonian, but rule Jeffersonian. And I'd like to see a little more Hamiltonian in there, in that with the extreme neoliberalism we did super hands-off. And I saw this as a hands-on game because negative rights are granted by the government. They induce friction, which is the opposite of tangible production, where you eliminate friction. And so, my interest was principally through an economic lens that we were trying to grapple with productivity realms. And then this was something we had to pay attention to. And I was resisting the prevailing domestic orthodoxy. So, that's how I got interested in the public policy aspects of it.
And then, with the data economy that was emerging, and we all got, many of us got surprised a little later - you were prescient, of course, Shoshana - was what it was doing in the electoral realm. So, I knew the actors very well in an economic realm. And that's where with RIM, Mike and I parted, where I just thought the cellphone business was dead because it was two economic models, one a closed e-commerce system and the other was a data-driven surveillance system. And therefore, we had to shift to the services. And so, in all of this, the data model was just coming fast and furious. And then of course, you start to see in Brexit and those kinds of things, it was crossing over into other realms. You start to see the child mental health. And so, my pulling into it, as I knew it was a public policy realm for prosperity, but what shocked me was how it became much more cross-cutting into non-economic realms. And I think this is for all the marbles, as Shoshana said, that when you control the agency of the individual, you really are controlling so many aspects of rights and proper functioning of society. And it's very profitable to erode those things. So, that's what drew me into it, it was a natural evolution of a commercial life. And then, as you stayed with the economic part, the non-economic parts, I think, most of us were shocked by the Brexit and saw the fact that these manipulation systems with the Cambridge Analytica and so on were, I didn't see them with that. That's where it shocked me.
And I think the first articles came out in October, and then it was, for some reason there's a delayed reaction and it was in March, all the articles, everyone was freaking out. There was like a delayed reaction, about six months. But I remember when that happened, I thought this is a new realm. This isn't Kansas anymore. And so, you just heighten the engagement. We served on the International Grand Committee together and, but I thought domestically we have to be far more front-footed and not passive in these realms, not accepting that, because in the tangible production economy, there's comparative advantage. Generally, a rising tide raises all boats, whereas this is a win-loss rentier game economically between states and within states, but it's also trading off public good for private gain. So, it's a very, it's a very consequential, it's very predatory, it's a very technical game and there's a lot of gaslighting that goes on to confuse people. So, I think because I was a commercial protagonist, I understood the nature of the game and…
Taki Sarantakis: You were in the arena, you were fighting…
Jim Balsillie: Yeah, I know the tricks. I mean, I see a press release, in the first line you can see the trick that's being played, that you're trying to steer it commercially.
Taki Sarantakis: So again, we're at a data conference. The first, kind of strange thing to be talking about rights. The second strange thing I want to talk about, Professor Zuboff, I want you to talk to us about September 11th. Or maybe even better, September 10th, 11th and 12th.
Shoshana Zuboff: Okay, did you have like a particular, because I know we've talked about it, so…
Taki Sarantakis: Yeah. What was happening in the data, digital surveillance world kind of metaphorically on September 10th.
Shoshana Zuboff: Okay. So, I'm going to tell that little piece of history.
Taki Sarantakis: Yeah.
Shoshana Zuboff: Okay. Another story? Okay, another story. So, my headline for this story is Surveillance Capitalism is What Happened When Democracy Stood Down. And by that, I mean our democratically elected political leaders sold us out to Silicon Valley for the sake of total information awareness. That was the headline. And then what's the thing they, what do they call it right under the headline? The deck?
Audience member: The super!
Shoshana Zuboff: The super? Okay, so anyway, you got the headline and the super sub-headline. Okay, the super and the sub. Now, let's tell the story. So, 1997, in my world people are still writing, hopefully about the Internet and the democratization of knowledge. How many in this room are old enough to remember that discourse? Okay.
Taki Sarantakis: It was more than that, right? It was hands off the Internet, government stay away!
Shoshana Zuboff: That's right. Remember, it was Stewart Brand, this was the new utopia, the democratization of knowledge. And good people worked on this and good people believed in this, and I was myself very animated and hopeful. 1997, that was also the year Clinton was the President, Gore was the Vice President. They came out of the White House one day with a big press conference. All the people we're familiar with now in Silicon Valley were like sitting in the audience when you look at the video, but they all looked sort of chubby cheeked and pimply. But they were all there and this was to announce the new white paper on e-commerce that was going to set the rules for the Internet. And since Silicon Valley is in the United States of America and Silicon Valley represented the Internet, essentially it was up to US political leaders to set the rules for the Internet, which was soon to be a global force.
And what they said in their very first line of this very interesting document was the private sector must lead. And this is what I call the democratic self-evisceration, where government is slow, government is stupid, democracy is too slow, we don't know anything, we can't possibly understand all this stuff, we don't know what computers are. And so, only the people in these Internet companies understand this well enough to run this show, so it must be self-regulation. And these boys, pardon my English, are going to have to lead because the rest of us, and they specifically said in this document government must get out of the way, we're not going to pass any laws that obstruct anything they want to do. And they said, by the way, if there are any laws already on the books that might obstruct something they want to do, we are going to retract those laws. So, this was all the gift of Milton Friedman. Milton Friedman, radical libertarian economics. Milton Friedman, the man who doesn't believe that a government should invest in public education. That Milton Friedman. This is who they were channeling, and they channeled him very well.
But it didn't take long for our Federal Trade Commission, since we had nobody in the United States to worry about things like privacy, the Federal Trade Commission is the closest we have to people who are interested in consumers and protecting consumers. So, the Federal Trade Commission is tracking all of this. That happened in 1997. By the year 2000, the Federal Trade Commission, the Commissioner and many members of the commission stood together and announced self-regulation on the Internet will not work. Look at all these cookies, look at all these web bugs. These guys are doing nothing but secretly extracting data from everyone everywhere and every webpage. We've got to do something about this. We've got to stop self-regulation. We need comprehensive privacy legislation. They wrote down the legislation, they walked it over to Capitol Hill. And if you were alive in Washington, D.C. in the year 2010, anywhere on Capitol Hill, all the conversations were about not whether or not to have privacy legislation, but what kind and how much.
Taki Sarantakis: Sorry, you said 2010. Did you mean 2000?
Shoshana Zuboff: Oh… if you were alive in the year 2010, yes. Sorry.
Taki Sarantakis: Oh 2010? Okay, sorry. My apologies.
Shoshana Zuboff: I mean, no, 2000. I'm sorry, 2001.
Taki Sarantakis: Yes, that's what I thought.
Shoshana Zuboff: Sorry, sorry, sorry. Get lost in my own story. 2000, if you were on Capitol Hill, everything is a debate about privacy. It didn't take long for that debate to end. In fact, after the Twin Towers were hit and the Pentagon was hit on September 11, 2001, people who were on site, people who were eyewitnesses to these discussions, these spaces, said that within 24 hours, the entire conversation flipped 180 degrees. No one was talking about privacy. Every one was talking about…
Taki Sarantakis: Security.
Shoshana Zuboff: Total Information awareness. And even though total information awareness, which was a phrase that had been floated by a particularly zealous leader of the intelligence community, the idea was that, no, no, no, we're not going to do total information awareness. What the US government actually did was institutionalize total information awareness.
Taki Sarantakis: Wait, the home of the free, the land of the brave? They did that?
Shoshana Zuboff: The home of the formerly free…
(Laughter)
And just temporarily, the brave people are sort of quivering in the closet. But it's up to us to create the spaces for all the brave people to come together and demand what is seemingly impossible. But we'll get to that later, right? Okay. So, right now to wrap up this story, we know that this story is true not just because scholars have written about it, and legal scholars have written about it and eyewitnesses have written about it, but because the CIA has told us that it's true. And you can look this up on YouTube, where in 2013, a man named Gus Hunt, who was at that time the Chief Technology Officer of the CIA, he went to a public conference in an American city and anyone, for the price of registration, could be in that conference. It was a nerdy computer conference, GigaOm. And Gus Hunt went there to, just another presenter in a big conference hall, another windowless, airless room where he gave his presentation. And in his presentation, he explained that the riddle that the CIA had to solve since 9/11 was the riddle of how do we connect the dots? Because you may recall that on that fateful day, the CIA, and the FBI, and many of the agencies, NSA, they were blamed for not having connected the dots.
So, Mr. Hunt says if we're going to connect the dots, we have to have the dots, and if we're going to have the dots it means that we have to collect everything. And that, he said, is what we've successfully done. And then he said, thanks to, I want to thank Facebook, I want to thank Google, I want to thank AOL, I want to thank Twitter, I want to thank Fitbit, but I want to thank the telecoms. And he had a long list. I want to thank all these companies because they are collecting all of the data from all of the people. And thanks to them, he said in 2013, the CIA is now on the cusp of being able to, quote, "compute on all human generated data." Again, this is why we're talking about rights. So, should I tell the end of that story? Do you want me to…
Taki Sarantakis: Tell us who was in the room, in the back?
Shoshana Zuboff: All right. So, interesting thing. And when if you do watch the video, you can see glimpses of this in the video. But the thing is, like he was sitting in a room. I'm not going to say the room was packed, I'm not going to say there was standing room only. It was a sort of scattershot crowd. I'm not going to say they were really paying attention. People were looking at their watches. He had the misfortune of giving his talk just before lunch, and you could, just watching this thing you could feel like nobody really gives a shit about what he's saying. In fact, he says, Gus Hunt says to the MC, he says, "I'll be happy to stay and answer questions for as long as the audience would like." And the MC says to him, "Well, you know, Gus, it's lunchtime. Everybody's hungry. Why don't we break for lunch? And then if anybody really wants to ask you a question, they'll find you in the hallway or something. No big deal." Gus Hunt exploded a nuclear bomb in the airless conference room at this little public conference space for computer nerds, kind of a big conference but, and nobody even heard it except for one guy. One guy was sitting in the room in the back row, one guy whose hair was on fire. That guy's name was Ed Snowden.
By that time, this was, I believe, March 25th, 2013, by that time, Mr. Snowden already had been researching these documents within the NSA, and Mr. Snowden had a already deep appreciation of the collusion between the private sector and the intelligence community, the fact that all of these data flows, the human data flows were thanks to the Internet companies, the private companies later to be revealed as the PRISM program. You'll remember that, perhaps, if you read the Snowden documents. In any case, he had a deep appreciation of it, but he had still not made up his mind whether or not he was going to blow the whistle on the intelligence community in the United States of America. And it was that day, hearing Gus Hunt, and that Hunt was willing to get up in this public conference where there was a camera that was going to put him on YouTube and have all of this information available to what Ed described as all the normies in the room, and those that would be watching it from afar. That was what helped him decide that actually he needed to tell the world about this because it wasn't happening in that conference room. So, to finish one sentence, 9/11 produced what I call surveillance exceptionalism. We're a democracy except when we are determined to achieve total information awareness for the sake of national security. And we put national security against all democratic values, norms, laws and principles, and we make an exception, and we do it by commandeering, and nurturing, and nourishing and giving wide berth to this Internet sector. They collect the data. We've got the data, we use the data, we don't have to violate the constitution and they're not bound to the constitution.
Taki Sarantakis: Yeah.
Shoshana Zuboff: Surveillance exceptionalism.
Taki Sarantakis: And remember that as public servants, because when you go through a big crisis there is that temptation to say, oh, let's set the rules aside, let's set our values aside. It's an emergency, let's go do this. And you're even seeing it play out in the front page of The Globe and Mail every day on our last public emergency. So, Jim, that's where kind of we were, where we are, for better, for worse. So, what do we do? How do we start getting out of this?
Shoshana Zuboff: I got to write this down.
(Laughter)
Jim Balsillie: Okay. Well, first of all, I think the judicial branch is a good one. I'm a big believer in using legal strategies to press these issues. I think you can advance things there. I think there's enormous legislative issue, but it's going to take civil society. We intensely reconnected over the Sidewalk Labs issue, and that was the case where three levels of government and the most powerful corporation in the world lined up to privatize government without any legislative act, and civil society stood up. So, yeah, I'm a big believer in civil society, I'm a big believer in public policy. I think the courts are going to be important. I think we're going to have to get upstream on these issues, which is why I encouraged getting to the rights part at the end and then framing all of this.
But I do think the stakes are high. I think the regulatory capture is very high. I think the gaslighting is a special skill of confusing the heck out of people when in fact it's not that confusing. So, you see all this attention to existential risks, which you can't quantify, you can't define, you can't time, or let's have a six-month coding moratorium, or let's have a global agreement on stuff that's, embeds norms where there's not concurrence, where there's strategic interests, which, they're not concurrence. What do those all things have in common? It means that it's an impossibility that takes you away from the near-term attention of really governing the harms. And so, I think there's a tremendous amount to be done, and I think it's going to take good public policy, the courts, civil society, education.
Taki Sarantakis: Yeah. And just on that point, I'm going to quote Professor Zuboff again through another book. Somebody is quoting her, but I'm going to read the prelude of what she says before she quotes Shoshana Zuboff. "So, as the digital fuses with the physical, data increasingly becomes the built environment around us and creates a space that we collectively inhabit and cannot individually escape. As Shoshana Zuboff puts it in The Age of Surveillance Capitalism, ‘Individuals each wrestling with the myriad complexities of their own data protection will be no match for surveillance capitalism's staggering asymmetries of knowledge and power. The individual alone cannot bear the burden of this fight at the new frontier of power.'" It seems to me that that's exactly what you just said, that absent kind of legislative collection action, it's, we're not going to be able to get there. We're not going to be able to say, well, we'll give you data protection, you'll have rights over your own data, we'll give you recourse. But it's no longer data. It's the data in us, and us and the data have fused. So, for example, if somebody goes into your phone and publicizes all of your pictures, they haven't done a data breach, they've done a Jim Balsillie breach. They have violated you and your privacy.
Jim Balsillie: Yeah. One other thing which I should have mentioned is that when you look at everything we enjoy today, whether you walk into this room and have confidence that the ceiling won't fall in, or the elevator works, or that the water you drink, those are all products of institutions in public policy, water authorities, electricity authorities, building codes, and so on and so forth. And I can't see how we're going to rebalance these asymmetries. I mentioned these other things, but I think we have to rethink institutions, I think you have to have space for novelty and creativity. You want to chew on data with algorithms, but you don't want those to be appropriation machines. So, I think you're going to have to start to think of public good institutions to rebalance this, and you have to have space for experimentation. There's novel ideas of part-time or gig job boards that aren't the property of a platform, but really, public good. And so, I think there's, the other one I would say is you have to think institutions. So, I think it's going to take a series of activities, but we'll never need, we need public policy more than ever.
Taki Sarantakis: Now, Professor Zuboff, everybody knows the title of your book, The Age of Surveillance Capitalism. I read the subtitle of your book, The Fight for a Human Future at the New Frontier of Power. Talk to us a little bit about that. That's a dramatic statement.
Jim Balsillie: That was the last line of my comments, just to let you know. Did you notice I put that in?
Shoshana Zuboff: I did.
Taki Sarantakis: The Fight for a Human Future at the New Frontier of Power.
Shoshana Zuboff: Well, this is what I was kind of referring to at the beginning of our discussion when I said to you, it's really taken me a long time to integrate, and like, to feel like I really got to the bottom of what all this means. The key word in all of this is power. This is not a story about data, and it's not even a story about technology, not even a little bit. Forget all of that. This is a story about power. And power is about politics. And that's why I said to you, surveillance capitalism is what happened when democracy stood down, because democratic leaders decided that certainty and control were more important than freedom, agency, self-governance, the flawed but still best idea ever to have been produced by humanity, that people can and should have the right to govern themselves. When the book first came out, that year, 2019, I left the house for a few days of a book tour. My publisher would only pay for a couple of days. And I didn't get home for 14 months, and they nickel and dimed me on every single receipt that whole time.
Taki Sarantakis: Were they the Government of Canada?
Shoshana Zuboff: What?
Taki Sarantakis: Were they the Government of Canada? No, I'm joking.
(Laughter)
Shoshana Zuboff: They were talking to each other, I guess. But the point is that I met so many young people during these big events. Everybody sitting in the aisles, on the stairs, hanging from the rafters, these big events in every kind of venue in every kind of country. And over and over, I had this conversation. I would stay late to sign books, the line would snake out the door down the street, and I'd stay to the last person and talk to each one, and I heard this question so often. How can we possibly save democracy? This is too big, it's too late, we can't undo it. Like, democracy is, like no match for this. And every time, I want to tear my hair out and like shake that person. And I started to understand how that generation, how this, younger than not all of us, but many of us, were seeing the world. Like, many of them were born during the Obama era. And even though I, I have completely reevaluated everything I ever thought about Obama at that time, nonetheless, if you're growing up coming of age in the Obama era, things are looking pretty good about democracy and you feel like democracy is something like the Alps. It's like this big rock and it was there when you were born, and it will be there when you die and nothing can move it.
There was a lapse of historical understanding. Democracy is not a rock. Democracy is the most fragile, delicate idea and it requires every generation to be called to defend it, to rebuild it, to protect it, to fight for it. Every generation. I picture those hoop games that kids played before there were toys. Like, you roll the hoop, and like you got to run after it, and it starts to wobble, and you got to get there and make sure it gets more momentum before it falls over or you lose. That's democracy. And it's wobbling, and we have to run after it. This is politics. Just like we had to run after Jay Gould and J.P Morgan over 100 years ago when they said the only rights that matter are property rights, and we own it. You don't, you lose. And everybody thought we can never change that. It's the cartels, it's the trusts, it's the monopolies. They have all the wealth, all the power. How do we ever fight back against that? But we did because we're democracies, and we came together in something called collective action, and we learned how to create pressure and put pressure, and develop political power, which was then institutionalized in new kinds of governmental democratic institutions that hold and stand, are ruled by democratic law no matter who is elected to the Presidency or the Prime Ministership, no matter who is in Congress or in Parliament, these institutions stand and protect transparently, usually, the rule of law throughout, under democratic governance.
We did that ultimately for our industrial civilization. Now, we are called in an entirely unprecedented world, unprecedented conditions in which we must exist, now we are called to create the new charters of rights, the new legal frameworks, and the new institutions to oversee them that will protect us in a new century and thereby protect the very possibility of a digital and democratic information civilization. We don't have that yet. But we did it before, we can do it again.
Taki Sarantakis: In the course of those words, what you said was that word, power, again. But you put the word democratic before it. Democratic power. And democracy does take work. And we forget that. And I think I can take the first question from the audience, but I'm going to vary it slightly to bring in democracy a little bit more. About half the planet is going to be having elections in the next little while, including your little part of the blue dot, and democracy will face, I think, its first real test of the digital age. And that's disinformation, that's deepfakes, you all know about ChatGPT. I don't know if you've tried Sora yet, which is kind of the video version of ChatGPT made by OpenAI. What do you see in the next little while when people are no longer able to immediately trust what they see and what they hear?
Shoshana Zuboff: Well, in the long run, no democracy can survive those conditions. That I think is clear. But having said that, we're 20 years into this. The worst of surveillance capitalism, it's first it had to figure out, oh, we're going to extract all the human data, and then do it and how to compute it. By the way, they've been calling that AI since at least the year 2000. Even in Google, they referred to their search engine as, "our AI." So, this whole AI thing of the past year, well, that's another discussion. But AI has become marketing, consumer marketing, that whole thing. But the point is that this thing has built over time, this is an, surveillance capitalism is an institution that has developed in stages over time, founded on this idea and discovery that we talked about, the extraction of human data. And then they had to build the whole, all the computing infrastructures, and corner all the data science and all the data scientists, and all the great microchips, and all the servers, and all the data centres and all of that…
Taki Sarantakis: All the energy, and all the water and all the cooling capacity.
Shoshana Zuboff: All the energy, and all that water and all the stuff that they needed so that they control the entire global market of artificial intelligence, computational knowledge production. And then from there now, we've got, now we've got knowledge, now we've got knowledge that never existed before. We've got it. You don't. We're private about it, but your privacy is gone. What they did was they stole it from us, but everybody was, under the conditions of surveillance exceptionalism, no one was going to call it theft. But now we know it was theft, because when you take something from someone secretly, obviously without asking, and use it for your own advantage and they don't even know about it, that's usually called stealing. So, they stole it. So, the fact that they stole it…
Taki Sarantakis: But we clicked, "I Accept."
Shoshana Zuboff: Yeah, we clicked, "I Accept." So, we're not going to, that's a whole misdirection, gaslighting discussion, but I think we've already said enough about that. We all know that this whole thing is smoke and mirrors, right? We all know that. But the point is that once they had all the knowledge, and that took about most of the first decade, it was in the second decade that the real work began. And that was turning knowledge into… Audience participation! Turning knowledge into power. Knowledge, power, knowledge, power, chasing each other always like those figures on a Greek vase, inseparable. So, it's now in the second decade that they've really learned how to intervene in our behaviour, how to shape our way, because they have so much knowledge for targeting, for subliminal cues, for…. social, the social comparison dynamics, all of the tricks of the trade that they use to get us to read something that we wouldn't read, associate with people that we would never associate with, think ideas that we would never think, engage in behaviours that we would never engage in.
For young people, we see that expressed in the mental health issues that Jim talked about, and the self-harm and even the suicide, stuff that those same children would never have engaged with. But in the adult world, that's where we see the polarization, the extremes, the political extremes. And it begins with the fact that we have ceded our information and communication spaces to private corporations who have an agenda that is not our own. It is for them and not for us, and it is fundamentally, as we have seen, anti-democratic.
Taki Sarantakis: I think what I'm hearing you say is we have made choices and we, advertent choices, inadvertent choices. The system, the political system made choices.
Shoshana Zuboff: The political, our leaders made choices.
Taki Sarantakis: Yes, that's, sorry, that's what I… yeah.
Shoshana Zuboff: But those choices, we were deprived of those choices.
Taki Sarantakis: Right.
Shoshana Zuboff: Because we were told it was this, it was this beautiful fiesta. And in fact, it was this death match.
Taki Sarantakis: Right.
Shoshana Zuboff: With the values and principles that, as citizens of democratic societies, we hold dear.
Taki Sarantakis: Yeah. Now, Jim, I can take the second and last question from the audience. We have choices coming up, whether they're from citizens, or from politicians, or legislatures or courts. And what choice are we going to make on data and AI? And specifically, the question says, "Who should govern data in AI? Is it every organization? Is it every province? Is it a country? Is it an international body?" How do we, who has, in Shoshana's words, the power here?
Jim Balsillie: Well, who has the power and who should have the power…
Taki Sarantakis: Exactly.
Jim Balsillie: Is the question that is appropriately being put forward. And clearly, it should be the sovereign body politic. I think there's a couple of things that I'd like to bring into this as we wind through this. One, Shoshana talked about that two-headed Cerberus, I'm going to put the third head back in there, which is the political parties have an incentive to manipulate democracy now, and how can they regulate these forces if they benefit from these commodification forces? And if you see domestically, tremendous effort by the political parties to be completely absent in any form of governance of political activity. So, it shows they all think they're the better dopers in the Tour de France equivalent of the political race. And there's also a political economy of this machinations and manipulation. And so, they talk about worry of misinformation, and the old expression, I've seen the enemy and the enemy is us, that I'm not worried about foreign misinformation, I'm much more worried about domestic misinformation because that's the prime actor with the direct resources that's trying to pull them to their side. And all political parties are playing that. So, I would say one thing that's extraordinarily important, and that's a litigation file that's going on in British Columbia right now, is should there be even a minimal governance of the data for political parties? And the national parties are fighting for an absolute no. And so, I think that's a core aspect of, because how can the politics invoke when they're the beneficiary of the whole system of commodification?
I think a second, and clearly, they need to be governed like the United Way as a minimum, if not more seriously, like the EU does it. The second thing, which I think is a critical public policy issue when you talk about who governs, the way I sort of say this, there's four blocs in this world, there's the EU bloc, there's the Chinese bloc, there's the US bloc, and there's everybody else who's scared like bejesus and doesn't know what to think. And so, the smaller open economies, I think, have go through a bit of an intellectual journey, a bit of a policy journey. What does a small open economy do in the nature of these forces? Because those three can throw their weight around, they can do their quasi-sovereign thing. They're debating between internal forces, but we're whipsawed by external forces. And I think that game is unfolding right now and how you achieve national approaches in this is not clear, but I think it goes back to sovereign integrity of democracy, where political parties don't have an incentive to the game. I think it goes back to the civil society, it goes back to the capacity of the civil service to actually deliver what you normatively want, but you need to develop the capacity to do it.
So, that's why I think an event like this is so important. We need you to be better and we need the political parties to smarten up. And that's, I think if you weaken that, then their incentive to protect the other two heads is not as intense. So, that's how I characterize the antidote to, where one is very clear and one is, which is the political integrity of the parties, and the other is a journey where I think this is going to get a little bumpier before it gets smoother.
Taki Sarantakis: So, do you want to take the last one?
Shoshana Zuboff: can I make a tiny comment?
Taki Sarantakis: Absolutely.
Shoshana Zuboff: Just, and with respect and gratitude for the work that you do, there's one tiny comment I wanted make about the idea of democracy itself. And it goes back to your asking me about this subtitle, because when we talk about the denigration of democracy in this tech discourse, we're talking about the denigration of the human. And I gather, possibly not everyone in this room, but many of you in this room have dedicated your lives to serving democratic institutions. And I think it's really important to remember that democracy may be old, and it may be slow, but it has power that is unmatched by any other power, including the power we've been talking about. Two things. Only democracy can inspire human action. And if we ever doubted that, I think the people of Ukraine have taught us that even in this cynical modern age, democracy inspires selfless action. It did for my parents' generation and it still does so. The other thing is that only democracy can make, impose and enforce law. And that is the one thing that this private global institution of surveillance capitalism is deathly afraid of. Law. And it's why they want to make you feel bad. The denigration of democracy is the denigration of the human, and none of us are going to buy into that. So, your work is not only possible, but necessary, and you, believe it or not, have the upper hand.
Taki Sarantakis: Professor Shoshana Zuboff, Jim Balsillie, thank you for spending the last hour and 15 minutes with us, thank you for being a friend of the federal public service and thank you for having us fire up our neurons on important issues that will define the lives of our children, and our grandchildren, our grandchildren's grandchildren. Thank you so much.
(Applause)
Shoshana Zuboff: Thank you.
[01:15:51 The CSPS logo appears onscreen.]
[01:15:57 The Government of Canada logo appears onscreen.]