Big Data, Virginia Woolf, and the Right to be Forgotten

Sep 5, 2014

As a society, we're still developing vocabulary to talk about data technology and the moral questions it raises. In this first of a series of podcasts on data and privacy, we’ll explore how big data is used and the underlying moral questions that impact how our global economy--and society--develops in this world of increasingly data-driven commerce.

JULIA TAYLOR KENNEDY: You're listening to Impact from the Carnegie Council. I'm Julia Taylor Kennedy, and I'm really excited about this episode of the podcast. It's the first of three installments we're doing on data, privacy, business, and society.

JOSHUA ROTHMAN: My name is Joshua Rothman, and I write about ideas and books for NewYorker.com.

JULIA TAYLOR KENNEDY: Rothman's also the archivist at The New Yorker. He thinks and writes a lot about data and identity online, and he's kindly kicking things off for us today with a story from Virginia Woolf's classic novel, Mrs Dalloway.

Now, you might wonder why we're consulting a dead feminist novelist for her insights on data privacy. Well, few have considered issues of private and public spheres as deeply as Virginia Woolf did—after all, her other classic is called A Room of One's Own.

Okay . . . back to Joshua Rothman and Mrs Dalloway.

JOSHUA ROTHMAN: The set-up of Mrs Dalloway is that the protagonist is this woman Clarissa. In the novel she's middle-aged, but a lot of it is told in flashback to when she's much younger. In one such flashback, she's a teenager, and she's hanging out with some friends, basically. One of the friends is a girlfriend named Sally, who is another girl her age, who's sort of a free spirit. I think it's fair to say Clarissa is sort of a buttoned-up person, and Sally is sort of bohemian, and you never know what will happen.

They're walking outside for a walk on the terrace of a fancy house, and Sally picks a flower from an urn and turns and kisses Clarissa. Clarissa, as I think happens to many people—you're a teenager and you sort of have a crush on a friend. It's a hyper-intense friendship that's almost about love but not quite, but it could be. It's sort of like, you just don't know—you don't have categories for that kind of stuff at that time in your life. And that's the way it is for Clarissa.

JULIA TAYLOR KENNEDY: Clarissa never told anyone about that kiss.

Just think: What if Clarissa or Sally, her freewheeling friend, had posted a photo of the kiss on Instagram? Or published it on the school paper's website? That might affect Clarissa's memory of the kiss, or her future relationships with men, or her job prospects.

As a society, we're still developing vocabulary to talk about data technology and the moral questions it raises. Part of our challenge? Our ability to collect, store, combine, and analyze data from different sources gets more sophisticated by the day. And even though we all spend so much time online, few of us truly understand the impact these data collection tools can have.

Over the next half hour, we'll explore how our data is used once we click "accept" on those service agreements, or when we search for anything on Google, where our cost of using the site isn't money—it's data. On this podcast, we'll also explore the underlying moral questions that impact how our global economy—and society—develops in this world of increasingly data-driven commerce.

VIKTOR MAYER-SCHÖNBERGER: Viktor Mayer-Schönberger. I am professor of Internet governance and regulation at Oxford University.

JULIA TAYLOR KENNEDY: Mayer-Schönberger is a former software developer-turned scholar who's been writing and thinking about data far longer than most of us have. In 1986 he developed a popular Austrian anti-virus software. And he's pretty glad the rest of society is finally catching up.

VIKTOR MAYER-SCHÖNBERGER: It's exhilarating. It's wonderful. It's wonderful because I believe that our path forward in this digital future is not one that is designed for us, that we have no influence over. No, I strongly believe that it is ours to choose the path, and it is ours to influence the direction in which we are going.

And so, in order to choose the direction in which we are going, we need to know what directions we have available, what options we have to choose from, and what implications these options have.

JULIA TAYLOR KENNEDY:
In 2009, Mayer-Schönberger published a book called Delete: The Virtue of Forgetting in the Digital Age.

VIKTOR MAYER-SCHÖNBERGER: As I researched the role of forgetting in human society, I discovered that forgetting plays an enormously important role in how we make decisions, in how we act, and live in the present.

JULIA TAYLOR KENNEDY:
He found in order to generalize—to move out from seeing the trees to understand the forest—we need to let go of the details.

VIKTOR MAYER-SCHÖNBERGER: We need to forget.

JULIA TAYLOR KENNEDY: So, forgetting serves an important cognitive function when a person steps back to analyze or comes up with a new idea. It also serves an important social function—

VIKTOR MAYER-SCHÖNBERGER:—because as we forget, we also forgive.

It was a shock to me, initially, when I discovered that forgetting and forgiving are intertwined. I thought that we can easily forgive, but not forget. It turns out that as we forgive, our mind labels these memories that we have forgiven others of, as irrelevant. Therefore we begin to forget it.

So if we cannot forget anymore, we may also turn into unforgiving individuals, and an unforgiving society.

JULIA TAYLOR KENNEDY:
Think about it: If you're six years old, and your sister pulls your hair, and then later in the afternoon she gives you a daisy, which do you remember?

If you're a forgiving/forgetting person, you remember the daisy. If not, you remember both—and you might not be able to move past the hair-pull.

VIKTOR MAYER-SCHÖNBERGER: As we move from an analog age to a digital age, that whole, old analog balance between remembering and forgetting is uprooted. Because in a digital age, forgetting is costly and hard, and remembering is the default. It's built into most of the digital devices that we use, and it's almost costless.

Many of the Internet companies advertise that we can store everything. All of our computers now make automatic backup copies. We store our pictures in the cloud, and our camera phones do that automatically. So, getting rid of something now, suddenly, is very hard.

That's a situation that we human beings never had to deal with. We never had to deliberately forget.

JULIA TAYLOR KENNEDY:
And deliberately forgetting is pretty difficult.

VIKTOR MAYER-SCHÖNBERGER: If I tell somebody something, and then I say, "And now I want you to forget that," they surely will remember rather than forget. It has the opposite result.

JULIA TAYLOR KENNEDY
: But courts and legislatures in the European Union are currently trying to figure out how to build the "ability to forget" into our digital landscape. Commonly known as "the right to be forgotten," this data privacy movement made significant progress in a Spanish case last spring.

MARK STEPHENS:
I'm Mark Stephens, and I am a lawyer at HowardKennedyFsi in London.

JULIA TAYLOR KENNEDY:
Stephens has been big in the data privacy space for years—after all, in 2010 he represented Julian Assange, founder of the whistleblower website Wikileaks.

Stephens is also the independent chairman of a non-profit organization called the Global Network Initiative. It's a group of Internet communications technology companies, scholars, investors, and civil society organizations that together develop guidelines for companies to protect and advance user rights of freedom of expression and privacy.

Understandably, Stephens has been watching the "right to be forgotten" case pretty closely.

MARK STEPHENS:
There was a man called Mr. Gonzalez, who lived in Spain.

JULIA TAYLOR KENNEDY: His full name is Mario Costeja Gonzalez, and he's a 58-year old calligrapher and lawyer.

MARK STEPHENS: He'd been made bankrupt as a result of some property problems that he was having.

JULIA TAYLOR KENNEDY: He hadn't paid taxes on a house that he owned. Many years later, Gonzalez sold his house and made good on his tax debt. But the local newspaper article reporting that his home had once been repossessed by the bank still showed up prominently when anyone Googled his name. Gonzalez considered the information obsolete and six years ago, he decided to sue Google.

MARK STEPHENS:
He applied to the courts in Spain, under their "right to be forgotten" laws, which partly derive locally, under their civil law code, and partly from European Union data protection laws, to have Google remove from their search engines references to this report in his local newspaper, so that if you put his name into the search engine, then you would not be directed to the report of his earlier bankruptcy, as the court took the view that sufficient time had passed, that this should have fallen into the collective amnesia of society.

JULIA TAYLOR KENNEDY: Stephens works closely with Google and other major tech companies that belong to the Global Network Initiative, and he disagrees with the ruling.

MARK STEPHENS:
If you're looking at freedom of expression on the one hand, and the right to privacy on the other, then clearly, privacy is being given greater primacy. I think that's a balance which is of concern, where we're looking to see truthful information have free currency around the world. So I think that's the first thing.

JULIA TAYLOR KENNEDY: From a more practical standpoint, Stephens thinks the genie is out of the bottle once something has been catalogued anywhere on Google—especially given the limitations of the EU court's jurisdiction.

MARK STEPHENS:
If you are in America and you look up Mr. Gonzalez's name—and you will find it—you find a direction, a link taking you to the URL for the newspaper report. If you are in London, you won't get that link and that report. So, we're serving different returns for people from different countries.

JULIA TAYLOR KENNEDY: I tested this out online by searching "Mario Costeja Gonzalez" on Google in the United States, the United Kingdom, and in Spain. But, of course, there's now been so much press coverage of the case itself, it's not a perfect test. The name "Mario Costeja Gonzalez" now retrieves hundreds of thousands of results on Google in the United States, the United Kingdom, and in Spain. In the United Kingdom and Spain, though, a notice also pops up that "Some results may have been removed under data protection law in Europe."

My favorite search result was an Atlantic magazine article entitled "Will Europe Censor this Article?" I picked that one up on Google Spain, so I guess the answer is no.

But to go back to Stephens' point, a request to remove a link from Google search results won't garner the kind of press attention that Gonzalez did.

MARK STEPHENS: Now, of course, people from the European Union can put into the Google search engine that's based in America, and so you will effectually get American returns. So it doesn't have the practical effect that one would have hoped, if you were a judge in this particular case.

JULIA TAYLOR KENNEDY: After all, the news report itself remains online. A Google search of the article's author, for example, will still retrieve a link to the article. Only the search result linking the person who wants to be forgotten to the article has been removed.

Now, the decision has set a precedent. And Google has hired a whole team of folks to respond to other requests that have started pouring in to have links removed from search results. The EU is currently working on broader legislation that will more clearly codify rules that govern the "right to be forgotten," which will likely take effect in 2018.

But when he steps back to consider the underlying question of the right to be forgotten, Stephens thinks this is a wrongheaded approach to privacy.

MARK STEPHENS:
Most people are reasonable and rational enough to understand that we all have done things in our teenage years, in our youth, that would embarrass us today.

Those of us that think for a moment about it—forgive and forget—and look at the person as they are today, not as they were when they were feeling and learning their way in the world—it's about a question of maturity. I think it's a bit sort of "nanny knows best" to start saying, as the Europeans have done, what you can't know about that information, because you are incapable of making a proper judgment about how to treat that information. That information is too complex for you to be able to determine what weighting should be given to it.

JULIA TAYLOR KENNEDY: Stephens prefers the current version of scrubbing away a youthful indiscretion or past offense.

MARK STEPHENS: Basically, you try and bury it with good articles, so you move it back to page three, four, five of the Internet search engines, to a place where it's never going to be read or seen—or not by anyone that's significant. Ultimately, it doesn't really matter if you're famous or not. It helps if you're rich, because you can afford the PR and the Reputation.com people to come in and assist you.

MICHAEL FERTIK: My name is Michael Fertik. I am the founder and CEO of Reputation.com.

JULIA TAYLOR KENNEDY: To find out more about this practice, I got something a lot deeper than a sales pitch of reputation management techniques. Fertik didn't really want to go into that part of his business too much. He did share that he's a huge supporter of the decision in the EU "right to be forgotten" case.

MICHAEL FERTIK: I've been unequivocal in my vocal support of the ruling, even though I thought it would be bad for my business. This is so correctly the right thing that it doesn't matter what the impact on my business is.

JULIA TAYLOR KENNEDY:
After all, even if it might make a piece of his service offerings obsolete, the "right to be forgotten" is consistent with Fertik's raison d'être.

MICHAEL FERTIK: I started Reputation.com because I believe that you should have a right to have some substantial control over your data.

JULIA TAYLOR KENNEDY: Like any good entrepreneur, Fertik sees an ocean of business opportunity for Reputation.com beyond helping people improve their Google search results.

MICHAEL FERTIK: The Faustian bargain of the Internet has been that, as soon as you log on to any website that does anything, your entire life and entire data set that they accumulate can be used for any purpose whatsoever. You have been opted into the machine as soon as you turn on your computer and you don't even know it.

There is a great myth about the Internet: that the Internet is about you. In fact, it's not. It's about the companies that control your data. The greatest beneficiaries of the Internet are the companies that control your data. They have concentrated data and they have concentrated wealth in their hands.

JULIA TAYLOR KENNEDY: To give some control back to users, Fertik argues there should be a sort of "data report," similar to a credit report, that gives each of us insight into the data that is being accumulated and consolidated based on our digital footprints.

MICHAEL FERTIK: It's very, very critically hyper-important that your listeners not think that if they don't participate in Facebook, they're okay. That's the biggest rookie mistake that we have to just avoid.

Your data file is enormous, probably 10,000 data points right now, and that's before you opened a Facebook page. It's because everything you do on the web is catalogued. All the data from all the places you go on the web, email services, are aggregated by third parties, which are in the business of buying, trading, selling data. Then, profiles of you are sold, and manufactured, and then sold again, for evaluative purposes, for money, health care, romance, professional opportunity, discounts, marketing.

JULIA TAYLOR KENNEDY: Fertik compares the way this "sea of data" works to a prison designed by a philosopher two centuries ago. Oxford's Viktor Mayer-Schönberger makes the same comparison. So we'll bring him back now to explain just how this philosopher's prison worked.

VIKTOR MAYER-SCHÖNBERGER: Almost 200 years ago, a Brit, Jeremy Bentham, had a rather ingenious idea, he thought. The idea was to construct a prison where the prison guards could watch the prisoners without the prisoners knowing when and whether they are being watched. So they had to assume that they are being watched all the time, and therefore, they had to behave.

JULIA TAYLOR KENNEDY: Bentham called his prison a Panopticon. He designed it to get the prisoners to behave all of the time because they never knew when they could be caught. Some say the eyes on our online data are similar to the eyes in the Panopticon. By tracking our digital footprints, companies and the government can better understand our strengths and weaknesses.

By looking at our status updates and other information that we voluntarily share, our friends, colleagues, and bosses also have insights, even if they're more narrow. So the Panopticon has moved beyond the physical prison that Bentham designed.

VIKTOR MAYER-SCHÖNBERGER: I think that the Internet is actually worse, in a way. It's not just a Panopticon in which we have to assume that we are being watched what we are doing now, today, but we have to assume that we are being watched in the future. That is, whatever we do on the Internet today may be held against us in many years down the road, in the future.

JULIA TAYLOR KENNEDY: That could lead us to censor our online behavior and to censor the way we express ourselves.

VIKTOR MAYER-SCHÖNBERGER: Self-censor us as individuals, as well as a society—our public debate might lack in the future the robustness that it needs in order to stimulate democratic reflection. So, I'm quite worried that we undermined the very foundation of democracy by creating this temporal Panopticon in which we are tempted to self-censor whatever we do online because it might be held against us by a potential future employer or a potential future life partner.

JULIA TAYLOR KENNEDY: One solution might be more "forgetful" applications like Snapchat, which erases posts after a given period of time.

But how we present ourselves on social media is no longer the biggest worry. Of deeper concern are those indelible digital footprints we often don't even realize we're making and which are now being analyzed incredibly effectively.

VIKTOR MAYER-SCHÖNBERGER: In the past, we were somewhat safe from some of those crazy, big data surveillance programs because most of them were stove pipes. They were silos where the data was collected but not shared with others. The beauty, but also the danger—the challenge of big data—is that value is had from combining different data sources, not just one or two of them, but hundreds of thousands of them. That way, people, and people's behavior gets identified very easily.

There is a saying in software design that says, "With enough eyeballs, every bug can be found. Every bug is shallow." Similarly, here, with enough data, everybody is surveilled. Every behavior, every little attitude, every little utterance is being captured. In that sense, as these different data sources are coming together—and put together—we are creating, really, the Orwellian 1984 of a comprehensive surveillance society.

JULIA TAYLOR KENNEDY: Scary stuff. The "wormholes of paranoia"—or maybe realism—start to pile up. If I Google a celebrity, will some data company assume I'm a fan of that person? Or if I Google a news story, will some news media outlet assume I have a strong opinion or interest in the topic?

Facebook tracks the keystrokes of the names you type into its search bar, even if you delete them without hitting "enter." What does that mean about the ads it shows me, or the people it suggests are my "friends"?

Can Amazon know me better than I know myself? Can this data profiling start shaping my identity based on who the websites think I am, or should be?

JOSHUA ROTHMAN:
In a way, the danger is in mistaking this data that we have for knowledge about people.

JULIA TAYLOR KENNEDY: The New Yorker's Joshua Rothman points out that as sophisticated as big data has become, it has its limitations. Think of the videos that YouTube or Netflix recommends for you. Some of them may be of interest, but some not so much.

JOSHUA ROTHMAN: On many Google-powered websites, you can log into a panel—on YouTube for example—and you can see sometimes a little pop-up that will say something like, "Help us improve our targeted advertising. Tell us what kinds of ads you'd like to receive."

JULIA TAYLOR KENNEDY: If you click on the panel, suggested advertisers appear.

JOSHUA ROTHMAN: You'll see who YouTube, for example, thinks you are. Very often the "you" that they think you are is very different from who you actually are. The reality is that the persona that's being communicated through these data traces online is just a small window into who you are.

JULIA TAYLOR KENNEDY: Perhaps there's an essence of identity no amount of data could capture.

JOSHUA ROTHMAN: I'm as scared about Big Brother as the next person. There's good reasons to be concerned. But I think sometimes the story of big data, and the story of the erosion of our privacy because we're sharing more and more, I feel that it can, in a way, create its own kind of danger.

It creates the idea that people are deeply knowable, that their characters are accessible, that who they are is something we can deduce from collections of facts about them, that we can predict their future behavior based on their past behavior, that we can put them into boxes based on things that they've shared or things that they've said. That seems to me to sort of violate a fact about people, which is that they're unknowable, that you don't know what they're going to do, you don't know how they're going to act. In a way, that's a big a danger as the sharing of this information in the first place, is our trust in the information and in its power.

JULIA TAYLOR KENNEDY: What you buy on Amazon, what you watch on YouTube, may not betray who you really are. To Rothman, certain essential, identity-forming experiences—ones that give us depth and humanity—may be hidden from every database in the world.

Think back to that story of Clarissa Dalloway, whose stolen kiss with a female friend named Sally opened our podcast.

JOSHUA ROTHMAN: Clarissa had two men who were interested in her when she was this age, when this happened. One was a friend, Peter, who is this sort of hipster kind of guy. He wants to be a writer, he's very serious. He's very interested in her thoughts, and everything like that, and he's always asking her questions. He's a very probing personality, in other words. He's always saying, "Why do you think this? Why do you think that? Why do you feel this?"

JULIA TAYLOR KENNEDY: In other words, Peter wanted to be inside Clarissa's head, to know her completely.

JOSHUA ROTHMAN: Then the other boy who's into her is the man that she eventually marries. He's named Richard Dalloway, and he's the opposite. He's kind of not that fun to talk to, he's athletic, charming in a friendly way, sort of like a golden retriever sort of a guy, as opposed to this other boyfriend who will never leave her alone, basically.

JULIA TAYLOR KENNEDY: Richard gives Clarissa space. He respects her privacy.

JOSHUA ROTHMAN: This kiss between Clarissa and Sally is exactly the kind of thing that Peter would want to know all about, and want to get into—"What's up with that?" It's the kind of thing that Richard will say, "I guess that's a thing that happened."

JULIA TAYLOR KENNEDY: And that's one reason why Clarissa chooses Richard over Peter.

JOSHUA ROTHMAN: So a lot of Mrs Dalloway is about the idea that there are these experiences that it's best to appreciate and hold onto without ruining them by opening them, by investigating them too much, or getting too deeply into what they might mean. To get into what they might mean is to ruin them, it's to over-determine them, or to define them too much.

JULIA TAYLOR KENNEDY: As these predictive algorithms are still developing, and as governments are still determining how to balance privacy against free expression, we still have an opportunity to shape whether we want our Internet companies to take more of a lovingly invasive "Peter approach," or a more distant—even if it's a little bit colder—"Richard approach." That is, whether we value a highly-tailored, personalized experience or a more respectful, service-oriented one.

If we do choose the proverbial Richard over Peter, that has implications for the role of technology in society. And according to Oxford professor Viktor Mayer-Schönberger, it puts new onus on data-gathering companies.

VIKTOR MAYER-SCHÖNBERGER: What we really need to do in going forward, is to make sure that the personal data that is being kept and collected is used legitimately. So we need to place a much higher burden of responsibility and accountability on the users of data, whether they are private sector users, or whether they are government users. We really need to reform data protection and privacy laws swiftly, both in Europe, as well as in the United States, and elsewhere in the world because, otherwise, it's too late.

JULIA TAYLOR KENNEDY: Clearly the issue of big data is thorny, but the answers are in our hands. We have choices and we have resources. Whether we turn to Virginia Woolf or more contemporary thinkers, we can find guidance as we construct our personal and societal approaches to the ethics of big data.

Thanks for listening to Impact from the Carnegie Council. Join us next time for a deep dive into cybersecurity.

A special thanks to our production team Mel Sebastiani, Terence Hurley, Deborah Carroll, and Amber Kiwan. I'm Julia Taylor Kennedy. You can find out more about this podcast— along with links to many of the books and articles mentioned in today's episode—at carnegiecouncil.org.

You may also like

NOV 13, 2024 Article

An Ethical Grey Zone: AI Agents in Political Deliberations

As adoption of agentic AI increases, it is critical for researchers and policymakers to agree on ethical principles to inform governance of this emerging technology.

APR 30, 2024 Podcast

Is AI Just an Artifact? with Joanna Bryson

In this episode, host Anja Kaspersen is joined by Hertie School's Joanna Bryson to discuss the intersection of computational, cognitive, and behavioral sciences, and AI.

MAR 13, 2023 Podcast

C2GTalk: How can companies ensure carbon dioxide removal has a positive impact? with Amy Luers

New thinking is needed to ensure high-quality nature-based carbon dioxide removal offers genuine and long-lasting benefits to the climate and biodiversity, says Microsoft's Amy Luers.

Not translated

This content has not yet been translated into your language. You can request a translation by clicking the button below.

Request Translation