How do companies such as Yahoo! and YouTube decide on whether disturbing material should be banned from their sites? What are the free speech and human rights issues involved? What guidelines do they use? This fascinating workshop discusses specific cases.
JULIA KENNEDY: Today we're going to get a little bit of background about human rights and business,
and some of the issues that are going on, from our wonderful representatives
from NGOs [non-governmental organizations] in the room.
Then we're going to run a few mini-cases from Yahoo!
and YouTube, as well, later in the program.
Our first speaker is Rachel Davis. She's an attorney and an adviser to
John Ruggie.
And she also helped write the Guiding
Principles on Human Rights and Business, which was adopted by the UN
Human Rights Council over the summer of 2011. Rachel is taking a lead role
in creating a new NGO called Shift, which will help companies implement
the new Guiding Principles.
I've asked her to give some background about the Guiding Principle, ways they
can help businesses address dilemmas around human rights, and some of the challenges
of implementation.
RACHEL DAVIS: I want to do two things very quickly. The first thing is just
sketch some background for you of why did the UN get involved, and what is the
significance of these Guiding Principles, now that we have them.
The second
is to talk briefly about some of the implementation issues we are seeing through
the work that we're going to be doing with Shift on a cross-sectoral and cross-regional
basis. I know the focus of today is ICT
[information and communications technology] companies, but this is just to put
it in a bit of a broader context.
So, a little bit of history. Previous efforts at the UN to address this issue
really—and by "this issue," I mean setting some authoritative
standards for the business and human rights debate—really got stalled and
broke down in deep divisions, between the business community on one side, civil
society on the other side, with states largely absent from the room.
In 2005, the then-Secretary General, Kofi
Annan, appointed John Ruggie, as Julia mentioned—John is still a
professor at Harvard Kennedy School—to set out on a process of consultation,
research, exhaustive and exhausting, on stakeholder interaction on all continents,
to come up with some standards that everybody could agree on.
So that's what
we worked on for the best part of the last six years, with a lot of input from
companies, as well as from governments and other stakeholders.
In June of this year, the Guiding Principles were endorsed unanimously by the
UN Human Rights Council, which is very significant, because we now have a common
authoritative global reference point for this discussion.
These Guiding Principles do three main things:
- They clarify what states should be doing when it comes to protecting against
human rights abuses by third parties, including business, by setting clear
expectations for business, and implementing clear rules of the road.
- The second thing that they do is they really clarify for companies what
is now a global expectation, that companies respect human rights. And they
provide companies with a blueprint for what we like to call "knowing
and showing," how you are actually respecting human rights, in fact.
- The third thing that the Guiding Principles do, is that they emphasize the
importance of effective remedial mechanisms when things do go wrong, because
they do. And they give all stakeholders tools to really test company claims,
when they say that "we are respecting human rights."
A couple of points on implementation:
What the Guiding Principles really try to do, when it comes to businesses, is set out some clear management processes that businesses should have in place to prevent, mitigate, and address the risk of negative human rights impacts.
This means that there are a few challenging issues for companies to work through. As I said, the Guiding Principles were informed by actual company practice and experience, like those of the companies at the table, in trying to get this right.
The three issues I would highlight are:
The first one: It's really about risk to the impacted individual; it's about risk to the rights holder. So it goes beyond classic—and I think pretty outdated now—conceptions of just looking at the risk to the business itself. This is, though, a challenge. This involves stakeholder engagement. It involves really understanding the perspective of those it will impact upon. So this is new territory for many companies.
The second challenging area of implementation we are seeing, is around the
fact that the Guiding Principles apply not just to your own activities as
a company, but also to your business relationships as well. So how do you
deal with your value chain?
Now, here there are no simple answers. I think people are aware of that. But what the Guiding Principles try to do is provide, if you like, a structured
decision-making matrix. So here are the key factors that you as a company need
to take into account when you are thinking about:
- What do I do that has been linked to something far down the supply chain?
- What are the kinds of tools?
- What kind of leverage do I have?
- Can I increase my leverage?
- How necessary is the relationship? What are the other adverse consequences of
termination?
That also applies to the third implementation challenge that we are seeing, which is around, if you like, what we very euphemistically, in UN style, call "issues of context" in the Guiding Principles. What that really means is a situation where you have a direct conflict between national law and standards, and international human rights standards. And you, as a company, are trying to operate in that national jurisdiction, and meet the international expectations, and your responsibility to respect.
A lot of what we're talking about today gets into that. But I just want to say that's obviously not something that's unique to the ICT industry.
A final point on implementation. The Guiding Principles rely on companies putting
these management processes in place, through a system of cross-functional collaboration.
So they're about dealing with your internal stakeholders, as much as dealing
with your external stakeholders. Again, that can pose its own challenges.
But ultimately, by setting out clearly the responsibilities
of government, the responsibilities of companies, and the expectation that all
stakeholders can now hold both those parties to, the Guiding Principles are really intended to
help address collective action problems. So that you get these different parties
coming to the table, particularly when you're talking about dilemma situations,
to work together and say, "What are our respective responsibilities and
how can we find solutions?"
JULIA KENNEDY: One thing I forgot to mention about Rachel is she is a Carnegie New Leader,
here at the Carnegie Council. So we're so glad to have her here in multiple
capacities.
Next up we have Susan Morgan, who is executive director of the Global
Network Initiative, and is working very closely with a lot of the issues
we'll be talking about throughout the session today. She's got enormous energy
and drive and is bopping around the world constantly, speaking at different
conferences, and working with different participants in the Global Network Initiative
to make sure that everyone is on the same page.
She has brought enormous information communication technology companies together
to agree on these principals on human rights, including Google, Yahoo!, and
Microsoft.
She'll talk about the Global Network Initiative and lessons around human rights
that the tech sector is learning.
SUSAN MORGAN: I will talk about the Global
Network Initiative,
why we came about, how we emerged, and then I'll say a little bit about what
we do as well. The mission of GNI is really to protect the freedom of expression and privacy
rights of users around the world.
We're a multi-stakeholder initiative. So we have companies, human rights organizations,
investors, and academics who are active members and participants of GNI.
Really, what we're trying to do, is to address a series of issues that, I think,
started to emerge probably six, seven years ago. The issues I briefly describe
in this way.
Firstly, there's an exponential growth of the number of people around
the world who are using information communication technology. So we have something
like 650 million users of Facebook, about five billion people have cell phones,
about two billion people connected to the Internet. So there's a huge number
of people who are starting to use this technology. And using the technology
in a way that's increasingly integral into their social, economic, and political
lives.
What we're starting to see, as a result of that, is a lot of government interest
around the world, in terms of the way in which these technologies are being
used, and the potential impact that they can have. So we're starting to see
companies increasingly getting requests from governments—either to share
information about users or to censor information that's on the Internet—that
might be restricting the freedom of expression and privacy rights of users.
The consequence of that is that the business decisions that companies are making
in this sector are becoming really important. So, where data is stored, where
servers are located, on what conditions companies will share information with
governments—these kinds of things really can impact on the rights of a
lot of people in many, many different places in the world. Those are the kind
of issues that GNI was established to address.
The idea of having companies working with human rights groups and with academics
and investors is not a new thing for GNI. There are other sectors and other
industries that have really tried to create similar kind of models in this intersection
between business and human rights.
So you can look at the extractives industry; you have something called the Extractives
Industry Transparency Initiative. There's a number of things in the apparel
sector, the Fair Labor
Association, the Ethical
Trading Initiative. So GNI is very much modeled on these types of initiatives
that emerged some time ago in other sectors.
In terms of the work that we do, the first thing that GNI did was to
develop, using these four stakeholder groups, a set of principles and guidelines
to guide company action when they are facing requests from governments that
might impact on the rights of users.
Those guidelines are based on international
human rights standards, and were actually developed very much alongside the
UN framework that Rachel was talking about. We've had, for a number of years
as an observer, someone in Rachel's team working alongside us. So our principles
and guidelines very much mirror the kind of protect-and-respect framework.
The principles cover
very, very broad things around how can companies really operationalize into
the way in which they do business, thinking through the implications around
freedom of expression and privacy. So that's things like the technology that
companies are developing, products and services that they are selling, and the
markets that they operate in.
So how, as a way of doing business, can companies think about those things?
The principles then look at some very specific recommendations around how
to deal with requests from governments about censoring information, or looking
at handing user information over to governments.
Alongside the principles and the guidelines, organizations who join GNI—and
we're a member organization—all organizations commit to really support
and advocate the principles.
Companies take on an additional step, which is really to implement the principles
and the guidelines within their organizations.
And then there are a number of other activities that we do.
Firstly, we have a process of independent assessment on the way in which companies
are implementing the principles. So we want to be able to establish credibility,
both for the work of GNI, and also for the actions that companies are taking
in living up to that public commitment.
And then, what we are finding now as we develop, is that we are able
to use the principles and the guidelines as a basis for policy work and policy
engagement with governments, and also as a way of providing an opportunity
for companies to work with human rights groups, investors, and academics in
a kind of safe space to work on the issues, because I think there are very new
issues that are developing very quickly.
There are so many examples in the last 12 months of what has happened, from
issues in the Middle East and North Africa to other places—there are many,
many companies facing many issues. So we really try to provide a space for organizations
to come together and to work through the issues.
JULIA KENNEDY:Let's dive into the cases portion.First up we have Ebele Okobi-Harris, who has a wonderful résumé.
She's got a J.D. from Columbia Law School here in New York and an MBA from HEC
Paris [École des Hautes Études Commerciales de Paris], which was
ranked by the Financial Times as the number one business school in Europe.
So she has this diverse career in law, consulting, and now at Yahoo!, and has
worked both in the private and NGO sectors.
But the single thread throughout that career, as she said, is to do work with
impact. I learned about that through a compelling BusinessWeek
online column that she has written.
At Yahoo! she combines this legal, business, and do-good impulse as director
of the business and human rights team. Before we jump right into the cases, why don't I ask you what it's like to be
working in human rights at Yahoo!.
EBELE OKOBI-HARRIS: It's unique. Yahoo!'s
Business and Human Rights Program is the only program of its kind, certainly
in the information and communications technology sector and, I would argue, in a
lot of other companies.
I think for my part—in doing a lot of CSR
[corporate social responsibility] work, you often have to make the case of why
it's necessary. I feel that I'm lucky, because the case was made before I was
hired. I think Yahoo! learned a lot of very difficult lessons in some of our
adventures in other countries. So I feel that I'm quite lucky because I work
for a company who takes it incredibly seriously, from senior leadership on down.
I think the other thing that I would note is that we are in a position to actually
help the company make strategic decisions. I think sometimes CSR at companies
can be about marketing, which is all fine, or about PR. I think at our company
it's very specifically placed within legal to serve as an opportunity to have
a direct impact on business decisions before and as they're being made.
So I think it's a privilege and it's a unique opportunity, and it's the best
job I've ever had.
JULIA KENNEDY: Let's talk about a specific dilemma that you've dealt with
in your time at Yahoo! Tell me about a time that Yahoo! made a judgment call
about content posted to its photo site Flickr, which is owned by Yahoo!
EBELE OKOBI-HARRIS: I'm going to
try to give you a basic landscape.
I was tipped off by something that Rachel Davis said when she talked about
the tension between national law and international human rights standards. I
think this makes it very difficult—and you'll hear from my counterpart
at YouTube—from classic CSR problems, where you're talking about differing
standards, for example, like child labor. Or if you're talking about the extractive
industry.
In this case, you're talking about national laws that have clear sanctions.
So it's not a matter of, do you meet a minimum standard that's different in,
say, Thailand, than it is in the United States, when we're talking about child
labor? You're talking about a law. So that's one.
And two, if we're talking about privacy and free expression, there is absolutely
no country in the world that allows for a completely uncensored Internet. So,
a lot of times, when we're thinking about this as a problem, they say, "Well,
companies, you should never censor."
First of all, if it's a law in a country, you are bound to follow that law.
Two, no company is in a position where they can't censor, because every single
country in the world has a censorship regime of some sort. The joke that I always
make is —censorship, to people, is always what they don't agree with.
So an American would say, "Of course child pornography shouldn't be online."
We would all agree with that. That's something that you can't even argue with.
But that in itself is a government saying, "This we do not want."
The last centering point I would make, is that there is also no government in
the world that has a sophisticated police force that is not interested in solving
crimes through getting user data. So, we focus on China, we focus on the Middle
East.
But every single country in the world is interested. And if you look at
Google's map of government requests, you'll actually see that the country that
sends out the most is the United States. That's just something that grounds
this in what the reality of it is for companies.
Having said that, there are a lot of issues that I've dealt with in my role.
I think one of the most interesting—given everything that's going on in
the Middle East—is the issue with Yahoo!'s property called Flickr. It's
a photo-sharing site.
As you all know, Egypt is one of the beginning countries of the Arab
Spring. Right now, we feel a huge amount of enthusiasm for what the people
have managed to do. There are a lot of articles about how people were using
social media
to basically change the world—not that social media did it themselves,
but they were using it as a tool.
Flickr, as I said, is a photo-sharing
site. There is a very well-known activist in Egypt. He goes by the name el-Habaway,
but his name is el-Hamalawy. He has been a long-time
Flickr user and he uses Flickr to post a variety of photographs that he has
taken.
He also is a member of a Flickr group called Piggipedia. Piggipedia
is a group that was started years ago to identify members of Egypt's secret
police force. It was done because they wanted to make sure that these people
were people who were brought to justice.
In the middle of the uprisings in February—this was the time when Egyptians
stormed the secret security headquarters, and they ransacked it, and they were
able to get disks—El-Hamalawy took the disks. And he posted images from
the disks onto his site. And below that he had a caption that
said: "These are secret police people. They must be brought to justice.
This is who they are," et cetera.
JULIA KENNEDY: We're in February of 2011, beginning of
the Arab Spring. You're at Flickr. You see this post. How do you handle it?
Take a look at these pink Flickr
Community Guidelines [audience was handed an excerpt from the Guidelines]. See if that
gives you any insight on how to handle this post.
What first comes to mind if you hear that this photo has been posted? What are
some of the questions you want to ask about it or what's something you want
to address if this has been raised, attention about this photo has been raised?
PARTICIPANT: It sounds a little like Law and Order ripped from
the headlines, because they're more or less saying "You better do something
to these people." It definitely conflicts with "do play nice" [part of Flickr guidelines]. That
strikes me right away as what you're dealing with. I'd say you'd have to take
it down.
JULIA KENNEDY: What do others think? Anyone want to defend leaving it up?
PARTICIPANT: Was that decision made?
JULIA KENNEDY: We're playing it out. We'll let you know.
PARTICIPANT: Not necessarily to defend it, but the guidelines seem a little vague,
and judgment calls need to be made. So you could argue either way, whether you
take it off or leave it up there, that you're adhering to or not adhering to
some of the community guidelines. It feels like maybe they need to be defined
a little more strictly for any real decision to be made based on the guidelines.
JULIA KENNEDY: Any other thoughts?
PARTICIPANT: There's clearly a theme with Flickr. You didn't take the
photos. If they're not your photos, that you mustn't upload them. So that seems
like that's a large part. There's no misunderstanding that he knew that they
were not his photos. And he thought it was appropriate to use the Flickr format
to promote photos that were definitely taken by someone else.
JULIA KENNEDY: That is a key element in the decision-making process
at Yahoo!. Do you want to take us to the next step in this, and tell us what you were considering?
EBELE OKOBI-HARRIS: I'll start by saying that one of the reasons I picked
this is because this is a workshop. And I wanted to work through some things
that worked, and some things that maybe didn't work so well.
We have a Business and Human Rights Program. And
we are supposed to be contacted whenever there are these types of issues. To
your point about vagueness, the Business and Human Rights Program has always
been focused on government requests for data.
So when Flickr saw this, they thought, This is not a government request, because
it came in through a Flickr community member. Actually more than one clicked
the "abuse" button and said, "This is wrong. These should not
be up here."
So the Flickr community managers didn't escalate it because
they thought, "Not a government request."
They went through the entire decision process without telling me. I actually
found out through two different GNI members the value of being in a multi-stakeholder initiative.
Because people, nonprofits, instead of immediately going out to the public
and saying, "Oh my god, Yahoo! is a horrible company"—although
some people did say that, and that's fair—instead of doing that, they came
to me directly and said, "You know what? I know that there's someone here
who focuses on this. What happened?"
So the decision that was made by Flickr is that: Look, there are a number of
issues with this photo. Number one, it's basically threatening someone. And
also, it's not their own photograph. So the decision was made to take it down.
You pointed out one of the things to think about. The community guidelines are
vague. One of the things I would posit: there are hundreds of millions of pictures
posted on Flickr. To figure out a guideline that would address every single
layer of ingenuity of someone who's interested in posting pictures would be
completely impossible.
So one of the reasons that the guidelines are deliberately
left open, is because it's impossible to cover everything.
And the more detailed you are, the more someone can argue and say, "Ha
ha, you have not covered this; that means that it is allowed."
So there
is an aspect of judgment call being made.
Anyway, to answer your question, the decision was made to take it down. But
the public reason given was because those pictures were not his. And not only
were they not his, he explicitly said, "Hey, I stole these—"
The caption said, "I took these from the security office which we have
just ransacked. These are not my pictures. Here they are. Bring this person
to justice."
PARTICIPANT: Based on what you said, do you feel that these guidelines
could be, or have they been, strengthened to get to a higher level of principle?
Because when I look at them, I can see how they would apply to my Saturday soccer
team. And that I don't want to put a nasty comment on someone else's soccer
team. But it feels that they could be up-leveled to a point where they speak
to some of these principles. Are you working on it, or have you learned from
those experiences?
EBELE OKOBI-HARRIS: That's an interesting question. Yes, I think we've
definitely learned from the experience.
I will say this. And this is I think what gets to the tension. Flickr wasn't
necessarily created for this particular instance. In fact, the majority of people
who use Flickr use it to do stuff like post their grandkids' pictures, and that sort of thing.
I think you point out one of the tensions, and one of the opportunities for a
company, when you realize how people are using your product. So, to your point,
the guidelines are written like this, because that's what the vast majority
of people use Flickr for.
But now there is certainly an opportunity for us to
think about addressing a different audience.
It is for this reason that we are having, at the end of October, a summit where
we're bringing together activists—we're also bringing together people from
YouTube—to talk about this specific issue.
So how can companies figure
out user guidelines that are flexible enough to cover how most people are using
these tools but are also specific enough in some cases where they need to be?
I think YouTube has done an interesting job with some of their videos, and you'll
hear Abbi talk about it. But how do we create ones that are specific enough
to cover very specific instances that do not represent how most people are using
their property?
PARTICIPANT: I actually just wanted to expand, and give you perhaps a
further hypothetical.
You say that the public reason for removing the pictures was because he made
it very clear they were not his.
If, in this case, let's say he got a file, had the names of the secret police,
and managed to go track them down himself, and take his own pictures of the secret
police. And still put the same message—"These are secret police; I've
identified them; they need to be brought to justice." You can claim
that he's putting those people in direct harm.
Would that have been sufficient? Or was it only because it was not his pictures
that allowed you to remove the pictures?
JULIA KENNEDY: That's a great question. Why don't we have people discuss it.
PARTICIPANT: I would refer to "don't be creepy" [part of Flickr guidelines]. That could be
the type of thing where—"creepy" is probably a term that is used
to refer to other behavior. And maybe that's again intentionally vague for the
community to report on what's creepy.
But where YouTube is clearer about stalking
and so forth, "creepy" encompasses a lot of things—threats, et cetera.
JULIA KENNEDY: He might say, "Well, this is free expression. I am a
political activist and this is free expression." I don't know, but that
might be a response.
PARTICIPANT: I think one of the things we're talking about here, and
haven't brought up on these Flickr community guidelines issue, are values.
Most
companies, if not all, have a value statement and mission statement. Possibly—these
are just dos and don'ts—but a level up from that is what is the overall
values and mission of Flickr within Yahoo!
I don't know Yahoo!'s mission statement, the way everyone knows Google's,
which is "do no harm."
But if there was a values or a mission statement
that was more specific, like "create positive change" or something
like that—if we're challenging our companies to create a difference in
the world and then putting that in our values or mission statements for those
businesses—then we can also argue that the impacts of these are both positive
and negative.
It doesn't make it any easier to solve this, but in
many respects I consider these just very legal. What to do/what not to do is
not the same thing as looking at your business, seeing the impact it makes in
the world, and saying, "What are our values and how do these actions live
up to these values or not?"
JULIA KENNEDY: What we have posted here is just a short excerpt of much of the material you'll find
on Flickr having to do with values and mission statement. So there's a lot more
on the website. I just wanted to give out a short something for the case.
EBELE OKOBI-HARRIS: I thought it was interesting you said
that Google's thing is, "Do no harm," because that actually isn't it. [It's "Don't be evil."]
But having said that, to the point, I think there
are a lot of overall mission statements. And that's great. But
that does not help you with this specific instance.
So, for example, if we talk about wanting to be a platform where people can come,
engage, and share information, that does not help you with, "Is it okay
for someone to post a picture that explicitly intimidates or harasses another
person?"
Because you could make a human rights argument either way. So,
for example, what if someone saw that and said, "Ha ha, I know who this
person is, and I will go and kill that person"? Are you as a company responsible for that behavior? Should you be responsible
for that behavior? Even if that person has done all the horrible things this
person has said, where do you stand as a company?
I say that, only to say that it is wonderful to have larger overarching statements.
But what we're talking about here—which is critical for all companies to
think about—is when you get in the weeds, what guides your decision-making
when you're there? And understanding the overall impact of what those
decisions could be.
So if we left that up and said, "Okay, we don't think there's anything wrong with it," and something happened,
would we then as a company be responsible for it? I think this
is what makes it really complicated decision-making.
PARTICIPANT: I think that's a great point. Because one of the things
I know from working in tech for a long time, is that there's always going to
be a new technology that you didn't anticipate. Government, law, and regulation, or even standards, cannot keep up with that. Technology
is going to move faster than government.
So I think one of the really interesting things that you're raising is that
you are inviting a dialogue about this, because there are many places to post photos on the Internet. So
you could put them on Flickr, you could put them on any number of sites.
The conversation that you're inviting is about what is appropriate use overall
of media. And how do we as a global community have a conversation about this,
because it will happen, whether it's on Flickr or somewhere else.
PARTICIPANT: I have a question about the overall process, in terms of
how you manage these types of things, and how quickly you take them down. Is
there a process in which you can take them down while you're viewing them? These things are so viral that if a photo is up there for 30 minutes,
it can really reach a larger audience?
If it could cause harm, how do you react to that? Do you take it down and say,
"This is under review"? Or do you take the time to review it before
making that decision?
EBELE OKOBI-HARRIS: I'll start by saying, every different property at
Yahoo! has different people who are responsible for taking it down. So Flickr's
capabilities are different from, say, Yahoo! Front Page.
Flickr does have a way of hiding content without taking it down. Also, as you
can see from the guidelines, there are different levels of takedown. For example,
there are some things where if you put that up, they will close your account
immediately.
And there's absolutely no right of appeal. There are some things
they might hide, contact the person who put it up, and say, "Hey, look,
your account very broadly is violating community guidelines. You may want to
look at it and take something down."
So it depends on exactly what the
problem is with the account.
But the short answer to your question is yes, there are different ways of doing
account or issue takedowns.
PARTICIPANT: I wonder if you could describe, from a strategic perspective,
how the decisions are made—fundamental decisions on how the platform is
built.
So for instance, do you ever discuss at Yahoo! "What type of platform do
we want to be? Do we want to be the most ethical? Do we want to be the most
open? Do we want to be the biggest?"
Those very fundamental tradeoffs that really are at the heart of the matter—then
what you see down the road in a way are the result of those decisions.
But if
they aren't made intentionally early on, it's hard then to fix it when it happens.
So, can you describe the process through which you and your group engage in
those conversations?
EBELE OKOBI-HARRIS: In terms of what Yahoo! wants to be as a company, those are conversations that
are ongoing. Not just from a business
and human rights perspective, but just from a company perspective. I think certainly
business and human rights should inform that thinking.
But it's not the only question. There are a lot of other questions that go into
a company deciding that.
Having said that, I think one of the reasons Yahoo! created this department,
is because they are very much interested in getting input from a human rights
model, like, "What should we be thinking about? So as we're thinking about
who we are as a company, what are human rights issues we should be taking into
consideration?" That's one of the reasons this department was
created, so that business decisions could be informed with human rights issues.
SUSAN MORGAN: Ebele, you said that one of the issues was that this didn't
actually get escalated as you would hope it would have. That's something we
see very often in these kinds of situations.
It's really important that a sufficiently
senior level of decision-making authority comes in early enough to work out
what the problem is and to work out what your public response will be when you
get asked. Would this be dealt with differently now?
EBELE OKOBI-HARRIS: Yes. Certainly it would be escalated. As I said before,
because the Business and Human Rights Program is understood as,"Oh yes,
if we get a government request"—so in this case, like I said previously, people
said, "Oh, not a government request, so this is not an issue."
Now the understanding of what should get escalated is much broader. So people
realize that anything that might have a potential human rights impact, that
might have an impact on freedom of expression, we should escalate that. I think
that would certainly be different. So that would be escalated. It would be discussed
at a much higher level than at the community manager level.
JULIA KENNEDY: I'm curious. You had mentioned there are issues of
internal implementation, as well as external implementation. So, say, you are
at Yahoo!—and I'm going to turn this to you next—but, first, for the
participants. Say you decide, Okay, this photo needs to come down off of Flickr,
and that decision has been made.
Then, in terms of implementing it, you get pushback from an employee. And you
get an email from a user, maybe another activist, saying, "Hey, this photo
is important, it's part of free expression"—then how do you respond
to that challenge? I guess that is the question.
PARTICIPANT: I guess I'm wondering if—rather than trying to fit all
the different objectives of the different people who are using file sharing
and photo sharing—maybe the answer will be:
"Look, it just doesn't
fit on Flickr. But you know what? Six weeks from now, we're going to be rolling out a new product
where you can share photographs, which will have a different set of guidelines
and a different set of objectives. Just be patient. This is where we're headed."
Could that be a possible answer?
JULIA KENNEDY: So maybe a new different type of community.
PARTICIPANT: I want to echo what an earlier questioner asked. If I understood
what you were saying, it sounded like you find it difficult to have principles-based
lawmaking as opposed to rules-based lawmaking.
EBELE OKOBI-HARRIS: No, no, no, no.
PARTICIPANT: These guidelines are very specific rules-based. We're trying
to understand. We would have to read this many times to understand the culture
behind Yahoo!, and what you're trying to achieve. Whereas if you had a principles-based
rule-making, it would be much clearer for people, much easier for you.
Perhaps with new products, if you had a larger ethical umbrella under which
everyone knew how your company operated?
EBELE OKOBI-HARRIS: I just want to address that really quickly. When I was
talking about the rules, I'm not saying we don't have principles by which Yahoo!
is operating. I'm saying—for purposes certainly of this conversation—if
people need to understand exactly what can I post, and what can I not post,
they actually need to refer to rules.
And I was also saying that even if you have principles, it still doesn't answer
the basic question. So, for example, if your principle is, Yahoo! says "We're
the premier digital media organization, and Flickr
is all about people being able to share information and share photos with each
other"— that doesn't answer the specific question. There are always
going to be new and emerging issues.
It doesn't answer the specific question. The joke I always make is—I don't
know how many of you are familiar with Monty Python—"Nobody expects
the Spanish Inquisition." Nobody expected the Egyptian revolution. You
can't anticipate every single potential wrinkle that is going to come.
Even if you say, "Yes, we want to be an open platform"—open for
what, open to whom? That's why I started by saying there is absolutely no way
that a company can have a completely open platform.
So yes, I think it's very important to always be referring back to what your
principles are, which are that we want as many people as possible to use our
platform. We want people to be able to use it to photo share.
The other point I would make, is that what Flickr started as is actually very
different from how this user wanted to use it. That's one of the reasons why
we want to continue to have this conversation, because Flickr is actually not
supposed to be a place that you post pictures that someone else took, so that
that person can then be arrested. Flickr is supposed to be a place that you
can post pictures that you find pretty or engaging and whatever, and people
comment on them. It's supposed to be about people who are photographers, coming
together to share things.
Now, that's not to say that Flickr can't emerge or evolve. But Flickr's original
founding principle is actually completely contrary to how human rights activists
want to use it. Now again, that doesn't mean that we can't evolve. And that's
part of the reason why we want to have these conversations. But this is actually
exactly the opposite of what Flickr meant to be and what Flickr was founded
to be.
So to your point about principles, there were principles. And this was actually
against them.
PARTICIPANT: It would be much easier to have those principles in a place
where you could read them and understand them.
EBELE OKOBI-HARRIS: They are up there.
JULIA KENNEDY: They're up there. This is not everything on Flickr. I've
gotten requests for more information and I've just been responding because there
is so much out there.
PARTICIPANT: One of the interesting elements of this example is this
notion between who the request was coming from—if it was coming from an
individual or a government.
To pick on the Egypt case—but it's quite a relevant one since the Arab
Spring, and perhaps will continue to be so—who is government and who is
the public?
The person who posted the objectionable images a few weeks from now might be
a member of government, and could make a claim that these pictures are the property
of the state, or the property of the people, et cetera.
I'm not arguing that this would actually change the decision. It's more of a
question of how did those judgment calls get made. I guess it's again going
back to the operating principles—not the values principle, but the operating
principles—where you at the end of the day still refer to the guidelines.
EBELE OKOBI-HARRIS: I don't think I totally understand.
PARTICIPANT: In part, I found it an interesting scenario in that the request
might have been coming from an individual. Or it might have been coming from
the government. But because the situation was so hazy, those lines are quite
blurred and might continue to be so, as uprisings and revolutions continue throughout
the world. That's just an aside.
But I guess, perhaps beating a dead horse of values and guidelines—but,
indeed, because you can't anticipate every single scenario, what is in the end
the operating principle? Is it keep the site as open as possible as long as
these rules are obeyed? Is that what it basically boils down to?
EBELE OKOBI-HARRIS: No. That's why I'm saying that I think this, for us,
was a very interesting thing, because Flickr is not a photo-hosting site. Flickr
has been very explicit about saying, "We don't want every single photo
that you might want to post. We don't want photos that you haven't taken. We
don't want a whole bunch of pictures, photos that you have snagged from other
sites. Or because you're a huge Lady
Gaga fan and you've put together a collection."
Flickr's values or principles are that it wants to be a site where people who
really love photography can share their pictures, can get to be better photographers.
That's what Flickr's principles are.
It's not, "We want it to be as open as possible." That's what I go
back to. So yes, that's the overarching principle. But we're realizing that
people are trying to use Flickr in different ways.
But it's a company decision—and it's separate from human rights—to
say: Well, do you want to be a place where people post any photo at all? Or,
do you really want to focus on being a site for people who are photographers,
who are really interested in getting to be better? Of, for people who can share
comments like "Look, I took a picture of my grandchild"? What do you
want to be?
So again, this is the principle, and the principle I think was actually quite
clear to Flickr users. But I think it's an opportunity for us as a company to
figure out if people are using it in a different way. And, if there's a human
rights impact, how can we have a conversation with human rights activists, and
see how we can use it differently? That goes back to the question here.
So that was one of their suggestions. And I think YouTube—and Abbi will talk about it—has done a really interesting job of saying
that there are different rules, or they will review images or video that have
a human rights aim differently. I think that there are a lot of questions that
go along with that.
One of the questions that I asked, when that was recommended is: For us, how
do you define a media site? How does that get defined? So, if Flickr says, "We
are not a photo-sharing site," is that really different just for human
rights activists?
If that is the case, then how do we define who is a human rights activist? Anyway,
there are a lot of really interesting questions which I think Abbi will get
into, because I think YouTube has done interesting work figuring out what that
means.
But for us at Yahoo, we had not reached that point, because the decision was
made on two bases. One, that it wasn't theirs, and Flickr is quite clear
about it not being yours. And that the actual caption on it was threatening.
JULIA KENNEDY: I'm going to fast-forward a little bit because we do have
a lot of great stuff to talk about from YouTube.
I do just want to cap the story by saying that Yahoo! took down the photo. But
they've allowed him to continue posting other photos. So, if you go onto Flickr
and look up el-Hamalawy, you can see he's still posting actively on the site.
EBELE OKOBI-HARRIS: And Piggipedia is still up.
I guess the only thing I would say, is that this is an ongoing conversation
at Yahoo! and at Flickr, and that through GNI, but also separately, we have invited a further conversation
with human rights activists. So we're sitting down, having a workshop, and saying,
"Specifically, how can this work? What are the details of it?"
Because, yes, we can talk about overaching principle. But how does it work at
Flickr? And what is it that human rights activists are looking for? And how
can we evolve and engage and respond to those specific requests?
JULIA KENNEDY: Thank you so much. That was very insightful and very helpful.
Let's move one now to talk about YouTube. Abbi Tatton is here. She spent a decade
reporting on the Internet and politics for CNN. So she has been thinking about
these issues for quite a while. She joined YouTube as Manager of Global Communications
in 2010.
She has brought along two very compelling videos to share with us. I just want
to give the warning to our webcast viewers, as well as you here in the room.
Some viewers may find the content disturbing in these videos. If you prefer,
we'll let you know ahead of time and you can cover your eyes, leave the room,
or whatever. But they're also very compelling, so we thought it was important
to include them.
Before we jump into the cases, Abbi, why don't you tell me, also, what it's
like to be working in these issues of human rights and communications at YouTube,
and how the challenges you face are unique or more universal?
ABBI TATTON: Well, I think the first thing to start with with YouTube is
scale. And to also note that—just listening to Ebele talking about this—it's
not unique. We are all facing some similar and very new challenges.
YouTube is six years old. It was started with the motto "Broadcast Yourself."
I don't think anyone could have anticipated how much everybody would. We're
now at 48 hours of content uploaded every single minute. So by the time we're
done here, thousands and thousands more hours of content will be on the site
that none of us could possibly sit through and watch, three billion views a
day.
So with the motto "Broadcast Yourself," this is a platform for free
expression, giving everybody a voice. And we believe that that's a great benefit
to society, access to more information.
But, at the same time, what we're trying to balance is a platform for free expression,
which necessarily has to have rules. And what those rules should be. And how
we strike the balance between being this place where everyone has a voice, a
place for the free exchange of ideas, and a platform that's safe for users as
well.
Sometimes this is really simple. If content is illegal—if content, for
example, contains child pornography, we will omit it. Again, as Ebele was saying,
all around the world the jurisdictions might have slightly different interpretations
of what that illegal content is. We don't. We have a zero tolerance policy on
that. We think that's pretty simple to enforce and to implement.
On other things, as we'll look in a couple of case studies, it's really not
that easy. You'll see our community guidelines. Again, we've spelled them out.
I heard the comment that it seems vague. It seems vague from Flickr as well.
But, just as we were hearing already, when you have this much content, how can
you write every single policy to come up with every new challenge that we couldn't
possibly anticipate yet?
I think looking at a couple of videos will illustrate this, just to say that
these are tough calls. We review video carefully. And we scratch our heads,
sit around, debate, and talk about it. I'm not saying we always make the right
decision, but we look at this content very carefully.
JULIA KENNEDY: Great. So let's look at the first one. This is an excerpt from the first video we're going to ask you consider, if
you could put yourself in the role of a reviewer at YouTube, and decide whether
you think this should stay up or not. As you all know, the guidelines are published.
[Video presentation: excerpt from "How to Make an Exit Bag." Warning: this is a very disturbing video about how to commit suicide.]
ABBI TATTON: This is about four and a half minutes long. I'll stop it
there.
JULIA KENNEDY: Put yourself in the place of a reviewer. What stands out to
you about this video, and does it raise any concerns for you dealing with human
rights?
PARTICIPANT: I don't see why a company is required to post that. I'm not
saying that it can't be on the Internet. But it would be up to you whether you
want to give people ways in which to kill themselves. I don't see where you'd
be violating anyone's right of free expression as a company.
ABBI TATTON: By taking it down?
PARTICIPANT: Yes.
JULIA KENNEDY: Obviously this is very disturbing content, and these are questions
to mull over. But I'm wondering if anyone has any other perspectives on it.
PARTICIPANT: I think those are issues that—there are professionals
who deal with issues like that all the time who are far better—they've
thought about this in a way that maybe we don't always consider. Doctors have
principles of "first do no harm." I'm wondering if there might not
be some guidance among such professionals on how to even think about an issue
like that.
JULIA KENNEDY: And so how would you say that should inform the reviewer at
YouTube when considering whether—
PARTICIPANT: In other words, you wouldn't necessarily want a reviewer
to make a decision purely based on a gut feeling. But, if there was a set of
issues that had been debated by professionals, like doctors. Or people who are
dealing with problems that the aged have, when they're facing a short lifespan
with tremendous pain, and there's no hope for improvement. There are such complex
issues around that—
JULIA KENNEDY: So, you would call in the support of, say, a medical ethics
board or something, and call in a doctor to see what they think?
PARTICIPANT: Yes. At least to create principles around it.
JULIA KENNEDY: Okay, great.
PARTICIPANT: I think, number one, this is a piece of a video. We don't
have the full story. I think it's important that this information is available.
I'm a health professional myself, so I can speak on that side of things. Whether
it's appropriate that it be available on YouTube, rather than—as the gentleman
down here was saying—somewhere else where there is a professional grounding
to that information, I don't know.
But this is open information and it can be found from many, many locations.
PARTICIPANT: I guess there are absolute answers, and nuanced answers.
In this case, it strikes me—just as a point of view—that there has
to be a nuanced answer. This could be seen as public health information. There
are many ways to look at it.
I think the issue is that the platform is an open platform. That you don't pay
to view this. So the access is so easy, so immediate, and so global that, in
a way, it is a company's responsibility to decide either in absolute terms, "This
is a yes or no."
And, once you have made that decision, if it's a yes, then to decide how it
will be further filtered. So that it's either a separate community, for example.
And then, even within that community—imagine that this could be a medical
library, for instance—you would still choose which documents are worth
being showcased in that library. So there are a number of criteria.
I guess the complexity for you, I'm sure, is volume, just sheer abundance of
information.
So I wonder if you could describe how do you deal with these nuances. And is
it now leading you to reconsider the openness of this platform, and the free
access to the information? What is the debate within your company now?
ABBI TATTON: I think the first point I'd want to make is what you brought
up. Which is—we just saw a snippet—there was no greater context. If
this was in a medical library, there might be more supporting information. There
might be more information around it; there might be a medical professional introducing
it.
This is YouTube. There isn't. There is a video. There is some what we call metadata,
which is a horrible term, which just means the tags around the video which show
up when you search. So the metadata, what a reviewer would see when they were
looking at this video, is just the video.
That's the same thing as a user sees. That user could be a 15-year-old who just
searched on the word "suicide," and this video came up. Other tags
on this video: "how to," "helium," "exit bag,"
"suicide," "euthanasia"—all these things. So that is
what the reviewer is looking at and taking into account.
The reviewer is also looking at our policies and applying every video according
to our policies. When it comes to this—we have a lot of different policies
for everything from spam to nudity. But when it comes to this one, you'd be
looking at a policy which is promoting a dangerous act that has an inherent
risk of physical harm or death, which would come down.
You're also looking at a policy we have, around acts that could be copied by
minors, because we have so many young users on the site. So, those are a couple
of things.
Shall we go on and describe what happened?
JULIA KENNEDY: Yes. It looks like we have a couple more comments before we
do that, if that's okay.
PARTICIPANT: I think it all comes down to the bottom line of a policy judgment
call based on ethics, because legally the company has no obligation to respect
your right of freedom. The right of freedom is only vis-à-vis the government.
Companies can do that.
And then I would think that the company is protected by the statute
47 U.S.C. 230, which means the provider is shielded from publishing these
things. So, legally you can. It all comes down to a policy judgment based on
ethics.
JULIA KENNEDY: I think we need to move on, but I'd love to take your comments
in the next section.
So what steps did YouTube take?
ABBI TATTON: This one was hotly debated. There was no clear answer for any
of us on this because, like you said, there's no legal reason that we had to
take it down. At the same time, we have young users.
What we ended up looking at and balancing was our principle of access to information,
leaving up as much information as possible. At the same time looking at—we
have this "harmful and dangerous" policy, which is a video that "promotes
doing something harmful and dangerous."
Was that video actually promoting it? That's something we took into account.
It wasn't a cult going out telling you to commit
suicide. It was giving you information should you be looking for that.
Another thing we were looking at is just the fact that this is just a highly debated political issue.
And what would we be doing if we were taking it down from YouTube.
However, what we did do is put in some extra safeguards.
We have a policy where we can what we call, "age gate" a video. That
means you can only access it if you are a logged-in user over the age of 18.
So that immediately reduces the amount of young people that could quickly access
that video.
Another thing we've done around the issue of suicide—because we have so
many young users—is people searching on "suicide" search terms
will bring up, basically, a house ad for the National Suicide Prevention Hotline.
So it takes you to other information.
So yes, you will find this, you can get to it over you're 18. But the first
thing you'll see is "call this number if you're looking for help."
Is that a perfect solution? I don't know. But that's where we arrived at, given
that we are trying to give people access to information, and keep up as much
as we possibly can.
JULIA KENNEDY: One more comment, and then we'll move on to the final case.
PARTICIPANT: Just a question in a situation like this. Obviously—and
this is for Flickr, too—how much user input or concern are you looking
at in terms of, for lack of a better term, community policing of a lot of the
content, given that you probably just can't keep up with how much content you
have?
ABBI TATTON: I wanted to take a step back just for a moment and look
at what does that look like on our end, with 48 hours of video uploaded a minute?
I'm not sitting there watching it, right? It doesn't work that way. We couldn't
possibly prescreen all the content that's going up on the site.
So we rely on our users. The reason we put up these community guidelines—so
everyone can see them—is because it's our users who are our first line
of defense.
Underneath every single video on the site is a flag button and a way to flag
that video. You can flag it for a number of different things. Whatever you flag
it for, it will go to a reviewer. A flag does not take down a video; it gives
it to a reviewer, so the reviewer can then take a look at it.
Dealing with scale, we use technologies to try and review things as quickly
as possible. For example, if you are a user, and you just flagged 30 Justin
Bieber videos none of which should come down, maybe your 31st flag
won't be reviewed quite as quickly as someone else's flag coming in.
So there are algorithms like that that assess the flags coming in.
We have a team of reviewers who are working in different time zones around the
clock. So videos are being reviewed all the time, to the point of thousands
and thousands and thousands a day.
JULIA KENNEDY: Great. Thank you for those wonderful questions.
I also want to thank you for your bravery in sharing some of these case studies.
It's very helpful for us.
ABBI TATTON: I think three of them I suggested you wouldn't even show. You need
quite a thick skin to watch them.
JULIA KENNEDY: Yes. Speaking of which, the next one I think, depending on your sensibility, might
be even a little bit more hard to watch. With that warning, why don't we show
it?
ABBI TATTON: Some context for this video.
Again, if you are a reviewer, you don't know anything beyond the video you're
seeing, the tags below it, and any information that the user gave you. Sometimes a video won't be flagged until a couple of years after it was uploaded.
Or sometimes a video will be immediately talked about so much that it's being
flagged, and it's in everyone's mind.
Going back to the summer of 2009, in June, when we saw a flood of videos coming
in from Iran, this was the context. News organizations had been kicked out.
News organizations were setting up social media desks to look at the material
that they were finding online and assess it. So with that background—
[Video presentation: excerpt from "Iran, Tehran: wounded girl dying in front of camera, Her name was Neda." Warning: this is a very disturbing video.]
JULIA KENNEDY: The name of the young woman featured in the video is Neda.
She went into cardiac arrest and died during the 40 seconds of this video. The
question is: Is it a violation of Neda's privacy to leave this video up on YouTube?
Let's start there.
PARTICIPANT: Going back to the guidelines without any other context,
it looks like it's just showing somebody dying. So it's hard to say that it
has some other meaning beyond that. You've provided us the context now. But
that definitely is not clear from seeing the video. Perhaps given the particular
culture, it might not be appropriate to leave a video of somebody dying.
JULIA KENNEDY: Are there any signals of context that you can find? Say this
is the page that you are presented with, as a reviewer.
ABBI TATTON: Let me just point out a couple of things.
You have a title which is very straightforward. It's not alarmist in any way.
It's documenting a statement about the video. There was also extensive text
underneath this particular video describing what happened.
So, a video with absolutely nothing showing someone dying—we have policies
against graphic violence, shocking content, all of this. With this additional
context, does that change your mind?
JULIA KENNEDY: The title is difficult to
read, but it does give context to the video.
PARTICIPANT: Obviously, from looking at your guidelines, it says, "Don't
post videos of dead bodies." I think I have seen this video, though, or
something like it, on the news.
So I wonder from the angle of this, how does the news affect what goes on YouTube,
too? Do you look at content that's posted on CNN, or other broadcast news stations,
and look at that, compared to what's on YouTube? And does that have any impact
on your policies?
ABBI TATTON: Again, going back to the fact that we have guidelines, and
shouldn't we take this down. If this same video was shown elsewhere— say
we had taken it down and then CNN did a news report that was exactly the same
video with eight words of introduction. Wolf
Blitzer saying, "Now we're going to show you a disturbing video from
Iran," and then played it; would that change it?
And, if so, should that just be Wolf Blitzer at CNN? Who is the news? Who is
allowed? This person who shot this video, is he or she not just as important
a news documentarian as an established news network?
PARTICIPANT: Especially, given the rise of blogging, especially given
you have a situation in Iran where news reporters were kicked out—
ABBI TATTON: Four days before CNN's last—
PARTICIPANT: So, you wouldn't have a news reporter who is respected,
or understood to be one by the Western world, introducing this piece. Then,
is it de facto not news, because there is no one from the West witnessing it?
ABBI TATTON: Because no one's waving their credential around, right.
PARTICIPANT: Wasn't this on the cover of Time magazine as a still
photograph?
ABBI TATTON: Not on June 20, 2009, which is the point that the reviewer
was looking at the video, and deciding whether it should be up or not.
PARTICIPANT: Would it, then, have not been anywhere? In other words,
this was a big deal.
The minute I saw that I said, "That was the girl who became the symbol
of the revolution at that point."
ABBI TATTON: Right.
PARTICIPANT: So that I don't know why—I mean, it was news from practically
the minute it happened.
ABBI TATTON: While we have policies on graphic content, on graphic violence,
and on shocking or disgusting content, I can tell you right now that if
that same video, which was a full introduction of what the video was, and what
happened—had the title, "Wow, watch a girl die, exclamation point"—it would have come down
immediately.
We're looking at the intent, what was the intent of this video. That's why we
recently did a blog post about context, asking our users to tell us more, to
help us out. That's why a video which shows a bloody body being beaten, with
no additional context, would be brought down immediately. When we know that
that is a human rights activist, and this is the only information that he has
to document what happened to him, it's different. And our user will see it differently.
What we did with this video is apply what we call "educational-scientific-documentary-artistic
policy." Sorry, that's
a mouthful.
But essentially nudity comes down from YouTube; artistic nudity is a whole different
case.
Same with this. The documentary value of this video is so clear, it was basically
the turning point in this news story. It would be very difficult to take this
down.
Another challenge when we're talking about scale; search the word "Neda"
in YouTube today. We're not talking about this one video. There are 9,000. So
say we took down one. What's the policy then on the other 8,000 plus?
This video was uploaded. It was on Facebook, it was on Twitter. It wasn't just
on YouTube; it was all over the place. So this was a way that news spread, and
a person with a video camera became the news anchor, the news reporter, that
day.
So again it is kept up. It's behind an age gate, so it has a warning interstitial.
So, just like Julia told you, this was going to be disturbing, YouTube tells
you that, as well, and it's your choice to go on. But some difficult issues
certainly around that one.
PARTICIPANT: I think what is very powerful is the effect that YouTube,
Yahoo!, and other organizations are having on the world globally as a result
of documentary-value videos like this. That, I think, is at the heart of the
issue.
Basically what we're saying is that the access to information is changing our
world. In the past, we had better communication online that was affecting our
world. But now the dramatic effect of the video image is going to have implications
that we cannot even foresee at the moment.
People who are potentially going through heart operations can now see on YouTube
exactly what they are going to go through. And perhaps they either go into it
with full information, or be much better informed about whether they want to
go through the process or not, rather than trusting somebody.
I mean we are dealing in a world in which some of the tragedies of the past,
like the My
Lai massacre, the Pol
Pot massacre, the apartheid
issues, and the Nanjing
massacre—the world will know a lot more about all of these. I think it's
a big tribute to what YouTube and Yahoo! are doing in this regard, that you
are informing us much more quickly.
And we expect that you will not always exercise the best judgment in taking
things on or off. That's the price to be paid for the value that you are creating
for society. I would view it from that perspective and say leave it up because
these kind of things are changing our world.
ABBI TATTON: Just to that point, we do make mistakes. Just the sheer
volume means that that will naturally happen. There are human beings reviewing
videos. And sometimes they don't have access to all information. Or, sometimes,
something will come down for graphic content, where the person next to them
might have had a little bit more context, and understood it differently.
That's why we have an appeals process. So if your material was struck, you can
appeal that. It's not just our users. Every user can flag a video. But we also
hear about videos from journalists, from human rights organizations, from all
different kinds of people, alerting us to this material. Because, like I said,
we can't possibly prescreen everything that's going on up there.
PARTICIPANT: I just wanted to comment about the news media. Knowing what
the news media has built into—the "good news" media over decade,
in terms of editorial reviews, quality, and screening—I wonder if we are
at a turning point where we either need to accept that some of that has to go
out the window. Because, now, the tools that are at our disposal are such that
you just can't build that into the process anymore.
Or—the answer is probably
somewhere in between—we are at a point where YouTube recognizes that you
are a news media organization. And that some of the practices that have been
recognized as best practices in news media organizations have to be much more
part of your core fabric.
Could you describe what's happening now in YouTube along these lines?
ABBI TATTON: I think we feel pretty strongly, even though we face some
of the challenges about what to show, what not to show, and what kind of warnings
to put up there.
We've been developing, like you said. The news media has had decades to do this,
and we are developing in a very short time frame.
At the same time, we are not a news media organization. We did not vet this
video. How could we possibly, every time a video was uploaded? When I used to
work at CNN, before you put any social media video on the air, you would do
the reporting. Just because it came from YouTube didn't mean that your reporting
was any different than the person you were interviewing in an office.
There's no way, nor would we want to do that. We wouldn't want to ask people
questions—"Before you upload this, is that really you and your dad
going to the beach today?"
It's just not possible. So that's why it's great to see news organizations having
people within their structures and their news desks who are looking, vetting.
There's a group called Storyful
that's doing a great job gathering material from news scenes, and looking at
what's coming in. And then going the extra step, and saying, "Okay, but
is this real?" That's just not something that we are equipped to do, or
even want to be doing.
JULIA KENNEDY: These have been a raft of really interesting cases. I appreciate
all of your participation.
Before I let you go, I want to pose a question that any of our speakers can
weigh in on, which is: We've been talking a lot about privacy and free expression
and Internet communication technologies today. I'm curious how the types of
community guidelines and internal processes that Yahoo! and YouTube have put
in place can be applied to other types of businesses and other industries when
approaching these types of human rights dilemmas.
ABBI TATTON: I'd rather have advice from you guys. [Laughter]
EBELE OKOBI-HARRIS: This is something that we need too. I was talking
earlier about the IC [information communications]
techs. I mean if you're in oil and gas, you're used to seeing—I think CSR
just has a much older tradition there. So I think within the ICT
[information and communications technology] sector this is just so new.
For us this is why being a part of GNI was really important, continually engaging
with NGOs is really important—because I think the culture within a lot
of ICT companies is very young.
But it's also a culture where people feel like, just by doing our jobs, we're
changing the world. But you could think of things, if you did things a
little bit differently, what you do actually has a greater impact on human rights.
So I think I would echo Abbi by saying I very much enjoy the opportunity to
engage, and to hear from others.
PARTICIPANT: I think there are two areas where this has significant impact
for businesses.
One is the guidelines that are introduced in many companies for the whistle-blower
provisions, to protect the whistle-blowers, to protect their anonymity, to ensure
that they are not penalized for informing the malpractices that are taking place.
I think that has had very good impact.
It was very interesting that in the Dodd-Frank
bill, that was passed recently, the whistle-blower provisions were strengthened.
Perhaps that's an indication that the media awareness—that your organizations
have made possible—have now had some business impact in which we are saying,
"Let's protect these people, because they are serving a public interest."
Another area is the Foreign
Corrupt Practices Act, where the guidelines that companies have in force
are very detailed, to ensure that the provisions of those laws are not violated.
Again, I think this is an extension of what we are trying to do.
PARTICIPANT: One, I think there's a tremendous amount to learn from ICT,
in terms of pushing the boundaries of transparency. So many, many corporations
are being asked to be more transparent about their activities, and have not
caught up with the fact that people, in general, are sharing much more information
now. They're doing it at a rapid pace, and there is a sort of public citizenry
of media— which I think is really interesting—in some of the information
that you're presenting.
One case that we've worked at, at BSR
[Business for Social Responsibility]—I just want to bring up as an example
from another industry which might be of interest—working with a very large
company that's producing low-cost health-care equipment. Which on its face is
a very good thing, bringing ultrasound equipment that's portable and low-cost,
into communities, globally, that don't have access to health care.
There was an unintended consequence of that, which was that the ultrasound equipment
was being used for sex selection, in India and China, in particular, in feticide
for girl fetuses. It was an interesting conundrum, because do you pull the ultrasound
equipment from this community, therefore denying access to critical health care?
Or do you grapple with the fact that it has this unintended use, and that you—as
the company that produces it—are responsible for that?
I bring that up because I think in any of the industries that we work in, there
is a real need to think about the fact that I'm producing a product or a service.
And I'm producing it with this goal in mind. And I'm focused, and I'm an entrepreneur.
Or I'm an engineer, and I'm looking at it from this perspective, but opening
up the dialogue to think about what the potential unintended uses of that might
be.
We can never know, but always just try to have that dialogue with groups outside
of your own product group.
EBELE OKOBI-HARRIS: I think your point about unintended consequences
is such an important one. I just want to tell a very quick story also using
Flickr.
Flickr, as I said, is used by a lot of people. It was used by a man in Iran
who started off, as he says—if you look at his page in Flickr, it says
"Flickr made me the person I am today." Because he started off as
an engineer, he never was a photographer. He started using Flickr to take pictures.
People gave him comments.
He went from taking pictures of his kids—it's the classic, for me, problem
of the accidental activist— and he started taking wildlife pictures
in Iran. And then the Green Revolution happened. And, boom, all of a sudden he's a journalist.
So he's taking pictures of everything going on. As you heard before, news reporters
were kicked out. So his pictures start getting used by all these news organization.
Now, he has a Flickr account that has his name on it, that has pictures of him,
that has pictures of his children. And there he is. He has now become a very
public person.
When his pictures were first used, he was very excited about it. He posted it
on his Flickr page. Although he was upset that Der Spiegel didn't give
him credit, that they had stolen his photograph—so there was that issue.
And, so, we connected with him, and we featured him at our Business and Human
Rights Summit. A year later he was arrested because of his pictures that were
posted on Flickr.
So when you talk about unintended consequences, there is that aspect—okay,
this is amazing, he was able to use Flickr not only as a tool to become a better
photographer. But he was also able to use it as a platform to create social
change. But then, because of Flickr, he was arrested.
So then as a company—I just point that out that we would never have thought
of all these consequences. You just don't think—when you create
a product, you don't necessarily play out everything.
To your point, it has been incredibly important for us to continue to engage
with NGOs and with other people. Because, again,
the extent to which and how your products are used, in ways that could be harmful,
you don't often know.
JULIA KENNEDY: I do want to give the final comments to Rachel and Susan,
who started us off today.
RACHEL DAVIS: My comments, very briefly, would be that I think that there
are some real similarities with the challenges we see in other sectors, in other
companies. But that, with a lot of what you've been talking about, there are
lessons on both sides.
So, policies are a common theme. But policies—and we take it as a given
that they are informed by company values—are never going to give you the answer in every
case. So the question is always: What processes are there in place, and how
good are those processes at responding rapidly?
So back to the escalation question. If you're a mine, you know that if there's
a community problem.They block your road and your operations are stopped. That's
an immediate direct impact on the company, so you pay attention. For other companies
it's not so direct. So that's something where you see lessons being learned.
So what's important is having processes in place that also are capable to this point of assessing
new risks—so they're ongoing processes—and constantly scanning what are the
human rights implications of our activities, and how do we make sure we're keeping
up with them, so that you can then explain the processes that you have in place.
If you're not immediately on top of an issue, you can say, "This is how
we are addressing it. And these are the types of considerations we take into
account."
I think as far as what other industries can learn from you guys, the stakeholder
engagement and the dialogue piece is really, really critical. You won't understand
the human rights implications of what you are doing unless you are talking to
the human rights activists, as you are discovering. So I think that's key.
SUSAN MORGAN: My reflections on this industry, is that there are lots
of similarities around the intersection between business and human rights, in
this industry and others.
But there are some really big differences too. I think they are probably kind
of scale, just the sheer amount of photos and videos.
Also, speed, the speed at which things change. Now, who could have imagined
at the beginning of the year, that you would find that issue with the photos
in Egypt? And just the speed of innovation, I think, and complexity of the issues.
It strikes me that the important thing is to find a way of dealing with uncertainty,
and find a way of institutionally being able to react to circumstances, processes,
and things that come up. Whether that's risk assessment, whether it's having
a team who are dedicated, whether it's making sure that other stakeholders are
involved, so that you've got those connections and those relationships—I
think it's finding ways of accepting and dealing with the fact you can't work
out every imaginable consequence before you start.
JULIA KENNEDY: Great. Thank you so much for those insights.