Facebook: Cracking the Code (2017)

1
In 2014, Facebook scientists
published the results
of one of the biggest
psychological experiments
ever conducted.
They took almost 700-thousand
Facebook profiles
and deliberately skewed
their news feeds
to be either more positive
or more negative.
Then, they used the company's
sophisticated algorithms
to measure any shift
in people's mood.
Their findings?
The more positive the feed,
the happier Facebook users
seemed to be.
The more negative,
the more depressed they became.
It proved the power of Facebook
to affect what we think
and how we feel.
Facebook has very cleverly
figured out
how to wrap itself
around our lives.
It's the family photo album.
It's your messaging
to your friends.
It's your daily diary.
It's your contact list.
It's all of these things
wrapped around your life.
This is the story of how
one of the World's biggest
and most powerful
private corporations
is turning our lives and
our data into vast profits,
and in ways we have
no control over.
They are the most
successful company
arguably in human history
at just gathering people's time
and turning that time
into money.
Like his company,
Facebook's founder
hardly needs introducing.
Mark Zuckerberg started
the social media platform
13 years ago when he was just 19
as a site for Harvard
undergraduate students.
When we first launched,
we were hoping
for maybe 400 to 500 people.
Harvard didn't have a Facebook
so that was the gap
that we were trying to fill
and now we're at 100,0000 people
so who knows
where we are going next.
Within its first month,
more than half the students
had joined, setting the trend
for the membership explosion
that followed.
Now, almost a quarter
of the world's population
has signed on.
It is bigger than any country.
Facebook is now
a global colossus.
It is one of the world's
most valuable corporations,
worth over 400 billion dollars.
Mark Zuckerberg
is an international powerbroker
in his own right.
He's like a king, right?
He's like a monarch.
He's making decisions
about your life on Facebook,
what the rules are,
and he's a benevolent dictator.
You can't say this is
accountable governance
or a participatory governance
in any particular way.
But almost two billion users
still isn't enough for Facebook.
Mark Zuckerberg is aiming
for the next billion.
There is a limit to how much
they can grow
in established markets like
North America and Australia.
But the 32-year-old businessman
sees huge potential
in the developing world.
There are a billion people
in India
who do not have access
to the internet yet
and if what you care about is
connecting everyone in the world
then you can't do that
when there are so many people
who don't even have access
to basic connectivity.
There's a term that's being used
by folks connected to
Facebook and Google
called the last billion
where they're basically trying
to figure out a way
to spread internet access,
but the internet
that they're going to spread
is an internet that's shaped by
Facebook and Facebook's agendas.
That's actually part
of the long game here.
Most people in a lot
of the developing world
are accessing the internet
through their mobile phones
and there are these programs
that are known as zero rating
or Facebook Zero
so when you get you smartphone,
you get free data
if you're using Facebook
and so people stay on Facebook.
They don't go anywhere else,
so that their whole world
on the internet
becomes very much the same
as, you know...
they don't know
any other kind of internet.
Facebook is a free service,
but that's because
Zuckerberg has learned
how to turn our data
into dollars. Lots of dollars.
Last year his company
earned 27-and a half billion US,
just under 16 dollars
for each user,
and he's buying even more
internet real estate.
Clearly there's one topic
we have to start with.
You bought WhatsApp
for 19 billion dollars.
Why did you do it
and what does it mean?
You can look at
other messaging apps
that are out there, whether it's
Cacao or Line or WeChat
that are already monetizing
at a rate of two
to three dollars per person
with pretty early efforts
and I think that shows
if we do a pretty good job
at helping WhatsApp to grow
that is just going to be
a huge business.
Facebook has WhatsApp,
Facebook has Instagram.
Facebook has Oculus Rift,
not necessarily mainstream
but these are very big
corners of the internet.
Should we be concerned
about a monopoly?
We should always
be concerned about monopoly.
We should always be concerned
about concentration of power.
We should always be concerned
about that
and we need to hold their feet
to the fire at all times.
Facebook is all about
community,
what people all around the world
are coming together to do -
connecting with friends
and family,
informing these communities.
Facebook presents itself
as a digital platform,
a neutral stage upon which
life plays out.
2016:
we all went through it together
It says it is a company that
develops digital technology,
not social engineering.
For all the talk
about community,
Facebook is neither democratic
nor transparent.
Any place we go to
that is not truly open,
that's not governed by us
as users,
that's not governed by
some democratic accountability,
is actually a place
that is not truly ours.
It's a place that we can use,
it provides great value
in many ways,
don't get me wrong,
to its users.
But it's incorrect to see it
as a neutral place.
It can do things
like a government
and indeed it has inherited
some government-like functions,
but I don't think that passes
the smell test
to imagine that Facebook
or any online platform
is truly democratic,
they're not.
If we tell the computer
to look at two numbers
and compare them and put
the larger number on one side
and the smaller one
on the other then,
with a series of steps
we will be able to reorder it.
To understand
how Facebook works,
we need to understand
what goes on under the hood.
The engine that drives the
system is built on algorithms -
sets of instructions
that Facebook's engineers use
to determine what we see
in our News Feed.
Dr Suelette Dreyfus,
an information systems expert,
demonstrates how
a basic algorithm works.
Typically, an algorithm might be
for processing some data
or doing some arithmetic,
summing something for example,
or it might be
to try and recreate
the decision-making process
that we use in our human brain
on a more sophisticated level.
Facebook's algorithms
were originally configured
to help Harvard University
students
stay in touch with one another.
They exploited the way
the students had
a small group of close friends,
and a wider,
looser social circle.
The algorithms are now
vastly more complex,
but exactly how they work
is a closely guarded
commercial secret.
We do know that they are
designed with one aim in mind -
to keep us online
for as long as possible.
The algorithms are designed
to be helpful
and give us information
that's relevant to us,
but don't for a minute
assume that
the algorithms are just there
to help us.
The algorithms are there to make
a profit for Facebook.
And that is Facebook's genius.
It is a giant agency
that uses its platform
to deliver us advertising.
By tracking what we do,
who we associate with,
what websites we look at,
Facebook is able make
sophisticated judgements
about the stories we see,
but also advertising that
is likely to move us to spend.
We will probably
always live in a world
with old fashioned display ads.
Times Square simply wouldn't be
the same without it.
But these ads nudge
towards products
with all the subtlety
of a punch in the nose.
Facebook on the other hand uses
the extraordinary
amounts of data that it gathers
on each and every one of us
to help advertisers reach us
with precision that
we've never known before.
And it gives anybody
in the business of persuasion
power that is unprecedented.
Depending on what we post
at any given moment,
Facebook can figure out
what we are doing and thinking,
and exploit that.
Facebook's very well aware
of you know our sentiment,
our mood and how
we talk to people
and it can put
all that data together
and start to understand
like who our exes are
and who our friends are
and who our old friends are
and who our new friends are
and that's how it really works
to incentivise another post.
What you're saying is
Facebook has the capacity
to understand our moods?
Yes.
Could that be used to influence
our buying behaviours?
Of course it can be used
to influence our behaviour
in general, not just buying.
You can be incredibly
hyper targeted.
Can I give you an example?
We don't always act our age
or according to
our gender stereotypes.
A middle-aged woman
might like rap music.
She is sick of getting ads
for gardening gloves
and weight loss.
So she posts on her Facebook
that she likes Seth Sentry's
Waitress Song.
Now she gets ads
for a streaming music service -
something she might
actually buy.
Adam Helfgott runs a digital
marketing company in New York.
He uses a tool called
Facebook Pixel.
Facebook gives it to advertisers
to embed in their sites.
They can track anybody
who visits their site
and target them with ads
on Facebook.
Well if you've ever logged
into Facebook
with any of your browsers,
it's a good chance
it'll know it's you.
You don't have to be logged in,
you have to have been there
at some point in time.
If it's a brand new computer
and you've never
logged into Facebook,
Facebook at that moment in time
won't know it's you,
but based upon their algorithms
and your usage
they'll figure it out.
So, what you can then do
is put this piece of script
onto your website.
And then use Facebook data
to find the people
that looked at your website
and then target ads to them.
That's correct.
Through Facebook.
- Yep.
That feels a little bit creepy,
I mean...
are there privacy issue
involved with that?
From a legal point of view
there's no privacy issue,
that's just the internet today,
and the state of it
and using a product
that generates a lot
of revenue for Facebook.
For advertisers it is a boon -
giving them access to the most
intimate details of our lives.
Megan Brownlow
is a media strategist
for Price Waterhouse Coopers
in Sydney.
When you change your status,
for example,
we might see something,
a young woman
changes her status to engaged.
Suddenly she gets ads
for bridal services.
These sorts of things are clues
about what her interests
might really be.
The research from consumers
is they don't like advertising
if it's not relevant to them.
If it actually is something
that they want,
they don't mind it so much.
This is actually
not a bad thing.
Nik Cubrilovic is
a former hacker
turned security consultant.
He's been using his skills
to investigate
the way our data is tracked.
One day Cubrilovic
made a discovery
that startled the tech world.
He found that even if you're not
logged on to Facebook -
even if you're not a member -
the company tracks and stores
a huge amount
of your browsing history.
And you can't opt out.
If you don't like Facebook,
if you don't like the kinds
of things you're describing,
just close your account?
It's very difficult to opt out
of Facebook's reach on the web.
Even if you close your account,
even if you log out
of all of your services
the way that they're set up,
with their sharing buttons
and so forth,
they're still going to be able
to build a profile for you.
And it's just not going to have
the same level of information
associated with it.
They don't even tell us clearly
what they're doing.
They tell us some things
but it's not specific enough
to really answer the question,
if somebody was going
to build a dossier on me
based on what Facebook
knows about me,
what would it look like?
I should be able to know that,
so that I can make
informed decisions
about how I'm going
to use the platform.
Facebook is not just
influencing what we buy.
It's changing the world
we live in.
Sure they want to
bring their service
to everybody on the planet.
From a commercial standpoint
that's obviously a goal.
Whether it makes the world
a better place
is another question.
Not only have you built
this big business
and this big social network,
you now are possibly determining
the course of world events.
That's exactly what happened
in the streets of Cairo.
In January 2011,
millions gathered in the city
demanding the resignation
of the autocrat Hosni Mubarak.
It became known
as the Facebook revolution.
The organizers used Facebook to
rally vast crowds of protesters.
They were so effective
that the government
tried to shut down the internet.
It took just 18 days
to topple Mubarak.
So what Facebook came to stand
for several months I would say
or at least in its early days
after the events of Tahrir
Square in the Arab Spring
was a symbol of people's ability
to organize and express
and share information
more widely.
It symbolised that so much so
that I like to tell stories
about how I could buy
T-shirts in Tahrir Square
which said
"Facebook, Tool of Revolution".
I understand as well as anybody
just how effective
Facebook can be.
Three years ago,
I was imprisoned in Egypt
on trumped up terrorism charges.
My family used Facebook as
a way of organizing supporters,
and keep them informed.
It became one of the most
vital tools in the campaign
that ultimately
got me out of prison.
The Facebook page became a place
that anybody could find
the latest on our case.
The underlying algorithms
helped push it to people
who might have been interested
in supporting our cause,
even before
they knew it existed.
Peter!
How are you, man?
Good to see you.
- Good to see you, man.
Let's go inside, mate.
It's cold.
Mohamed Soltan was also
arrested for protesting.
He was imprisoned
in the same jail as me.
There was this medium
that people just wanted
to express themselves because
there was no other avenue
in the public space
to express themselves
and then they found
this outlet...
and then they found this outlet
to the outside world as well,
where they would put
how they feel
about social justice issues,
on just day to day inequalities
that they've seen
and then there was
the second phase of that
where they saw that
there's quite a few of them
that feel the same way about
a lot of different things.
It took a prolonged court case,
a 500-day hunger strike
and intense international
pressure to get him released.
For him too, Facebook became
an invaluable tool.
Facebook unlike other platforms
and social media outlets,
it allowed for us
to put out the reports,
the medical reports, it allowed
for us to share my family,
to share stories
and it establishes credibility.
So Facebook provides this place
that is almost ideal
for finding like-minded people,
whether it means finding people
who live in a certain place,
who are interested
in a certain thing
or people who are in the thrall
of a dangerous ideology.
In Australia, Facebook has also
become a powerful political tool
for mainstream causes,
and groups on the fringe.
They don't want
to represent you.
They want to represent the great
global international agenda
that corrupts our people
from the inside,
builds mosques
in our communities
and floods our beautiful country
with third world immigrants.
But is that what we want?
- No!
Blair Cottrell leads a group
called the United
Patriots Front -
a right-wing movement
built on Facebook,
that campaigns against Muslim
and immigrants across Australia.
Facebook's been
extremely effective for us.
That's indispensable
to the development
of our organisation.
Without it, we would probably be
a separatist cult,
where no one would be able
to relate to us,
because no one would actually
hear us directly,
they would only hear about us
through established
media corporations.
Islam can only pose a threat
to our nation
if our weak leadership,
or rather lack of leadership,
is allowed to continue.
Facebook has helped turn
a disparate group of individuals
into a political force
that some say is dangerous.
It gives us the ability
to cut out the middleman,
to go directly to the people,
to the audience
with our message,
to speak directly
to the Australian people
which is a power that hitherto
has only been held
by established
media corporations
and anybody who speaks
through such media corporations.
But now anybody has that power.
Anybody has access
to that power.
Some of the UPF's
more inflammatory statements
have been censored -
he's been prosecuted
for staging a mock beheading
that they filmed
and posted on Facebook.
Facebook removed
some of their posts,
including the original
beheading video,
and threatened
to suspend the page.
Sometimes Facebook has removed
or censored
certain posts of ours
because we've used
the world Muslim for example.
Not in a negative way at all.
If we'd explained an incident
or a point of view
and we've used the world Muslim,
sometimes that registers
in Facebook's computer
and they automatically delete it
for some reason.
I don't know if it's a person
who deletes it or a computer,
but that can be
a bit problematic.
We actually started altering
the way we spelt Muslim
in order to have our posts
remain up
when we were speaking about
the Muslim people
of the Islamic faith.
Facebook has been criticized
for the way it censors
controversial posts.
Whenever someone flags
a post as offensive,
it gets sent
to a human moderator
who decides
if it should be taken down.
The company says it reviews
a hundred million pieces
of content every month.
People are under
a lot of pressure
to review a great deal
of content very quickly
and I certainly hear about
what appear to be mistakes
quite frequently and some
of them are kind of ridiculous
like at the end of 2015
a bunch of women named Isis
had their accounts deactivated
because clearly somebody went
and flagged them
as being terrorists.
After an Egyptian court
convicted me
of terrorism charges,
my own Facebook page
was suspended.
We were never told
why it disappeared.
Facebook says it was
to protect my privacy.
We believed I'd been labelled
a terrorist,
violating what Zuckerberg calls
its community standards.
You can't have a common standard
for 1.8 billion people.
Our diversity is actually
our strength, right?
Part of what makes us
a global community
is the reality of that
what forms a global community
are our incredibly
fundamental differences.
In one infamous example,
Facebook removed a post
showing one of the most powerful
images of the Vietnam war.
The photograph of a naked girl
violated
its community standards.
The community standards
are developed by his staff.
The community didn't develop
those standards.
They're called
community standards
but they were developed
by Facebook
and yes they've had input
here and there over time,
they also get input
from governments
about... you know, recently
a number of governments
told them You need to amend
your community standards
to be harder
on extremist content", you know.
And so they amended
their community standards.
It's not like the community
got together
and developed these standards.
Facebook is also transforming
politics as we know it.
Politicians have used social
media for years of course,
but in this last election
campaigners used big data
to radically transform
American politics.
In the words of some observers,
they weaponized the data.
We're on our way
to Washington DC
to find out what impact
Facebook has had
on the fight for
political power.
At the heart of political power
is information.
That's why government
security agencies
go to extraordinary lengths
to vacuum up data.
But increasingly it is also
becoming the key
to winning power.
I think that there's
a legitimate argument to this
that Facebook influenced
the election,
the United States Election
results.
I think that Facebook
and algorithms
are partially responsible,
if not the main reason why,
there's this shift towards
hyper partisan belief
systems these days.
We will soon have, by the way,
a very strong
and powerful border.
When Donald Trump became
the presidential frontrunner
in last year's US election,
few pundits predicted
that he'd actually win.
One of the Trump campaign's
secret weapons
was an ability to research
social media data
in extraordinary detail.
It helped him understand
and target his voters
with a precision
we've never seen before.
By using Facebook's
ad targeting engine for example,
they know if some of those
independent voters
have actually liked
Republican pages
or liked the Bernie Sanders page
or like a Donald Trump page.
So you can go to them
to spend money,
to target advertising
specifically to those voters
and it is a much more reliable
ultimately form of targeting
than many of the other
online vehicles out there.
Political strategist
Patrick Ruffini
runs a company that mines big
data for the Republican Party.
He produces social media maps
that help them make sure
their political messages
hit their targets.
What it does give us is
much greater level of certainty
and granularity and precision
down to the individual voter,
down to the individual precinct
about how things
are going to go.
It used to be we could survey
eight hundred,
a thousand registered voters
nationwide,
but you couldn't really
make projections
about understanding from that
an individual State would go
or how an individual voter
would ultimately go.
I Donald John Trump
do solemnly swear
that I will faithfully execute
the office
of President
of the United States.
It is one thing to know
your voters of course,
and quite another to get them
to change their minds.
Facebook can help with that too.
The ability to take the pools
of big data that we've got
and do really deep
analysis of it
to understand small groups
of customers' preferences,
can be applied
in a political setting
in a way that is
potentially worrying
because it allows politicians
to potentially lie better.
For instance,
you can't make a political
statement on television,
without it being disclosed
who it was being paid for.
Those same controls
on the web are very lax.
For instance,
I could see a story about
a certain XYZ politician
has done a great thing,
produced on a completely
different third party news site.
I cannot know that
that ad was placed
by a political operation who
have specifically targeted me,
because that information
is not disclosed, anywhere.
I am understanding yet troubled
by the data driven advertising
and targeting ads that occur,
but I'm even more uncomfortable
by the reality
that our elections and
how our elections are structured
and configured
can be hijacked by these forces
that are not transparent to us.
One of the most important parts
of any democracy is news -
almost half of all Americans
get theirs from Facebook.
But the last US election
also saw the explosion
in fake news,
turbocharged by sharing
on Facebook.
These things look like news,
they function like news,
they're shared like news,
they don't match up
with traditional ideas
of what news is for
and what it should do.
Facebook is in the middle
of this,
they are the company
that can see all of this
and make judgements about it,
I think they would prefer
not to have to do that.
Adam Schrader is a journalist
who used to edit stories
for Facebook's Trending News
section.
Part of his job was
to filter out fake news.
Essentially, we operated
like a newsroom,
it was structured
like a newsroom,
copy editors would make sure
that the topics met standards,
make sure that they were
unbiased, checked facts.
There were often times that
a fake articles would appear
and present themselves
as possibly being
a legitimate trending topic.
And our job was
identifying those
and the original term
was blacklisting.
In the heat of the campaign,
right-wing commentators
accused the team of bias.
Facebook sacked it and
handed the job to an algorithm.
An algorithm cannot do the job
of a trained journalist.
They don't have
the ability to reason,
artificial intelligence
hasn't gotten to the point
where it can really
function like a human brain
and determine
what has news value
and what is good for the public
and what is not.
Schrader says
after the team was sacked,
fake news really took off.
Yeah after the trending news
team was let go,
there was a big problem
with sensational
or factually incorrect
or misleading news sources
and trending topics.
It was just a disaster.
The more partisan
news sources you consume,
the less likely
you are to believe
fact checkers or experts.
And so, this can create
some really dangerous
divisive believers
of alternative facts.
During last year's
election campaign,
a news site published a story
alleging that Hillary Clinton
and her campaign chairman
John Podesta
were running a child sex ring
out of the Comet Ping Pong
Pizza restaurant
in Washington DC.
The story was widely shared
on Facebook.
It was utterly fake,
but it gained so much traction
that a 28-year-old man
finally went to the restaurant
armed with an assault rifle,
a revolver and a shotgun
to find and rescue the children.
One of the hosts runs up
and he's like
"Did you seee that guy?
He had a big gun".
He fired several shots
into the restaurant
before police arrested him.
The story still circulates
on line and on Facebook.
I don't think that I trust
the general public's ability
to identify fake news,
real news, anything like that.
One study found that
in the closing months
of the US election,
Facebook users shared
the top 20 fake news stories
over a million times more
than the top 20 stories
from major news outlets.
Fake stories are often written
either for political advantage,
or to make money.
There are a lot of people out
there who aren't journalists
and aren't publishers
who are publishing.
They don't have the same
sense of obligation
so we are really
in uncharted territory.
I think one of the most
important things
is that we actually need
a big public debate about this
because it's changed
the nature of news,
and in doing so it's changed
our reality as a society.
If you suspect a news story
is fake, you can report it.
Mark Zuckerberg initially
dismissed the notion
that Fake News somehow
skewed the election,
but he is rolling out a system
that allows
the Facebook community
to flag suspect stories.
Mark Zuckerberg has said that
he's not in the news
publication business, right?
That they're not
a media company.
But I think that's a mistake,
kind of a denial, right?
So, they're definitely
a media company
and I think that they should try
and treat themselves
more as one in the future.
As more and more consumers get
their news from digital sources
and Facebook in particular,
the old-fashioned world
of newspapers
and TV stations is collapsing.
Like most news businesses,
the New York Times
is struggling.
Facebook sends plenty of readers
to their stories,
but most advertising dollars
go to Facebook.
Newsrooms are shrinking
along with the resources
for serious journalism.
You'll hear this from small
publications and large ones
that their absolute audience
is larger than it's ever been
and that surely
has to mean something
but it certainly
hasn't meant profits.
I don't think there's a major
news company in the world
that hasn't changed
its operations around Facebook
in a real way and I mean that
both in the way
that it produces stories
and approaches stories
and the way that it makes money.
If Facebook is playing
an increasingly important role
in how we understand the world,
its social mood study showed
it affects how we feel about it.
When its researchers explained
how they manipulated
the news feeds
of some 700,000 users,
they were criticized for playing
with people's psychology
without telling them.
And yet it's algorithms do that
every day.
They give us stories
they know we want to see,
to keep us online, and help
advertisers send us more ads.
I don't think we can treat
Facebook as benign.
I think it has
enormous implications
for how we experience our lives,
and how we live our lives,
and I think simultaneously
that's one of the things
that makes that network
and others like it
so phenomenally interesting
and important.
But Mark Zuckerberg's plans
would have Facebook do far more.
He wants its users
to rely on the network
for the way we organize society,
discover the world
and conduct politics.
He wants the community
to inform Facebook,
while Facebook
watches over us.
The philosophy of everything
we do at Facebook
is that our community
can teach us what we need to do
and our job is to learn
as quickly as we can
and keep on getting
better and better
and that's especially true when
it comes to keeping people safe.
In a post on his own profile
in February,
he outlined his plan
for the company -
a kind of manifesto
for the central role
he wants the platform to play
in societies around the world.
We're also gonna focus
on building
the infrastructure
for the community,
for supporting us,
for keeping us safe,
for informing us,
for civic engagement
and for inclusion of everyone.
So it's a document that really
felt like an attempt
to take some responsibility
but it wasn't apologetic.
It was fairly bold
and it seems to suggest
that the solution to Facebook's
problems is more Facebook.
Zuckerberg has great ambition.
"Progress needs humanity
to come together
as a global community",
he writes.
Facebook can help build it.
It wants to
'help fight terrorism',
while it's news service can
show us 'more diverse content'.
He is behaving recently in ways
more befitting
of a politician than a CEO.
There's a lot of speculation
that he may run for office
and to my mind,
if Facebook continues to be
as successful as it is,
not just through Facebook
but through its other products,
through Instagram and WhatsApp,
if it continues to be so central
in people's lives,
he doesn't need
to run for office,
he will be presiding
over platform and a venue
where people conduct
a real portion of their lives.
It seems there is
almost no limit
to Facebook's intrusion
into our lives,
for better or for worse.
I want to thank all of you
for joining us to hear more
about some of the things
we're working on at Facebook
to help keep
our communities safe.
Zuckerberg convened what
he called the Social Good Forum
to help people whose
Facebook posts indicate
that they might be at risk
of harming themselves
or committing suicide.
We're starting to do
more proactive work.
Like when we use
artificial intelligence
to identify things that
could be bad or harmful
and then flag them
so our teams can review them.
Or when someone shares
a post that makes it seem
like they might want
to harm themselves.
Then we give them
and their friends
suicide prevention tools
that they can share
to get the help that they need.
The ability to get big data
and to gather data about us
in all aspects of our lives
creates a particular
type of power among the people
or organizations
that have that data.
So they can say "Oh these are
your daily habits".
There's been some research done
that nearly half of what we do
is just repeating patterns
of what we did the day before.
From that,
you can also predict potentially
how people will behave
in the near future as well.
And that's perhaps
a little bit concerning
for people who care a lot
about their privacy.
You use the word potential harm,
that's a fairly big word.
That's a fairly serious phrase.
What sort of harm do you mean?
There are a couple
of factors here.
The first is the issues
that we know about already.
They are from little things,
such as ad targeting,
giving away what your girlfriend
bought you for your birthday,
and doesn't want you
to know about
through to a 14-year-old boy
who hasn't come out as gay yet,
and his parents
discover the fact
because of the advertising
that's being targeted to him,
through to potential
future harms.
One of the problems
in the privacy realm
is that we only have
one identity,
and we can't take back
what we've already handed over.
Facebook is far more intimately
involved in our lives
than any company
we've ever seen before.
Facebook has a responsibility
to inform people
of what is happening
to their data
and so then there can be
a conversation
also with "the community"
about whether people agree
that this is an appropriate use
and right now
they're not providing
enough information
for that conversation
to take place.
There's no take-back
on private data.
The implications
that are going to occur
in five or 10 years' time,
we need to protect
against that now.
And to an extent,
it almost feels like,
I'm reminded of Einstein's
letter to Eisenhower
warning about
the potential dangers
of nuclear holocaust
or whatever,
with the dangers in uranium -
not to say that
it's that severe,
but we are at the point now,
where we know
that there's danger,
but we don't know
the extent of it
and we don't know the potential
implications of it.
Our lives are now
measured in data.
What we look at, who we talk to,
what we do is all recorded
in digital form.
In handing it to Facebook,
we are making
Mark Zuckerberg's company
one of the most powerful
in history.
And the question is,
at what cost to ourselves?