Facebook’s
disclosure last week that it had tinkered with about 700,000 users’
news feeds as part of a psychology experiment conducted in 2012
inadvertently laid bare what too few tech firms acknowledge: that they
possess vast powers to closely monitor, test and even shape our
behavior, often while we’re in the dark about their capabilities.
The publication of the
study ,
which found that showing people slightly happier messages in their
feeds caused them to post happier updates, and sadder messages prompted
sadder updates, ignited a torrent of outrage from people who found it
creepy that Facebook would play with unsuspecting users’ emotions.
Because the study was conducted in partnership with academic
researchers, it also appeared to violate long-held rules protecting
people from becoming test subjects without providing informed consent.
Several European privacy agencies have begun
examining whether the study violated local privacy laws .
Facebook
and much of the rest of the web are thriving petri dishes of social
contact, and many social science researchers believe that by analyzing
our behavior online, they may be able to figure out why and how ideas
spread through groups, how we form our political views and what
persuades us to act on them, and
even why and how people fall in love .
Most
web companies perform extensive experiments on users for product
testing and other business purposes, but Facebook, to its credit, has
been unusually forward in teaming with academics interested in
researching questions that aren’t immediately pertinent to Facebook’s
own business. Already, those efforts have yielded several important
social science findings.
But
there’s another benefit in encouraging research on Facebook: It is only
by understanding the power of social media that we can begin to defend
against its worst potential abuses. Facebook’s latest study proved it
can influence people’s emotional states; aren’t you glad you know that?
Critics who have long argued that Facebook is too powerful and that it
needs to be regulated or monitored can now point to Facebook’s own study
as evidence.
It
is problematic that Facebook roped users into the study without their
express consent. The company has apologized, and now says it will look
at ways to improve its guidelines for conducting research. “After the
feedback from this study, we are taking a very hard look at this
process,” said Jonathan Thaw, a Facebook spokesman.
If
Facebook figured out a way to be more transparent about its research,
wouldn’t you rather know what Facebook can do with the mountains of
information it has on all of us?
Wouldn’t
you also be interested in what other tech companies know about us? How
does Google’s personalized search algorithm reinforce people’s biases?
How does Netflix’s design shape the kinds of TV shows we watch? How does
race affect how people navigate dating sites?
After the outcry against the Facebook research, we may see fewer of these studies. That would be a shame.
“It
would be kind of devastating,” said Tal Yarkoni, a psychology
researcher at the University of Texas at Austin. “Until now, if you knew
the right person at Facebook and asked an interesting question, a
researcher could actually get collaborators at Facebook to work on these
interesting problems. But Facebook doesn’t have to do that. They have a
lot to lose and almost nothing to gain from publishing.”
If
you’ve been cast in a Google or Facebook experiment, you’ll usually
never find out. Users who are put into experimental groups are selected
at random, generally without their knowledge or express permission.
While Facebook says people agree to such tests when they sign up for the
site, users aren’t given any extra notice when they’re included in a
study.
ne
problem is that obtaining consent may complicate experimental results.
“Facebook could throw up a bubble asking people to opt-in to each test,
but it would totally mess up the results, because people would be
selecting themselves into the test,” Mr. Yarkoni said. (Offline social
science and medical researchers face a similar problem.) Another option
would be for users to be periodically asked whether they wanted to take
part in research, but some research ethicists have balked at the
prospect of not giving users individual notice of each study.
Ryan
Calo, an assistant professor at the University of Washington School of
Law who studies technology policy, has called for companies that conduct
experiments on their users to create “
consumer subject review boards ,”
a kind of internal ombudsman who would assess each proposed experiment
and balance the potential risks to users against the potential rewards.
“There’s
enough pressure and understanding of this issue that these firms are
going to have to come up with a way to make the public and regulators
comfortable with experimenting with consumers,” Mr. Calo said.
Much
of the research that Facebook and Google conduct to improve their own
products is secret. Some is not. Google has acknowledged running about
20,000 experiments on its search results every year. It once tested
41 different shades of blue on its site , each color served to a different group, just to see which hue garnered the most engagement from users.
Over
the last few years, Facebook has expanded what it calls its Data
Science team to conduct a larger number of public studies. The company
says the team’s mission is to alter our understanding of human
psychology and communication by studying the world’s largest meeting
place. So far, it has
produced several worthy insights .
In 2012, the data team published a
study that analyzed more than 250 million users; the results shot down the theory of “the
filter bubble ,”
the long-held fear that online networks show us news that reinforces
our beliefs, locking us into our own echo chambers. Like the new study
on people’s emotions, that experiment also removed certain posts from
people’s feeds.
In another experiment, Facebook
randomly divided 61 million American users into three camps
on Election Day in 2010, and showed each group a different, nonpartisan
get-out-the-vote message (or no message). The results showed that
certain messages significantly increased the tendency of people to vote —
not just of people who used Facebook, but even their friends who
didn’t.
Zeynep
Tufekci, an assistant professor at the School of Information and
Library Science at the University of North Carolina, points out that
many of these studies serve to highlight Facebook’s awesome power over
our lives.
“I
read that and I said, ‘Wait, Facebook controls elections,’ ” she said.
“If they can nudge all of us to vote, they could nudge some of us
individually, and we know they can model whether you’re a Republican or a
Democrat — and elections are decided by a couple of hundred thousand
voters in a handful of states. So the kind of nudging power they have is
real power.”
Ms. Tufekci
has offered a stirring call to arms against
Facebook, Google and other giant web concerns because of their power to
shape what we do in the world. She makes a worthy argument.
But
if every study showing Facebook’s power is greeted with an outcry over
its power, Facebook and other sites won’t disclose any research into how
they work. And isn’t it better to know their strength, and try to
defend against it, than to never find out at all?
No comments:
Post a Comment