Free gifts and forecasts: the murky world of Facebook research support

Motivations to work with outside academics are thorny, and it’s Facebook that decides who gets access to its data to examine its impact on society

FILE- In this Sept. 20, 2016, file photo, Facebook CEO Mark Zuckerberg and his wife, Priscilla Chan, prepare for a speech in San Francisco. The Giving USA report, released Tuesday, June 12, 2018, said giving from individuals, estates, foundations and corporations reached an estimated $410 billion in 2017. The biggest increase was in giving to foundations, up 15.5 percent. This surge was driven by large gifts by major philanthropists to their own foundations, including $2 billion from Zuckerberg and Chan. (AP Photo/Jeff Chiu, File)
Powered by automated translation

The professor was incredulous.

David Craig had been studying the rise of entertainment on social media for several years when a Facebook employee he didn’t know emailed him last December, asking about his research. “I thought I was being pumped,” Prof Craig said.

The company flew him to Menlo Park and offered him $25,000 to fund his ongoing projects, with no obligation to do anything in return. This was definitely not normal, but after checking with his school, University of Southern California, Prof Craig took the gift. “Hell, yes, it was generous to get an out-of-the-blue offer to support our work, with no strings,” he said. “It’s not all so black and white that they are villains.”

Other academics got these gifts, too. One, who said she had $25,000 deposited in her research account recently without signing a single document, spoke to a reporter hoping maybe the journalist could help explain it. Another professor said one of his former students got an unsolicited monetary offer from Facebook, and he had to assure the recipient it wasn’t a scam. The professor surmised that Facebook uses the gifts as a low-cost way to build connections that could lead to closer collaboration later. He also thinks Facebook “happily lives in the ambiguity” of the unusual arrangement. If researchers truly understood that the funding has no strings, “people would feel less obligated to interact with them”, he said.

Contorted

The free gifts are just one of the little-known and complicated ways Facebook works with academic researchers. For scholars, the scale of Facebook’s 2.2 billion users provides an irresistible way to investigate how human nature may play out on, and be shaped by, the social network. For Facebook, the motivations to work with outside academics are far thornier, and it’s Facebook that decides who gets access to its data to examine its impact on society.

FILE- In this May 1, 2018, file photo, Facebook CEO Mark Zuckerberg makes the keynote speech at F8, Facebook's developer conference in San Jose, Calif. Chinese phone maker Huawei said Wednesday, June 6, that it has never collected or stored Facebook user data, after the social media giant acknowledged it shared such data with Huawei and other manufacturers. (AP Photo/Marcio Jose Sanchez, File)
Facebook CEO Mark Zuckerberg. Marcio Jose Sanchez/AP

“Just from a business standpoint, people won't want to be on Facebook if Facebook is not positive for them in their lives,” said Rob Sherman, Facebook’s deputy chief privacy officer. “We also have a broader responsibility to make sure that we’re having the right impact on society.”

The company’s long been conflicted about with how to work with social scientists, and now runs several programmes, each reflecting the contorted relationship Facebook has with external scrutiny. The collaborations have become even more complicated in the aftermath of the Cambridge Analytica scandal, which was set off by revelations that a professor who once collaborated with Facebook’s in-house researchers used data collected separately to influence elections.

_______________

Read more:

Facebook confirms data-sharing deals with at least four Chinese companies

Data storage deal helps Taiwanese stock surge 204%

_______________

“Historically the focus of our research has been on product development, on doing things that help us understand how people are using Facebook and build improvements to Facebook,” Mr Sherman said. Facebook’s heard more from academics and nonprofits recently who say “because of the expertise that we have, and the data that

Facebook stores, we have an opportunity to contribute to generalisable knowledge and to answer some of these broader social questions,” he said. “So you’ve seen us begin to invest more heavily in social science research and in answering some of these questions.”

Facebook has a corporate culture that reveres research. The company builds its product based on internal data on user behaviour, surveys and focus groups. More than a hundred PhD-level researchers work on Facebook’s in-house core data science team, and employees say the information that points to growth has had more of an impact on the company's direction than chief executive Mark Zuckerberg’s ideas.

Wars

Facebook is far more hesitant to work with outsiders; it risks unflattering findings, leaks of proprietary information, and privacy breaches. But Facebook likes it when external research proves that Facebook is great. And in the fierce talent wars of Silicon Valley, working with professors can make it easier to recruit their students.

It can also improve the bottom line. In 2016, when Facebook changed the “like” button into a set of emojis that better captured user expression - and feelings for advertisers -  it did so with the help of Dacher Keltner, a psychology professor at the University of California, Berkeley, who’s an expert in compassion and emotions. Pro Keltner’s Greater Good Science Centre continues to work closely with the company. And this January, Facebook made research the centrepiece of a major change to its news feed algorithm. In studies published with academics at several universities, Facebook found that people who used social media actively - commenting on friends' posts, setting up events - were likely to see a positive impact on mental health, while those who used it passively may feel depressed. In reaction, Facebook declared it would spend more time encouraging "meaningful interaction". Of course, the more people engage with Facebook, the more data it collects for advertisers.

FILE- In this April 18, 2018, file photo, a graphic from the Cambridge Analytica website is displayed on a computer screen in New York. On Wednesday, June 6, Cambridge Analytica's ex-CEO Alexander Nix, clashed with British lawmakers as he denied his firm was unethical. (AP Photo/Mark Lennihan, File)
A graphic from the Cambridge Analytica website. Mark Lennihan/AP

The company has stopped short of pursuing deeper research on potentially negative fallout of its power. According to its public database of published research, Facebook’s written more than 180 public papers about artificial intelligence but just one study about elections, based on an experiment Facebook ran on 61 million users to mobilise voters in the Congressional midterms back in 2010. Mr Sherman said, “We’ve certainly been doing a lot of work over the past couple of months, particularly to expand the areas where we’re looking.”

Prohibited

Facebook’s first peer-reviewed papers with outside scholars were published in 2009, and almost a decade into producing academic work, it still wavers over how to structure the arrangements. It’s given out the smaller unrestricted gifts. But those gifts don’t come with access to Facebook’s data, at least initially. The company is more restrictive about who can mine or survey its users. It looks for research projects that dovetail with its business goals.

Some academics cycle through one-year fellowships while pursuing doctorate degrees, and others get paid for consulting projects, which never get published.

(FILES) This file photo taken on November 20, 2017 shows logos of US online social media and social networking service Facebook. Facebook announced on May 24, 2018 it will apply to its users accross the world the EU's General Data Protection Regulation (GDPR). The GDPR, which comes into effect on May 25, 2018, aims to give users more control over how their personal information is stored and used online, with big fines for firms that break the rules.
 / AFP / LOIC VENANCE
Logos of Facebook. The firm gives gifts to academics. Loic Venance/AP

When Facebook does provide data to researchers, it retains the right to veto or edit the paper before publication. None of the professors Bloomberg spoke with knew of cases when Facebook prohibited a publication, although many said the arrangement inevitably leads academics to propose investigations less likely to be challenged. “Researchers focus on things that don’t create a moral hazard,” said Dean Eckles, a former Facebook data scientist now at the MIT Sloan School of Management. Without a guaranteed right to publish, Mr Eckles said, researchers inevitably shy away from potentially critical work. That means some of the most burning societal questions may go unprobed.

_______________

Read more:

When the charm offensive from Facebook just turns into being offensive

Facebook proves its strength as it shrugs off data debacles

_______________

Facebook also almost always pairs outsiders with in-house researchers. This ensures scholars have a partner who’s intimately familiar with Facebook’s vast data, but some who have worked with Facebook say this also creates a selection bias about what gets studied. “Stuff still comes out, but only the immensely positive, happy stories - the goody-goody research that they could show off,” said one social scientist who worked as a researcher at Facebook. For example, he pointed out that the company’s published widely on issues related to wellbeing, or what makes people feel good and fulfilled, which is positive for Facebook’s public image and product. "The question is: ‘What’s not coming out?,’” he said.

Begged

Facebook argues its body of work on well-being does have broad importance. "Because we are a social product that has large distribution within society, it is both about societal issues as well as the product," said David Ginsberg, Facebook's director of research.
Other social networks have smaller research ambitions, but have tried more open approaches. This spring, Twitter asked for proposals to measure the health of conversations on its platform, and Microsoft's LinkedIn is running a multi-year programme to have researchers use its data to understand how to improve the economic opportunities of workers. Facebook has issued public calls for technical research, but until the past few months, hasn't done so for social sciences. Yet it has solicited in that area, albeit quietly: last summer, one scholarly association begged discretion when sharing information on a Facebook pilot project to study tech's impact in developing economies. Its email read, "Facebook is not widely publicising the programme."

Cardboard cutouts depicting Facebook CEO Mark Zuckerberg are pictured during a demonstration ahead of a meeting between Zuckerberg and leaders of the European Parliament in Brussels, Belgium May 22, 2018. REUTERS/Francois Lenoir     TPX IMAGES OF THE DAY
Cardboard cutouts depicting Facebook CEO Mark Zuckerberg during a demonstration against the firm's use of data. Francois Lenoir /Reuters

In 2014, the prestigious Proceedings of the National Academy of Sciences published a huge study, co-authored by two Facebook researchers and an outside academic, that found emotions were “contagious” online, that people who saw sad posts were more likely to make sad posts. The catch: the results came from an experiment run on 689,003 Facebook users, where researchers secretly tweaked the algorithm of Facebook’s news feed to show some cheerier content than others. People were angry, protesting that they didn’t give Facebook permission to manipulate their emotions.

The company first said people allowed such studies by agreeing to its terms of service, and then eventually apologised. While the academic journal didn’t retract the paper, it issued an “Editorial Expression of Concern.”

Omission

To get federal research funding, universities must run testing on humans through what’s known as an institutional review board (IRB), which includes at least one outside expert, approves the ethics of the study and ensures subjects provide informed consent. Companies don’t have to run research through IRBs. The emotional-contagion study fell through the cracks.

The outcry profoundly changed Facebook’s research operations, creating a review process that was more formal and cautious. It set up a pseudo-IRB of its own, which doesn’t include an outside expert but does have policy and PR staff. Facebook also created a new public database of its published research, which lists more than 470 papers. But that database now has a notable omission - a December 2015 paper two Facebook employees co-wrote with Aleksandr Kogan, the professor at the heart of the Cambridge Analytica scandal.

Facebook said it believes the study was inadvertently never posted and is working to ensure other papers aren't left off in the future.

In March, Gary King, a Harvard University political science professor, met some Facebook executives about trying to get the company to share more data with academics. It wasn't the first time he'd made his case, but he left the meeting with no commitment.

_______________

Read more:

Data privacy: How will the new EU law affect the UAE?

Mark Zuckerberg apologises to EU over data misuse

_______________

A few days later, the Cambridge Analytica scandal broke, and soon Facebook was on the phone with Prof King. Maybe it was time to cooperate, at least to understand what happens in elections. Since then, Prof King and a Stanford University law professor have developed a complicated new structure to give more researchers access to Facebook’s data on the elections and let scholars publish whatever they find. The resulting structure is baroque, involving a new “commission” of scholars Facebook will help pick, an outside academic council that will award research projects, and seven independent US foundations to fund the work. “Negotiating this was kind of like the Arab-Israel peace treaty, but with a lot more partners,” Prof King said.

Problematic

The new effort, which has yet to propose its first research project, is the most open approach Facebook’s taken yet. “We hope that will be a model that replicates not just within Facebook but across the industry,” Mr Ginsberg said. “It’s a way to make data available for social science research in a way that means that it’s both independent and maintains privacy.”

FILE PHOTO: Facebook CEO Mark Zuckerberg testifies before a House Energy and Commerce Committee hearing regarding the company’s use and protection of user data on Capitol Hill in Washington, U.S., April 11, 2018. REUTERS/Leah Millis/File Photo
Mark Zuckerberg. Academics say Facebook's approach risks the 'Matthew Effect'. Leah Millis/Reuters

But the new approach will also face an uphill battle to prove its credibility. The new Facebook research project came together under the company’s public relations and policy team, not its research group of PhDs trained in ethics and research design. More than 200 scholars from the Association of Internet Researchers, a global group of interdisciplinary academics, have signed a letter saying the effort is too limited in the questions it’s asking, and also that it risks replicating what sociologists call the “Matthew effect”, where only scholars from elite universities - like Harvard and Stanford - get an inside track.

“Facebook’s new initiative is set up in such a way that it will select projects that address known problems in an area known to be problematic,” the academics wrote.

The research effort, the letter said, also won’t let the world - or Facebook, for that matter - get ahead of the next big problem.