The fight to study what happens on Facebook
Facebook has recently added a new report to its transparency center. The report “Content widely seen” occurred apparently to shed light on what has been a long-term debate: What is the most popular content on Facebook?
The 20-page report raised more questions than answers. For example, he showed that the most viewed URL was a seemingly dark website associated with the former Green Bay Packers players. He boasted almost 90 million visits even though his official Facebook page has only a few thousand followers. The report also included the URLs for e-commerce sites that seemed at least something spammy, such as online stores for CDB products and thematic Bible t-shirts. There was also a low-resolution cat GIF and several soft memes that asked people who responded with the foods they like or do not like or the items they had recently bought.
Notably absent from the report was the right figures that regularly dominate the unofficial Twitter account “Facebook Top 10”, which occupies the content of the commitment. In fact, there was not much political content at all, a Facebook point has been anxious to demonstrate. For Facebook, its last attempt to “transparency” was evidence that most users’ foods are not polarized, swamps with misinformation, but something much more mundane.
Days later, the New York Times reported that the company had prepared an earlier version of the report, but chose not to publish it. The top URL of that report was a story of the solar times of Chicago who suggested that the death of a doctor may have been linked to the Vaccine Covid-19. Although the story was from a credible news source, it is also the type of history that is often used to combine anti-vaccine narratives.
Almost as soon as the initial report will be published, the researchers raised other topics. Ethan Zuckerman, Associate Professor of Public Policy and Communication at the University of Massachusetts in Amherst, called it “Transparency Theater”. It was, he said: “An opportunity for FB to indicate critics who are moving in the direction of transparency without freeing any of the data that a researcher would have to answer a question like ‘is a disproportionately popular right extract content in Facebook ? ‘”
The promise of ‘transparency’
For researchers who study how information on Facebook moves, it is a family tactic: to provide sufficient data to claim “transparency”, but not enough to be useful. “The findings of the report are debatable,” says Alice Marwick, main researcher at the Information Technology and Public Life Center at the University of North Carolina. “The results simply were not maintained, do not feel the scrutiny. Do not map any of the ways in which people actually share information”.
Marwick and other researchers have suggested that this can be because Facebook chose to cut their data in anunusual way. They have suggested that Facebook only sought URL that they were actually in the body of a publication, instead of the preview of the link they normally shared. Or maybe Facebook only has a really bad spam problem. Or maybe it’s a combination of both. “There is no way to verify them independently … because we do not have access to the data compared to what Facebook has,” said Marwick Engadget.
These concerns were Eco of Laura Edelson, a researcher at the University of New York. “No one else can replicate or verify the findings in this report,” she wrote in a Tweet. “We just have to trust Facebook.” In particular, Edelson has her own experience with the boundaries of Facebook push for “Transparency”.
The company recently closed your personal Facebook account, as well as those of several NYU colleagues, in response to your research on political announcements on the platform. Since Facebook does not make the focalization data available in your ad library, researchers recruited volunteers to install a browser extension that can recover advertising information according to their feeds.