Culture

Surprising no one, Facebook fed third-party researchers incomplete data

Who could've seen this coming?

Kevin Dietsch/Getty Images News/Getty Images

Facebook CEO Mark Zuckerberg has made it a point, in the last year or so, to remind the public that his site welcomes and encourages third-party research. The process of actually doing so has proven to be thorny for researchers, though — and now some have found the social network is explicitly sending incomplete and inaccurate data for their studies.

This weekend, Facebook apologized to researchers across many studies for sending them flawed data for their work, The New York Times reports. Facebook provided data to various research groups to study how users interact with posts across the platform, and it turns out that data only actually represented about half of Facebook’s U.S. users. Oops.

Facebook’s Open Research and Transparency team held a call with some researchers on Friday to express their ostensibly sincere apologies for what they assured users was definitely a mistake. The company is now rectifying the data set, though it’ll take weeks to actually compile it for researchers. But who’s to say it’ll actually be complete this time?

Not exactly shocking — Not to be that guy, but Facebook sending researchers a massively inaccurate data set is basically the least surprising thing that’s ever happened. Facebook’s third-party research program, which has long promised to be completely, 100 percent transparent, has always been entrenched in one big caveat: Facebook has absolute say over which data is or isn’t okay for researchers to study. When researchers step outside those Facebook-devised data boundaries, there is very often hell to pay.

Because the research data provided by Facebook, along with the rest of the company’s third-party research protocols, are so restrictive, plenty of researchers have decided to pack it in rather than deal with Zuckerberg’s cronies. Turns out their research would’ve been supremely incomplete, anyway.

The damage is done — Facebook has long insisted its restrictive research practices are a preventative measure to protect user data. The company loves to cite the FTC’s Cambridge Analytica ruling as the driver for this touchiness, but the FTC actually wrote to Facebook recently letting it know that its agreement “does not bar Facebook from creating exceptions for good-faith research in the public interest.”

Now Facebook finds itself apologizing once again. Researchers say they’ve lost months of work because of the error. One on the call said doctoral degrees were at risk because of the mistake.

Facebook claims this (enormous) oversight was nothing more than a mistake. The truth of that statement is unknowable from the outside, but its validity is essentially moot. Facebook has eliminated any remaining trust researchers might have had in the company’s ability to provide full and accurate data for their studies. An apology isn’t going to rectify that.