A FOIA for Facebook

Everyone seems to be a fan of transparency, these days. Politicians of every stripe are keen to preach their belief in it, and transparency has become a watchword for everything from marketing, to healthcare, to journalism. In debates around content moderation, calls for transparency are a ubiquitous feature. Online platforms, for their part, are keen to tout their own belief in the importance of transparency. It’s a curious thing: if everyone is so committed to transparency, why do people keep complaining about a lack of transparency?

“Transparency”, as it turns out, is a pretty flexible concept, which can mean different things to different people, and can be twisted in a number of directions. But for the platforms, at least, leaning into this idea makes sense. Much has been written about the accountability deficit underlying the unprecedented power and influence that these companies wield. Transparency and openness are a key ingredient to cultivating public confidence, by fostering external understanding of what is going on behind the curtain. In addition, the scope and scale of the biggest platforms leaves them heavily dependent on external oversight, both to flag problematic content and, more broadly, to drive structural improvements by noting where the system is failing. This dynamic is particularly pronounced in markets that are removed from the platforms’ main geographic areas of focus. The ability of third parties to play this oversight role is necessarily dependent on their ability to access accurate and comprehensive information about how the systems are functioning. In other words, transparency is not only necessary to foster public confidence in the platforms’ decisions, but also to allow external stakeholders to call attention to problems before they metastasize. 

However, the novelty of this space makes it difficult to define exactly what strong transparency should look like in the context of the platforms. Around the world, there are legions of journalists, academics, regulators, and civil society observers studying the platforms’ operations, each of whom views the issue through the lens of their own thematic or regional perspective, and each of whom will offer a unique answer as to what should be disclosed and how. The platforms must navigate not only these competing priorities, but also concerns that disclosing too much on how they moderate might allow bad actors to game the system. There are also privacy risks to consider, not only in terms of ensuring that disclosures are scrubbed of sensitive user information, but also insofar as expanded disclosure requirements could lead platforms to track increasing amounts of information. While there are sound policy reasons underlying the calls for transparency, figuring out how it should work is more complicated. 

Comparing Transparency Models

In a governmental context, international standards have developed into a relatively well-recognized corpus for what strong transparency looks like. At the core of these standards is the notion that information should be “open by default,” meaning it is created, processed, and stored under a presumption that it should be available to the public on request, subject to reasonable redactions to protect legitimate interests like national security or the efficacy of law enforcement techniques. In practical terms, this idea is mainly implemented through freedom of information, access to information, or right to information rules, which grant the public an ability to file requests for documents, ideally in a format of their choosing, and under a reasonable cost and timeframe. As of November 2021, 136 countries, comprising around four-fifths of the world’s population, have some form of right to information or freedom of information law on the books, though their strength and quality varies enormously. The best laws typically build a degree of independence into disclosure structures, both through delegating information requests to specialized “information offices,” and by constituting independent oversight bodies, such as an information commission or commissioner, to hear appeals against refusals to disclose information or other breaches of a public agency’s transparency obligations.

Experience from the public sector teaches us that relying purely on government statements or press releases is insufficient because it allows for information to be massaged to suit a desired narrative. Even statistical data can be cherry-picked to present a misleading picture. An ability to make access requests is important to ensure that the public can get an unvarnished picture of what is happening. In terms of building trust in institutions, an effective public dialogue, through receiving and responding to information requests, is better at fostering positive relations with citizens than a one-way flow of information. This is true even where information requests uncover instances of incompetence or even corruption. While public trust may plummet in the immediate aftermath of such a scandal, there is consensus that an environment which allows this behavior to be brought to light is, on the whole, more conducive to improve relations between governments and their constituents.

In contrast to the public sector, where requesting mechanisms are common, transparency in the private sector is usually driven through voluntary reporting. However, while the scope and scale of transparency reporting among the platforms has expanded significantly over the past few years, the mechanism suffers from a key deficiency as it relies oncompanies’ messaging arms to provide a window into how they are operating. For outsiders, there will always be a challenge connected to this – the public will never be certain whether the results they receive present a complete, accurate, and unvarnished picture of what is actually going on. Another problem is that companies will struggle to meet the needs of the many stakeholders who are invested in understanding their work, and many of whom lack effective channels to communicate their interests.

In recent years, much has been written about the way the platforms are challenging existing understandings of the division between private and public exercises of power. With the enormous expansion of private sector power over the public discourse, there is a need to reconceptualize what transparency means in the context of the largest online platforms, toward the kind of rules driven, and requester-driven, process that mirrors what we expect from governments: a Freedom of Information Act (FOIA) for Facebook. 

FOIA-ing Facebook

While the vast majority of information requesting mechanisms apply to public sector entities, their use among commercial entities has some precedent. A number of international financial institutions, for example, have implemented information access policies which operate under a FOIA-like structure. The Internet Corporation for Assigned Names and Numbers (ICANN), a California-based corporation which coordinates the global domain name system, has its own requesting mechanism which is roughly analogous to a governmental model.

To be clear, this is not to suggest that the FOIA should be directly applied to online platforms. For one thing, there are severe problems with that law, which is in dire need of reform. The law’s disclosure exemptions are also ill-suited to the platforms’ unique context. Rather, what is needed is a new, rules-based process that the platforms could implement, allowing external parties to request information, and the platforms required to deliver such information, subject to appropriate safeguards to protect legitimate interests (e.g., user privacy or the efficacy of moderation systems). Refusals, redactions, or other potential non-compliance could be directed to an external and arm’s length oversight structure (say, the “Facebook Transparency Board”), which could make decisions on disclosures, and broader compliance or non-compliance with transparency rules.

Obviously, the development and implementation of such a mechanism would not be an easy lift. For one thing, the scale of the biggest platforms, and the enormous public interest in their operations, would threaten to overwhelm any such system under a flood of requests. There are a number of potential avenues to mitigate that challenge, such as through imposing modest user fees to the system; allowing for the dismissal of “frivolous” or “vexatious” requests; barring retroactive requests for data which predates the policy; or even vetting requesters, on clearly enumerated grounds, to ensure that the system does not become inundated with commercially motivated queries. Even so, the system would not be cheap to implement, including not only the cost of staff dedicated specifically to reviewing and responding to requests, but also, likely, a significant restructuring in how information is organized and retained, to facilitate this new function while protecting user privacy. 

It also requires an acceptance by the companies that relinquishing their complete discretion over what is disclosed will inevitably result in the release of information that paints them in a negative light. After all, the point of freedom of information systems is that they are meant to be more resistant to public relations messaging and efforts to control the narrative. For anyone working at these platforms whose job is to mitigate risk, both in terms of public relations harms and actual legal liability, such a shift must sound horrifying, even if it serves the longer-term good of promoting trust, legitimacy, and accountability.

One Step at a Time

One potential avenue to easing concerns about such a system would be to start implementing it at quasi-independent external structures, such as the Facebook Oversight Board and the Global Internet Forum to Counter Terrorism, where the potential risks to user privacy are lower, and the ambit of information that could be subject to request is more limited, since these are siloed off from the companies themselves. The fact that these institutions, particularly the Oversight Board, were created as an effort to support legitimacy and public engagement also makes them a logical starting point, and a good place to hammer out conceptual challenges in adapting freedom of information to the context of the platforms, particularly around procedures for access and the adequate scope of exceptions to disclosure.

Ultimately, however, the goal should be to develop a requesting mechanism which applies to the platforms themselves, at least to the extent of their content moderation function. It may be unrealistic to expect that such a move would ever take place voluntarily. However, several regulatory bodies are already considering avenues to mandate greater transparency in this space. In the US, the conversation around reforming intermediary liability protections includes a number of proposals that would impose new transparency rules. Likewise, the EU’s proposed Digital Services Act contemplates a mechanism for vetted researchers to obtain information from the platforms.

Regardless of whether a better transparency framework is achieved through regulation, voluntary changes by the platforms, or some combination of both, it is important to move to a structure which is rules-based and responsive to external demands. The platforms’ role in the public discourse is too important for debates about their decision-making to be based on guesswork, or the companies’ largesse in delivering scraps of data to a few handpicked external stakeholders. While there are many different definitions of transparency, hopefully we can agree on that.

Michael Karanicolas is the inaugural Executive Director of the UCLA Institute for Technology, Law & Policy, a collaboration between the UCLA School of Law and the Samueli School of Engineering whose mission is to foster research and analysis to ensure that new technologies are developed, implemented and regulated in ways that are socially beneficial, equitable, and accountable. 

This post is adapted from his paper, “A FOIA for Facebook: Meaningful Transparency for Online Platforms” available on SSRN and upcoming in the Saint Louis University Law Journal.

Leave a Reply

Leave a Reply

Your email address will not be published.