Interoperability and Platform Monopoly
The case for mandating interoperability between platforms within digital markets.
Happy Holidays! Below is my article on mandatory interoperability. While interoperability creates challenges for preserving privacy and security, it would greatly expand competition in the technology sector. As a result, the internet would become more diverse, innovative, and better for democracy. This policy issue is an emerging subject of research, and I greatly enjoyed delving into existing literature. I hope you can learn something from my exploration.
Since the emergence of the internet, it has been a font of innovation, individual expression, and community. But today large swathes of the digital sphere have become enclosed within the control of a few dominant platforms. These companies—Amazon, Apple, Google, Meta, and Microsoft—are monopolies, and they have acquired their power through limiting a crucial trait of the internet: its interoperability. To interoperate is to “transfer and render useful data and other information across systems, applications, or components” (Gasser 2015, 2). As the monopolists grew, they began to limit the transfer of data and other information from their platform to entrench its dominance. Low levels of interoperability exclude smaller platforms from sharing in the value of the large network, and as a result, they face many barriers to competition. In the absence of these barriers, normal market processes should reduce the size of any one firm, but digital service markets exhibit characteristics which distort competition. When these markets are left to their own devices, oligopolies emerge, and public policy must intervene to create competition. Policymakers have many options to this end, but my focus here is on one. I argue that regulators should mandate interoperability within digital markets to open up platforms and make them once again compete on the merits of their services. This policy is called “mandatory interoperability”. While not the only solution legislators should consider, encouraging a greater flow of information between platforms would solve many of the barriers to competition which currently haunt digital markets. The structure of my argument is as follows: I first explain the technical elements of interoperability, diagnose the anti-competitive features of digital markets, justify mandatory interoperability as a solution to several of them, address major objections, and conclude with concrete policy recommendations.
While smarter and more robust enforcement of antitrust policy has started to reduce anti-competitive mergers, mandatory interoperability must play a strong role in mitigating other limits to competition. As policy analyst Chris Riley has observed, “Competition policymakers are fully aware of this dynamic and increasingly, analyses and reports indicate that the future of regulatory travel will be towards promotion of interoperability” (2020, 1). However, “interoperability” can have many meanings in practice because it is highly contextual and technical in nature. Ur Gassers explains that for digital systems interoperability has four “layers”: data, technological, institutional, and human (2015, 3). For platforms to be interoperable, information must be able to flow between them (the data layer), and it must be understandable and usable within each platform (the technological layer). They and other relevant social systems (e.g., government) also must be able to engage effectively (the institutional layer) and ensure that end-users are able to find the information sent from other platforms meaningful (the human layer). Social media networks are a great example of digital systems which are not interoperable. Meta simply does not permit the sharing of Facebook content with competing platforms like Twitter, and there exists no coordinating body to negotiate or enforce the terms of that content sharing. As the idea of a coordinating body hints, policymakers can help strengthen platform interoperability.
The academic literature proposes two types of data/technology interoperability requirements: standardizing protocols and requiring open Application Programming Interfaces (or “API’s”) (Chao and Schulman 2020, 8). Two definitions are now in order. “Protocols” are rules which govern the way programs exchange information, and API’s are a “set of well-defined ways to interact with a system to get the system to take some action, to get some response from the system, or often both” (Chao and Schulman 2020, 8-9). Where standardization (the first type) would require common protocols across most platforms for a given market, mandating open API’s would permit platforms to have unique protocols which are made publicly available for eligible third-parties (Sharma 2019, 451). The crucial difference is the degree of autonomy platforms possess to determine the rules for exchange of information between computer systems running services in the same or a complementary digital market. Conceptually, one can imagine a spectrum of intervention ranging from giving platforms unilateral control (the status quo) to very little at all (full standardization). Currently, the monopoly problems we see in digital markets occur because dominant platforms have too much autonomy. With no external constraints, Meta can freely limit the interoperability of Facebook by restricting access to its API. The crucial question policymakers must answer when they mandate interoperability is what degree of autonomy it is desirable for platforms to have.
My position is that regulators should make this decision on a market-by-market basis. Digital markets are much more diverse and heterogeneous than the business models of old (Hovenkamp 2021, 1956). There is simply more heterogeneity in digital than manufacturing markets. Sweeping policy recommendations are therefore difficult and likely counterproductive. Some digital markets are suitable for standardization while others may benefit from platforms having some freedom to administer their protocols. Decisions must be tailored to a given market because there are real tradeoffs between interoperability and other valid concerns about a greater flow of information. Chao and Schulman advocate for full standardization because it would lead to deeper, more reciprocal, and accessible interoperability (2020, 9-10), but other thinkers have reservations about this strong intervention for its potential harms to privacy, security, and long-term innovation (e.g., Gasser 2015, 13-16; Kerber and Schweitzer 2017, 42). Both sides of this debate have merit, but I see common flaws in their views. Analysts too often focus on expanding the data and technological layers of interoperability and generalize their conclusions. Because digital markets are so diverse, it is more useful for scholars and policymakers to focus their efforts on institutional and human interoperability. More precisely, instead of focusing on the specifics of protocol standards, government is better equipped to establish the infrastructure for agreement and enforcement of those standards. Policy should build institutions to ensure the effective engagement of legal and commercial systems and to hold them accountable for decisions which harm users.
I follow Wheeler and colleagues in their call for a “digital platform agency” (DPA) which would impose a common-law derived “duty to deal” upon dominant platforms and formalize a process for industry, regulators, and interest groups to cohere around “codes of conduct” for a digital market (2020). This DPA would function as a mandatory forum for platforms to negotiate the creation of interoperability standards. In the context of digital systems, a “duty to deal” will require platforms to provide non-discriminatory access to their services through “interoperable interfaces, free flow of data across service providers, and limits on preferencing dominant platforms over competitors” (Wheeler et al. 2020, 42). Though the DPA would have a role in other areas of competition policy (e.g., antitrust), it would serve as the standard-setting organization for the protocols governing information exchange between platforms. The DPA would set and enforce “codes of conduct” for businesses through a “Code Council”, a body of experts, industry representatives, government officials, and advocacy groups who would cooperatively tailor standards to digital markets (Wheeler et al. 2020, 54). This standard-setting process, made mandatory and enforced by the DPA, is the practical implementation of institutional and human interoperability. The Code Council would set standards on a market-by-market basis and tailor limits on platform autonomy to particular market circumstances. This infrastructural approach is more agile and flexible than traditional industrial era regulation (Wheeler et al. 2020, 51). The DPA would guarantee that digital markets are more interoperable, and it would do so in a way that dampens the problems associated with mandating interoperability.
However, before addressing these counterarguments it is useful to draw out the pro-competition benefits of mandatory interoperability. First, there are four types of barriers to competition in digital markets, and mandating interoperability can address the latter two. The first is well-known: dominant platforms employ anti-competitive business practices like acquisition or predatory pricing to kill or deter new market entrants (Hovenkamp 2021, 2001), and the second is more technical: platforms benefit from increasing economies of scope. That is, the more data they collect, the more valuable they are. Counterintuitively, the value of data does not follow the law of diminishing returns (Zingales et al. 2019, 37). Mandatory interoperability can only play a limited role in reducing these barriers. Even if a new social media company can interoperate with Facebook, it would still struggle to compete with incumbents who hoard bigger data, and no level of interoperability will shield the upstart from Meta’s acquisitive treasure chest. In tandem with mandating interoperability, antitrust laws need to be updated and more often enforced, and other competition policies like data portability and pooling also merit consideration (Hovenkamp 2021). The DPA can play a pivotal role in this comprehensive package in which mandatory interoperability would be a crucial line item. The policy essentially eliminates the third barrier to competition: network effects. These occur when a service becomes more valuable with every additional user or customer (Zingales et al. 2019, 38). Though network effects eventually diminish, the markets which exhibit them tend toward an oligopolistic structure (Zingales et al. 2019, 38). Interoperability would also blunt the effects of the fourth barrier: huge economies of scale. Digital platforms enjoy scale economies because the marginal cost of serving an additional user is close to zero, so there are few natural limits on growth (Zingales et al. 2019, 36). Because data’s value grows with scope, the quality of a service often increases with the scale of the platform (Zingales et al. 2019, 37). Bigger networks mean bigger data, and bigger data improves quality of the machine learning algorithms which manage the service. Amazon makes better product recommendations because it gets greater and more granular data as its user base expands. Together network effects and scale economies incentivize platforms to grow extremely quickly so that they became dominant and smaller competitors lack access to the established network and cannot match service quality. This is the phenomenon of “market tipping”.
Mandatory interoperability would re-write the rules of competition so that marketing tipping is no longer possible. Higher levels of interoperability move competition onto a common network (Hovenkamp 2021, 1992). When companies cannot interoperate, companies compete between networks, which leads to market concentration because only one or a few networks can survive. With greater interoperability, all companies share in a single network’s value, and this value grows as more users join. This point is important: the total value of a common network is greater than the sum value of a few smaller networks which are unable to interconnect (Hovenkamp 2021, 2032). Hovenkamp explains this idea with an example of telephone networks: the collective benefits of a single phone system with 1000 subscribers are greater than the sum value of three systems which have (e.g.) 500, 300, and 200 subscribers (Hovenkamp 2021, 2032). Network effects are the positive feedbacks which emerge from everyone being able to call everyone else, so the entire system is worth more. If the DPA required (e.g.) social media companies to interoperate, then other platforms would share in the wealth of Facebook’s giant network, and social media as a whole would grow in value. Creating a common network would reverse the competitive barrier of network effects because new entrants could establish themselves on the same foundation. A user would be able to switch to an upstart platform without losing the ability to see content from their Facebook friends; she would not longer be “locked in” to the Facebook platform because of network effects. As a result, new entrant can more easily compete for a dominant platform’s existing user base, so scale economies are no longer so devastating. Interoperability reduces the friction of switching between platforms, so one cannot grow exponentially and leverage bigger data to provide an unmatchable service.
Though mandatory interoperability would benefit competition enormously, it is not without its critics. Generally, these analysts warn of three possible issues: harms to privacy, security, and long-term innovation (e.g., Gasser 2015, 21-25; Kerber and Schweitzer 2017, 43). Regulators must weigh these very real pitfalls against the benefits of deeper competition. I first consider the challenges of privacy and security. In the context of digital systems, privacy and security require a measure of control over the transfer of information. Standardization would accelerate and deepen the flow of information online, so the ability to steer which and to whom information is shared is necessarily lower. Complex online ecosystems simply create more vectors of danger (Gasser 2015, 21-22). This fundamental problem is a strong reason to permit companies some degree of autonomy in determining their protocols. The greatest threats to privacy and security occur when standard protocols are in full effect. When a protocol contains an exploitable vulnerability, strong interoperability mandates systematically spread the backdoor because it is embedded within the standard protocol (Gasser 2015, 10). An important example of this issue is the so-called “Heartbleed” vulnerability embedded within SSL which enables encrypted communication across much of the internet. A version of this standard protocol, OpenSSL, became so popular that it was present on an estimated 66% of the internet (Gasser 2015, 10). This and other real instances of privacy and security violations are a real risk of standard protocols, and they may justify a measure of platform autonomy. For markets where greater information sharing might pose strong risks to the security and privacy of personal data, policymakers should impose “weaker” versions of mandatory interoperability.
Furthermore, privacy challenges are not intractable. Most advocates of interoperability argue that steps taken during implementation can reduce vulnerabilities (Gasser 2015, 21). Chao and Schulman make the valuable big picture point that the internet itself is one vast interoperable system which creates many risks for the privacy and security of personal information (2020, 8). The internet is built off standard protocols like the Transmission Control Protocol (TCP), which governs the exchange of information between devices millisecond-to-millisecond, and the Hypertext Transmission Protocol (HTTP), which determines the way servers send documents and respond to requests (Chao and Schulman 2020, 8). Technologists have built many systems within the constraints of standard protocols like TCP/HTTP which mitigate vulnerabilities to privacy and security. Policymakers can continue this tradition when mandating interoperability. Both Gasser and Chao and Schulman, for instance, suggest that federal legislation regulating the sale of personal information could reduce the market demand for privacy violations (2015, 22; 2020, 19). This example of a complementary policy should reinforce my point about the usefulness of institutional interoperability. In the abstract, any steps which better align the legal, economic, and social systems behind the exchange of information across technical boundaries will help mitigate the potential issues of privacy and security. A DPA would be an excellent guiding voice for the careful implementation of mandatory interoperability.
The other major objection to standardization is its potential negative consequences for long-term innovation. Plausibly, might exacting standards lock in existing technologies to the point of stifling the development of new ones? This problem leads scholars Wolfgang Kerber and Heike Schweitzer to warn against standard protocols, arguing “the possibilities of firms to develop their own specific products and services are limited because they have to comply with standards and interoperability requirements” (2017, 42). The strongest form of this argument holds that standardization will inhibit “disruptive” innovation, i.e. Joseph Schumpeter’s process of “creative destruction” where competitors develop entirely new generations of technologies in order to replace existing platforms and achieve market dominance (Gasser 2015, 11). Full standardization might legally entrench existing technologies and make it harder for upstarts to upend the playing field (Gasser 2015, 11). These points are well-taken, but again they are addressable through careful design of the standard-setting process. Wheeler and colleagues list mandating interoperability as one of the first actions the DPA should take (2020, 45), and they argue that its outcomes-based regulatory approach avoids the worst anti-innovative consequences of mandating interoperability. In the language of Kerber and Schweitzer, the DPA’s negotiation of “codes of conduct” is a standard-setting process, and it is light and cooperative which allows the resulting standards to evolve with technology (Wheeler et al. 2020, 53). Empirically, this cooperative process has a successful track record of producing high-quality technological standards, especially in oligopolistic markets (Wheeler et al. 2020, 53).
Most criticisms of standard-setting assume a classical microeconomic conception of regulation. In this model, government adds friction to the adoption of new technologies because regulation is rigid and inflexible. Wheeler and colleagues explicitly contrast the DPA’s regulatory approach with this “utility regulation” of older industrial markets (2020, 52). Historically, regulators would micromanage the operations of industrial firms because innovation in these markets was simply slower. There was a large gap between the development of a new technology and its widespread adoption (Wheeler et al. 2020, 50), so the long-run costs utility regulation were lower. In digital markets, innovation is much more rapid and inexpensive to implement, so the regulatory model needs to change (Wheeler et al. 2020, 51). This is why Wheeler and colleagues call for a strategy of “risk management”; instead of predicting and specifying technical details, regulators will identify risks that new technologies create and oblige platforms to solve them (Wheeler et al. 2020, 51). In microeconomic terms, this agile approach focuses on mitigating and internalizing negative externalities, the costs platforms dump on others as they develop new technologies in the race of competition. Privacy and security are foremost among these externalities, so a primary task of government during negotiation of interoperability standards will be obliging industry to duly weigh the consequences of innovation for privacy and security.
Wheeler and colleagues persuasively argue that the reluctance to impose regulatory oversight upon digital markets has gone too far (2020, 9-10). The reflexive idealization of “permissionless innovation” by economists and policymakers is part and parcel of the much criticized “move fast and break things” attitude of industry (Wheeler et al. 2020, 9). Caution is a constant theme in the interoperability literature, and it is warranted. Indeed, regulators must pay close attention to the particular circumstances of a digital market before mandating interoperability; however, this caution has descended into fear. While critics of interoperability strongly emphasize regulatory particularism, their concerns about innovation have a whiff of abstraction and generalization. Of course, standards may harm certain digital markets, but three decades into the mainstream internet, many will be sufficiently mature so that disruptive innovation is unlikely (Hovenkamp 2021, 1978). Interoperability mandates exist on a spectrum, and while the weight of the evidence points toward the weaker end, one must remember that the status quo is also on that spectrum. Policymakers choose to give platforms almost complete autonomy. While full standardization might be extreme, current policy also goes too far. The status quo is broken; monopolies rule the internet. Attitudes toward regulation are too fearful, and they have given platforms too much autonomy. Concentration is the result.
It is time to move fast and fix things. The extreme lack of interoperability exacerbates many of the problems that so worry critics! When dominant firms close their systems, they foreclose the innovation of complementary services (Kerber and Schweitzer 2017, 45). In addition, when digital markets are not interoperable, competing companies are prevented from offering higher quality or innovative services. In short, lack of competition impedes innovation (Zingales et al. 2019, 74-75). This is the crucial point critics so often miss. When small companies can compete, they invest more in R&D, develop new products and services, and glean better insights from the data they collect (Zingales et al. 2019, 74). Conversely, when monopolies dominate digital markets, potential competitors struggle to find funding, and investment in innovative technologies is either lower and directed toward refining existing ones instead (Zingales et al. 2019, 75). Disruptive innovations occur less often because entrepreneurs get more value from being bought out than the herculean task of competing in the market. Because markets are not competitive, the status quo is not innovative, and this harms privacy and security. New technologies are developed upon an interoperable foundation (as encryption was atop TCP/HTTP), yet these potential improvements are killed in their cradle. Because some competition is not impossible within the social networking market, Snapchat was able to innovate a privacy-protecting messaging service; in its platform, texts and images are not saved unless both sender and receiver give their consent. If Facebook had 15 percent of the market instead of the 60 it currently possesses (Hovenkamp 2021, 1987), new firms could differentiate themselves with more privacy- and security-protecting features. Furthermore, the continued existence of platform monopolies places all our eggs in one proverbial basket. Security and privacy breaches at the scale of Cambridge Analytica are only possible when a single company is so dominant. The decentralized competition of many small platforms is a more vibrant, secure, and privacy-protecting market. Critics of interoperability gloss over this comparative fact.
As a coda, I offer a concrete example of mandatory interoperability. As has likely been obvious throughout this essay, making social media more interoperable is of particular interest to me. Both Herbert Hovenkamp and the Stigler Center’s Final Report on Digital Platforms cite social media as a strong candidate for mandating interoperability (Zinagles et al. 2019, 16; 2021, 85). As Facebook displaced MySpace in the late 2000’s, it began to lower its platform’s interoperability with other competing platforms (Bodle 2011, 332). (Ironically, Facebook itself grew so quickly because MySpace was interoperable with it (Doctorow 2019)). So too, Twitter grew a robust ecosystem of third-party apps like TweetDeck, Tweetie, and Summize which improved upon the existing platform (Chao and Schulman 2020, 16). In recent years, however, Twitter has systematically acquired these apps and restricted access to its API (Chao and Schulman 2020, 16). Both Facebook and Twitter took these actions to capture more value from user attention on their platform, but they did so at the expense of security, privacy, and innovation. As dominant platforms, Facebook and Twitter do not fully internalize the social benefits of open access (Kerber and Schweitzer 2017, 45). Their anti-competitive actions demonstrate the need for policy intervention. I therefore recommend that a DPA should mandate that dominant social media companies maintain open Application Programming Interfaces. The Stigler Report calls for this exact intervention (Zingales et al. 2019, 16). The DPA should police API procedures to ensure they are non-discriminatory and do not preference dominant platforms over smaller competitors. While this policy is weaker than full standardization, it would greatly boost the interoperability of social media platforms and virtually eliminate network effects. As a result, greater competition would see the emergence of new, higher quality services because they can more easily acquire new users (Zingales et al. 2019, 16). Users would benefit from mashups of dominant platforms and their messaging systems (Zingales et al. 2019, 16). Alternatively, the DPA could fully standardize social networks on a “social media transfer protocol” as other analysts recommend (Brown 2020). Chao and Schulman 2020) argue that standardization would enable richer feature sets and user choice. However, this stronger interoperability mandate would entail greater risks to privacy and security. Policymakers must determine whether the social networking market is suited to a standard protocol and whether greater competition is worth the tradeoff. Creating a DPA would improve the institutional capacity to make this decision.
The primary focus of technology policy must be promoting competition. When markets are competitive, companies work hard, and citizens see a more vibrant and diverse online information ecosystem. Public policy can and must intervene to make this vision possible. Because policymakers are too fearful, monopolies have come to dominate the internet. They have created closed systems which reduce privacy and security and stifle innovation. Mandating interoperability would break open these closed systems, and novel forms of entertainment, community, and social interaction would arise. People would once again rule the internet, and democracy would be better for it. The time for change is now, and it must start with mandatory interoperability.
Bodle, Robert. 2011. “Regimes of Sharing: Open APIs, interoperability, and Facebook.” Information, Communication & Society 14 (3): 320–37. https://doi.org/10.1080/1369118X.2010.542825.
Brown, Ian. 2020. “Interoperability as a Tool for Competition Regulation.” Preprint. LawArXiv. https://doi.org/10.31228/osf.io/fbvxd.
Chao, Becky, and Ross Schulman. 2020. “Promoting Platform Interoperability.” New America Open Technology Institute. https://docs.house.gov/meetings/JU/JU05/20210225/111247/HHRG-117-JU05-20210225-SD008.pdf.
Gasser, Urs. 2015. “Interoperability in the Digital Ecosystem.” Berkman Center for Internet & Society at Harvard University. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2639210.
Hovenkamp, Herbert. 2021. “Antitrust and Platform Monopoly.” Yale Law Journal 130 (8): 1952–2050.
Kerber, Wolfgang, and Heike Schweitzer. 2017. “Interoperability in the Digital Economy.” Journal of Intellectual Property, Information Technology and Electronic Commerce Law 8 (1): 39–58.
Riley, Chris. 2020. “Unpacking Interoperability in Competition.” Journal of Cyber Policy 5 (1): 94–106. https://doi.org/10.1080/23738871.2020.1740754.
Sharma, Chinmayi. 2019. “Concentrated Digital Markets, Restrictive APIs, and the Fight for Internet Interoperability.” University of Memphis Law Review 50 (2): 68. https://doi.org/10.2139/ssrn.3400980.
Wheeler, Tom, Phil Verveer, and Gene Kimmelman. 2020. “New Digital Realities; New Oversight Solutions in the U.S.: The Case for a Digital Platform Agency and a New Approach to Regulatory Oversight.” Harvard Kennedy School Shorenstein Center on Media, Politics, and Public Policy. https://shorensteincenter.org/wp-content/uploads/2020/08/New-Digital-Realities_August-2020.pdf.
Zingales, Luigi, Guy Rolnik, and Filippo Lancieri. 2019. “Stigler Committee on Digital Platforms: Final Report.” University of Chicago Booth Stigler Center. https://www.chicagobooth.edu/-/media/research/stigler/pdfs/digital-platforms---committee-report---stigler-center.pdf.