Why is antitrust so important for the digital economy? Because the digital economy is ruled by data, including personal data.
The latter relates to privacy. Privacy is not necessarily in and of itself important for competition regulators, but data is. Data can act as a barrier to entry, but can also be used to foreclose competitors’ access to market and leveraging market power from one market to abuse such market power in another market.
These issues go straight to the core of antitrust. It is extremely important to understand how data is collected, how it is used and combined, and how it is it monitored. Data gives firms a competitive advantage in almost all markets.
Many of the big tech platforms operate with “take it or leave it” terms and conditions, which allow them to collect both first party data and third party data. A combination of first and third party data can be used to create super-profiles on users and surveil users online. What is abusive is not the combination of data per se, but imposing data collection and combination preconditions to users seeking to access the service of a given platform. This not only violates users’ fundamental human rights in terms of right to privacy, but, in circumstances where the platform holds a dominant position, it is also an antitrust violation in the form of a non-price exploitation and exclusion. Due to the serious underenforcement by privacy and data protection regulators, as well as by antitrust enforcers, certain tech companies have been allowed to develop into monopolies.
The ongoing global debate on antitrust law can hopefully ensure that the pendulum swings to the right level of enforcement – which can only be more than it is now where there is regulatory stagnation. This, however, requires antitrust enforcers and data protection authorities co-operate to reign in tech companies.
A Question of Consumer Behaviour
When it comes to the digital economy, it is important to ask “what kind of society do we want to live in?”
Our society is slowly evolving into a dystopia where current conceptions of privacy are completely lost. Solely blaming the big tech companies would be unjust, as users of the platforms have been part of the problem by allowing their personal data to be eroded: for example, by using technology to monitor kids on their mobile phones or track where they are, and by constantly uploading photos of one another on the internet.
One serious problem in the digital economy is what is referred to as “behavioural surplus”. This means that the tech platforms know what users want before they know it themselves. This means that users can easily be manipulated and the platforms have become very good at convincing them to part ways with their personal data in return for a service. For example, users can easily use alternative search engines to Google such as Bing and DuckDuckGo, but they tend not to do so.
Users voluntarily harm themselves by giving away too much personal data. For this reason, we need a change in consumer behaviour. But this will likely take a very long time. In the meantime, it is clear that competition authorities need to work with privacy and data protection regulators.
Antitrust Enforcement Law in the Digital Economy
Creating a framework of rules for digital platforms was recently recognised by European Competition Commissioner Vestager, who has stated that we need to enforce the competition rules firmly, to stop digital platforms using their power to deny their rivals a chance to compete. There is no doubt that enforcement in the digital space has fired up over the last couple of years. To be fair to the regulators, when the internet and digital economy were first emerging in the 1990s and early 2000s, it was unclear what shape they would take. The explosive growth and innovation in the online economy in that period appeared to validate the idea that markets were best left to themselves. However, allowing private power to go unregulated created three problems: (1) monopolies in certain sectors, (2) surveillance and (3) disinformation – all three devastating for digital rights.
If the internet had emerged in another period, it might have looked very different. But the online economy has developed in an era when we have three chief means of keeping corporate power in check: antitrust, economic regulation and public ownership. That said, these tools have not been used to protect citizens’ digital rights.
Concerns expressed about antitrust enforcement in digital markets may at first look new. But with regard to leveraging harm-based theories applied in recent and ongoing enforcement against Google and Amazon, they correspond to well-known “traditional” theories of harm. These include abusive tying, leveraging, discrimination and refusal to supply. The notable exception to this, which is truly novel (at least in a European context), is the use of antitrust to pursue exploitative non-pricing abuses. This can be seen in the German Facebook case, which concerned data exploitation caused by the company’s terms and conditions.
The Need to Address Business Models of Tech Companies
Big tech companies are often referred to as “GAFA”, “FAANG”, or some other acronym. This is not ideal, as they have very different business models. While Facebook and Google are data-driven businesses dependent on advertising revenue, Apple and Microsoft generate their revenues from hardware and software products. Amazon, on the other hand, operates as a retailer and a marketplace, which is driven by taking a share of sales revenues. Thus, it is important to distinguish between them as the business models of Google and Facebook are distinct from that of Microsoft and they accumulate power in very different ways.
These different business models, and therefore financial incentives, are critical to both the analysis of competitive effects and remedy design. This is especially true for Google and Facebook, where allegations of anticompetitive behaviour through data use (or misuse) strike at the heart of their business model. This represents a clear paradigm shift for antitrust enforcement, which has traditionally focused on conduct that is ancillary to, rather than at the core of, the business model of investigated businesses. Imposing fines do not appear to have much deterrent effect unless they are really high, so, when intervening in these markets, enforcers need to impose remedies such as preventing Facebook and Google from combining first and third party data.
This demands careful consideration to be given to what remedies are appropriate, in circumstances where there is the real potential to undermine business models which, notwithstanding concerns raised by authorities, are generally recognised as delivering significant consumer benefits.
Liza Lovdahl Gormsen is Senior Research Fellow in Competition Law and Director of the Competition Law Forum at the British Institute of International and Comparative Law and a Board Member at the Open Markets Institute.