by Sukanya Hosmani

Abstract
In today’s economy where there is an increasing measure of scrutiny with regard to technology’s impact on consumers and market competition, a key inquiry that should be considered is whether privacy should be considered as a dimension of competitive practices in the market. In other words, is privacy relevant to the analysis of competitive effects? Competition law incorporates many non-price dimensions of competition, including innovation, quality, variety, service and advertising. One significant type of non-price effect involving data is privacy. Firms may compete to offer better privacy terms to customers over their competitors. However, consumers have vastly different ideas about how or when they want their data to be used. Some find targeted or behavioural advertising invasive, while others appreciate more relevant ads and receive free products or services in exchange for targeted ads. So we see there is a palpable tension between competition law and privacy. Competition law enforcers generally want as much data sharing as possible, whereas privacy advocates want to limit data sharing. For example, a competition law enforcer may want to facilitate access to data to alleviate one party from having more or better information than the other in a transaction (information asymmetry), but this may raise privacy concerns if that data includes personal information, as this data could be exploited or misused.
This paper aims to shed light on the questions that arise when talking about data privacy as relevant to the analysis of competitive effects and why there is merit in dealing with privacy as a right, and as data being more than a mere economic asset.
Introduction
The Personal Data Protection Bill, 2018 was drafted by the Srikrishna Committee subsequent to the Supreme Court declaring right to privacy as a part of right to protection of life and personal liberty in the case of K.S. Puttaswamy v. Union of India.[1] Both the 2019 and the 2018 drafts lay emphasis on concerns such as- obtaining consent before accessing an individual’s data[2], penalties for violating the law[3], setting up a Data Protection Authority (“DPA”)[4] and storage of data collected within India.
This bill came against the backdrop of the Aadhar Project, the biggest unique identification database of citizen data in the world. It is in this age of datafication where every activity of ours (be it financial transactions or online behaviour) gives away not only our financial records and credit history but also derivative sensitive information pertaining to health, sexual orientation and preferences, religious and political stances, and personality traits. Such a huge database creating a map of maps of information pertaining to individuals raises crucial security questions. In this digital economy, many questions are raised and analysed with regard to technology’s impact on consumers and on the competitive market. A key question that we see coming up, to which there is no clear answer yet, is – how relevant is privacy to the analysis of competitive effects?
Considering privacy in analysing competitive effects
We know competition law encompasses within itself many non-price elements of competition such as innovation, quality, service etc. In the context of personal data protection, the non-price element we consider is privacy. Companies may compete amongst each other to provide better privacy policies to their consumers over their competitors. However, these consumers have very different ideas about how they care about data privacy. Some do not like it when they are on the receiving end of targeted advertising whereas some find value in it as it is more relevant to them. To make matters murkier, there is a phenomenon known as the ‘Privacy Paradox’, which means that while a majority of users claim to care about their privacy and the need for data protection, in reality they do not act like it.[5] Whenever asked if they value their privacy, the invariable response from people is a resounding ‘yes’. However, they continue to use services that undermine this very beloved privacy of theirs. We do not have to look any further than Facebook for an example. In 2018, Facebook was engulfed in a massive security breach which compromised nearly 50 million Facebook users. A number of hackers acquired the ability to take over accounts. This is considered to be the largest breach of data in the history of Facebook. Despite this, the number of daily active users from around the globe on Facebook is still huge in number, and it continues to grow. The average revenue per user is also only increasing.[6] So clearly this serious security issue seemed to make absolutely no difference to users’ behaviour whatsoever.
The problem that the privacy paradox gives rise to, that ultimately leads to market failures, is two-fold. Both these outcomes are related to the role of consent.[7] The first is wilful data negligence, which means caring about data protection but not actively taking measures towards it (like reading the privacy policies carefully or using methods that are more privacy enhancing on their devices). The second problem concerns a lack of transparency with respect to what companies and organisations do with users’ data. Even if users want to actively take steps to protect their data, they cannot due to a simple lack of ability arising out of them not knowing what happens to their data. Some of these platforms are dominant and almost indispensable to consumers, who have little choice and tend to use the same platforms and show an unwillingness to switch. Such platforms are often compared to utilities in the sense that users feel they cannot do without them and so have limited choice but to accept their terms of service.[8] Most times these interrelated problems conjoin and sometimes despite companies putting out privacy policies, there is little to no difference made to users’ data security.
Companies that make use of users’ data, view data privacy laws as a threat to their business and hence put up a fight against them. There was resistance in the EU against the General Data Protection Regulation 2016/679 (“GDPR”), just as there is resistance against data protection reforms in the USA. It is safe to say that the case is not that users do not care about their data privacy: it is more about the market conditions that do not allow for these legislations to come into place as the competition that is created prevents companies from adopting privacy friendly methods. Data is a crucial component of the business models of digital platforms, and control of data confers market power to such platforms. Furthermore, economies of scale and scope, data – driven network effects and control of data that giants like Facebook and Google possess, create high barriers to entry for new entrants in the market. Establishing a successful platform that can attract sufficient online traffic while abiding by privacy enhancing methods like say, obtaining affirmative consent of end-users, is a significant challenge for companies.[9]
The EU competition law acknowledges the need to protect competition while at the same time leaving room for allowing situations where it is necessary for the benefit of consumers. This is laid out in Article 101 of the Treaty on the Functioning of the European Union (“TFEU”). While Article 101(1) of the TFEU prohibits agreements and concerted practices between undertakings which may distort competition in the internal market, Article 101(3) permits it on the basis that it promotes consumer welfare by fulfilling four cumulative conditions laid out in it, namely:-
- increases efficiency gains
- confers a fair share of gains for consumers
- ensures the preservation of competition; and lastly,
- the measures are indispensable to the objective sought to be achieved.[10]
The Commission issued Guidelines on the applicability of Article 101(3) laying down a set of criteria to determine pro-competitive effects.[11] For instance, efficiency gains may comprise a range of economic and econometric properties including cost and qualitative efficiencies.[12]
Two sides of arguments to privacy enhancing data practices
There are two sides of arguments to considering privacy enhancing data practices as increasing qualitative efficiencies and hence promoting consumer welfare[13].
* One side of the argument relies on consideration of purely economic effects in the market. While the EU Commission has invoked non economic objectives in its analysis of Article 101(3) in a number of cases where the decisions were based on public policy goals[14], a close analysis of the cases reveals that the seemingly non-economic goals are taken into account only to the extent that they can be subsumed under the conditions of Article 101(3).[15] This can be best explained by the reliance of the Commission on the ruling of European Court of First Instance (“CFI”) in Matra Hachette Case, which suggests that it considers that these factors may only be taken into account in a “supererogatory” manner where “public policy considerations have been used to supplement the economic benefits which the agreement generates.”[16]
In the case of social media platforms for instance, (which can be defined as undertakings operating in a multi-sided market that use the internet to enable interactions between two or more distinct but interdependent groups of users) because of the interdependency of the two or more groups of consumers interacting through the platforms, the two-sided platforms need both sides on board in order to operate: without one side of the platform, the other side will not join, and vice versa. For instance, Yellow Pages are more valuable to customers if more companies provide information and are more valuable to companies if more customers see the messages[17]; similarly, free television is more valuable to advertisers if there are more viewers.[18]
There is also this strong belief that obtaining informed consent is an extremely hard thing to do especially to develop new things in this age of Artificial Intelligence (“AI”) and Machine Learning Interpretability (“MLI”). As has been stated- “Maybe informed consent was practical two decades ago, but it is a fantasy today. In a constant stream of online interactions, especially on the small screens that now account for majority of usage, it is unrealistic to read through privacy policies. And people simply don’t.”[19] The growth of AI and Machine Learning algorithms take us to a new found level of understanding to our world, and this growth requires heavy use of data. It has been argued that while in the last decade, privacy legislations have developed upon the assumption that the individual human experience is paramount to everything else, it needs to be noted that the sum of our data is greater than its parts, and we need these individual contributions to keep moving forward.[20]
* The other side argues that it furthers consumer welfare, as personal data of individuals is considered to represent monetary value and is to be considered a counter performance for the so called “free” digital services. It is argued that the collection and use of personal data is not so much a paid price, but an objective cost imposed on consumers in the process of digital transactions. A recent proposal for an EU Directive on the supply of digital content has acknowledged that personal data in the modern digital economy can be used to pay for digital content.[21] In the same vein, the Bundeskartellamt established in its Facebook decision that in December 2018, the Average Revenue per User was to be estimated at $10.98 bn in Europe and $17.37 bn worldwide.[22]
The collection and use of consumers’ personal data has become a vital feature of digital markets and has created significant efficiencies for consumers. It is therefore well accepted that when competition authorities assess the health of competition in these markets, they should consider the benefits consumers receive from digital services- online search, social networks, fast and convenient connections with relevant products, news and entertainment and real-time information on healthier lifestyle choices.[23] The collection and use of personal data is not so much a paid price, but an objective cost imposed on consumers in the process of digital transactions. We should be more concerned about the consequences of these revelations for consumers, than what the supplier gains from each incremental revelation of consumer data. A critical problem for consumers and for the competitive process is that, currently, these costs are hidden and consumers have almost no power to address them. Concealed data practices create objective costs and detriment for consumers. They undermine the competitive process by chilling competition on privacy quality and increasing inequalities in bargaining power and information asymmetries between suppliers and consumers. It is therefore imperative that competition authorities take into account these costs and detriments in assessing the state of competition.
Conclusion
It needs to be understood that besides being a standard currencies’ substitute in order to pay for digital services, personal data has another underlying value, namely privacy. Disclosure of personal data leads to increased risks of identity theft or fraud, like we have seen in the Facebook case. Indeed, in this post Cambridge Analytica time, citizens report high levels of concerns about the protection of their personal data and sensitive information leakage.[24] Using privacy enhancing methods for companies operating online creates a positive externality for the whole market, and therefore instead of focusing on the economic losses caused to a few players in the market, it should be understood that consumer welfare is more than simply price. The last decade has seen a global shift away from the purely price-driven understanding of consumer welfare[25] and there is no reason why data protection needs to be left out of it.
[5] Marco Botta, Klaus Wiedemann, The Interaction of EU Competition, Consumer, and Data Protection Law in the Digital Economy: The Regulatory Dilemma in the Facebook Odyssey, The Antitrust Bulletin (2019).
[6] John Naughton, The privacy paradox: why do people keep using tech firms that abuse their data?, The Guardian, May 5, 2019, https://www.theguardian.com/commentisfree/2019/may/05/privacy-paradox-why-do-people-keep-using-tech-firms-data-facebook-scandal
[7] Id at 1.
[8] Select Committee on Communications, United Kingdom, (2019), Regulating in a digital world, Second report of session 2017–19, (9 March, 2019), https://publications.parliament.uk/pa/ld201719/ldselect/ldcomuni/299/299.pdf
[9] OECD Secretariat, Big Data: Bringing Competition Policy To The Digital Era, 14 Organisation for Economic Co-operation and Development (2016).
[10] Consolidates Version of the Treaty on the Functioning of the European Union, Art. 101, May 9, 2008 O.J. (C115) 47 [hereinafter TFEU].
[11] Ibid.
[12] LawTeacher. Article 101 – Treaty on the Functioning of the EU [Internet]. November 2013. [Accessed 18 May 2020]; Available from: https://www.lawteacher.net/example-essays/article-101-treaty-on-the-functioning-of-the-eu.php?vref=1.
[13] The analysis in this paragraph is explained through arguments advanced mainly in the context of social media platforms for an effective understanding
[14] Ford/Volkswagen; Jahrhundertvertrag and VIK-GVSt; EBU/Eurovision System; Fiat/Hitachi
[15] Supra at 10.
[16] Alison Jones, B.E. Sufrin, EC Competition Law: Text, Cases and Materials, pg. 273.
[17] Marc Rysman, Competition Between Networks: A Study Of The Market For Yellow Pages 1-2 (2002).
[18] David S. Evans, The Antitrust Economics of Multi-Sided Platform Markets, 20 Yale J. on Reg. 325 (2003).
[19] Jeremy Owens, The Debate Around Data Privacy is Missing the Point, Towards Data Science, (Jul. 30, 2019), https://towardsdatascience.com/the-debate-around-data-privacy-is-missing-the-point-1fcdc4effa40
[20] Ibid.
[21] European Parliament and the Council Proposal for a Directive on Certain Aspects Concerning Contracts for the Supply of Digital Content, Article 3(1) COM (2015) 634 final (Dec. 9, 2015).
[22] Facebook Decision, B6-22/16, (6 February 2019) Available at: https://www.bundeskartellamt.de/SharedDocs/Entscheidung/DE/Entscheidungen/Missbrauchsaufsicht/2019/ B6-22- 16.html?nn=359156
[23] See, eg, ‘Common Understanding of G7 Competition Authorities on “Competition and the Digital Economy”’ (July 2019) 3; D Daniel Sokol and Roisin Comerford, ‘Antitrust and Regulating Big Data’ (2016) 23 George Mason Law Review 1130, 1133-1135; Geoffrey A Manne and Joshua D Wright, ‘Google and the Limits of Antitrust: The Case Against the Case Against Google’ (2011) 34 Harvard Journal of Law & Public Policy 171, 203-206. See also David S Evans, ‘Attention Platforms, the Value of Content and Public Policy’ (January 2019) 3, 21-24; Alessandro Acquisti, ‘The Economics of Personal Data and Privacy: 30 Years After the OECD Privacy Guidelines’ (Organisation for Economic Co-operation and Development (OECD), 2010) 8-11.
[24] See Michael M. Harris, Greet Van Hoye and Filip Lievens, Privacy and attitudes towards internet-based selection systems: a cross-cultural comparison, 11 International Journal of Selection and Assessment, pages 230-236 (2003); Jeff H. Smith, Information privacy and marketing: What the US should (and shouldn’t) learn from Europe, California Management Review 43, pages 8–33 (2001).
[25] Ivana Kottasová, Europe is beginning to break up Facebook’s business, CNN, Feb. 8, 2019, https://www.cnn.com/2019/02/08/tech/facebook-european-regulators/index.html (“The German regulator argued that Facebook (FB) controls more than 95% of the country’s social media market, which means users have to choose between its data collection and not using social media.”); FTC Imposes $5 Billion Penalty and Sweeping New Privacy Restrictions on Facebook, FEDERAL TRADE COMMISSION (July 24, 2019), https://www.ftc.gov/news-events/pressreleases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions.
References
1. Botta,M., Wiedemann, K. (2019).The Interaction of EU Competition, Consumer, and Data Protection Law in the Digital Economy: The Regulatory Dilemma in the Facebook Odyssey, The Antitrust Bulletin
2. Naughton, J. (2019, May 5). The privacy paradox: why do people keep using tech firms that abuse their data?, from The Guardian: https://www.theguardian.com/commentisfree/2019/may/05/privacy-paradox-why-do-people-keep-using-tech-firms-data-facebook-scandal
3. Select Committee on Communications, United Kingdom, (2019), Regulating in a digital world, Second report of session 2017–19, https://publications.parliament.uk/pa/ld201719/ldselect/ldcomuni/299/299.pdf
4. OECD Secretariat (2016). Big Data: Bringing Competition Policy To The Digital Era, Organisation for Economic Co-operation and Development
5. Jones A., Sufrin B.E., (2011) Introduction to Article 102. (Oxford University Press) EC Competition Law: Text, Cases and Materials (p. 273)
6. Rysman M, (2002). Competition Between Networks: A Study Of The Market For Yellow Pages, The Review of Economic Studies
7. Evans D.S., (2003) The Antitrust Economics of Multi-Sided Platform Markets, 20 Yale J. on Reg. 325
8. Owens J., (2019, July 30) The Debate Around Data Privacy is Missing the Point, from Towards Data Science: https://towardsdatascience.com/the-debate-around-data-privacy-is-missing-the-point-1fcdc4effa40
9. Harris M.M., Hoye G.V. and Lievens P., (2003) Privacy and attitudes towards internet-based selection systems: a cross-cultural comparison, 11 International Journal of Selection and Assessment, (pp. 230-236); Smith J.H., (2001) Information privacy and marketing: What the US should (and shouldn’t) learn from Europe, California Management Review 43, (pp. 8–33)
10. Kottasová I., (2019, February 8) Europe is beginning to break up Facebook’s business from CNN: https://www.cnn.com/2019/02/08/tech/facebook-european-regulators/index.html
About the Author
Sukanya Hosmani is a final year BA., LLB. (Hons.) student graduating in the midst of the Covid-19 crisis. She is interested in corporate and technology, media and telecommunications (TMT) laws. Her interest in data protection was enhanced when she worked in relation to white paper on data protection framework for India and the Aadhar case of 2017. She believes factoring data protection in the way the world functions is a major undertaking all the while being hopeful that data privacy will come to be understood and practiced as an individual human right, not to be superseded by commercial interest.