KEY FINDINGS of The Italian Joint Sector Inquiry Into Big Data

0

On February 10, 2020 the Italian Authorities for Communications[1], Competition[2] and Data Protection[3] published the findings of a joint sector inquiry (“Investigation”) into the field of Big Data lasted almost three years. It was launched on May 30, 2017 to study the functioning and impact of Big Data on the economic, political and social environment and whether the applicable regulatory framework is suitable to address concerns for the protection of privacy, consumers and competition.

Differently from similar studies in other jurisdictions[4] – which have seen the “in silos” involvement of competition authorities separately from other regulators – the Italian Investigation is the first featuring a joint effort of the three regulators of the areas of law mainly affected by the transition from analogical to digital economy. The greater added value of the Investigation rests indeed on its interdisciplinary character and the Italian ability to mix ingredients together, valorizing each one individually and as a whole.

Definition. The expression “Big Data” is used to describe the collection, analysis and storage of a vast quantity of personal and non-personal data generated from different sources. They are characterized by four main features (referred to as “the four Vs”):

  • Volume of the enormous quantities of data collected and processed
  • Variety of the many different types and sources of data gathered
  • Velocity of the processing and elaboration activities
  • Value acquired by processed and elaborated data.

Value chain. The Investigation has portrayed the value chain of Big Data as structured along three levels: (i) data collection (further divided by data generation, collection and retention); (ii) data elaboration (extrapolation, integration, analysis); (iii) data interpretation and decision-making.

The content of the Investigation (122 pages) is vast and articulated and it is thus impossible to thoroughly address all the issues raised in a brief format. In subsequent sections A, B and C we have distilled the key findings of each of the three authorities in their respective remits. In the final section D, we wrap up with a few conclusive observations.

  1. Considerations from the Italian Authority for Communications (AGCOM)

The development of digital markets and powerful platforms does not only involve the need to investigate Big Data from a competition and regulatory standpoint but also requires analysis from a constitutional perspective. AGCOM addresses Big Data from that angle too. It observes that market failures in the context of Big Data do not only depend from the structure of markets and dynamics of supply (e.g. market power, network externalities, economics of scale) but also from the dynamics of demand of digital content/services from users (e.g. framing, prominence, self-confirmation bias, default-bias etc.). As illustrated by studies on behavioral economics, these novel distortions may too negatively compromise a competitive environment in digital markets to the detriment of consumer choice and pluralism.

Considering its competences, the analysis of the AGCOM focuses on the impact of Big Data on: (i) the dissemination of information and the protection of fundamental rights; (ii) the functioning and role of the online advertising in driving consumers’ information and preferences; (iii) the regulatory framework for audiovisual services and electronic communications, with a focus on regulatory asymmetries between “over the top” players (OTTs) and traditional operators.

A.1         Dissemination of distorted and biased information and threats to democratic process

The creation and supply of informative content is deeply impacted by the advent of Big Data. The Investigation highlights that the traditional journalistic and scientific criteria, which before the digital era worked as gatekeepers of objective criticism, are losing this fundamental role in the society with their ability to curb disinformation. The concentration of digital markets in the hands of a few platforms (Big Techs or GAFA[5]), which in large part distribute informative content generated by a great number and variety of non-professional editors (often users themselves, or even bots and algorithms) brings to a new paradox: the great increase of sources of information caused by the digital revolution does not lead to an increase of quality and variety of information but rather the opposite. The reality is that it seems to increase polarization of views while lessening the ability of consumers to distinguish between objective information and fake news.

The AGCOM warns that these factors threaten fundamental freedoms at the very founding of the democratic process and must therefore be tackled as a priority through strong policy initiatives and – possibly – new regulatory tools.

The basic concept is that data-driven, zero-pricing digital businesses aim at capturing as much consumer attention as they possibly can (generating so called “attention markets”). Digital platforms push consumers to produce a vast array of data through proposition of attractive content and interactive functions (like, scroll, search, etc.). As a result, platforms profile users and provide customized content strictly related to their online “history”. This may severely affect the quality and neutrality of information received by users as it leads to phenomena (like filter bubbles, self-confirmation bias, anchor effects, echo chambers, group thinking etc.) characterized by a circular mechanism in which individual users reveal the information they are prone towards through their actions. The platform’s algorithm then circularly re-proposes content to the same user (or clusters of users) which confirms their opinions and beliefs.

These mechanisms isolate discussion environments and predispose favorable conditions for microtargeting disinformation campaigns on aspects which have deep political and cultural impacts. Ultimately, these campaigns would allow – through the massive collection and exploitation of data – to actively influence voters, threatening the functioning of democracy. AGCOM thus rightly observes that in the digital era effective competition in communications and differentiation of sources (external pluralism) is no longer enough to ensure real information pluralism and quality. To the contrary, uncontrolled multiplication of sources might even accentuate auto-selection and polarization in the research and dissemination of information (backfire effect), if these issues are not properly addressed.

So far, to counter these practices the AGCOM has implemented and is further promoting co-regulatory initiatives with digital operators’ associations and experts aimed at setting best practices codes, creating ex ante quality benchmarks combined with enhanced transparency obligations, ex post monitoring and fact-checking, and initiatives of “digital literacy” against disinformation and hate speech. Regulatory tools already exist in the weaponry of the AGCOM[6] and are being updated to also enable monitoring and tracking of ownerships and corporate links between platforms and websites/brands, including the sources of financing of fake contents by online advertising. The AGCOM also warns that the contrast to disinformation online requires that similar and other regulatory initiatives be tackled at EU level and is proactively cooperating with the European Commission and the EU network of national regulatory authorities to that end.

A.2         Big Data & Online advertising: concerns for competition and pluralism

The crucial value of data for advertising purposes, which is functional to creating specific profiles on consumption habits, pushes online platforms to capture as much consumer attention as possible through the promotion and proposition of appealing content. The data so gathered are then marketed to advertisers or intermediaries and monetized, sometimes by breaching privacy rules and fundamental rights of individuals.

Against this backdrop, in July 2019 the AGCOM, in the context of the competences assigned by sectoral legislation[7], initiated proceedings aimed at assessing the existence of dominant positions in relevant markets within the online advertising sector, as capable of hindering pluralism in the wider “Integrated System of Communications” (ISC)[8]. It is the first proceeding directly involving online platforms and the correlation between collection and profiling of data for commercial purposes. The AGCOM has wide statutory powers to impose structural remedies to remove dominant positions within the ISC (for purpose of pluralism)[9] and it expresses willingness to use it to address competitive concerns at various levels of online advertising’ value chain, including collection and profiling of data.

In addition to the aspects of competition and pluralism, the AGCOM points out that the dynamics of the collection of data for advertising purposes can lead to production and dissemination of disinformation and content contrary to human dignity (hate speech, racial or sexual discrimination, bullying, brutality etc.).

A.3         Regulatory asymmetries and the pursue of a level playing field between OTTs and offline players

Zero-pricing digital businesses, combined with Big Data and the high level of concentration in digital markets, may enable certain platforms to aggressively expand into traditional markets with potentially disruptive consequences. Regulatory asymmetries between OTTs and “offline” players is often a perceived cause of unfair competition as it may unlevel the playing field between them. The Investigation shows that regulatory asymmetry is particularly problematic for competition in the electronic communications and audiovisual services sectors. Indeed, traditional operators of these sectors are since long calling for approximation of their regulatory and liability regime to that applicable to OTTs.

Electronic communications services. According to the AGCOM’s findings, the need to approximate regulatory regimes in this sector has emerged mostly with respect to the following aspects.

Safeguarding open internet and net neutrality. The rules provided by EU Regulation 2015/2120[10] to safeguard net neutrality principles, which so far have been applied solely towards internet access providers, might have to be extended to guarantee transparency, fairness and neutrality also with respect to “algorithmic” platforms operating in highly concentrated markets. The AGCOM notes however that EU Regulation 2019/1150[11] may have already addressed some of the concerns.

Number-based interpersonal communications. The most widely diffused interpersonal communications and messaging services using the internet (e.g. WhatsApp, Messenger, etc.) usually require the indirect use of a mobile number as an identifier. This feature makes their functioning is in the substance equivalent to traditional telephone/SMS services, whilst according to the EU regulatory framework they cannot be characterized as electronic communications. Certain measures have been introduced at EU and national level to reduce this regulatory asymmetry[12], though OTTs still escape from general authorization’s requirements and thus from access and interconnection obligations applicable to traditional telephone and networks operators.

Data protection/retention asymmetry. The pervasive diffusion of powerful OTT platforms, combined with Big Data, has unveiled a source of regulatory distortion relating to different data protection regimes. To this regard, AGCOM observes that traditional communications operators are subject to more stringent obligations on data collection and processing compared to OTTs, disadvantaging the dynamism and quality of their offering. While traditional operators must request a new consent to their customers to provide enhanced data-driven services (e.g. based on geo-localization), OTTs may request a unique consent for a bundle of digital services including interpersonal communications.

Audiovisual services. Audiovisual operators have highlighted regulatory asymmetries with respect to OTT content aggregation or hosting platforms which do not have editorial responsibility and do not own the content (e.g. platforms like Youtube, social networks and other hosting providers).

Access to users’ data and issues of data ownership. Audiovisual service/content providers, unlike OTT platforms, find difficult to access data generated by consumers viewing their original content through OTTs. Such data remain in the platforms’ exclusive availability even without owning and having invested in content production. This puts traditional audiovisual content providers, where not vertically integrated with powerful OTTs, at a competitive disadvantage for the greater difficulty in acquiring new customers’ data and improve their content accordingly compared to OTTs. OTTs could indeed leverage on this advantage to also enter and quickly displace traditional players in the upstream content markets. The AGCOM suggests that this problem could be addressed by introducing interoperability and data portability procedures either by means of contract between content providers and OTTs (provided that such procedures comply with GDPR’s consent/legitimate interest requirements) or by means of new regulation on viewers’ data co-ownership.

Editorial liability of OTTs over hosted content. A further concern considered by the AGCOM rests in that by organizing, promoting and ranking aggregated content based on Big Data analytics and algorithms, OTTs may too harm democracy and human dignity if not held liable for the illegal content they may diffuse. Many stakeholders have complained that the e-commerce directive[13] exempts hosting platforms from prior monitoring obligation and liability on illegal or dangerous content they host (unless for not removing it ex post). The new Audiovisual Service Directive[14] recognizes in part the growing competitive pressure of OTTs (in terms of audience and advertising revenues) towards traditional audiovisual service providers in this respect. In particular, it has extended to social media and video-sharing platforms certain rules typically applied to the audiovisual world – namely on protection of underaged people and wider public from violent, offensive or anyhow criminal audiovisual content – if the platforms’ “essential functionality” is the provision (even by mere automatic means or algorithms) of entertainment programmes and user-generated videos[15].

  1. Considerations from the Italian Privacy Authority (Garante or IPA)

From a privacy perspective, the IPA stresses that its task is to limit the possible impact of Big Data on people’s fundamental rights by, at the same time, preserving the benefits that may come from them.

B.1         To what extent GDPR may help handling data protection concerns raided by Big Data

The IPA notes that the GDPR[16] does not expressly deal with, and is not entirely fit to tackle, all the issues arising from across the whole value chain of Big Data. Certain provisions applicable to the collection phase may effectively protect data subjects from information asymmetries and, indirectly, from the risks related to profiling and automated decisions. Further, the entire processing of personal data must be carried out in compliance (by default and by design) with the principles of lawfulness (consent/legitimate interest), fairness, transparency, purpose/data/storage minimization, accuracy and security, which are at the foundation of the GDPR (Articles 5-6). Nevertheless, the IPA provides no straight answers to the question whether and how the GDPR may make massive volume of data collection and processing through automated techniques, which intrinsically characterizes Big Data, compliant with these fundamental principles.

B.2         The importance of preserving quality of Big Data to ensure fair and transparent profiling

The requisite of “trustfulness” and “quality” of personal data plays a decisive role to a GDPR-compliant use of Big Data to profile individuals. In this regard, the IPA invokes effective enforcement of Recital 71 of GDPR[17]. Indeed, the added value of Big Data, compared to traditional data, consists in the possibility to extract from personal data more meaningful information to identify behavioral trends, address people’s needs and predict their future decisions. This is made possible by the combined use of tracking and analytics technologies (e.g. cookies and IoT combined with AI/algorithms) enabling the direct observations of data subjects and the automated collection of related data to infer other data/information from various datasets (by identifying recurrent statistical-probabilistic correlations).

B.3         The importance to comply with transparency requirements when processing Big Data

Big Data analytics techniques are frequently accused to be opaque: often users are not aware of which data are collected and for what purposes. Therefore, the IPA highlights the paramount relevance of the GDPR’s provisions aiming at ensuring transparency, particularly those requiring to inform data subjects “of the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject”. In this regard, the IPA notes that a correct enforcement of the principle of transparency may also significantly affect competition in digital markets, considering that the information given to data subjects may influence consumer choice in the same way as labels on goods may do.

B.4         Complexities with anonymization and pseudonymization of Big Data

Big Data often do not involve personal data. However, the IPA stresses that it is difficult to draw a hard line between personal and non-personal data as this dichotomy does not correspond to the actual complexity of the processing of Big Data. For instance, even though the processing starts from anonymized (i.e. non-personal) data, data outputs may lead to re-identify (singleouting) a person by combining different datasets – and this may also occur unpredictably. In other words, Big Data widen and blur the category of personal data. It is thus crucial that data controllers implement (by default and by design) dynamic[18] technical and organizational measures to: (i) accurately assess whether collected data are personal; and (ii) organize processing systems to preserve from voluntary or accidental re-identification. In this regard, the IPA examines the differences between existing techniques of data anonymization[19] and data pseudonymization[20].

Anonymization seeks to prevent from: (a) isolating a person from a group; (b) linking data to a certain person in a different database; and iii) inferring new personal information from another dataset. It can be achieved by combining techniques of “randomization” (e.g. shuffling) and “generalization” (e.g. crowding). However, one must find the right trade-off to preserve the utility and value of the anonymized data[21].

Pseudonymization is a less incisive and radical measure than anonymization as it allows to keep ono-to-one correspondence of the pseudonymized data with the original data. This way, while the immediate identification of data subjects is prevented in the first instance, the collected data keep intact their statistical and informative value. However, the IPA warns that pseudonymized data are still personal data (thus subject to the GDPR) and thus pseudonymization cannot substitute anonymization.

B.5         Application to Big Data of the principles of purpose limitation and data minimization

The Investigation highlights that the enforcement of the principles of purpose limitation and data minimization is particularly challenging in the digital era. While the GDPR expressly sets out these principles and the related obligations, it is difficult in practice to apply them to massive data collection and processing through automated techniques, which is typical of Big Data.

Purpose limitation. The IPA warns that anonymization cannot be an escamotage to carry out processing activities incompatible with the purpose initially declared to data subjects. Enforcing the requirement to set the initial purpose as precisely as possible may serv to curb arbitrary and uncontrolled use of Big Data (i.e. without renewed data subject’s consent). However, the IPA recognizes that the added value of Big Data largely relies on the possibility to use data for purposes that cannot be precisely foreseen from the outset.

Data (including risk) minimization. To address this issue, the IPA invokes the strict application the “accountability” principle[22] based on a “by design approach”. Data controllers should implement appropriate policies and checking systems to ensure quality, accuracy and minimized collection of data throughout the processing and against a case-by-case analysis. To this end, the IPA suggests that controllers systematically precede the processing of Big Data by a “data processing impact assessment”[23] (“DPIA”) and make the DPIA publicly available. Further, controllers should adopt “dynamic” measures to systematically check over time whether the solutions in force are still appropriate and improve them where necessary.

  1. Considerations from the Italian Competition Authority (AGCM or ICA)

The ICA first provides an overview of the structure and main characteristics of data-driven markets. It then analyses various aspects of the complex interplay between use of personal data and privacy rules with market power, consumer protection and competition law, also pointing to the role and impact of public policy on the functioning of digital markets.

C.1         Structure and main characteristics of digital, data-driven markets

The ICA observes that the need to factor Big Data in the antitrust analysis mostly arises for services that either play a central role in the digital ecosystem, to the extent that could not be provided without the use of Big Data (e.g. online advertising), or are characterized by relevant information asymmetries and by distribution and intermediation activities (e.g. financial and insurance markets, with growing relevance of automotive).

The ICA then stresses that while these services are usually provided in highly concentrated markets also featuring dominant players, accumulation of Big Data is only one of the several factors which contribute to such characteristics. The following factors – familiar to antitrust analysis but more relevant in the digital economy –  still play a critical role in explaining market power of digital platforms: (i) the distinctive multi-sided structure of digital markets; (ii) high investments in data analytics and interpretation capabilities (i.e. AI and algorithms); (iii) economics of scale and scope; (iv) network effects; and (v) switching costs. It is the cumulated effect of Big Data with all these factors that is capable of strongly condition the competitive dynamics of digital markets and lead to “winner takes all” or “market tipping” effects, which ultimately tend to strongly benefit first movers and disincentive new entrants.

C.2         Big Data as a barrier to entry

Considering the above, personal data can be characterized as a barrier to entry. However, the ICA warns that the extent to which they constitute a barrier to entry must be measured case-by-case against: (i) actual relevance of Big Data for the service in question; (ii) nature, quality and quantity of data needed to compete effectively; and, most importantly, (iii) number/variety of sources available to attain the information and knowledge generated by such data to compete effectively.

Indeed, the Investigation indicates that, in principle, what causes grater competitive strength is not data themselves but the additional information and knowledge which is possible to extract from data. In certain markets (e.g. online search) the competitive advantage may depend on the availability of a greater volume and variety of data, which combined with well-established algorithmic and machine learning technologies generate new knowledge and improved services[24]. Conversely, in other contexts raw data may be easily available in large quantity and variety and the crucial factor for gaining a competitive edge is the possession of superior and proprietary AI technologies developed through important investments in innovation. Some markets require both[25] whilst other need data of a certain quality over quantity. For instance, the ICA notes that in digital advertising/marketing the greater value of real-time data (which can be sourced and replicated indefinitely from various sources) makes incumbent’s competitive advantage less sensitive for competition and thus barriers for new entrants less significant.

Notably, the Investigation shows that most datasets are somehow replicable as there are countless online and offline data available on the market for different uses, often publicly accessible[26]. Hence, the ICA concludes that only exceptionally Big Data may constitute an “essential input” to operate on a market.

C.3         Novel characteristics of market power and new challenges for antitrust in the digital ecosystem

The ICA takes as a fact that certain strategic digital services are today controlled by dominant companies (i.e. not subject to significant competitive pressure), expressly pointing to markets for online search and mobile operating systems, social networks and intermediation services in e-commerce. More in general, it points to “systemic” market power caused by the global dimension and the strategic role of gateways played by these services. Such role would give certain platforms a “decisive influence” on social and economic dynamics on the Internet by managing access to digital markets, visibility and reputation of third-party undertakings. In particular, the Investigation identifies the following key contributors to market power of digital platforms.

Vertical and conglomerate integration. It is considered a distinctive characteristic of leading platforms across the Big Data value chain. Supplying a whole range of services is deemed to entrench the power they hold in each single market where they operate by combining data generated by the same subjects from different digital sources and behaviors, thus allowing better profiling. This in turns enables them to provide an even greater range of services and to enter new markets, gaining a competitive edge from the outset. Further, the ICA points to the relevant positions held by certain platforms in supplying strategic intermediate services for acquisition and elaboration of data (e.g. hosting, cloud, analytics etc.); and to the lock-in/lock-out effects that the integration of all these services, combined with limited interoperability, may have on third-parties (either customers and potential competitors).

Persistency of market power. Network externalities typical of multi-sided digital platforms, combined with the availability of a large volume of detailed data on users’ behaviors, would make the market power of certain big players also difficult to contend for an indefinite timeframe. Such perceived persistency of market power makes the high concentration level of digital markets even more problematic in the eyes of the ICA. It risks discouraging investments in innovation and preventing future market entry.

Novelty of market power. The ICA stresses that digital economic is challenging basic and consolidated antirust concepts such as the notion of undertaking, relevant market and market power. It points to the paradox that incumbents in traditional/offline markets (e.g. physical banks) seem to attribute actual market power to digital operators that yet have not entered their markets. Thanks to the greater availability of Big Data and the unmatchable technical capabilities to pool and exploit data to their advantage in potentially any sector, certain platforms are perceived as being able to disrupt the existing positions and become dominant immediately after entry. Competition authorities are worried that since they lack ad hoc powers and regulatory tools to intervene ex ante, they may lose momentum to impede the creation of new dominant positions, which would be extremely difficult to remedy ex post by means of the existing antitrust toolkit.

C.4         Market power and control of concentrations

The ICA shares with other competition authorities the concern for a gap in merger control legislation, which allows digital incumbents to acquire innovative start-up platforms or technologies before they can thrive and threaten the persistency of their market power. The targets are businesses with yet a tiny or no turnover that are however very valuable for incumbent platforms because of the strategic value of their data and the innovation potential. Where the aim of such acquisitions is not to give a chance to thrive to an innovative business, which otherwise would not have a growth potential, but rather to impede effective competition in digital services, they are called “killer acquisitions”.

killer acquisitions may escape antitrust scrutiny because the prior notification and authorization system of most merger control regimes is based on market turnover thresholds, which trigger mandatory notification if exceeded by the parties of the concentration. These thresholds were set in a time where market turnover was a reasonable proxy of size and potential of firms, whilst the speed of innovation of the digital economy and the value of data generated by zero-pricing services was not even conceivable. The target of a killer acquisition will often not exceed the minimum turnover threshold which triggers prior notification obligation with any competition authority. Market shares may also be impracticable to calculate in the digital sector, considering that target’s activity may be an innovative service or platform with yet no market presence.

From the substantive viewpoint, increasing market power by means of mergers and acquisitions in the digital sector poses new concerns for competition other than effects of the concentration on prices. Effects on the competitive level of quality, innovation and privacy of digital services seem more important and directly affected. Analyzing these types of effects in concentrations creates new challenges for the antitrust analysis. It is not only exponentially more complicated to assess the likelihood of a harm to competition; it is also difficult to distinguish between effects on competition, which is the object of antitrust law, and effects on public objectives assigned to other authorities, e.g. the protection of privacy and of other fundamental rights.

In line with similar stances expressed by public institutions globally, the ICA pushes for both procedural and substantive changes to merger control regimes at both national and international level. First, it supports the introduction of value-based thresholds (i.e. of purchase price or assets’ value) in merger control regimes to detect killer acquisitions. Secondly, it calls for reviewing European Commission’s criteria for the antitrust analysis of concentrations by giving greater relevance and clarity to “conglomerate effects” on competition in the digital sector, e.g. with respect to powerful platforms’ ability to leverage on Big Data generated by various services to create and dominate new markets. New guidance is also needed to make complicated analysis on potential effects of mergers on non-price parameters (e.g. innovation, quality of services, privacy) or on the collusive potential of pricing algorithms in digital markets. All these complications are enhanced by the difficulty of defining relevant markets in the digital economy: in this respect, the Investigation suggests that centrality of market definition in EU competition law should not prevent addressing exclusionary effects of conducts[27].

C.5         Characterizing personal data as commodity, price or quality: impact on competitive analysis

Interestingly, while the ICA observes that personal data do not usually constitute autonomous object of trade but rather an ancillary good in the demand or supply of products/services, it nonetheless analyzes the characteristic of demand and supply of personal data as it were the autonomous object of a relevant product market. By contrast, it warns that considering personal data themselves as “commodity” or “currency” might conflict with the philosophy underlying the protection of privacy as a fundamental right.

However, considering personal data as a component of the price or, in the alternative, as a quality feature of digital products/services is deemed a helpful means to positively address the impact of the merchandise of personal data on consumer welfare and competition. Characterizing personal data as (non-monetary) price highpoints that they may be the sole consideration requested for a product/service, and this enables authorities to protect competition and consumers in “zero-pricing” digital markets.

Nevertheless, the ICA recognizes that the relationship between quality or price and the level of privacy is not univocal. For several services and consumers, a greater collection and matching of data means a more efficient service. Further, the Investigation shows that it is difficult for consumers to assign a specific value to their personal data since it is often hidden or unpredictable beforehand: the real value may be only revealed when combined with other datasets at a given point in time and known only by data controllers.

The ICA also recognizes that “negotiability” of privacy is at odds with (i) a low level of differentiation of privacy conditions in platforms’ offers of digital services and (ii) fundamental rights relating to protection of personal data (e.g. need to comply with Articles 5-6 GDPR). In addition, the ICA notes that assigning the role of price or quality to personal data may “backfire” as it entails that digital players – assuming they might differentiate their offering based on the level of privacy – could charge real money for “premium”, i.e. more data-protective services, thus making privacy a “luxury for a few”. In this respect, the Investigation reveals what the ICA calls a “privacy paradox”: while 93% of respondents to the ICA’s surveys declared their strong interest in protecting their personal data, a mere third of it refuses to give whatever requested consent – a distortion attributed to the “free effect” of the offer of digital services.

To conclude, while characterizing personal data as price or quality may – theoretically – help assessing the effects of Big Data on competition, it remain exceptionally complicated to determine a minimum benchmark/competitive level of privacy and thus whether platforms’ conducts may cause anti-competitive effects (e.g. unfair prices or degradation of quality).

C.6         The (ambivalent) effects of privacy policy on competition

Along this line, the ICA indicates that privacy rules may have an ambivalent impact on competition. A restrictive approach to data protection may relatively curb digital incumbent’s market power but also hamper the circulation and use of data across multiple sources and thus strengthen barriers to entry. However, national experience has shown that a coordinated application of sector regulation (e.g. energy and financial) with privacy rules can lead to identify the right balance to use access to data to support the dynamics of competition[28]. A full application of the GDPRS’ provisions may thus contribute to level the playing field between digital incumbents and new entrants.

The Investigation points to three topics in privacy enforcement policy having a crucial impact on competition: (i) portability; (ii) ownership/management of data; and (iii) purpose limitation combined with consent requirements.

Data portability. The introduction of data portability by the GDPR (Article 20) is deemed in principle a fundamental step to reduce switching costs (i.e. lock-in risks for users deriving from accumulation of data on a platform) and to incentivize “multihoming” across platforms. However, the ICA observes a few potential obstacles to full exploitation of portability. Firstly, surveys sent to consumers have revealed scarce awareness of this right and on how to use it (up to 91%). Secondly, around 40% of users show low propension to multihoming for certain services (e.g. social networks) because of the time/effort it would take or lack of real alternatives. Thirdly, different standards often cause the transfer of datasets across platform technically unfeasible or unhelpful. The ICA notes that the GDPR provides for portability of users’ raw datasets and not of information extracted therefrom; and only requires “interoperability” of platforms’ systems, which does not necessarily entail “compatibility” of transferred data.

Ownership and management of data.  Some experts heard by the ICA suggest that market dynamics in connection with circulation of personal data may be improved by defining property rights over personal data. Similarly, other have suggested to develop alternative legal and technical architectures for managing personal data, e.g. by passing from the current system based on “centralized” management of data by platforms to a “decentralized” system based on users’ control. However, the feasibility of such a solution is conditioned by the ability of users to identify the value of their personal data, which is often not foreseeable ex ante.

Data minimization, purpose limitation and consent requirements. The Investigation highlights that the massive and generalized collection of Big Data inherently conflicts with the principles of data minimization and purpose limitation, which require controllers to obtain new consent from data subjects to use their data for purposes exceeding that initially communicated. Nonetheless, it may be impossible to foresee from the outset how collected data might be used and the utility they may have. Therefore, how the consent requirement is applied in relation with the principles in question may crucially impact competition as it may significantly impact the ability to swiftly develop new services and expand into new markets. Innovative solutions like “dynamic consent” (by which data subjects can express or retrieve their consent based on actual use of data over time) may favor the participation of individuals in the processing of their personal data, reduce information asymmetries, thus enhancing “consumers empowerment” and improving competitive dynamics. Dynamic consent may however require alternative architectures for ownership and management of data as well as a functional separation of platforms’ activities.

C.7         Possible scope of intervention of the AGCM

The ICA Investigation confirms the ICA’s determination to make use of its wide powers to enforce competition and consumer protection law to prevent the harmful consequences that may derive by misuse or strengthening of market power in the digital environment.

Consumer protection. In the recent years the ICA has fined leading digital, zero-pricing platforms for allegedly misleading and aggressive practices on the grounds that they collected personal data of consumers in exchange of services promoted as “free”, without correctly informing on how personal data are used and monetized, irrespective of a breach of the consent requirements set forth in the GDPR[29]. It now seems also willing to monitor personalization of prices and discriminatory conducts based on tracking consumers’ spending behaviors and preferences by using algorithms and Big Data techniques. Personalization of services may be greatly beneficial to individual consumers and consumer welfare, though it may also become unfair if leads to harmful or obscure discrimination based on use of personal data unknown to consumers and which obstructs their ability to make informed commercial choices. Notably, to effectively pursuing and enforcing consumer protection in digital markets, the ICA calls for legislative interventions to increasing maximum fines and investigative powers in the field.

Competition law. The ICA indicates that conducts entailing the use of Big Data may constitute antirust infringements (Article 101 and 102 TFEU or the national equivalent) in several circumstances.

Abusive conducts. According to the ICA, the three abuse investigations carried out by the European Commission against Google in the recent years are perfectly consistent with traditional abusive conduct. The Commission had just to frame the peculiarities of a digital commercial practice (e.g. discriminatory use of search algorithms) within a traditional theory of harm based on exclusionary effects. This considered, the ICA deems antitrust law sufficiently flexible to address a whole set of conducts in digital markets through an evolutionary application of existing rules and theories of harm.

Notably, while recognizing the greater complexities connected with defining relevant markets in digital economy, the ICA suggests that the enforcement policy could focus more on exclusionary effects than on rigorous market definition. This would particularly make sense for conducts leveraging on the central role of “irreplicable” Big Data in possession of a dominant platform that could simultaneously affect a variety of markets. It however stresses that refusal to deal or provide access with respect to data must in any case satisfy the “exceptional” conditions set forth by the case law on “essential facilities” to constitute an abuse. In particular, the following aspects are indicated as relevant to assess the “indispensability” of data to compete on a certain market: (i) the personal or non-personal nature of data; (ii) whether data have been collected by the dominant undertakings with the consent of data subjects rather than generated by analytics operations or integration of various datasets; and (iii) the impact of GDPR or other regulations to portability and circulation of the data concerned.

While speculating on hypothetical exclusionary abuses in the digital environment, the ICA interestingly pointed to conduct consisting in “reducing rivals’ data” (as opposed to conducts “increasing rivals’ costs”). Examples refer to making more difficult for competitors to access customers’ data by imposing contractual limitations to use certain services or by means of exclusivity agreements with third parties; or other obstacles imposed to customers to use competitors’ services. Exploitative abuses, though more complicated to find, are deemed to possibly arise in connection with imposition to customers of unfair use or “excessive” collection of personal data. In this regard, despite admitting that it is problematic to assess the competitive level beyond which collection of personal data may be deemed unfair, the ICA indicates that the category of exploitative abuses may apply to a broader set of circumstances in digital markets than in traditional markets.

Restrictive agreements. The ICA refers to mainstream theories of potential anti-competitive effects of pricing software or algorithms used by vendors or platforms to monitor and set prices of goods and services on the Internet[30]. Such algorithms may be designed or operated by competitors with the object or effect of automatically colluding on higher resale prices, in which case traditional case law on cartels may apply unproblematically. If, otherwise, such effect is the result of tacit, anti-competitive interaction between sophisticated algorithms without human intervention and awareness (so called “robot collusion”), it may be extremely complicated to construe a “meeting of minds” or “common understanding” between competitors, as required by the established case law to find an infringement of Article 101 TFEU (or national equivalent).

  1. Main takeaways from the inquiry and what to expect

The Investigation shows that privacy law and policy represent the cornerstone to the main issues affecting the protection of competition, consumers and fundamental rights (including ultimately democracy) in the era of Big Data. However, privacy law has ambivalent effects on these issues and may cut both ways depending on the specific circumstances. A restrictive enforcement of data protection may raise barriers to circulation of data and competition to the detriment of new entrants, while a loosen enforcement may favor digital incumbents. There is indeed no straightforward relation between competition and use of personal data and no one-size-fits-all solution.

This said, all the three authorities seem to agree on introducing “dynamic consent” requirements as a possible means to address a few crucial concerns in their remits. From the perspective of privacy law, dynamic consent would help restituting control of personal data to data subjects and enforcing the principle of purpose limitation. From the perspective of media regulation, it would help leveling the playing field between OTTs and traditional players by approximating regulatory asymmetries on use of customers’ data. From the perspective of competition law, dynamic consent could curb market power of digital incumbents by constraining their ability to leverage on Big Data accumulated across a variety of sources and services to cement their dominance into existing or new markets.

In this latter respect, not only such a solution would give consumers better control on their data, but it may raise a barrier to supply-side substitutability between different digital services, contributing to functional separation of platform’s activities. This in turn would help shaping and distinguishing relevant markets in the digital ecosystem and thus preserving rigorous market definition as a foundation of EU antitrust analysis. Indeed, various authorities across the EU, including the ICA in the Investigation, have proposed to loosen the centrality of market definition to better address abuse and concentration cases in digital markets; but this should be avoided – in our view – as it would undermine certainty and predictability of antitrust law by making it prone towards non-economic or political considerations.

The Investigation also makes very clear that in-silos application of sectoral legislations by different authorities and countries is an obstacle to effectively addressing concerns raised by Big Data.

Competition law, which is often invoked to solve all the concerns, has its own limits and cannot be used to reach public goals that pertain to other political or legislative fields. From another angle, arbitrarily using consumer protection or privacy law to address competition concerns (or vice-versa) may cause conflicts of competences between authorities and serious risks of infringing the fundamental ne bis in idem principle (no double jeopardy). Indeed, the very same conduct from the same offender in the context of Big Data may simultaneously breach multiple provisions in either of the three fields of law, which may entail quasi-criminal sanctions. Moreover, the legal interests protected by privacy, consumer and competition law (where deemed relevant to find double jeopardy)[31] may coincide or overlap when it comes to consumers’ welfare and the right to make free and informed choices. To avoid the risk of conflicts or double jeopardy, in the field of Big Data undertakings should be given the opportunity to defend within a single proceeding from interconnected allegations investing the remits of the three authorities. Procedural reforms in the direction of further integrating complementary activities and competences of the AGCM, AGCOM and Garante may therefore make sense in this perspective.

The need for a case-by-case analysis also seems increasingly indispensable to avoid mistakes in the assessment of infringements, which might hamper, instead of safeguarding, investments in innovation and the benefits it may bring to society. This requires that competent authorities be equipped with the right technical expertise (e.g. data scientists and analysts) to make such complex analysis. Finally, certainty of law, and thus consistency and predictability, must be safeguarded too as a fundamental right to be preserved – and this is probably the real challenge posed by Big Data.

In the field of media and communications, the Investigation shows that coordinating legislative and regulatory updates with changes to the GDPR may be necessary to develop new data ownership’s architectures, which in turn would help improving competition between OTTs and “offline” operators. For example, the rules on portability and circulation of data must be coordinated with consent and purpose limitation requirements for the use of personal data as well as with specific consent requirements prescribed by sector regulation.

Big Data have manifested more than ever the need for close cooperation between privacy, regulatory and competition authorities at both national and international level, combined with self-regulatory initiatives of stakeholders. Indeed, the illustrated concerns and complications cannot be tackled through a fragmented approach along sectoral or national lines, given the global scale of digital players and, within the EU, the imperative of the Single Market. A holistic and synergic application of different fields of law, which combines sector regulation and common constitutional principles, supported by technical capabilities and resources to carry out case-by-case analysis, has become essential to correctly and effectively remedy to negative consequences of undertakings’ conduct on competition, consumer welfare and data protection.

As to Italy, the ICA is investigating Amazon and Google for anticompetitive practices in the markets of e-commerce intermidiation services and e-mobility services respectively, on the grounds that they may be leveraging on Big Data acquired on markets which they allegedly dominate to cement prominence of their own or third partners’ services, thus marginalizing potential competitors[32]. In the upcoming future, the AGCM is considering possible competitive threats raised by Big Data in the banking, insurance and automotive sector. Finally, the AGCOM has strong ex ante regulatory powers in the field of control of pluralism by which, theoretically, it could impose either structural and behavioral remedies in the online advertising sector currently under inquiry. These are all crucial cases to understand future competition enforcement policy in Italy on matters such as market definition and remedies in the field of Big Data.

[1] Autorità per le Garanzie nelle Comunicazioni (AGCOM).

[2] Autorità Garante della Concorrenza e del Mercato (AGCM).

[3] Garante per la Protezione dei Dati Personali (Garante).

[4] Including, inter alia, in UK, Germany, France, Spain and Benelux countries.

[5] Common acronym to refer to Google, Apple, Facebook and Amazon. GAFA(M) includes Microsoft.

[6] Like the “Informativa Economica di Sistema” (IES) – a mandatory monitoring system provided for by national law.

[7] Legislative Decree N. 177/2005 (as amended from time to time) and Article 1 of Law N. 249/97.

[8] It is a bundle of media, publishing and advertising markets statutorily defined by Legislative Decree N. 177/2005.

[9] Article 43 of Legislative Decree N. 177/2005.

[10] Regulation (EU) 2015/2120 of the European Parliament and of the Council of 25 November 2015.

[11] Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019.

[12] AGCOM refers to: (i) the introduction by Article 1(44) of law N. 124/2017 of the obligation for operators that indirectly use national plan’s numbering resources, to subscribe with the Register of Operators of Communications (ROC) managed by the AGCOM; and (ii) the possibility introduced by the new EU Electronic Communications Code for Member States to impose interconnection obligations to OTTs in limited cases an under the principle of proportionality (Recitals 16, 18, 149 and Article 61.2.c of EU Directive N. 2018/1972 of 11 December 2018).

[13] Directive N. 2000/31/EC (see articles 12-15).

[14] As amended by Directive (EU) N. 1808/2018.

[15] See Recital 5 and new Article 28(b) in Directive (EU) N. 1808/2018.

[16] Regulation (EU) N. 2016/679 of 27 April 2016 (General Data Protection Regulation).

[17] Recital 71 of GDPR states that “the controller should use appropriate mathematical or statistical procedures for the profiling, implement technical and organizational measures appropriate to ensure, in particular, that factors which result in inaccuracies in personal data are corrected and the risk of errors is minimized, secure personal data in a manner that takes account of the potential risks involved for the interests and rights of the data subject and that prevents, inter alia, discriminatory effects on natural persons […]”

[18] As opposed to “static” ore on-off measures, “dynamic” measures entail a systematic assessment at different moments in time and scope of the processing of data.

[19] Recital 26 of GDPR sets the principle that GDPR should not apply to anonymized data.

[20] See Recital 28 and Articles 4(5), 6(3)(e), 25 and 32(1) (a) of GDPR.

[21] The IPA makes the example of an epidemic study, which is more valuable the less the data are randomized or generalized

[22] Accountability in the framework of the GDPR requires that data controllers must not only be responsible for, but also be able to demonstrate compliance with the principles set forth in the GDPR (Article 5.2).

[23] Article 35 GDPR.

[24] Referring to online search, the ICA also mentions the need to consider increasing and decreasing economics of scale generated by the ratio between quantity, variety and frequency of data collection, citing in this respect the European Commission’s Google Search (Shopping) case (AT.39740).

[25] The ICA makes the example of digital agriculture and refers to the European Commission’s merger decision in Bayern/Monsanto, Case M.8084 (2018).

[26] ICA refers to European Commission’s merger cases Facebook/WhatsApp (M.7217) and Apple/Shazam (M.8788), in addition to hearings of Big Tech companies.

[27] European Commission’s top officials reported to press they are about to launch a public consultation for reviewing the 1997 Market Definition Notice to respond to the new challenges of digitization and globalization.

[28] See in this regard Decision of the Garante N. 39 of 25 July 2007.

[29] Decision of the AGCM N. 27432 in Case PS/11112 (Facebook-sharing data with third parties) of 29 November 2018 (as partially upheld in appeal by the first instance administrative court TAR Lazio). A precedent in line is Decision N. 26597 in Case PS/10601 (Whatsapp – transfer of data to Facebook) of 11 May 2017.

[30] One of the basic texts cited on this topic is A. Ezrachi and Maurice E. Stucke, Virtual Competition – The Promise and Perils of the Algorithm-driven Economy, London 2016.

[31]n the 2012 Toshiba judgment (Case C-17/10), the ECJ set out three cumulative conditions for the ne bis in idem principle to apply in the competition field rather than the two applying to criminal matters: two prosecution cases have to cover the same facts, the same offender and the “same protected legal interest”. However, in its Opinion rendered in a subsequent case (C-617/17) AG Wahl took the view that including the identity of the legal interest as a requisite to apply the ne bis in idem principle to the competition field is no longer justified.

[32] A summary of the ICA’s preliminary concerns is available here and here.

Share this article!
Share.

About Author

Leave A Reply