Legal Tech News Review Week 30 November-13 December 2020, by Eleni Kozari

EU Council publishes recommendations on encryption

On December 1st, 2020, the EU Council published its recommendations on encryption acknowledging its dual role: a). encryption constitutes an important tool fostering trust in communications and digitalisation and enhances the protection of fundamental rights, b). encrypted communications and encryption tools are increasingly deployed for criminal purposes in the digital environment.

 

Upon the foregoing considerations, the Council seeks to balance the protection afforded, by means of encryption, on the fundamental rights, privacy and security of communications on one hand and securing important public interests on the other hand. The latter primarily concern the ability of law enforcement authorities (LEAs) to lawfully access encrypted communications and data in the area of security and criminal justice. According to the Council, a fair balance should be sought in order LEAs can lawfully access encrypted data for legitimate and clear purposes in fighting serious and/or organised crime and terrorism, in both physical and digital context.

 

With the view of upholding the principle of ‘security through encryption and security despite encryption’, the EU Council aims to establish an EU consistent regulatory framework that would enable competent authorities exercise effectively and lawfully their operational tasks. To this end, all involved stakeholders, i.e. Member States, internet service providers, social media platforms, device manufacturers etc. are called to engage in a joint European dialogue on encryption.

 

In addition, the Council underlines that, in the context of the EU single market, device manufacturers and service providers, operating in the EU, could be called to develop encryption technologies that address Member States needs while maintaining the benefits of encryption. Finally, the Council calls Member States and EU institutions to engage in technical standardisation processes e.g. at the ITU, ETSI and 3GPP, and along with the Commission develop technical and operational solutions upon the principles of legality and proportionality.

See more here.

 

CNIL imposes fines on Google and Amazon for unlawful placement of cookies

 

On December 7, 2020, the French Data Protection Authority (the CNIL- Commission nationale de l’informatique et des libertés), issued two different decisions for similar violations: i) against Google LLC and Google Ireland for unlawfully placing advertisement cookies from its website ‘google.fr, without the prior and informed users’ consent, ii) against Amazon Europe for unlawfully placing advertisement cookies from its website ‘amazon.fr’, without the prior and informed users’ consent.

The CNIL conducted its investigation against Google on 16 March 2020 while in case of Amazon, the Authority had carried out several investigations, including online, within the period between 12 May 2019 to 19 May 2020. From the investigations concerning both companies, the Authority found that upon a user’s visit to each company’s website, cookies, including advertisement cookies, were automatically installed in its computer without any action required from his/her side.

In particular, as regards both Google LLC/Google Ireland and Amazon Europe, the CNIL found that advertisement cookies were automatically placed on the user’s computer without any action on his/her side, by merely visiting the companies’ website. This practice was in violation of Article 82 of the French Data Protection Act, which required the collection of user’s prior consent for installing cookies that are not essential to the service.

In addition, the Authority found that both companies did not provide clear and sufficient information regarding the installed cookies and their purposes. In particular, Google’s information banner did not inform users about the cookies that had already been placed on their computers upon visiting the website nor did it provide information on the purposes of these cookies or the means of refusal to such placement. On the other hand, Amazon’s information banner provided only general and approximate information about cookies, lacking any reference to the users’ ability of refusal as well as the means of refusal.

In case of Google, the Authority found another breach of Article 82 of the French Data Protection Act. In particular, even if users deactivated the ad personalisation feature, one of the advertisement cookies was still stored and active on the users’ computer. Hence, the CNIL considered the opposition mechanism, provided by the company, as partially defective.

Upon the foregoing findings, the Authority imposed on Google LLC a financial penalty of 60 million euros and on Google Ireland 40 million euros. Against Amazon Europe, the Authority imposed a fine of 35 million euros. The Authority considered all three fines justified vis-à-vis the seriousness of the identified breaches.

See the CNIL’s decision against Google here

See the CNIL’s decision against Amazon here

 

 

California’s Department of Justice joins Federal Lawsuit Against Google Anticompetitive Practices

 

On December 11, 2020, the Department of Justice of California announced its intent to join the federal lawsuit, initiated by the U.S Department of Justice, regarding Google’s alleged violations of federal antitrust laws.

According to the lawsuit Google, in violation of the Sherman Antitrust Act, enters into exclusionary business agreements with device makers and carriers and pays high amounts to the latter in order to secure its position as their default internet search engine. In addition, some of the contracts include provisions that exclude the conclusion of similar arrangements with competing companies.

Pursuant to the allegations, such anticompetitive practices on Google’s side, has secured the company’s monopoly over internet search and search-based advertising, hindering new market entrants from developing new and innovative solutions that could possibly yield value to consumers. Hence, both innovation and consumers are harmed. It is estimated that almost 90% of the all the internet searches in the US are carried out via Google, rendering also consumers ‘captured’ to the company’s data protection and privacy practices.

See more here

 

IAB Europe publishes guidance on DPIAs for AdTech

 

The European-level Association of IAB for the digital marketing and advertising industry, has published guidance with practical steps over the Data Protection Impact Assessment (DPIAs) process vis-à-vis data processing activities in the context of digital advertising and real-time bidding (RTB).

The guidance seeks to facilitate the interested companies in introducing and implementing the DPIA process within their normal course of product design and development, in order to be aligned with GDPR requirements. The purpose of the guidance is to introduce an industry-wide, specific and consistent framework for the evaluation and management of the risks associated with the foregoing data processing activities.

The guidance analyzes the concept of DPIA, when the requirement for DPIA is triggered, the process to be followed and the stakeholders that need to be involved as well as its purpose under the GDPR provisions. In addition, the guidance includes an illustrative presentation of the risks that commonly arise in the digital advertising industry and provides potential controls and mitigation measures (Appendix C and B respectively).

Indicatively, identifiers that are typically used in the industry can facilitate the re-identification of pseudonymous data through ID matching. Salted hashes may be used as control measures while any identifiers and identified data should be retained for the minimum time period possible. Accordingly, increased granularity in terms of collected device information can raise the risk of identifying device’s users. To this end, precision on the collected device data should be limited. Finally, the guidance underlines that, albeit unintentionally, special category data may be collected or deduced through the combination of other collected data e.g. location and browsing history. To mitigate this risk, companies should control the sources of the data, filter the nature of the collected data and avoid storage in a personal form, when required.

 

See more here