Legal Tech News Review Week 14 December-31 December 2020, by Eleni Kozari

Statement on the end of the Brexit transition period by EDPB

On 15th December, 2020 the European Data Protection Board adopted a Statement regarding cross-border data transfers once the Brexit transition period comes to an end. From 1 January 2021, the GDPR will not be applicable in the UK and a separate legal framework on data protection and privacy will supersede it.

Upon the end of the Brexit transition period, the UK will constitute a third-country and thus, any transfer of personal data between EAA and UK entities will be subject to the provisions of Chapter V of the GDPR (i.e. personal data transfers to third countries). To this end, interested stakeholders should ensure that such data transfers take place with appropriate safeguards, in accordance with Article 46 of the GDPR, unless an adequacy decision will be adopted pursuant to Article 45.

Accordingly, from 1 January 2021, the One-Stop-Shop mechanism (OSS) will cease to apply in the UK. Hence, ICO will not constitute the Lead Supervisory authority (LSA) for data controllers/processors with establishment in the UK. The latter can decide whether to set up a new main establishment in the EEA, wishing to continue benefiting from the OSS mechanism.

For more see here

 

European Commission’s proposed Digital Services Act

On 15 December 2020, the European Commission released its Proposal for Regulation on a Single Market for Digital Services (Digital Services Act). The proposed Act aims to address the challenges and risks emerging from innovative information society digital services while retains the key principles set out in the eCommerce Directive.

The objectives of the Act are to: i) ensure the best conditions for the provision of digital services within the internal market, ii) enhance online safety and the protection of fundamental rights and iii) establish a sound governance and supervision framework for the providers of intermediary services.

The Act comprises of five Chapters. Chapter I circumscribes the subject matter and the scope of the Act along with the definitions of the key terms embedded in the Act. Chapter II clarifies the conditions for the exemption of liability for the providers of intermediary services, i.e. mere conduit, caching and hosting. It also stipulates their obligations vis-a-vis orders of national authorities (judicial or administrative) to act against illegal content and provide information.

Chapter III sets transparency-related requirements for all providers of intermediary services, including the establishment of a single point of contact for direct communication with national authorities. It also stipulates requirements for online platforms, including their obligation to provide an internal complaint-handling system, to engage with certified out-of-court dispute settlement bodies and to inform enforcement authorities for any information triggering suspicions of serious offences involving a threat to the life or safety of persons. In parallel, very large online platforms are required to conduct impact assessment on their systemic risks, manage and mitigate the latter.

Chapter IV envisages the designation of Digital Services Coordinators, i.e. the national authorities designated by the Member States for the consistent application of the Act and stipulates their powers and tasks. The same chapter sets out the formulation of the European Board for Digital Services which will be an independent advisory group. In addition, in case of non-compliance of very large online platforms, the European Commission can adopt non-compliance decisions and impose fines and periodic penalty payments.

Finally, Chapter V deletes certain provisions of eCommerce Directive, including those on the liability of intermediary service providers (Article 12-15).

See more here

 

EU institutions agree on the Cybersecurity Competence Centre and Network

The EU institutions have reached a political agreement on the establishment of the Cybersecurity Competence Centre and Network, in Bucharest, aiming to enhance EU’s cybersecurity capacity and empower the safety of the online environment. The formal adoption by the European Parliament and the Council is expected in January 2021.

By establishing the Cybersecurity Centre, EU aims to promote research, collaboration and knowledge-sharing with public and industry stakeholders. In addition, the latter will be provided with access to significant capacities (e.g. testing facilities). The Center will not focus only on addressing cybersecurity threats and challenges but also on developing and sharing such innovative and effective solutions.

Finally, the Cybersecurity Centre and Network will engage in large-scale projects regarding, inter alia, Cyber Threat Intelligence, Cybersecurity hardware and security certification.

See more here

 

FTC issues orders to nine social media and streaming service provider

On December 14th, 2020, the Federal Trade Commission (FTC) issued orders to nine social media and video streaming companies, mandating them to disclose their personal information and advertising-related practices as well as their respective effect on minors.

The investigation is directed to, amongst others, Amazon, Facebook, Twitter, YouTube and WhatsApp. The FTC issued the orders pursuant to Section 6(b) of the FTC Act, which enables the former to perform large scale studies, albeit not pursuing a specific law enforcement purpose.

According to the orders, the recipient companies should provide response – within 45 days- on their data collection and management practices, including on any tracking process or on how they deduce personal information from demographic data. In addition, they should provide information regarding their practices on advertising and content delivery as well on calculating user engagement. Finally, the recipient companies are expected to disclose information on any algorithmic or data analytics process they perform on personal information.

See more here

 

FRA publishes report on AI and fundamental rights

The European Union Agency for Fundamental Rights (FRA) has published a report focusing on the implications arising from the use of Artificial Intelligence on specific use-cases vis-à-vis the fundamental rights.

The FRA acknowledges that, albeit the benefits of AI has drawn mass attention, its effects on fundamental rights has been assessed only to a limited extent. The latter concern primarily the rights enshrined at the Charter of Fundamental Rights of the EU (Charter) and the European Convention on Human Rights. EU data protection framework along with the EU non-discrimination legislation provides additional safeguards in the field of AI.

To this end, the FRA provides six propositions, aiming to promote the protection of fundamental rights throughout the development and deployment of AI. Acknowledging that the use of AI involves a wide range of fundamental rights, including privacy, non-discrimination and access to justice, the FRA calls EU and national legislators to consider sound evidence on the AI’s impact on fundamental rights, before enacting new legislation regarding AI. In addition, the Agency recommends as mandatory the assessment of AI’s impact- beyond the technical aspects- on fundamental rights before the deployment of an AI system by public and/or private entities.

Accordingly, the Agency recommends the assessment of any potential discriminatory effect emerging from the deployment of AI systems. Such initiatives could involve discrimination testing and advanced statistical analysis while the active implementation of ‘explainable AI’ could help to detect potential discrimination more effectively.

Finally, the FRA highlights that in order to ensure effective access to justice, individuals need to know that AI is deployed, how AI systems make decisions and where they can contest such decisions. To this end, the Agency recommends the imposition of duty, on both the private and public sector entities using AI, to provide information explaining the operation of their AI systems.

See more here