Legal Tech News Review Week 31 August -6 September 2020, by Eleni Kozari

UK’s Age Appropriate Design Code comes into force

 

On September 2nd, the Age Appropriate Design Code (or Children’s Code) came into force, mandating higher privacy and data protection threshold for children. Organisations that provide online services and/or products, which are likely to be accessed by minors up to age of 18, fall within the provisions of the Code and thus, they should align their online practices respectively.

Following a risk-based approach, organisations engaged in designing, developing and /or providing online services (e.g. apps), social media platforms, online games, educational and streaming platforms that process, analyse and conduct profiling to children, are those who ought to take more steps in order to be compliant. In particular, following a ‘by design approach’, the Code articulates 15 design standards. To this end, digital service providers shall automatically provide a ‘by-design” baseline for the protection of children’s data, each time an app or a game or a website is visited and/or downloaded.

The organisations falling within the Code’s remit, are required to get compliant with the provisions of the latter, within a transition period of 12 months. The ICO has also invited organisations, engaged in cutting edge children’s personal data projects, to apply for participation in its regulatory sandbox while it has created a tech hub in order to provide them with helpful guidance.

See more here

 

U.S Appeal Court rules government surveillance program illegal

 

The U.S Court of Appeals for the Ninth Circuit ruled that the mass surveillance government program, concerning Americans’ telephone records, was unlawful. In particular, the Court found that the warrantless telephone surveillance that secretly collected Americans’ telephone records, including the persons of the communication, the manner and communications metadata, such as the time and location, violated the Foreign Intelligence Surveillance Act and was unconstitutional.

The Court’s ruling comes seven years after Snowden’s revelations. Albeit the latter still faces espionage charges, the ruling constitutes a ‘privacy victory’, according to privacy advocacy groups. Although governmental officials had initially declined allegations that NSA knowingly collected such data, they later justified the program upon its necessity in combatting domestic extremism.

The Court’s ruling, though, will not affect any convictions pursued upon the evidence of the surveillance program, since the Court found that such evidence presented in the respective trials, were not tampered in their essence.

See more here

 

EU Commission about the possibility of a ban on facial recognition technology

 

After the EDPS’ position earlier this year regarding a temporary freezing on the use of automated recognition technologies in public spaces, including fingerprints, DNA and other biometric capturing technologies, the head of DG Connect’s Technologies and Systems for Digitising Industry Unit of the Commission (Killian Gross) clarified that the EU Commission does not exclude a potential ban on the use of facial recognition technology.

Underling the concerns highlighted in the Commission’s public consultation about the White Paper on AI, in particular those referring to the deployment of remote biometric identification technology, the head claimed that the Commission will thoroughly assess the existing legislative framework without excluding any option. In doing so, albeit the GDPR covers the processing of biometric data, the Commission will examine whether the latter sufficiently regulates data processing in case of facial recognition technologies.

In case of a ban on facial recognition technologies in public spaces, though, the Commission will provide the necessary clarifications for the proper implementation of the ban.

See more here

 

 

California’s new Genetic Information Privacy Act

 

The Bill SB-980, albeit currently pending California Governor’s signature, will establish the Genetic Information Privacy Act. Upon the enactment of the latter, direct-to-consumer genetic testing entities will be subject to a series of data protection and data security obligations.

Entities that sell, analyse or otherwise offer consumer-initiated genetic testing products or services directly to consumers, fall within the scope of the Act. However, authorised providers engaged in diagnosis or medical treatment as well as HIPPA-covered entities, are still exempt of the Act’s remit.

For the rest of the companies, subject to the Act, certain obligations regarding the provision of information on the company’s policies and procedures for the collection and processing of genetic data are stipulated. In addition, those entities should obtain explicit consent of the affected individuals and in particular, explicit consent for each separate data activity.

On top of that, consent’s withdrawal mechanisms are also mandated while the covered entities should ensure that, upon consent revocation, biological samples are destroyed within 30 days.

 

Furthermore, appropriate security measures and individuals’ rights (such as access and deletion of their genetic data) are stipulated as well. Finally, the Act provides civil penalties in case of infringements of its provisions.

See more here

 

EDPS published guidelines for Body Temperature checks by EU institutions  

 

In the light of the COVID-19 pandemic, many EU institutions implement body temperature checks as a complementary health and safety measure. Acknowledging that the systematic body temperature checks to employees and visitors, while accessing EU institutions’ premises, can interfere with their right to privacy and data protection, the European Data Protection Supervisor (EDPS) published guidance to facilitate institutions’ compliance with the Regulation (EU) 2018/1725.

The EDPS clarified that basic body temperature checks, performed manually and without being recorded or registered otherwise, do not fall within the scope of the Regulation. On the contrary, the use of systems of temperature checks, operated manually and followed by registration, or automated systems and temperature measurement devices, fall within the Regulation’s remit. As regards the latter, the EDPS highlighted that for the lawfulness of the underlying data processing, Article 1(e)(2) of the Staff Regulations along with an executive decision of the EU entity, could be used as basis.

In addition, the Supervisor provided a non-exhaustive list of appropriate technical and organisational measures. Inter alia, EU institutions should particularly take into consideration ‘privacy by design and by default’ measures in order to minimise the collected data while privacy-friendly technologies should be favoured. On top of that, the EDPS recommended that the systems deployed for the body temperature checks should not be interconnected to other IT systems, such as CCTV systems, they should be real-time while no recording or storage of the health data should take place.

In any case, clear and transparent information should be provided to data subjects in an apparent manner, both in terms of size and location, in order the latter be able to easily read them. Finally, EU institutions are mandated to regularly review and assess such measures vis-à-vis the proportionality test and the evolution of the pandemic.

See more here