Logo and page links

Main menu


SALT (Mobai et al.), exit report: Securing Digital Identities

Legal basis and legal status

Artificial intelligence often requires the processing of large amounts of data — often personal information — which is compiled and analysed on a scale that is not possible by other means.

All processing of personal data requires a legal basis to be lawful. Article 6(1) (a-f) of the General Data Protection Regulation (GDPR) contains an exhaustive list of six legal bases for the lawful processing of personal data.

It is natural to split the question of legal basis in two, based on the two main phases in an AI project; the live service phase, and the development phase. The development phase can often occur before, during and after the service is active, and the two phases often utilise personal information in different ways.

In this sandbox project, we have taken no position on whether Mobai or the eID providers have a legal basis for processing personal data in the artificial intelligence tool that Mobai is offering (the SALT-solution). This applies both to the use of the SALT-solution for the primary purpose of remote identity verification and any use of personal data for secondary purposes as mentioned in this report.

The discussions in this sandbox project presume that the data controllers, whether Mobai itself or the eID providers, find a legal basis for processing personal data when using and further developing the SALT-solution.

Why discuss legal status

For Mobai, it is important to clarify the legal status of the key data that is a part of the solution. This includes facial images, plaintext templates, and protected templates.

By “legal status”, we refer to several questions, such as: Is the data processed considered to be personal information subjugated to the GDPR? And at what point is the information considered to be “biometric”? And when is it considered to entail the special categories of personal data, subjugated to article 9? These questions are important to clarify – especially in regard to encryption techniques and machine learning – as the GDPR does not apply to data that has been completely anonymised.

Are protected templates personal information or anonymous?

The distinction between personal and anonymous data can be complex.  The threshold for when data can be considered anonymous is very high.

It is easier to pseudonymise data, which replaces directly identifiable parameters with pseudonyms, which will still constitute unique identifying indicators. Pseudonymisation can make it more difficult to link a specific data set to the data subject's identity and can therefore be seen as a useful technique to strengthen privacy. However, as opposed to anonymous data, pseudonymised data is still considered to be personal information subject to the GDPR.

For more information about anonymisation and pseudonymisation, see our general guidance.

For protected templates, Mobai must assess if the associated information is anonymised, pseudonymised or simply regular personal information.

While homomorphic encryption and protected templates make the information incomprehensible to other parties without the encryption key, this does not necessarily guarantee anonymity. As long as the key can be used to connect the template to a real person, it is reasonable to say that it falls within the definition of personal information in the GDPR.

Article 4 of the GDPR supports this view. It defines personal data as information relating to an identifiable person, either directly or indirectly. Even though it is not possible to directly identify someone based on the content in the protected template, the key itself enables indirect identification.

Are facial images biometric data?

Biometric data are defined in Article 4(14) GDPR as “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic [i.e., fingerprint] data”.

The meaning of “which allow or confirm the unique identification of that natural person” in Article 4(14) was discussed in the sandbox project. In particular, it was discussed whether this wording entails that only personal data used for the specific purpose of uniquely identifying an individual fall within the definition. According to the Norwegian DPA this is an overly restrictive reading of the definition; data that are suitable to enable such unique identification also fall within the definition in Article 4(14). Thus, the definition of biometric data in the GDPR is essentially equivalent to the definition of biometric data in the AI Act.

A facial image in itself does not always qualify as biometric data for GDPR purposes. This is made clear in Recital 51 of the GDPR, which states that:

The term “specific technical processing” is neither defined in the GDPR nor interpreted in case law. However, in our view, the biometric templates generated by the template creation module meet the definition in Article 4(14) GDPR, and should be considered biometric data. We are also of the opinion that the processing operations performed on the facial image to generate the biometric template (i.e., the biometric feature extraction) should be considered processing of biometric data under the GDPR.

Although facial images in themselves are not systematically considered biometric data, processing of facial images may be subject to the same level of security requirements as for biometric data. For example, the creation of a database containing a large amount of high-quality facial images belonging to persons with a verified ID is likely to entail a high risk to the rights and freedoms of the data subjects whose facial images are registered in the database. For further information on this issue, see the section below on central storage of biometric information.