How to explain the use of artificial intelligence
Processing personal data in a transparent way is a fundamental principle of the Personal Data Act. Transparency enables the data subject to exercise their rights and safeguard their interests.
In the sandbox project we discussed what requirements there are to inform the data subject about how personal data is processed. We also discussed specific issues relating to the user interface.
What requirements are set for transparency?
The requirement to provide information to the data subject can be found in Articles 13 to 15 in GDPR. Article 13 regulates what information is to be provided when obtaining personal data from the data subject. Article 14 regulates what information is to be provided, and when this information is to be provided, if the personal data is not obtained from the data subject themselves. Article 15 regulates the data subject’s right of access to personal data about them that is being processed. Article 12 also imposes a general obligation to provide information in a concise, transparent, intelligible and easily accessible form, using clear and plain language.
Regardless of whether or not you use artificial intelligence there are certain requirements for transparency when processing personal data. In summary these are:
- The data subjects must receive information on how the data is used, depending on whether the data is obtained from the data subject themselves or from others.
- The information must be easily accessible, for example on a website and be written in clear and intelligible language.
- The data subject has the right to know whether data about them is being processed and have access to their own data.
- It is a fundamental requirement that all processing of personal data must be done in a transparent manner. This means that an assessment must be carried out of what transparency measures are necessary for the data subjects to be able to safeguard their own rights.
In the first bullet point there is a requirement to provide information on how the data is used. This includes, inter alia, contact information for the data controller, the purpose of the processing and what categories of personal data will be processed. This is information that is typically provided in the privacy policy.
These duties are targeted at the data controller. When Secure Practice and the employer have joint processing responsibility, they must determine which responsibilities each of them has in order to meet the requirements in the GDPR. The obligation to allocate responsibility follows from Article 26 GDPR. This means, inter alia, information on how the data subjects can exercise their rights and what personal data about them will be processed in the tool.
Do the employees have a right to be informed about the logic of the algorithm?
For automated decisions which have a legal effect or significantly affect a person, specific requirements for providing information apply. Article 13(2)(f) states that the data controller in these cases must explain the underlying logic of the algorithm. The same applies in accordance with Article 14(2)(g) when the personal data is not obtained directly from the data subject.
Guidelines from the Article 29 Working Party
“Articles 13(2) (f) and 14(2) (g) require controllers to provide specific, easily accessible information about automated decision-making, based solely on automated processing, including profiling, that produces legal or similarly significant effects.
If the controller is making automated decisions as described in Article 22 (1), they must:
- tell the data subject that they are engaging in this type of activity;
- provide meaningful information about the logic involved; and
- explain the significance and envisaged consequences of the processing”
(Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, page 25.)
The tool in this sandbox project neither leads to legal effects for the employees nor affects them significantly. The processing therefore falls outside the scope of Article 22 of the Regulation. The duty to inform in accordance with articles 13 and 14 is aimed at processing covered by Article 22. Accordingly, no duty to explain how the algorithm functions follows directly from this provision.
The project assessed whether the principle of transparency read in light of the recital could imply a legal duty to inform how the algorithm functions.
According to Article 5(1)(a) in GDPR, the data controller must ensure that personal data is processed fairly and transparently. Recital 60 highlights that the principle of transparent processing requires that the data subject be informed of the existence of profiling and the consequences of such profiling. The recital refers to profiling in general, and it therefore appears to be somewhat broader in scope than Article 13(2)(f) and Article 14(2)(g) which refer to automated decisions with legal or other significant consequences.
Recital 60 in the GDPR
“The principles of fair and transparent processing require that the data subject is informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.”
The Article 29 Working Party has given its views on transparency in processing situations which fall outside Articles 13, 14 and 22. In the guidelines concerning transparency, the importance of explaining the consequences of processing personal data is emphasised as well as that the processing of personal data must not come as a surprise to those who are having their personal data processed. The fact that the obligations to explain the underlying logic in accordance with Articles 13 and 14 go beyond the general principle of transparency as discussed in Recital 60 is also supported by the guidance on profiling and automated decisions.
In summary, it is difficult to see how a legal obligation to explain the underlying logic of the tool in this project can be deduced from the Regulation corresponding to the requirements of Articles 13 and 14. In any case, the Article 29 Working Party states in the aforementioned guidelines that it is good practice to explain the underlying algorithm, even if the data controller does not have a duty to do so.
The sandbox also recommends providing information on how the tool from Secure Practice works, as this can help create trust in the AI tool. In the section below, we refer to an example from a focus group of employees, who highlighted the importance of clear and plain information as a prerequisite for providing correct personal data.
How and when is it best to provide information to users?
The GDPR does not regulate in detail how the user interface should be designed. But in continuation of the question of the duty to inform, attention was also focused on how and when the solution should inform the user.
In the project we discussed, inter alia, specific issues linked to the design of the user interface. An important point was whether employees should receive an explanation of why the AI tool is offering you this exact suggestion, whether you are being encouraged to complete a specific training module or take a specific quiz, and how it should be carried out.
A specific example could be an employee receiving a suggestion to complete a certain type of training because they had been tricked by a phishing exercise. Such information reveals something about the underlying logic of the algorithm. It was specifically discussed whether such detailed information might make the user feel they are being monitored, which could again lead to decreased trust. The arguments in favour of providing this type of information were that the data subjects need it to understand how the data is used and that this understanding can build trust in the solutions.
In the first focus group there was a high degree of willingness to use the solution and share data if it constructively contributed to achieving the goal of better information security in the company. The participants emphasized the importance of clear and plain communication with the employees. It is important to clarify early in the process how the data will be stored and used in the company's work. Uncertainty surrounding how the data will be used increases the danger of the employees adapting their answers to what they believe is “correct” or being unwilling to share data. This is an interesting finding because the algorithm becomes less accurate if the data it is based on is inaccurate and does not represent the actual situation of the user.
In the focus group with the trade union, Negotia, there was a major focus on transparency in general as a prerequisite for employees to be able to trust the solution. The points that were emphasised were linked, inter alia, to what information the employer has access to, how the contract with the company is designed, the importance of involving the employees or union representatives at an early stage in the process and that such a solution may be perceived differently by employees depending on the situation, for example whether they have a high or low degree of trust in the employer. The risk associated with the traceability of answers back to individual employees was also highlighted in the focus group. This focus group warned against designing questions in such a way that the answers could damage the employees if the employer became aware of them.