Data Protection Impact Assessment – DPIA
If it is likely that a type of processing of personal data will entail a high risk to people’s rights and freedoms, the data controller must assess the consequences for privacy of the planned processing. This particularly applies when using new technology.
Determining with certainty if there is a high risk can be challenging. If there is uncertainty about this issue, the Norwegian Data Protection Authority recommends carrying out a Data Protection Impact Assessment (DPIA). This can be a useful tool for ensuring that the other requirements in the GDPR are met.
The Norwegian Data Protection Authority has produced a list of processing activities that always require a DPIA to be carried out. (The list is based on guidelines from the Article 29 Working Party.) From this list the following elements are relevant to the tool Secure Practice is developing:
- Processing of personal data with AI which is innovative technology.
- Processing of personal data for systematic monitoring of employees.
- Processing of personal data where the purpose is to offer a service or develop products for commercial use which involves predicting job performance, finances, health, personal preference or interests, reliability, behaviour, location or patterns of movement. (Specific categories of personal data or highly personal data and evaluation/scoring).
The sandbox therefore concluded that the use of Secure Practice’s tool requires a DPIA. It is the responsibility of the controller to ensure that a DPIA is conducted. In practice, this means that companies that purchase the new service from Secure Practice must also carry out a DPIA.
For many small and mid-size companies it can be demanding to implement a sufficient DPIA of a tool that is based on artificial intelligence. This requires, inter alia, knowledge of privacy regulations and other basic rights, artificial intelligence and knowledge of the system’s logic in addition to the particular conditions in each workplace.
Challenging asymmetry
The asymmetrical relationship between the client and the service provider is often evident in our digital society. To exaggerate slightly, it may occasionally seem like the supplier sets requirements for the client instead of the other way around. An equivalent dynamic can also be found between an AI services provider and the client. In this situation the provider will also have the role of technical expert which can highlight both advantages and disadvantages of the technology they sell.
The Norwegian Data Protection Authority and Secure Practice agreed that responsible use of AI requires a data controller with a solid information basis, which enables the data controller to carry out the correct assessments. Based on this the sandbox decided to include a DPIA in the project.
It is important to stress that it is not sufficient to assess the data protection impacts of the tool itself. The assessment must also take account of the context the tool will be used in. This context will often vary from client to client, and which part of the country (or the world) the client lives in. This means that Secure Practice can undertake part of the investigative work for the client in advance, but the assessments for each specific circumstance must be performed by the client themselves.
It was important for the Norwegian Data Protection Authority to be able to provide effective guidance in the process for assessing data protection impacts, without the advice leaving so little room for manoeuvre that Secure Practice’s ownership of the process was challenged. This was particularly important as the development of the service was in an early phase when the Norwegian Data Protection Authority gave feedback. Secure Practice themselves arranged a workshop for assessing data protection impacts together with the Norwegian Data Protection Authority, and then documented the results.
Feedback from the NDPA
The Norwegian Data Protection Authority gave feedback on various issues linked to potential consequences for data protection and the design of the assessment itself:
- Publishing the DPIA online can be one of several measures to facilitate transparent processing of personal data. Publishing the assessment is not in itself enough to fulfil the duty to inform. The data subject must be informed in a concise, transparent, comprehensible and easily accessible manner.
- It is important to ensure procedures so that the privacy policy and the DPIA are updated in parallel. This means that changes and new solutions that are implemented in the production solution must be reflected and dealt with in the DPIA and in the privacy policy when this is relevant.
- It is important to avoid ambiguous and sweeping wording. The descriptions should be as precise as possible so that it is possible to see what has been assessed. The object of assessment must be clearly stated.
- Messages directed at the client (company) must avoid wording that may be confused with the legal basis for the processing of the employee’s personal data. This was particularly relevant where the client’s agreement with Secure Practice was mentioned, and this could be confused with the reference to Article 6(1)(b) GDPR (“processing is necessary for the performance of a contract to which the data subject is party […]”).
- Threshold values for consequence and probability must take particular account of the risk for each data subject, and that the DPIA does not exclusively concern how many people are affected, but also the consequences for each data subject.
- The Norwegian Data Protection Authority recommended that the risk to the data subject’s freedoms and rights be investigated in more detail with regard to the intended division of responsibility between Secure Practice and the company. This can clarify both the division of roles and responsibilities for the data subject.
Feedback on the DPIA above is focused on the usage phase. Finally, we mention that Secure Practice must also assess the data protection impact for the learning phase where they have independent processing responsibility.