What rules apply?
Anyone who wants to collect, store or otherwise process personal data must comply with the requirements in the General Data Protection Regulation (GDPR). In the following we will examine more closely the fundamental requirement that there must be a legal basis (consent, agreement, basis in law etc.) for processing personal data.
The GDPR applies with equal force in 30 countries in Europe. Norway also has special rules on privacy in working life. These special rules are provided in regulations enacted pursuant to the Working Environment Act. The regulation relevant for Secure Practice is the Regulation of 2 July 2018 on Employer’s Access to Emails and Other Electronically Stored Material, commonly referred to as the “Email Monitoring Regulation”. The Email Monitoring Regulation concerns employers’ access to all electronic equipment employees use in a work context and is intended to protect employees against unnecessarily invasive monitoring or control. Section 2, second paragraph of the Email Monitoring Regulation prohibits the monitoring of employees’ use of electronic equipment. In the sandbox project we have assessed how this affects the tool Secure Practice is developing.
Usage phase
Before Secure Practice offers the tool on the market, it is important to find out whether future clients will be permitted to use the tool. In the first workshop in the sandbox project, we started by looking at whether an employer that purchases the service has a legal basis for processing personal data from the employees in this tool. We also examined more closely how the service is affected by the employer’s prohibition against monitoring employees, as provided for in the Email Monitoring Regulation.
We note briefly that for the processes where Secure Practice and the employer have joint processing responsibility, both must have a legal basis for the processing. In this chapter on the usage phase, we have chosen to focus on the employer as the data controller.
What personal data was relevant in the AI tool and was assessed in the sandbox project?
Before we assessed the legal basis, Secure Practice listed all possible data that may be valuable for mapping employees’ interests and knowledge within information security. The purpose of the exercise was to assess more closely, what data is relevant and decide what is acceptable to use from a privacy perspective.
Data Secure Practice assessed to be both desirable and in the “right” end of the privacy scale was data on the completion of e-learning, phishing exercises and reporting from the MailRisk service, as well as answers to surveys and quizzes about knowledge and habits related to information security.
Then data that might be borderline was listed, with regard to both privacy and usefulness. These were the most important to discuss, for example demographic data such as age and seniority of the employees, and data from other security systems such as weblogs.
Finally came data that was never actually relevant from a privacy perspective to use, but which was nevertheless included in the original list due to potential usefulness. Examples in this category include psychological tests or data collected from social network profiles.
Brief explanation of the relevant legal basis – legitimate interests
Secure Practice and the Norwegian Data Protection Authority assessed that “legitimate interests” in accordance with GDPR Article 6(1)(f) was the most relevant legal basis in this project.
The provision states three conditions which must all be met for processing to be legal:
- The purpose of the processing must be linked to a legitimate interest
- The processing must be necessary to achieve the purpose
- The employees’ interests, rights and freedoms must not exceed the employer’s interests. In short, we call this step “balancing of interests".
Condition No. 1 - legitimate interest
As noted above, the purposes of using the tool are two-fold:
- To give employees individually adapted training in information security.
- To provide the companies with statistical reports that describe employees’ knowledge and interest levels in information security at a group level.
Both these purposes are linked to the company's interest in improving information security. We assessed better information security to be in the interests of the company itself, the employees, third parties such as clients and partners and society as a whole. It was easy to conclude that improvement of information security constitutes a legitimate interest and that the first condition is therefore met. The discussions in the project therefore concerned the last two conditions.
Condition No. 2 - necessity
Useful questions when assessing which processing is necessary:
- Will the processing of this personal data actually help achieve the purposes?
- Can the purposes be achieved without processing this personal information or by processing fewer personal data?
The employees’ knowledge and interest in information security must be mapped to achieve both purposes – both individually adapted security training and statistical reporting for the companies. The discussions in the sandbox concerned how Secure Practice can minimise the use of personal data and ensure that the data used actually helps to achieve the purposes.
We formed focus groups of employees from the trade union Negotia and a large company in order to gain potential users’ perspective on the assessments. Interesting insight was provided into the assessment of how apparently appropriate data can be the exact opposite.
One of the methods Secure Practice wants to use for the mapping is surveys. In these, employees must take a position on various statements, such as “I have deliberately broken information security regulations at work”. The focus group of representatives from the trade union commented that it is unclear what consequences there could be for the individual employee to answer such a question if they had actually broken the rules. In the second focus group emphasis was placed on the importance of anonymity towards the employer if one were to give an honest answer to this.
Regarding the purpose of individually adapted training in information safety, this question will not help to achieve the purpose if employees who have broken security rules answer no because they are afraid of the consequences of answering yes.
As regards to the purpose of creating statistical reports for the companies, the question viewed in isolation could contribute to achieving the purpose as long as some of the employees would answer yes. Such feedback could suggest to the company that the internal regulations on information security are badly designed or do not fit in to the everyday working life of the employees. The example illustrates that the data controller must assess the legal basis separately for each purpose the processing is going to address.
As stated in point 4.4 of this report, Secure Practice was advised to reformulate some of the questions in order to more easily achieve honest answers.
Once Secure Practice has identified the personal data necessary for the tool to work for individual training and overall information to the employers, the next step will be a balancing of interests.
Condition No. 3 - balancing of interests
The third condition concerns the employer being unable to introduce measures if the employee’s interests, rights and freedoms carry more weight than the employer’s interests. In short, a balancing of interests is about finding a balance between the interests on both sides, so that any invasion of privacy is proportionate. In order to carry out this balancing of interests, we started by investigating how the employees are affected by the tool.
Employees are in an uneven power relationship with the employer. It is therefore especially important for employees that personal data about them is not misused for new purposes. An employee who is “tricked” during a phishing exercise expects the result only to be used for training purposes and not to assess what assignments or benefits he or she receives. But if the employer gains access to this information, there is a risk of such misuse.
A distinct feature of solutions that use artificial intelligence is that they often process large quantities of personal data. The tool we are discussing in this project is meant to map both knowledge and interest and is suited for a detailed survey of employees. This can be perceived as invasive. The solutions within artificial intelligence can also produce an unexpected result.
We assess consideration for the employees to carry a great deal of weight, and greater demands are therefore placed on the interests linked to the employer. On the other hand, the interest linked to better information security also carries a great deal of weight. As mentioned in the point on legitimate interest, better information security is an interest that benefits the individual employees, not solely the employer.
When balancing the interests linked to information security and consideration for the employees’ privacy, we assessed these points in particular:
- How the employees will perceive the mapping and what consequences it might have for them. Positive consequences may be that they receive personalised help and follow-up to strengthen their skills in information security, and that they avoid being affected by fraud and hacking with the consequences this may entail. Potential negative consequences may be that employees feel unsure about how data about them will be used, and whether there may be negative consequences if they reveal little knowledge or interest in information security.
- How the employers involve the employees before they introduce the tool.
- What information about each employee the employer has access to.
- How Secure Practice provides information to employees in the tool and in the privacy policy.
- What technical guarantees are built in to prevent outsiders from gaining access to personal data on the employees.
As regards to the bullet points on possible consequences for the employees and what information about individual employees the employer has access to, Secure Practice worked on the assumption that the employer should not have access through the tool to measurements of each individual. The sandbox’s recommendation is the introduction of both legal and technical guarantees that information about the employees does not go astray.
By legal guarantees we mean that Secure Practice should include provisions in the contract that employers do not have access to information on individual employees – not even upon special request. By technical guarantees we refer to the measures Secure Practice has already implemented to prevent the employer or others from gaining access to this information. The service uses both pseudonymisation and encryption to protect personal data. The name of the employee is replaced with a unique identifier, and the link between the name and the identifier is stored in a separate database. Secure Practice has also identified a need to store the user’s email address in proximity to the user’s other information. A personal email address establishes a clear link to an individual and will therefore challenge the original objective of a pseudonymised database.
In order to accommodate this challenge, Secure Practice has chosen to encrypt email addresses; names and other direct identifiers in the database itself, so that these will no longer be available in the open. The keys used to encrypt and decrypt such identifiers are stored separately from the database. In this way anyone who gains access to the user database will not gain access to the information necessary to link personal data to individuals.
When both legal and technical guarantees are in place, the sandbox assesses the risk of the employer or others being able to access and possibly misuse personal data that the tool collects to be low. This contributes to the overall interests carrying most weight on the part of the data controller. It will therefore be possible to use legitimate interest as a legal basis in the usage phase.
The right to object
When legitimate interest is used as a legal basis, employees have the right to object to the processing of personal data in accordance with Article 21 of the GDPR. If an employee objects to the use of their personal data in the tool, the employers must take into consideration the specific circumstances the employee has pointed to in the objection. The employer must still carry out a balancing of interests with respect to the person who has protested, but must show that there are “legitimate compelling reasons” for using the personal data in the tool.
Accordingly, the assessment must be done on an individual basis, taking account of the justification the employee who is objecting has given. A data controller assessing an objection must to a greater extent consider alternatives to specifically adapted training and reporting at a statistical level. Secure Practice envisages that those who object will either have their objection processed by the employer, or have it granted through the tool without further manual processing, as it may be challenging for Secure Practice to assess the basis for the objection.
If multiple employees object and do not use the tool, a smaller proportion of the company will understandably be mapped by the tool. Data controllers must be aware that this may make it easier to identify the employees in the reporting at a statistical level.
Learning
Use of personal data for learning to improve the tool also requires a legal basis. Secure Practice is the sole data controller for this phase. In the same way as for the usage phase, the purposes of improving the service are linked to a legitimate interest. Learning is expected to gradually make the service more accurate as personal data from more employees is entered into the tool. This will likely increase information security in clients’ companies.
Unlike the usage phase, however, the purposes are still clearly linked to commercial interests, since an expected increase in quality may offer increased sales. We would therefore briefly note that commercial interests are also legitimate interests, and accordingly the first condition is met.
As regards to the second condition, Secure Practice must take an active position on what personal data is necessary to achieve each individual purpose. We assume that the same type of personal data as in the usage phase is relevant to making the service more accurate. If Secure Practice is to test the tool for possible discrimination, further assessment of what information is necessary . For this purpose, access to for example demographic data such as age and gender may be highly useful. However, we have not examined this assessment in this project.
When balancing consideration for Secure Practice’s interests and employees’ privacy, Secure Practice may emphasise that increased accuracy will benefit both the companies and the employees. If the solution is not updated through learning, the tool may become outdated and not function as intended. Possible privacy disadvantages for the employees may be greater during this phase, as personal data will be used to further develop the tool for new companies. With the legal and technical guarantees discussed under the usage phase in place, it will in our opinion also be possible to use legitimate interest as a legal basis in the learning phase.
In this phase too, employees can object against processing.
The national prohibition against monitoring in the Email Monitoring Regulation
So far, we have focused on the GDPR. However, it is also relevant to assess the service in relation to the Email Monitoring Regulation with its prohibition against “monitoring employees’ use of electronic equipment, including use of the internet". In contrast to the GDPR, it takes more than the processing of personal data for these special rules to apply.
Another important distinction from the GDPR is when an eventual monitoring is legal. There are two legal instances. Either to “manage the company's computer network” or to “detect or resolve security breaches in the network”.
In other words, an employer can lawfully use the tool as long as the use does not entail monitoring of the employees, or if one of the two instances named above are met. But can the use of the AI tool be seen as monitoring of the employees’ use of electronic equipment?
What counts as “monitoring” is not defined in more detail in the Regulation. The legislative history of similar regulations in the previous act highlights that the measure must have a certain duration or take place repeatedly. Monitoring is in contrast to individual access which is permitted in several situations. The legislative history also emphasises that it is not solely a question of whether the purpose is to monitor. The employer must also attach weight to the question of whether the employee may perceive the situation as monitoring.
The AI tool Secure Practice is developing is composed of many methods for mapping the employees. The methods range from surveys and quizzes to recording how employees react to simulated phishing exercises and activity in the learning platform. Viewed in isolation, this would not count as monitoring the use of electronic equipment. However, the question is whether the overall mapping is affected by the prohibition against monitoring.
Past decisions by the Norwegian Data Protection Authority are not conclusive with regard to whether the employer is required to actually see the data or metadata for it to count as monitoring. The concept of monitoring is broad and it may be the case that collection and systematisation are also affected by the prohibition. The fact that the provision is targeted at the employer's monitoring indicates that the employer must at the very least be able to access the data on the employees in order to be subject to the prohibition.
After discussions in the sandbox, there was a consensus that the mapping in the tool would not be affected by the monitoring prohibition. We have specifically emphasised the technical and legal measures that have been implemented to ensure that the employer will not have access to the data that is collected on each employee.
The statistical reports for the employers concerning the level of information security among the company's employees may be more likely to be perceived as monitoring. The reporting must take place at the group level. The number of employees the company has and the number of staff in each group will likely affect how the employees perceive the tool. We have assumed that the data is provided in a way that does not enable the employer to identify individual employees. What information the employees receive will also influence how they perceive the service.
The sandbox has concluded that the mapping, which is only communicated to employees, is not affected by the prohibition against monitoring. As regards to measuring the security level of the employees at group level, the employer must consider the design in more detail. We recommend that the employer discusses how the reporting may take place with the employees beforehand.
Since the use of the tool does not count as monitoring in this case, we do not need to assess the rest of the provision.
Which (related) regulations have not been assessed in the sandbox project?
We have not assessed whether the employer’s use of the AI tool falls under the regulations on control measures in the Working Environment Act, Chapter IX. The Norwegian Labour Inspection Authority enforces these regulations, and you can read more on their website.
We have also not examined the regulations that apply to access to information stored on a user’s terminal equipment, computer, telephone etc. as regulated in the Electronic Communications Act.