Logo and page links

Main menu

An office woman surprised by a genie in her laptop (Illustration made with help from AI).

NTNU, exit report: Copilot through the lens of data protection

Generative artificial intelligence (AI) is no longer just a fun tool on the side, it is now being integrated into the digital solutions we already use. Microsoft launched its Copilot for Office Suite in November 2023, offering the potential to significantly simplify working life. Some people have started using it to varying degrees, while others are still sitting on the fence. For what actually happens when you turn on Microsoft 365 Copilot?

Summary

The Norwegian Data Protection Authority and the Norwegian University of Science and Technology (NTNU) have looked at what data protection requirements apply and what assessments NTNU should carry out before they use Microsoft’s AI assistant. NTNU conducted a parallel pilot project to examine whether it is ready to introduce M365 Copilot, as well as to propose a framework for management, operation, maintenance and development. NTNU has published its own findings report. It provides an overview of how M365 Copilot functions and a great deal of insight for others who are considering turning on Copilot.

Read NTNU’s own findings report (in Norwegian only)

The Norwegian Data Protection Authority supports NTNU’s report but recommends a more specific approach when conducting a data protection impact assessment (DPIA). Each organisation must carry out its own DPIAs based on what data they have and what tasks they wish to use M365 Copilot for.

M365 Copilot is an active component that retrieves and recreates information in new and unfamiliar ways. It is a challenge that this new technology’s ability to formulate language well – including Norwegian – can make it seem human, as though it is capable of assessment and logical reasoning.

It is also important to emphasise that this is pioneering work. To the best of our knowledge, no other supervisory authority has looked at the use of M365 Copilot in relation to data protection regulations. This report should be seen as a first step in understanding and assessing whether such tools can be adopted in a (cautious and step-by-step) manner that is compliant with data protection regulations.

Main points

  1. M365 Copilot requires that the organisation’s data are already stored in Microsoft’s cloud solution.

    M365 Copilot sits at the top of Microsoft’s M365 cloud solution. Before introducing M365 Copilot, it is a prerequisite that you have conducted all necessary security and data protection assessments related to the M365 platform itself. You also need to have the necessary resources and expertise to manage service providers and the cloud solution in a responsible manner over time, particularly because of frequent changes from the supplier side. Responsibility for the data used in Copilot rests with the organisations that use the tool.

    The Norwegian Agency for Public and Financial Management (DFØ) has created a guide for public enterprises on the procurement of cloud services (in Norwegian only), which may be of help. See also section 4 of NTNU’s findings report for further information and recommendations from NTNU on the management of the system.
  2. Get your own house in order.

    Copilot will have access to the same information that the user of the tool has. That means that challenges and weaknesses in the ‘digital foundation’, such as poor access management and control of personal data, will be made visible and significantly amplified by M365 Copilot. It is important to emphasise that Microsoft, as a service provider, largely assumes that ‘everything is in order’ with the management of the underlying M365 platform, in order for Copilot to be used responsibly and in a lawful manner. It is therefore important to get your own house in order first, and any project to introduce it will probably require thorough (re)assessments of your own information management. This requires a certain amount of effort and resources but is a critical and necessary step when introducing new technology.

    The Norwegian Digitalisation Agency has prepared a guide to information management (in Norwegian only), and information about what a record of processing activities must contain can be found on the Norwegian Data Protection Authority’s website (in Norwegian only).

    NTNU has concluded that they are not yet ready to introduce M365 Copilot throughout their organisation, partly because they consider that their own house is not yet in order.

  3. Identify and limit what M365 Copilot will be used for.

    Consider which tasks and associated processing of personal data M365 Copilot should and should not be used for. Some tasks are poorly suited for the use of generative AI, for example when it is important that the responses are correct and the user does not have the time or expertise to check what is generated. In addition, using M365 Copilot in e.g. HR and personnel management poses a particularly high risk to data protection. This is because access to personal data is difficult to manage and control, and because the consequences for individuals can be very severe. Tasks involving special category (sensitive) personal data should also be carefully assessed or avoided in connection with the use of M365 Copilot.

    Map and describe the processing operations that will occur if M365 Copilot is used for a specific purpose, i.e. from when a prompt is given to Copilot until it provides an answer. The record of processing activities is an appropriate place to start, where one can review and assess each instance of processing per purpose. That will provide a good starting point for assessing which tasks you can and would like to use Copilot for.

    If M365 Copilot has access to information that contains (sensitive) personal data, the information must be classified, identified and labelled, at least at the document level. We emphasise that Microsoft acknowledges that this necessary in order for M365 Copilot to be used responsibly.
  4. Assess the legal basis.

    When tasks and associated processing of personal data are considered “M365 Copilot candidates”, the legal basis for processing must be checked. For existing processing activities, you must assess whether using M365 Copilot would result in any changes to the processing, such as which or whose personal data are being processed. If there are changes, you must assess whether the existing legal basis for processing could still be used, including whether the processing remains ‘necessary’. If this is not the case, M365 Copilot cannot be used for this processing.

    Processing personal data for new purposes requires the identification of an appropriate legal basis for processing. In cases involving the re-use of personal data for new purposes, as will often be the case, you must assess whether the new processing is compatible with the original purpose.
  5. Assess the impact to data protection.

    As a general rule, a data protection impact assessment (DPIA) will be required when using generative AI that processes personal data. That is because the law emphasises ‘using new technologies’ as a particularly important factor, and because the understanding of risks associated with generative AI is still immature. A DPIA must be carried out for each processing operation or set of similar processing operations. Tasks that do not in themselves require the processing of personal data may nonetheless do so when using M365 Copilot, because Copilot uses all of the information accessible to the user and can thus link it to personal data.

    The DPIA process should identify technical and organisational measures that can reduce the risk to an acceptable level, and these must be in place before M365 Copilot is taken into use. Testing can be used as a measure to minimise risk. If the risk is too high even after measures are implemented, it is probably best not to use M365 Copilot for the processing in question. Alternatively, contact the Norwegian Data Protection Authority for a prior consultation.
  6. Will using it violate the Norwegian e-mail regulation?

    M365 Copilot logs all user interactions. The log is stored in the user’s own digital area and, in NTNU’s case, is accessible to the M365 administrators. Overall, we consider it likely that this log of user interactions could fall under the prohibition against monitoring employees’ use of electronic equipment. However, we understand that the primary purpose of such a log is to ensure that the quality of the service is as it should be. That purpose may fall under the prohibition’s exemption that applies to managing the organisation’s computer network. Whether the other exemption, to ‘detect or resolve security breaches in the network’, may be applicable must be specifically assessed in relation to the purpose of such log.
  7. The use of large language models requires expertise and awareness.

    Large language models provide a new user experience for many, with both opportunities and limitations that remain unclear. It can be difficult to understand what information is used to form the basis of statements. Expertise is required to formulate prompts that produce relevant and good answers. It is the responsibility of the organisation deploying M365 Copilot that users of the solution have sufficient knowledge, awareness and training in its use. This competence not only ensures that what is generated is of a high quality, but also that the solution is used in a way that safeguards data protection.
  8. Consider alternative solutions.

    M365 Copilot can be used for a great many things, so it is no small task to ensure that the system is used in a responsible and lawful manner. Some of Copilot’s features may challenge the principles of purpose limitation and data minimisation. Measures that could in theory reduce risks and consequences may in practice be very difficult to implement. It is therefore important to consider whether other AI solutions that present lower risks to data protection are able to meet your specific needs. These might include solutions that transcribe audio recordings, customised chatbots or support tools tailored to specific purposes and limited to carefully selected and quality-assured internal information sources.
  9. Introduce Copilot in small and controlled steps.

    It is possible for Norwegian organisations to start using M365 Copilot, but not by everyone and not for everything. We strongly recommend that these types of solutions are introduced in small and controlled steps, using selected roles carrying out suitable processing operations in the organisation first. Structured plans must also be created for post-implementation monitoring and following up the quality of what the solution produces, through both organisational and technical measures.

The road ahead

NTNU has done an impressive, socially beneficial and extensive job of acquiring knowledge and awareness of the use of large language models in general, and integrated AI solutions such as M365 Copilot in particular. They have chosen not to introduce M365 Copilot throughout the entire organisation, but instead introduce the tool in small and controlled steps, limited initially to selected roles.

M365 Copilot is still in the early stages of development and does not provide control at a granular level, such as the ability to make local and flexible adaptations (e.g. disabling access to users’ mailboxes or specific deletion policies). Microsoft probably considers unlimited access to the user’s mailbox as an important and central feature, although it is perhaps one of the features that creates the most uncertainty for many organisations.

The Norwegian Data Protection Authority expects the issues that customers, organisations, authorities and wider society identify in the product are taken seriously by the product supplier. At the same time, there are clear requirements for organisations that wish to benefit from using the tool. The prerequisite of having an extremely well-functioning information management system may make it difficult to succeed with such solutions, but obviously has a positive upside that goes far beyond the implementation of one specific solution.

Note:

An assessment of cloud services in general, the transfer of personal data to third countries and Microsoft’s role(s) under the GDPR was outside t he scope of this project. We would however like to mention that the European Data Protection Supervisor (EDPS) recently made a decision that partly encompasses Microsoft’s role in the provision of cloud service s to a number of EU bodies. The decision has been appealed by both Microsoft and the European Commission. The outcome of the case may have an impact on how the use of cloud solutions must be organised in the future to be in line with the GDPR.