Copilot (NTNU)
Microsoft's artificially intelligent assistant Copilot has been rolled out in full force in the winter of 2024, and NTNU wants to test whether this is a tool that can be used in a large organization in the public sector.
Microsoft's artificially intelligent assistant Copilot has been rolled out in full force in the winter of 2024, and NTNU wants to test whether this is a tool that can be used in a large organization in the public sector.
The project will look at NTNU's robustness, areas of use and data management. The aim is to find out what is required with regard to privacy in order for NTNU and other public organizations to be able to use tools such as Copilot.
- We have named the project "piloting Copilot for Microsoft 365". It sounds really boring, but it's actually quite exciting! says Heine Skipenes, specialist group leader for architecture and consultancy in NTNU's IT department.
He points out an important challenge with Microsoft's approach, where tools are often enabled by default and require active deactivation on the part of the organisation.
- It is important to have control over what is happening, says Skipenes, who wants to ensure that NTNU can navigate the AI landscape without compromising either privacy or integrity. "And if we don't have control over the things we can turn on and off, then we probably shouldn't turn this on at all."
See also NTNU's own page about the project.
Copilot for Microsoft 365 is thus an AI assistant integrated into Microsoft's flagship of productivity tools. This means that in Word, Excel, PowerPoint and Teams, a small chatbot will uninvited make a number of suggestions and offers, based on the fact that it has full control over what you do in the programs. For example, there may be suggestions for minutes from a meeting you have just finished in Teams.
- One concern is that the tools will become so easily available, that it will be a little too easy not to do so. We've had some horror examples, without going into detail, with using AI tools to assess students' work. You quickly look through, and then you get the conversational robot to explain why this answer deserves the grade C. It can be great to use, for example, ChatGPT to check whether you have thought correctly. But if you use it exclusively as the basis for making an assessment, you're on thin ice, says Skipenes.
And Copilot can therefore quickly be experienced as a constant offer to use skates on a frozen lake.
Copilot was launched for the corporate market in November, December. And what Skipenes hears from preliminary experiences indicates that it requires a lot of preparatory work.
- A lot of people say "Oh, you have to do an awful lot on the back end, before you can put the users in."
What does it mean for this project to be able to be explored in the sandbox?
- For our part, it's a lot about getting a look at things from the outside, getting a broader perspective.
He points to what he sees as a huge challenge in the IT world: that things are going too fast. One makes a little too quick assessments to get the tool in place. And then, for example, ethical assessments become a checklist-based approach that you just have to go through, without necessarily having a real impact on the project.
- We see that the suppliers make extremely high demands on the users. For example, with agreements and guidelines that you must agree to, and then you know that there are 6,000 pages you should have read through. Vipps, you have been given all responsibility for making sure things are done properly. Here we have an intermediary role, says Skipenes, and means that NTNU, or any employer, takes responsibility for ensuring that you, the user, understand what you are getting into when they offer you a tool.
- And let's put it this way, there is a certain gap in the field of expertise on our user interface. After all, we have 50,000 people with us, and it is from one to the other extreme, in terms of computer skills and order in their files. So what I'm most excited about is when we get inside the Copilot. How does it look there? So what are we actually dealing with?