Straying from the ethical path
Before we attempt to concretise the implications of the ethical frameworks for the PrevBOT project, we will return to the fundamental question: Is it actually right for the Norwegian Police University College (or other institutions associated with the police authorities) to conduct research on new technology when it is fairly certain even beforehand that it will have adverse effects, but the overall scope of such effects is difficult to estimate? Or could it possibly be the first step toward straying from the ethical path?
To assess this, the sandbox project did what we might call a first-step analysis, inspired by the ‘just war’ tradition (Bellaby, 2016; Diderichsen, 2011; Kleinig, 2009; Syse, 2003), a line of thinking that is currently at the core of intelligence, and in fact the use of force in general.
We did not aim to conduct a complete ethical analysis, but the analysis enables us to shed light on key issues that will hopefully provide a few guidelines for PrevBOT research in charting the course ahead. We also hope it will be useful for others to read the specific examples of ethical discussions in this report.
Slave after the first step
The philosopher Hans Jonas, known for his work on the ethical implications of modern technology and science, described how we are free to take the first step, but slaves to the steps that follow (Jonas, 1983). Although we have the freedom to initiate actions, the ensuing consequences of these actions bind us, limiting our future freedom. This underlines the importance of responsible decision-making, particularly in light of irreversible technological interventions in nature and human lives.
For PrevBOT, the distance from the idea and needs stage, via the prototype, design and development phases in the process outlined above, will be relatively short. This is because the functions the PHS wants to include in the bot have generally been demonstrated in other research. The PrevBOT project is therefore about getting the different parts to work together as a whole. To find out if it works, it must be tested, initially within a secure setting such as simulation. Another step has been added, however, as shown in the illustration above. Once the bot is developed and ready for testing in the intended environment, it can be difficult not to use it – in one way or another – if society or individual cases ‘demand’ it. If not by the Norwegian police, then by a commercial or other actor.
It is also easy to envisage another potential ‘demand’: A requirement that PrevBOT also stores data about the individuals who are flagged so that they can be prosecuted. If this is the case, is it ethically right for the police to not investigate when they are handed potential evidence and abusers on a plate? Perhaps not. But a PrevBOT that can be used for investigative purposes is probably more intrusive – and quite different – from a preventive bot. Flagging will then have greater consequences for the individuals, as the material will probably have to be stored for longer and shared with other parts of the police and prosecution authorities. It may therefore be wise to design the bot in such a way that it does not put the police in this ethical dilemma at a later date.
Research is research. Each step of the development process could present both known and unknown opportunities, and known and unknown consequences. The first step could as such lead us to stray off the path, where we wander off more or less consciously and end up somewhere we did not initially want to be. That is not to say that you should never take the first step. But it is important to be aware – already at the idea and needs stage – of the potential consequences of a final product.
‘Just war’
In his consideration of whether to take the first step in the PrevBOT research project, ethics professor Jens Erik Paulsen from the PHS was inspired by the ‘just war’ tradition, and highlighted seven elements that are relevant to look at:
- Legitimate authority
- Just cause
- Right intention
- Proportionality
- Probability of success
- Last resort
- Consideration for innocent third parties