Portret Tine Molendijk
Portret Tine Molendijk

Even with AI-controlled weapons and unmanned drones, warfare remains human work

AI-controlled weapons, unmanned drones and other advanced military technologies make warfare cleaner and more precise, and reduce the risks to military personnel. But that is only one side of the story, emphasises Tine Molendijk, professor of multilateral military operations at Nijmegen School of Management (Radboud University).

Tine Molendijk (1987) studied cultural anthropology at the University of Amsterdam and became interested in the armed forces during that time, partly through friendships with military personnel. The armed forces are an organisation with a monopoly on the use of force. The decisions made by its personnel can have fatal consequences. What does it mean to be human in a military operation? As an associate professor at the Faculty of Military Sciences of the Netherlands Defence Academy (NLDA), she tries to find answers to that question. Since the end of 2025, she has also been professor by special appointment of Moral Dilemmas of Multilateral Military Operations at the Department of Political Science.

For many people, the armed forces are a black box, she explains. As a result, people quickly resort to caricatures: soldiers are heroes, villains or pitiful victims. Or to simple dichotomies: AI and technology make the work cleaner, risk-free and more precise, says one camp. Autonomous weapon systems are 'killer robots' and 'killer drones', according to the other. It's not that simple, according to Molendijk. 'I see it as one of my tasks to open that black box and bring nuance to the debate.

Don't drones turn war into a kind of video game?

'That idea does indeed prevail. People have an image of so-called 'first-person-view' (FPV) drones, where someone wearing VR glasses controls a drone from a (small) distance from the war zone. At first glance, that seems cold and distant. But at the same time, it is actually intimate and close. In an average firefight in Afghanistan, all you saw were flames in the mountains, not exactly who you were killing. But with these goggles, you can follow your target until you are practically standing next to them, kill them and then see the impact of your action up close. That is a completely different experience of warfare. That is why it is so important to investigate how the operators who work with this technology experience it.

In her thesis (2019), Molendijk analysed how soldiers returning from missions often struggle with 'moral injuries' – feelings of guilt, shame and anger – due to morally difficult decisions made during their mission.

Professor Tine Molendijk

As a drone operator, are you more likely to become 'morally wounded'?

We don't know yet. Initial exploratory studies show a mixed picture. If the drone manages to hit the target very precisely and there are no innocent casualties, military personnel may find it easier to stand behind their actions. But that is certainly not always the case. We do see that the percentage of PTSD complaints among long-range drone operators is lower than among military personnel who have participated in ground combat, because drone operators themselves are not in immediate danger. On the other hand, they suffer from trauma complaints at least as often as 'regular' fighter pilots. In addition, all drones have a humanising effect; you can see very clearly that you are killing a human being. And the classic justification that military personnel use for themselves – 'it was him or me' – no longer applies. This shows that the frame 'technology makes modern warfare cleaner' is far too simplistic, just like the frame 'technology makes modern warfare more inhumane'.

AI systems are also being used more and more, for example to identify targets. 

‘That is not without risks either. Take the American immigration service ICE, for example. They use the super app Elite, which creates a map of potential deportees based on ‘reliability scores’. Israel uses ‘Lavender’ to generate lists of potential Hamas militants. 

'But technology is man-made. The importance of human control, the so-called human in the loop, is often emphasised in AIBut in Israel, after 7 October, there was enormous pressure to generate a large number of targets. According to military personnel, in practice they sometimes had only twenty seconds to check whether the target was the right one. In that case, there is no meaningful human control and the technology suddenly becomes extremely risky.

Molendijk advocates an end to simplistic black-and-white discussions. The armed forces are an organisation of violence. War is never 'clean' and risk-free.

On Thursday, 7 May 2026, at precisely 4 p.m., Tine Molendijk will deliver her inaugural lecture, entitled: 'Mensenwerk in een geweldsorganisatie: Morele dilemma's van militaire operaties'. Watch the live stream or register for the lecture here.

Text: Inge Mutsaers

Contact information

Theme
International, Politics, Society