A national Dutch newspaper recently published a story about drone experiments at the ‘Twente Safety Campus’, where units of the local fire brigade are using a testing area to ‘train’ their drones in controlled forest fire and traffic accident simulations.
Ultimately, the goal is to create a regional network of drone coverage for traffic observation and accident analysis, in support of emergency assistance services and in different law enforcement scenarios like fugitive tracking and drug lab detection by ‘sniffing drones’ scanning the air for tell-tale substances.
Even within a purely Dutch context, the use of drones in emergency situations and crime-fighting scenarios is nothing new. On a national level, there is extensive experience with drones in the Dutch police force. Fire departments in the heavily wooded eastern provinces of the country have been using camera-equipped and infrared-enabled drones ever since 2015 for situational monitoring and early warning purposes.
Drones, after all, are true multi-talents, suitable for a wide variety of applications. But there is one thing apart from their operational flexibility that these Dutch drones have in common: they are all manufactured by Da Jiang Innovations (DJI), a Chinese tech company that has become the subject of some disrepute following a recent study by investigative journalism platform Investico which revealed the dubious quality of cybersecurity provided by DJI’s drones and the presence of ‘back doors’ in their software, allowing for data to leak away, ending up stored on servers in China.
In this blog, we will look at the conditions for and cyber-risks of the deployment of drones by emergency assistance and law enforcement organisations.
The use of drones by law enforcement agencies is allowed on the condition of proportionality and due diligence. The big difference between deploying drones and assigning police officers is that drones, in a very short period of time, can collect significantly more data, including information that has no relevance for the actual investigation. Let’s say that a camera-equipped drone is used to locate a suspect on the run. Most likely, that camera will also record images of random individuals in city streets or in more privacy-sensitive areas like their backyards. Here, from a privacy-legal, data minimisation perspective, images showing non-suspects cannot be stored, unless faces have been ‘blurred’.
Where heat-detecting drones are used, in the search for pot farms for instance, no formal instructions from the public prosecutor are needed. In a ruling of January 20 2009, the Dutch High Council found there to be sufficient ground in Article 2 of the 1993 Police Act – meanwhile replaced by Article 3 of the 2012 version – for legitimacy in scanning roof tops with thermal imaging equipment because of the limited infringement on personal privacy. And since using odour sensors for the detection of drug labs is not significantly different in nature from the use of heat detection, the former technique is most likely legitimate as well. Only when police forces move to the next step of systematic observation, by daily drone passes over specific city districts for example, does the legal definition of special investigative powers apply, and the requirement of explicit authorisation.
As far as fire departments go, they have no powers of investigation and as such are only allowed to use drones as tools to support their assigned core activities, on condition of complying with GDPR requirements in terms, for instance, of the six principles of personal data processing: i) lawfulness, fairness and transparency, ii) purpose limitation, iii) data minimisation, iv) accuracy, v) limitation of storage and vi) integrity and confidentiality.
All in all, there are sufficient grounds for legitimate use of drones by emergency assistance and law enforcement services.
The aforementioned Investico study however, does suggest that there are serious issues with the level of drone cybersecurity. Practically every single drone currently in use by Dutch organisations in the public sector has been supplied by Chinese tech firm DJI which, in numerous international investigations, has been found very likely to equip its drones with control apps designed to channel large volumes of personal data to servers in China. While also allowing hackers live access to drone camera images.
Meanwhile, DJI maintains that its drones are safe and that users are under no obligation of sharing data. Apart from which, DJI drones supplied to governments and public organisations have extra security, the company claims. For what it is worth, as in many instances in the past, the words ‘Made in China’ have not exactly proven to be a guarantee, in spite of assurances to the contrary, for security and confidentiality. More often than not, software and devices alike turned out to contain accommodating little back doors for easy access to sensitive information and personal data by Chinese authorities and cybercriminals. One example that comes to mind is the Huawei system used by KPN for customer data processing.
In the case of the Dutch drone experiments, the regional fire department involved also claims total security, stating that camera images are not being stored and that the drones being used do not have internet connections. Which, in my opinion, would indeed make the operation good to go in terms of security measures, as there is no danger of image wiretapping through the internet.
If one thing is certain, it is that crime prevention and emergency assistance services need all the help they can get, whether provided by additional operatives or by drones and technology in general. The end, in this context, seems to justify the means. But although the Dutch drone-assisted firefighters appear to have implemented an appropriate level of cybersecurity, it remains essential, in the long-term picture and from a human rights perspective, to consider the option of European self-reliance in software and technology development in accordance with European privacy requirements. And without the danger of built-in back doors to not particularly privacy-minded countries.