It’s an exciting time for the drone industry.
Several companies are pushing the boundaries of what is possible. Hardware is becoming ever more sophisticated, reducing weight, improving flight times and bringing down prices.
Last month DJI launched the Mavic Mini, a tiny 249-gram drone with a range of 4km that can shoot 2.7K video and fly for 30 minutes on a single battery.
A feat of engineering and a measure of how far things have come in the last decade.
October also saw the launch of another industry benchmark: Skydio’s new drone, the Skydio 2. It’s lighter, cheaper and more sophisticated than the original R1 – which is saying something.
Both of these recent releases highlight the progress manufacturers are continuing to make.
They are being assisted by advances in other spheres, too. One example would be developments from computer chip specialists like Nvidia, who have also recently released a new supercomputer designed for AI at the edge.
There have also been regulatory steps forward. Aviation authorities around the world are getting to grips with drone technology and are increasingly open to more advanced operations. A market is developing around autonomous and BVLOS applications, even if the rules aren’t quite there yet.
Difficult questions on the horizon
But there are challenges on the horizon. One recent product launch in the space of autonomous aviation might have slipped under your radar: Anduril’s Interceptor drone. The Interceptor is an autonomous counter-drone drone that can locate, track and bring down targets at the touch of a button.
Unlike DJI and Skydio’s computer vision technology – which aims to avoid collisions – Anduril’s drone attempts to crash into what it recognizes as a rogue drone. Similar technology, vastly different application.
As with Skydio and DJI’s latest releases, Anduril’s drone is a remarkable feat of engineering. But it’s also proof that the potential of drone technology has reached a point where we have to start asking ethical questions.
Is weaponizing drones a good idea? How will we stop this technology getting into the wrong hands? How can we devise regulations to make sure drones reach their potential as life-saving tools without inadvertently enabling life-taking applications?
In an interview with the FT, representatives for defense start-up Anduril have insisted that the Interceptor is not the first step on the road to autonomous warfare.
“I can’t imagine a scenario when humans are taken out of the loop,” said Matt Grimm, one of Anduril’s co-founders and its chief operating officer, adding that he did not “accept the premise that the natural end is all out autonomous robot conflict.”
Currently, all of the Interceptor’s attacks are initiated and confirmed by a human operator.
Grimm also described autonomous weapons as unattractive for logical, financial and ethical reasons, and said none of Anduril’s customers have asked the company to develop them.
“Everything we’re doing is in coordination with the US government…. [and] an extension of Department of Defense policy,” said Brian Schimpf, co-founder and chief executive. “Most of the work we’ve done has been very defensive,” he added. Devising weapons is “not something we’re rushing into.”
That may well be the case. But one man’s defensive tool is another’s offensive weapon. More broadly, the elements required to build autonomous drones capable of causing harm are already present across the industry.
As soon as you start bringing those elements together, the argument goes, the genie is out of the bottle and there’s no going back. The technology could be weaponized in a sophisticated way and deployed either against other aircraft, as Anduril does, or against people.
All of which means that regulators and manufacturers have a role to play in shaping how this all unfolds.
Skydio: We’re not going to be in the business of building weapons
The aforementioned Skydio is one company whose technology fits into the category of ‘Incredible, but you wouldn’t want that to get into the wrong hands‘. The startup’s AI is as sophisticated as anything we’ve seen in the industry and it doesn’t require a lot of imagination to take it from a super-smart, obstacle-avoiding, object-tracking consumer drone to something more sinister.
Speaking with DroneLife, Skydio CEO Adam Bry explains that there’s a middle ground to be found for companies like Skydio and Anduril.
“We definitely see clear applications and use cases for our technology in the military and defense world. It’s something we’re working on and we’ll have more to say about it in the coming year,” he says.
“Any time you’re doing product development, you need to think about the ethical implications. Even when you’re purely working with consumers, there are safety concerns and privacy concerns. That kind of thinking should always be present when you’re a product company.
When you get into public safety and defense, the stakes are heightened because [what you’re building], it’s potentially being used in war,” he added.
We have to be realistic and accept that government contracts are big business. No drone manufacturer would or should turn down a lucrative public safety or defense contract, at least not while we’re still at the stage of drones simply providing situational awareness – a feature that is already supporting life-saving operations for first responders and public safety teams around the world.
Bry says that ethical concerns down the line are “definitely something that we think about. I don’t think there are any easy answers.”
“The position we’ve taken is that we’re not going to be in the business of building weapons. We think that our products can be useful for providing information and keeping people out of harm’s way. But we’re not interested in weaponizing them in any form.
“There are concerning possibilities… A shorter-term thing for us is making sure that people don’t abuse the Skydio SDK. With a programmable flying robot in the hands of a bad actor, there are all kinds of concerning things that could happen.
Bry explains that the company is keeping the SDK private at this moment in time. Partly because there is more interest than they can support, and partly because doing so ensures they have more control over how it’s used.
Drone industry manufacturers will soon have decisions to make
Inevitably, the issues raised by the unveiling of Anduril’s Interceptor will be faced by the industry’s mainstream manufacturers. Particularly as they strive for public safety and defense contracts.
It’s not just about developing physical weapons. AI can be used as a tool of oppression, for example assisting surveillance and facial recognition. At which point it’s worth noting that concerns over the development and application of AI aren’t exclusive to the drone industry.
In recent times, Github employees have quit over the company’s work with Immigration and Customs Enforcement (ICE).
Google has walked away from two lucrative AI contracts with the Pentagon following complaints from staff and concerns that the company’s technology should not be in the business of war.
And last year, 450 Amazon employees reportedly signed a letter urging CEO Jeff Bezos to stop the sale of facial recognition software to law enforcement agencies.
Ultimately, it’s down to regulators to have the foresight to put measures in place that ensure all technology under development is applied for good.
But given how they tend to be a step behind the drone industry as it is, a little self-reflection and self-policing might be a more realistic option.