Remove ads
2017 film From Wikipedia, the free encyclopedia
Slaughterbots is a 2017 arms-control advocacy video presenting a dramatized near-future scenario where swarms of inexpensive microdrones use artificial intelligence and facial recognition software to assassinate political opponents based on preprogrammed criteria. It was released by the Future of Life Institute and Stuart Russell, a professor of computer science at Berkeley.[1] On YouTube, the video quickly went viral, garnering over two million views[2][3] and was screened at the United Nations Convention on Certain Conventional Weapons meeting in Geneva the same month.[4]
Slaughterbots | |
---|---|
Directed by | Stewart Sugg |
Written by | Matt Wood |
Produced by | Matt Nelson |
Narrated by | Stuart Russell |
Production company | Space Digital |
Release date |
|
Running time | 8 minutes |
Language | English |
The film's implication that swarms of such "slaughterbots" — miniature, flying lethal autonomous weapons — could become real weapons of mass destruction in the near future proved controversial.[2][5][6]
A sequel, Slaughterbots – if human: kill() (2021), presented additional hypothetical scenarios of attacks on civilians, and again called on the UN to ban autonomous weapons that target people.[7]
The dramatization, seven minutes in length, is set in a Black Mirror-style near future.[8][9] Small, palm-sized autonomous drones using facial recognition and shaped explosives can be programmed to seek out and eliminate known individuals or classes of individuals (such as individuals wearing an enemy military uniform). A tech executive pitches that nuclear weapons are now "obsolete": a $25 million order of "unstoppable" drones can kill half a city. As the video unfolds, the technology get re-purposed by unknown parties to assassinate political opponents, from sitting congressmen to student activists identified via their Facebook profiles. In one scene, the swarming drones coordinate with each other to gain entrance to a building: a larger drone blasts a hole in a wall to give access to smaller ones.[1][10][11]
The dramatization is followed by a forty-second entreaty by Russell: "This short film is more than just speculation; it shows the results of integrating and miniaturizing technologies that we already have ... AI's potential to benefit humanity is enormous, even in defense, but allowing machines to choose to kill humans will be devastating to our security and freedom."[10][12]
According to Russell, "What we were trying to show was the property of autonomous weapons to turn into weapons of mass destruction automatically because you can launch as many as you want... and so we thought a video would make it very clear." Russell also expressed a desire to displace the unrealistic and unhelpful Hollywood Terminator conception of autonomous weapons with something more realistic.[13] The video was produced by Space Digital at MediaCityUK and directed by Stewart Sugg with location shots at Hertfordshire University[14] and in Edinburgh. Edinburgh was chosen because the filmmakers "needed streets that would be empty on a Sunday morning" for the shots of armed police patrolling deserted streets, and because the location is recognizable to international audiences.[15] All of the drones were added in post-production.[13][16]
In December 2017 The Economist assessed the feasibility of Slaughterbots in relation to the U.S. MAST and DCIST microdrone programs. MAST currently has a cyclocopter that weighs less than 30 grams, but that has the downside of being easily disturbed by its own reflected turbulence when too close to a wall. Another candidate is something like Salto, a 98-gram hopping robot, which performs better than cyclocopters in confined spaces. The level of autonomous inter-drone coordination shown in Slaughterbots was not not available, as of 2017, but that is starting[when?] to change,[according to whom?] with drone swarms being used for aerial displays. Overall The Economist agreed that "slaughterbots" may become feasible in the foreseeable future: "In 2008, a spy drone that you could hold in the palm of your hand was an idea from science fiction. Such drones are now commonplace ... When DCIST wraps up in 2022, the idea of Slaughterbots may seem a lot less fictional than it does now." The Economist is skeptical that arms control could prevent such a militarization of drone swarms: "As someone said of nuclear weapons after the first one was detonated, the only secret worth keeping is now out: the damn things work".[1]
In April 2018 the governmental Swiss Drones and Robotics Centre, referencing Slaughterbots, tested a 3-gram shaped charge on a head model and concluded that "injuries are so severe that the chances of survival are very small".[17][18]
As of 2020[update], DARPA was actively working on pre-operational prototypes that would make swarms of autonomous lethal drones available to the US military.[19]
In December 2017, Paul Scharre of the Center for a New American Security disputed the feasibility of the video's scenario, stating that "Every military technology has a countermeasure, and countermeasures against small drones aren't even hypothetical. The U.S. government is actively working on ways to shoot down, jam, fry, hack, ensnare, or otherwise defeat small drones. The microdrones in the video could be defeated by something as simple as chicken wire. The video shows heavier-payload drones blasting holes through walls so that other drones can get inside, but the solution is simply layered defenses." Scharre also stated that Russell's implied proposal, a legally binding treaty banning autonomous weapons, "won't solve the real problems humanity faces as autonomy advances in weapons. A ban won't stop terrorists from fashioning crude DIY robotic weapons ... In fact, it's not even clear whether a ban would prohibit the weapons shown in the video, which are actually fairly discriminate."[2]
In January 2018, Stuart Russell and three other authors responded to Scharre in detail. Their disagreement centered primarily on the question of whether "slaughterbots", as presented in the video, were "potentially scalable weapons of mass destruction (WMDs)". They concluded that "We, and many other experts, continue to find plausible the view that autonomous weapons can become scalable weapons of mass destruction. Scharre's claim that a ban will be ineffective or counterproductive is inconsistent with the historical record. Finally, the idea that human security will be enhanced by an unregulated arms race in autonomous weapons is, at best, wishful thinking."[5]
Matt McFarland of CNN opined that "Perhaps the most nightmarish, dystopian film of 2017 didn't come from Hollywood". McFarland also stated that the debate over banning killer robots had taken a "sensationalistic" turn: In 2015, "they relied on open letters and petitions with academic language", and used dry language like "armed quadcopters". Now, in 2017, "they are warning of 'slaughterbots'".[20]
Andrew Yang linked to Slaughterbots from a tweet during his 2020 U.S. Presidential primary candidacy.[21]
The sequel video, published 30 November 2021, had over two million views on YouTube by 8 December.[22]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.