Active and former military among those most opposed to autonomous weaponry
Newswise — AMHERST, Mass. – The results of a new survey by the University of Massachusetts Amherst show that a majority of Americans across the political spectrum oppose the outsourcing of lethal military and defense targeting decisions to machines. The opposition to autonomous weaponry is bipartisan, with the strongest opposition on the far left and far right, and among active and former members of the military.
A random sample of 1,000 Americans was asked how they felt about military technology that could take humans out of the loop altogether, dubbed “killer robots” by the Campaign to Stop Killer Robots, an international coalition of non-governmental organizations launched in April that is working to ban-fully autonomous weapons. The survey was posted today at the website Duck of Minerva, an international affairs blog.
Overall, 55 percent of the survey’s respondents said that they oppose the development of autonomous weapons, while 39 percent were “strongly opposed.” Of the remainder, nearly 20 percent were “not sure,” but the study found that people without a strong opinion tended to favor a precautionary approach to the emerging technology. The findings were consistent across all ages, regions, education and income levels, as well as both genders, but those with higher levels of education and those most likely to follow the news were more opposed.
The survey was overseen by Charli Carpenter, associate professor of political science at UMass Amherst and a specialist in human security and global advocacy movements. Carpenter, who has studied the ethical debate around autonomous weapons since 2007, determined that the survey’s findings support the claims of advocates for a pre-emptive ban.
“While much of the recent public debate has focused on remote-controlled military drones, there has been less research on what people think about fully-autonomous weapons,” Carpenter said. “This question matters in terms of the international law on new weapons, because an important treaty clause states that ‘the public conscience’ should serve to guide policy decisions in the absence of clear rules. These findings would suggest that people across the board do tend to feel very concerned about the development of these forms of weapons.”
Carpenter also collected open-ended answers on why people liked or disliked the idea of autonomous weapon systems. While she is continuing to analyze the data, the preliminary results show concerns over potential malfunctions, the absence of a moral conscience in machines, whether they could distinguish civilians and combatants, the loss of human control over machines with the power to kill, and the possibility that they could be used by dictators to more efficiently violate human rights.
The minority of respondents arguing in favor of the weapons generally mentioned their desire to “protect the troops.” However, active duty military personnel themselves indicated the highest proportion of strong disapproval for autonomous weapon systems among all subsets polled, at 65 percent. Overall, active duty military personnel had a 73 percent disapproval rate of such weapons. Military veterans and those with family in the military also had strong feelings against the weapons, with veterans registering a 50 percent “strongly opposed” response rate and 63 percent general opposition, while military family members showed a similar 62 percent rate of general opposition.
Citing critics who have accused organizations like Human Rights Watch of using the term “killer robots” as a scare tactic to heighten public concern over autonomous systems, Carpenter varied the survey’s questions to randomize its wording between “killer robots” and “fully autonomous weapons.” She said that the results showed public opposition to autonomous weapon systems is constant regardless of the terms used.
“We found no significant difference in public sentiment depending on whether we used language like ‘stopping killer robots’ versus ‘banning fully autonomous weapons,’” Carpenter said. “People are scared by the idea of removing humans from the loop, not simply scared of the label.”
This online survey of 1,000 Americans aged 18 or older was conducted May 10-12, 2013 by YouGov America (www.YouGov.com) under the direction of the University of Massachusetts Amherst. The margin of error for the poll is 3.6 percent.
The report of the survey’s results can be found HERE.Toplines and crosstabs for the poll can be found HERE.