Newswise — A recent report by the Center for a New American Society warned of dangers inherent to the widespread use of automated weapons in real-world environments. Sarah Kreps, government professor at Cornell University and an expert on drones, says we should not be lulled into thinking that technology can make decisions in combat easier.

Bio: http://government.arts.cornell.edu/faculty/kreps/

“The risks and limits of autonomy come in two forms. The first is that you can't put subjective decisions about who's a combatant or civilian into an algorithm. This has implications for targeting decisions. A human - or rather many humans - should be in the loop to analyze individuals' behaviors to see whether they are directly and actively involved in combat. Enemy status is often a subjective judgment and something that cannot easily be programmed into an autonomous weapon. We should not be lulled into thinking that technology can make these decisions easier.

“Second, autonomy implies full access to computerized systems. There are benign cases of interruptions, like a computer bug, but also less benign cases like hacking. Autonomy introduces big cybersecurity risks. If groups can hack into the Pentagon's system of security clearances, they can almost certainly hack into the system that controls autonomous weapons, in which case the potential for disaster is almost limitless.”

Cornell University has television, ISDN and dedicated Skype/Google+ Hangout studios available for media interviews.