The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

The independent newspaper of the University of Iowa community since 1868

The Daily Iowan

Sonn: Rise of the machines

This was a landmark week for killer robots across the planet. Tuesday’s meeting at the United Nations in Geneva was the first time human government officials gathered to focus on an increasingly probable future in which nations use lethal autonomous robots for warfare and law enforcement. What are “lethal autonomous robots” you ask?

Those types of robots would have the capability — in an ideal world — to use deliberate reasoning to assess possible targets and use appropriate force. Basically, it would be like EVE from Wall-E, or even Samantha from Her, if she were more violent and had several machine guns. According to a report in Foreign Affairs, these “security surveillance guard robots … can detect targets through infrared sensors” and “have an automatic feature that can detect body heat in the demilitarized zone and fire with an onboard machine gun without the need for human operators.” Terminators, basically. Unfortunately, the ideal world and the real world can be very different. Opponents of lethal autonomous robots have a very good reason to be unsettled by the notion of thousands of robots being trusted to take human lives in an “appropriate” way, whatever that means.

For me, the issues of programming sticks out like a robot from the future inexplicably speaking in a heavy Austrian accent. I’m not a rocket surgeon, and I’m sure the technology will somehow be viable in the distant future, but I have no idea how robots can be programmed to use deliberate reasoning with enough reliability to successfully differentiate enemy combatants from civilians. I also have no idea how a robot can be programmed to have intuition (again, like Samantha in Her). Movies can get away with highly-functional artificial intelligence and robots, but in real life, I don’t know how some of the things our brains can do are transferable into something completely synthetic. There’s still something very inexplicable, perhaps innate, about the way we work. Robots just wouldn’t be the same.

Humans and robots are imperfect, but for different reasons. Both make mistakes, but for different reasons. Humans get tired, for example, whereas robots generally do not get influenced by things such as sleep deprivation and stress (mentally speaking). On the other hand, robots might be able to use such things as thermal vision, but humans have things such as feelings and are also not bound by some kind of invisible set of parameters, unless you want say something pompous about time and space being the limits of perception for us humans. I mean, what makes a robot distinguish from a crouching soldier and a large kid? How do you program feelings?

There’s a reason nobody has voiced concerns over a possible apocalypse started by robots — the robots would be too stupid to do it. I guess it would also be awkward for a country’s representative to lose her or his mind à la Michael Shannon in Take Shelter over something a little silly like that at a U.N. convention, but still. Nobody is saying robots have to be as human as humans are, but I think striving toward that achievement would lessen civilian collateral damage and also make warfare more efficient (which sort of sounds horrifying).

The last issue with lethal autonomous robots revolves around rules and laws. Arms-control rules need to get updated just as everything else, which means there are currently possible loopholes for any mischievous country to do some unseemly things with robots — which would promptly backfire, I bet. Ultimately, the goal of the U.N.’s focus on robot warriors is to address those loopholes, and opponents of lethal autonomous robots will strive to get a treaty passed declaring human orders mandatory for a robot to fire on a target.

I’m in favor of such a treaty, even if it comes at the expense of future news headlines like “Robot pulverizes intoxicated college student after mistaking him for prone enemy combatant.”

More to Discover