Kamis, 07 Juni 2012

How to Prevent Drone Pilot PTSD: Blame the 'Bot

Drone operators

Drone operators. Photo: Air Force

The human operators who control America's killer drones are susceptible to the same psychological stress that infantrymen sometimes experience after combat. But better drones and control systems could help reduce the controllers' stress levels ' by allowing the people to blame the robots for the awful human cost of remote air strikes.

But there's a downside. Sometimes you don't want drone operators avoiding feelings of guilt.

At least that's what Stanford University researcher Ryan Calo has concluded. Calo, one of the country's top experts on the legal and ethical aspects of robot technology, has written extensively on the subject ' and closely tracks the work of other researchers in his field. 'It really matters how you design the controls,' Calo tells Danger Room. 'Design and interface design ' can change incentives and can change the psychological impact.'

When a missile gets fired or a bomb dropped ' something that's happened hundreds of times in America's fast-expanding robotic air war ' someone or something is going to get blamed for any resulting deaths. The question is whether a human being absorbs all of that culpability, which can mean an enormous emotional burden.

For drone operators, many of whom live in the U.S. and steer their armed drones via satellite from air-conditioned trailers, combat stress can be accentuated by the contrast between their jobs and their otherwise peaceful surroundings. 'You shoot a missile, you kill a handful of people,' Missy Cummings, an MIT drone developer and former pilot, told Salon. 'And then ' this is what is strange ' you go home. Your shift is over.'

When you fight in a war without living in a combat zone, 'it's harder to keep it in perspective,' Cummings said.

The question is who shoulders the feelings of guilt and remorse that can result from even justified drone strikes that kill only enemy combatants, Calo says. 'People feel more, or less, responsible for tasks they perform, depending on whether or not [the tasks] happened to them virtually or physically.'

In other words, the more operators can offload their feelings onto the robots, the better they'll feel. The problem with today's Predator and Reaper drones and their standard ground control stations is that the stations don't give the human controllers a chance to shift feelings of responsibility onto the 'bots.

For one, today's armed drones and ground stations require human operators to constantly babysit the robots, monitoring their systems and keeping an eye on their sensor feeds. Spending hours at a time directly controlling a drone tends to make the operator feel like he, rather than the machine, is the real combatant.

So when it's time to fire the missile or drop the bomb, the human being feels like he's doing the shooting, rather than the drone doing it. No one disputes that human 'bot-operators make the decision to fire or not to fire. The issue isn't the actual cause and effect underpinning a drone strike. It's the psychological aspect that Calo is concerned about. 'How much people feel in control of the drone matters to how they experience war,' Calo says.

In reality, the human and the drone inflict the violence as a team. But the moral and emotional burden falls only on the mind of the human controller. Unmanned aircraft with additional autonomy, requiring less human monitoring, could potentially ease those burdens.

A more independent drone could alert its controller for assistance only when it has spotted a likely target. The operator would give a thumbs-up or thumbs-down for the robot to fire a weapon. With only minimal involvement, the human being could avoid feeling fully responsible for the consequences of the strike. Drones are already becoming more autonomous by the day, opening the door for a different emotional dynamic between them and their operators.

Another guilt-avoidance tactic is to anthropomorphize the drone, Calo adds. It turns that robots that look more human can inspire many of the same emotions that actual people do in each other. The Battlefield Extract and Rescue 'bot ' a sort of remote-controlled medic ' was intentionally designed with a vaguely human shape, in order to comfort wounded troops.

By the same token, people perceive human-seeming robots as more morally culpable than strictly machine-like 'bots, Calo says. 'If you're building a search-and-rescue robot, you want to make the robot more anthropomorphic, so if the [human] rescuer tries pulling someone out of a burning building with a robot and fails, you want them to feel it's the robot's fault.'

One way to anthropomorphize robotic airplanes is to give them the ability to speak to their operators, like with Apple's Siri app. Not coincidentally, the Air Force Research Laboratory in Ohio is working on two-way voice controls for drones, hoping to introduce them in the next decade or so.

But there's risk in alleviating drone controllers' combat stress too much, Calo points out. Drone operators who are absolved of their sense of responsibility might become too cavalier in approving lethal attacks ' though, of course, drone operators don't typically launch attacks without getting approval from their chain of command. Still, Calo says, 'if what you want is for soldiers to internalize the cost of war and have incentive not to pull the trigger, maybe you want to design [robot controls] so that a soldier really feels like he's there.' 

Somewhere, there's a balance to be struck between offloading responsibility onto the robots and the mental health of the men and women who control them. But the drone wars proliferate faster than anyone can find that stable footing.



Tidak ada komentar:

Posting Komentar