Drones may be at the center of the U.S. campaign to take out extremists around the globe. But there's a 'pervasive vulnerability' in the robotic aircraft, according to the Pentagon's premier science and technology division ' a weakness the drones share with just about every car, medical device and power plant on the planet.
The control algorithms for these crucial machines are written in a fundamentally insecure manner, says Dr. Kathleen Fisher, a Tufts University computer scientist and a program manager at the Defense Advanced Research Projects Agency. There's simply no systematic way for programmers to check for vulnerabilities as they put together the software that runs our drones, our trucks or our pacemakers.
In our homes and our offices, this weakness is only a medium-sized deal: developers can release a patched version of Safari or Microsoft Word whenever they find a hole; anti-virus and intrusion-detection systems can handle many other threats. But updating the control software on a drone means practically re-certifying the entire aircraft. And those security programs often introduce all sorts of new vulnerabilities. 'The traditional approaches to security won't work,' Fisher tells Danger Room.
Fisher is spearheading a far-flung, $60 million, four-year effort to try to develop a new, secure way of coding ' and then run that software on a series of drones and ground robots. It's called High-Assurance Cyber Military Systems, or HACMS.
Drones and other important systems were once considered relatively safe from hack attacks. (They weren't directly connected to the internet, after all.) But that was before viruses started infecting drone cockpits; before the robotic planes began leaking their classified video streams; before malware ordered nuclear centrifuges to self-destruct; before hackers figured out how to remotely access pacemakers and insulin pumps; and before academics figured out how to hijack a car without ever touching the vehicle.
'Many of these systems share a common structure: They have an insecure cyber perimeter, constructed from standard software components, surrounding control systems designed for safety but not for security,' Fisher told a group of researchers earlier this year.
It'd be great if someone could simply write some sort of universal software checker that sniffs out any program's potential flaws. One small problem: Such a checker can't exist. As the computer science pioneer Alan Turing showed in 1936, it's impossible to write a program that can tell if another will run forever, given a particular input. That's asking the checker to make a logical contradiction: Stop if you're supposed to run for eternity.
Fisher became fascinated by this so-called 'Halting Problem' as soon as she heard about it, in an introduction to programming class at Stanford. 'The fact that you can prove something is impossible is such an amazing thing that I wanted to learn more about that domain. That's actually why I became a computer scientist,' she says. The instructor for the class was a guy named Steve Fisher. She was interested enough in him that she wound up marrying him after school, and taking his last name.
But while a universal checker is impossible, verifying that a particular program will always work as promised is merely an exceedingly-freakin'-difficult task. One group of researchers in Australia, for example, checked the core of their 'microkernel' ' the heart of an operating system. It took about 11 person-years to verify the 8,000 lines of code. Fisher is funding researchers at MIT and Yale who hope to speed that process up, as part of one of HACMS' five research pushes.
Once the software is proven to work as advertised, it'll be loaded onto a number of vehicles: Rockwell Collins will supply the drones ' namely, small, robotic Arducopters; Boeing will provide a helicopter; Black-I-Robotics will supply a robotic ground vehicle; another firm will provide an SUV.
In another phase of the program, Fisher is bankrolling research into software that can write near-flawless code on its own. The idea is to give the software synthesizer a set of instructions about what a particular program is supposed to do, and then let it come up with the best code for that purpose. Software that writes more software may sound crazy, Fisher says. But Darpa actually has some history of doing it.
'There was a project led here at Darpa a few years ago [to write software for] synthetic aperture radar. They had a non-expert specify [what should go into a synthetic aperture] radar program,' Fisher adds. 'It took the system about 24 hours to produce an implementation'instead of three months [for the traditional version] and it ran twice as fast. So ' better, faster and a lower level of expertise. We hope to see things like that.'
You couldn't ask a program to write the equivalent of PowerPoint ' it does too many different things. 'By the time you've finished the specifications, you might as well have written the implementation,' Fisher says. But the software that controls drones and the like? Ironically, that's way more straight-forward. 'The control theory about how you do things with brakes and steering wheels, how you take sensor input and convert it to actions is described by very concise laws of mathematics.' So synthesized (and secure) software should be possible to produce.
The goal at the end of HACMS is to have the robotic Arducopter running only fully verified or synthesized software. (The other vehicles will have some, but not all, of their 'security-critical code' produced this way, Fisher promises.) And if the project works out as Fisher hopes, it could not only help secure today's largely remote-controlled drones. It could make tomorrow's drones fly on their own ' without being hacked.
In the remaining component of HACMS, researchers from Galois, Inc. will work on a fully-verified, hack-proof software monitor that can watch a drone's autonomous systems. If those systems operate the robotic aircraft in a normal fashion, the monitor will sit back and do nothing. But if the drone suddenly starts flying itself in some weird way, the monitor will take over, perhaps passing control back to a flesh-and-blood operator.
In other words, a drone won't just be protected from an outside attacker. It'll be protected from itself.
Tidak ada komentar:
Posting Komentar