Will the cyber worm turn?
The tale of the Stuxnet worm is one of those seemingly good-news stories that grows more worrisome over time.
Security experts first became aware of the mysterious Stuxnet malware last summer, but it wasn’t until months later that they agreed on its likely target: Iran’s secretive nuclear weapons program. The worm hid itself benignly in personal computers, spreading (often through USB drives) until it could infect machines made by Siemens that control motors and other industrial equipment. The infected controllers intermittently sent the motors racing, all the while reporting that everything was normal.
Analysts speculate that Stuxnet damaged a sizable percentage of the gas centrifuges at Iran’s well-guarded uranium enrichment facility in Natanz, which relies on Siemens controllers. Iran hasn’t talked in detail about the situation, but U.S. and Israeli officials (who won’t discuss Stuxnet publicly) are no longer projecting that Iran is poised to develop an atomic bomb. Instead, they’ve pushed back their estimates by several years, citing unspecified “technological problems.”
If Stuxnet was responsible for slowing Iran’s nuclear development, then it accomplished over a period of a few months what the United States and its allies have failed to do in years of talks, threats and sanctions. It also achieved that goal without a shot being fired, buying more time for negotiators to try to persuade Iran to stop its bomb-making efforts.
That’s the encouraging side of the story. The other side is that Stuxnet demonstrates heretofore unseen capabilities of cyber attackers, many of whom aren’t playing for our side.
The stakes are particularly high for the United States, where so many crucial pieces of infrastructure — such as the electrical grid, transit systems, sewage treatment plants and dams — rely on automated systems. But security experts are sharply divided over how imminent the threat is, with some saying the risk of cyber warfare has been overblown by self-interested public officials and contractors, and others expressing grave concern and calling on the government to shore up defenses on the double.
Governments have long engaged in the sort of sabotage that Stuxnet evidently was designed to do. In the 1980s, for example, the U.S. supplied unwitting Soviet agents with defective equipment, components and designs to disrupt a number of high-priority technology projects. What’s different about Stuxnet is the use of computer malware to accomplish that disruption remotely.
One of the signal achievements of the Stuxnet authors was their ability to infect machines that aren’t connected to the Internet, and possibly not to any kind of computer network. Manufacturers of industrial controllers and automation equipment had long assumed that they didn’t need elaborate security mechanisms because they weren’t online. That assumption gave way over the years to a more cautious approach, based on the theory that even isolated computers could be vulnerable. Stuxnet has proved those fears to be well founded.
Granted, creating Stuxnet required far more resources than ordinary hackers typically possess. Security analysts say the worm appears to have been produced by a team of people over a period of months, and it could not have been accomplished without extensive knowledge of the targeted controllers and software as well as the setup in Natanz. That’s why so many fingers have pointed at the U.S. and/or Israeli governments as the likely masterminds.
So Stuxnet doesn’t provide a blueprint for wreaking havoc on U.S. nuclear plants or financial institutions. Nevertheless, it’s hard to ignore the signs that a new kind of arms race has started, one that goes beyond the denial-of-service attacks and corporate espionage that hackers allegedly conducted, either at the direction of or in support of their governments, against Estonia in 2007, the former Soviet republic of Georgia in 2008 and Google in 2009.
The thought of such an arms race is troubling for at least two reasons. The first is that we don’t know how the existing international laws and treaties that govern conventional conflicts would apply to cyber war, if at all. For example, what constitutes an attack, how can anyone tell who’s responsible, and what kind of response is justified?
More important, the United States isn’t positioned well to defend against a weapon of Stuxnet’s caliber. It’s not for lack of trying; over the last year, the Obama administration has activated a “cyber command” at the Defense Department to raise the military’s defenses against intrusion and develop offensive capabilities, and it has improved coordination between the Pentagon’s efforts and the Department of Homeland Security’s initiative on the civilian response to cyber threats.
But the government’s reach is limited. More than 80% of crucial U.S. infrastructure is in private hands. And while we count on the military to protect that infrastructure against attacks from land, air or sea, it’s up to the owners to defend themselves against attacks from cyberspace.
That’s the right strategy, considering how much there is to defend and how much faster private industry can adapt to changes in technology than the government can. But the U.S. has done a poor job of making sure that operators of crucial infrastructure stay on top of the changing threats. Although regulatory agencies have laid out best practices for guarding against intrusion, compliance is voluntary and some industry leaders have fiercely resisted efforts to go further. Banks and power companies have business interests to defend against cyber threats, but that’s not enough assurance that they’ll take all the steps they should.
The ultimate vulnerability is the public’s lack of understanding about the role individuals play in creating cyber security risks. Despite years of warnings about computer viruses and identity theft, millions of computer users still routinely do risky things that help cyber criminals gain control of their PCs. That sort of behavior is what enables hackers to assemble “botnets” and launch the kind of attacks that crippled government websites in Georgia as Russian forces rolled across the border.
There’s a big difference between flooding websites with traffic and making machines go haywire, but it’s worth remembering that Stuxnet was spread by people doing something simple and common that was riskier than they realized: transferring files from one PC to another through a USB drive. In an increasingly interconnected world, it’s hard to tell where the cyber battlefield begins and ends. An effective defense starts with everyone understanding what the risks are and what they can do to minimize them.
More to Read
A cure for the common opinion
Get thought-provoking perspectives with our weekly newsletter.
You may occasionally receive promotional content from the Los Angeles Times.