National Security
1:59 am
Wed November 2, 2011

Stuxnet Raises 'Blowback' Risk In Cyberwar

Originally published on Fri November 4, 2011 4:53 pm

The Stuxnet computer worm, arguably the first and only cybersuperweapon ever deployed, continues to rattle security experts around the world, one year after its existence was made public.

Apparently meant to damage centrifuges at a uranium enrichment facility in Iran, Stuxnet now illustrates the potential complexities and dangers of cyberwar.

Secretly launched in 2009 and uncovered in 2010, it was designed to destroy its target much as a bomb would. Based on the cyberworm's sophistication, the expert consensus is that some government created it.

"Nothing like this had occurred before," says Joseph Weiss, an expert on the industrial control systems widely used in power plants, refineries and nuclear facilities like the one in Iran. "Stuxnet was the first case where there was a nation-state activity to physically destroy infrastructure [via a cyberattack]."

Reactions to the use of Stuxnet in Iran generally fall into two categories. For those focused on the danger of Iran developing a nuclear weapon, Stuxnet was something to celebrate, because it set back Iran's nuclear program, perhaps by years.

But for people who worry about the security of critical U.S. facilities, Stuxnet represented a nightmare: a dangerous computer worm that in some modified form could be used to attack an electric or telecommunications grid, an oil refinery or a water treatment facility in the United States.

"It's just a matter of time," says Michael Assante, formerly the chief security officer for the North American Electric Reliability Corporation. "Stuxnet taught the world what's possible, and honestly it's a blueprint."

Further complicating the Stuxnet story is the widely held suspicion that the U.S. government, possibly in partnership with Israel, had a hand in the creation of this lethal cyberweapon, notwithstanding the likelihood that in some form it could now pose a threat to the U.S. homeland.

Training To Face A Catastrophe

The prospect of a cyberattack on U.S. infrastructure assets has prompted the Department of Homeland Security to arrange a new training program for the people who are supposed to protect the electric grid, manufacturing plants, refineries, water treatment centers and other critical facilities.

The top concern is the industrial control systems (ICS) that oversee the operation of key equipment at those facilities, from the valves to the breaker switches.

By hacking into the computer networks behind the industrial control systems, an adversary could reprogram an ICS so that it commands the equipment to operate at unsafe speeds or the valves to open when they should remain closed. This is roughly the way Stuxnet was able to damage the centrifuges in Iran.

Participants in the training program, based at the Idaho National Laboratory in Idaho Falls, are taken step by step through a simulated cyber-intrusion, so they can experience firsthand how a Stuxnet-like attack on their facilities might unfold.

During an Idaho National Laboratory exercise that was staged for visiting reporters in late September, instructor Mark Fabro installs his "red" team on the second floor of the training center, with the mission of penetrating the computer network of an unsuspecting industrial company, set up on the floor below.

The trainees on the "blue" team downstairs sit in a mock control room, monitoring their computer screens for any sign of trouble.

At first, everything appears normal. The attackers have managed to take control of the computer network without the defenders even realizing it. But gradually, problems develop in the control room.

"It's running really slow," says one operator. "My network is down."

Sitting at their monitors upstairs, the attacking team is preparing to direct the computer system to issue commands to the industrial equipment.

"Take this one out," says Fabro, pointing to a configuration that identifies the power supply to the control room. "Trip it. It should be dark very soon."

Within 30 seconds, the mock control room downstairs is dark.

"This is not good," says Jeff Hahn, a cybersecurity trainer who this day is playing the role of the CEO of the industrial company under attack. The blue team is under his direction.

"Our screens are black and the lights are out. We're flying blind," Hahn says.

During the exercise, the critical industrial facility under attack is a pumping station, such as might be found in a chemical plant or water treatment center. As the operators sit helpless at their terminals, the pumps suddenly start running, commanded by some unseen hand. Before long, water is gushing into a catch basin.

"There's nothing we can do," one of the operators tells the CEO. "We can only sit here and watch it happen."

If this mock facility were an actual chemical plant, hazardous liquids could be spilling. If it were an electric utility, the turbines could be spinning out of control.

If it were a refinery, the tanks could be bursting or pipelines could be blowing up, all because the cyberattackers have been able to take over the computer network that controls the key operations.

The cyberattack scenario is all the more worrisome, because it is not clear that such attacks can be effectively stopped.

"Some of these [systems] can't be protected," says Weiss, the industrial control systems security expert. "We're going to have to figure out how to recover from events that we simply can't protect these systems from."

A U.S. Role In Stuxnet?

The challenge of managing a Stuxnet-like attack is compounded by the possibility that the U.S. government itself had a role in creating the cyberweapon.

U.S. officials were certainly aware of the ICS vulnerabilities that the Stuxnet worm ultimately exploited. An Idaho National Laboratory experiment in 2007, dubbed "Project Aurora," first demonstrated how cybercommands alone could destroy industrial equipment. Idaho lab researchers, who at the time included Michael Assante, rewrote the ICS computer code for the generator, directing the generator to destroy itself.

"When we started to conduct the test, that really robust machine couldn't take it," Assante recalls. "The coupling broke ... and you saw black smoke belching out of it."

In 2008, Idaho National Laboratory researchers performed a demonstration expanding on the Aurora experiment and their further analysis of ICS vulnerabilities. The PowerPoint briefing was prepared specifically for Siemens, the company whose equipment the Stuxnet attack targeted. One year later, the worm was introduced into Siemens ICS equipment used at a uranium enrichment facility in Natanz, Iran.

Ralph Langner, a German cybersecurity researcher who was among the first to analyze the Stuxnet code, came away convinced that it was a U.S. creation.

"To us, it was pretty clear that the development of this particular malware required resources that we only see in the United States," Langner says.

Marty Edwards, director of the Department of Homeland Security Industrial Control Systems Cyber Emergency Response Team, based at the Idaho lab, denies any Idaho National Laboratory role in the creation of Stuxnet, and says the ICS traits the worm exploited were relatively well-known by the time it was created.

"I think it was only a matter of time before those common weaknesses or vulnerabilities were leveraged in an event such as Stuxnet," Edwards says. He would not comment on any role that other U.S. government agencies might have played in the development of the Stuxnet weapon.

That the United States has an offensive capability in the cyberwar domain is a matter of official record. Activities in that area are highly classified, but officials privately acknowledge that U.S. agencies have developed cyberweapons for offensive use.

It has also been reported that the United States has engaged previously in the sabotage of Iranian nuclear facilities. The use of Stuxnet would fit squarely within such a category.

Joel Brenner, the former inspector general at the National Security Agency, writes in his new book, America the Vulnerable, that the use of Stuxnet "would ... have been consistent with U.S. policy but not with previous U.S. methods, which avoided computer operations likely to damage others besides its intended targets."

Some observers have argued that the risk of a weapon like Stuxnet being turned against U.S. assets was so great that no U.S. government agency could logically have supported its development. But others aren't so sure.

Among them is Assante, who was among the first cybersecurity experts to warn that Stuxnet could provide a blueprint for attacks on U.S. infrastructure.

Now the president of the National Board of Information Security Examiners, Assante argues that concerns about Iran developing a nuclear weapon could have justified Stuxnet's creation.

"That is probably one of the largest national security challenges I can envision," Assante said in a recent meeting with reporters at the Idaho lab. "In that context, you can make a pretty strong argument that the benefit of using a cyberweapon to slow down or delay [a nuclear weapon program] or to achieve a specific objective might absolutely outweigh the risk."

Questions Of Information-Sharing

Given the secrecy around the U.S. offensive cyberwar capability, however, that cost-benefit analysis could only be carried out at the highest levels of the U.S. government. Moreover, it is unclear whether agencies responsible for defending the U.S. infrastructure would even be part of the deliberation.

"[The development of a cyberweapon] would probably be so highly classified that the people at DHS wouldn't even know about it," says one former intelligence official.

Such a strict compartmentalization of policymaking would raise the question of whether there is sufficient communication between the offensive and defensive teams in the cyberwar domain.

If Stuxnet was developed by U.S. cyberweapon specialists, the DHS personnel who spent a year analyzing the computer code were presumably engaged in a major duplication of effort.

But Greg Schaffer, assistant secretary of homeland security for cybersecurity and communications, says DHS officials have no complaint over coordination with U.S. agencies responsible for offensive cyber-activities.

"DHS is focused on network defense," Schaffer says. "We do get assistance from the organizations that work on the offensive mission. Whether they bring their work [to us] is something they have to decide. That is not something that we worry about."

A growing awareness of the cyberthreat to critical U.S. infrastructure assets, however, may well deepen concerns about the "blowback" risk to the U.S. homeland from the development of a potent cyberweapon designed to be used elsewhere.

The appropriate level of information-sharing between the offensive and defensive teams within the U.S. cybercommunity is likely to be the focus of intense interagency discussion.

"My sense is that there are lots of people talking about it," says Herbert Lin, chief scientist at the National Academy of Sciences and a co-editor of a book on policy, law and ethics in cyberwar. "But almost all of the discussion is going on behind closed doors."

Eventually, this could change. Whether and when the United States should use nuclear weapons or chemical weapons or land mines has been vigorously debated in public for years, and it may be only a matter of time until the use of cyberweapons gets similar attention.

Copyright 2013 NPR. To see more, visit http://www.npr.org/.

Transcript

STEVE INSKEEP, HOST:

About two years ago, a sophisticated cyber-attack struck Iran's nuclear program. The computer worm used in that attack was called Stuxnet.

RENEE MONTAGNE, HOST:

Researchers described it as the world's first cyber-superweapon. Many suspect the United States was involved.

INSKEEP: Now comes fear of blowback. Some security experts worry that a similar cyber-weapon, or even the same one, could be used to attack the United States. NPR's Tom Gjelten reports.

TOM GJELTEN, BYLINE: The Iranian nuclear plant targeted by Stuxnet had something in common with power plants and oil refineries and water treatment facilities here in the United States: The equipment in all these places is run by computers. You get control of the computer, you control the equipment. You can even destroy it.

Cybersecurity experts have known this for years, but until Stuxnet came along, no one had launched a cyber-attack along this line. Now it's happened.

The Stuxnet attack in Iran physically destroyed centrifuges by working through the computers that controlled them. But now, the people who operate industrial plants in the U.S. need to be prepared for something like Stuxnet being used against them.

MARK FABRO: What happens is when the adversary was actually doing that...

GJELTEN: The Department of Homeland Security has a cybersecurity training program at the Idaho National Laboratory in Idaho Falls. Instructor Mark Fabro takes power plant or refinery security officers through a demonstration - essentially a game - where they get an idea how an adversary could penetrate their computer systems without being detected.

FABRO: Wouldn't it be great for the adversary to be able to manipulate the system and not let the operator see it? Then you get into some very, very interesting ideas.

GJELTEN: Fabro could be describing Stuxnet. One of the features of the worm was that it hid itself, sending messages that everything was normal when, in fact, Stuxnet was in control.

For these training sessions, the instructors have set up a mock control room - like what would be found in a power plant, for example. The trainees play like they're the plant operators, monitoring the computers that control the plant equipment. In the initial phase of the exercise, everything seems normal. Then all of sudden, things start to go wrong.

UNIDENTIFIED MAN #1: It's running kind of slow. It's running really slow. So we got...

UNIDENTIFIED MAN #2: I can't get to my network scans.

UNIDENTIFIED MAN #1: We got something, we got something.

GJELTEN: The cyber attacking team is in a separate room upstairs. Unbeknownst to the defenders, the attackers have worked their way into the heart of the computer system that runs this facility, right down to the lights in the control room. And they're ready to pounce.

FABRO: So here, if this is live, we'll kill, we want to, yeah, take this one out. Trip it. And trip it. And our next feed going back to the camera, it should be - it should be dark very soon.

GJELTEN: Sure enough, about 30 seconds later, the power goes out in the control room. The plant operator and his team realize they're in big trouble.

UNIDENTIFIED MAN #3: This is not a good thing. Our screens are black, the lights are out. We're flying blind.

GJELTEN: In this case, the exercise is being staged, just for some visiting reporters. The attack is on a pumping station. The idea is that it's a manufacturing plant or water treatment center or another facility where pumps are used. And before long, it's the cyber attackers who are running the show, not the plant operators.

UNIDENTIFIED MAN #1: So right now we have no control. So we don't have control of the process. You know, it looks like it's running itself.

GJELTEN: Suddenly, the pumps in this facility turn on. No one in the control room has done anything. But before long, the pumps are pushing water into a catch basin.

UNIDENTIFIED MAN #3: It's pretty bad now. We don't have control of the control system, which is in this cabinet here. Water's falling into the basin here, and we're powerless, right now, to do anything. We can only just wait and watch the spill happen as it's happening.

GJELTEN: If this were an electric utility, the turbines could be spinning out of control right now; if it were a refinery, tanks could be bursting. Pipelines could be blowing up — all because the cyber-attacker has been able to take over the computer system that controls the operation.

In general terms, this is the way Stuxnet worked with the centrifuges at the uranium enrichment plant in Iran. About a thousand were disabled when Stuxnet ordered them to spin at the wrong rate. Now we have to worry someone will use a similar worm to attack critical facilities here in the U.S.

MIKE ASSANTE: It's a matter of time.

GJELTEN: Mike Assante is a former chief of security for the North American electric grid.

ASSANTE: Stuxnet taught, not just us, from a defender perspective, what's possible, but it taught the rest of the world what's possible, and honestly it's a blueprint.

GJELTEN: Assante says he worries about Stuxnet. So does Joe Weiss, a top U.S. expert on the industrial control systems used in power plants, refineries, dams, and other parts of the U.S. infrastructure.

JOE WEISS: Stuxnet was the first case where there was a nation-state activity to physically destroy infrastructure. Nothing like this had occurred before.

GJELTEN: Not with a cyber-weapon, anyway. Weiss has written a textbook on how to protect industrial control systems. He says the most dangerous aspect of cyber-weapons like Stuxnet is that there's no computer patch or easy fix that operators can use to defend their plants from this kind of attack.

WEISS: Some of these are not going to be able to be protected. And we're going to need to figure out, how do we recover from events like this that we simply can't protect these systems from?

GJELTEN: So there are two ways to think about Stuxnet. For people who've worried about Iran getting a bomb, Stuxnet was something to celebrate: It set back Iran's nuclear program. But for people who worry about the security of American facilities, Stuxnet represented their worst nightmare: a dangerous computer worm that in some form could disable a power plant, an oil refinery, or some other piece of the U.S. infrastructure. And one against which we may have no defense.

The next question: Did a U.S. government agency develop Stuxnet as a weapon to use against Iran? And if the United States is developing offensive cyber-weapons, how do we deal with the possibility that those weapons might also be used against us?

That story tomorrow. Tom Gjelten, NPR News. Transcript provided by NPR, Copyright NPR.