Cyber Security and Integrated Circuits
By Bob Gariano
In 2002 the New York Times published an article by William Safire that reported that a Russian pipeline explosion that occurred two decades earlier was caused by a CIA orchestrated cyber attack. In 1982 a three kiloton explosion tore apart a natural gas pipeline that supplied vital energy to central Russia and which ran through the remote regions of Siberia. Without warning, the pipeline ruptured and failed catastrophically, detonating a fireball that could easily be seen even during daylight by observation points in outer space.
In Safire’s article the claim was made that the CIA had intentionally placed faulty semiconductor chips and hardware embedded software code into the supply chain of companies who built the pipeline’s control mechanisms. This faulty hardware had, according to the article, been installed in the equipment in the field some months earlier and the resulting malfunctions had created the conditions that led to the gas pipeline explosion. The accuracy of this claim has never been confirmed or denied, though today we know that such cyber activity is entirely possible.
Syrian Radar Systems
IEEE Spectrum, a well regarded technical periodical that documents developments in the field of electrical engineering, reported a similar cyber attack some years later in another part of the Middle East. These unsubstantiated reports attribute the success of the Israeli air attacks on a Syrian nuclear facility in September 2007 to a carefully installed semiconductor component within Syria’s sophisticated air defense system.
According to the reports, Israeli agents had arranged for specially designed integrated circuits containing so called electronic kill switches, to be assembled into Syria’s ground to air radar systems. On the eve of the bombing attack, the kill switches were activated remotely, rendering the Syria defenses blind to incoming Israeli aircraft.
Software Attacks
At this time, most of the publicity describing malicious cyber agents involves networks and software. Such malware has multiplied as public access to the internet and common source codes and characteristics of well known software packages have grown globally. Fortunately, with the exception of certain symbiotic malware that relies upon weaknesses in the host software to survive, most such malware and network infections are detectable and generally curable. The software industry has become adept at publishing patches and detection software that defeat such nuisances.
Even so, software issues have the ability to disrupt, if only temporarily, key economic activities. Three months ago, such a software worm named Stuxnet made headlines. The new malware became globally viral. It began by being transmitted from USB devices. Originally designed to steal data from information systems in the utility industry, Stuxnet is particularly hard to control. A common variant called CPLINK requires only that an infected USB device be installed. The infection spreads, without a need for the user to activate a particular suspicious icon or pop up symbol.
SophosLabs, a software security firm, reports that the CPLINK malware can attack most modern Windows operating systems. More ominous, because the Stuxnet worm has an affinity for the utility industry, specifically attacking software systems known as SCADA or supervisory control and data acquisition systems, the malware hits at a critical infrastructure component of our global economy.
Iranian Nuclear Shutdown
It was such a SCADA system, designed and activated by energy industry giant Siemens, that was attacked in September 2010 and caused a shut down of the new Iranian nuclear facilities. The extent of the operational failure of the software in this facility may never be fully known to the rest of the world, however, the malware created an unstable operating condition in the new nuclear facility already being run in a politically unstable area.
Even though Siemens seems to have solved this problem, additional variants of the Stuxnet worm are already on the loose and these threaten utility systems around the world. Intimidating as they are, such software issues are the tip of the ice berg in the cyber security field. Much more troubling are embedded hardware faults that can inflict attacks on host equipment. That brings our attention back to the faulty chips.
Complexity of Chips
Integrated circuits that are used in all modern electronic devices are etched into silicon wafers in automated manufacturing processes that produce hundreds of chips every cycle. Each of these circuits or chips can contain the equivalent of billions of transistors. Even a few rogue transistors can dramatically alter the performance of integrated solid state circuits.
Checking each circuit is exceedingly difficult given the complexity and wide spread use of such equipment. The economics of integrated circuit production suggests that quality control is limited to the most superficial level. For instance, it is easier to replace an entire cell phone with a malfunctioning chip than to check each component circuit exhaustively on every phone.
Still, in mission critical devices, even a seemingly well controlled circuit can contain numerous threats based on a proportionately small number of embedded hardware changes. These embedded Trojan Horses or kill switches that involve only a tiny portion of the device would be almost impossible to detect in a chip that contains billions of transistors.
Sleeper Cells
Just as important, the timeline of a hardware based attack is more insidious than the real time attacks of software based viruses. Most security experts today recognize the immediate symptoms of network or software based malfeasance. In a global game of technical cat and mouse, software designers have been able to detect and remedy software attacks because the symptoms are immediate and fixes are just as fast to implement.
In contrast, hardware issues that are intentionally embedded in chip designs become systemic intruders that are almost impossible to detect until activated. Compromised hardware is literally a ticking time bomb that is implanted during chip design and manufacture and then not activated until some point in the future. Such a detonation can be activated remotely and no software patch can remedy the fault. As former General Wesley Clark wrote in the December 2009 issue of Foreign Affairs Magazine, “Sabotaged circuits cannot be patched; they are the ultimate sleeper cells.”
Even as chip manufacturers become more and more security conscious and chip production facilities are meeting ever higher levels of quality assurance, it is now well established that, in the future, significant cyber attacks will most likely involve hardware based changes instead of software based viruses. Where ever sophisticated electronic control systems are used based on integrated circuits, there is an opportunity for mischief and worse.