Emphasis has been placed on cyber security and resilience of computer systems, patching software, zero trust, and multi-factor authentication. We continue to make significant progress in new add-on cyber security features protecting IT systems, many of which are now inextricably linked to both the IoT and operational (OT) networks. However, the same cannot be said for control system field devices.
Informing and educating engineers and operators on vulnerabilities introduced by IT systems by itself will not ensure more secure and resilient operations of cyber-enabled physical systems that provide energy, water, transportation systems and pipelines. Bringing engineers, equipment designers, and operators into cyber security discussions is essential to provide necessary input to the decision makers. Additional focused research, education, and tools are needed to address cyber-induced physical damage (sometimes called “kinetic effects”) to critical infrastructures.
The Transportation Security Administration (TSA) Pipeline Cyber Security Requirements is one example where the gap between the network specialists and the engineers is impacting the development of cyber security requirements that can address physical damage. Pipeline cyber security has been a priority since the May 7, 2021 Colonial Pipeline ransomware attack. Understandably, the focus has been on ransomware and other network attacks that can shut a pipeline down. Incidents and intrusions continue to become more prevalent, complex, and disruptive. Consider the Colonial Pipeline attack, which disrupted fuel deliveries in the Eastern United States, and it’s easy to see how an attack on IT or business systems can have a significant operational impact, in this case on the supply chain.
There are other risks, however, that also deserve attention. There has been little focus on cyber incidents targeting control systems using perfectly acceptable commands and instructions in such a way to cause physical damage that could lead to pipe leaks, pipe ruptures, or pipeline compressor station damage. To cause kinetic damage, an intruder works backward: looking at cyber and physical systems in the context of the target facility and its physical processes to cause physical damage. Engineers are familiar with inherent design and operational “weaknesses” in natural gas compressor stations, petrochemical distillation columns, wastewater digester systems, power plant boiler control systems, etc. In general, IT specialists including cyber security experts trained to look for malware and anomaly in code and algorithms, are not trained to understand how software or firmware instructions can be sequenced to create physical failures of key operational components.
Why this blog
On April 5, 2022, Jeff Carr issued a report: “GURMO Hackers Go Kinetic Against Gazprom - Two Pipeline Fires So Far” (https://jeffreycarr.substack.com/p/gurmo-hackers-go-kinetic-against?s=w&_kx=1VbQ6aS7UUo96k7jvQYVzpE9cdWQ6Vzi0kxTTI5Tuea6iOPy2id1165HyRje7xZB.RHPy4M). According to Jeff, “cyber operators at the Main Directorate of Intelligence at the Ministry of Defense of Ukraine (GURMO) have been conducting computer network operations (CNO) against Gazprom. To date, two pipelines have experienced rupture events that were directly the result of a computer network attack.”
Relevance to TSA Pipeline and other infrastructure cyber security requirements
Jeff’s report is still subject to confirmation. However, the story raises the specter that pipelines (and all other physical infrastructures) can be at risk of physical damage from cyberattacks. In this case, the TSA Pipeline Cyber Security Requirements (and all other physical infrastructure cyber security requirements) don’t address cyberattacks that could cause kinetic damage to pipeline equipment (and other physical infrastructure equipment).
There have been more than 50 pipeline control system cyber-related incidents since the late 1990s (and similar or significantly greater numbers in other infrastructures such as more than 125 control system cyber incidents in water/wastewater and more than 500 control system cyber incidents in electric power). These include five cyber-related pipeline ruptures. This blog will identify the deficiencies in the TSA Pipeline Cyber Security Requirements by comparing them to control system kinetic cyber security attacks (notably the Stuxnet attacks and the Aurora demonstration) and unintentional cyber-related pipeline ruptures. The same approach can be used to identify the control system cyber security requirement gaps in the other infrastructures as there have been cyber-related kinetic damage including train crashes, plane crashes, refinery explosions, significant power plant damage, major environmental spills, medical device fatalities, food adulteration, data center and laboratory facilities damage, water contamination, etc.
TSA Pipeline Cyber Security Requirements
The TSA Pipeline Cyber Security Requirements were issued in Security Directive Pipeline-2021-01 Enhancing Pipeline Cybersecurity May 28, 2021. On July 20, 2021, TSA announced its second pipeline cybersecurity directive, requiring critical pipeline owners and operators “to implement specific mitigation measures to protect against ransomware attacks and other known threats to information technology and operational technology systems, develop and implement a cybersecurity contingency and recovery plan, and conduct a cybersecurity architecture design review.” The TSA Administrator has stated that the NIST Cybersecurity Framework, which is referenced in the second directive, “would give an idea of some of the items that we require,” and that the directive also mandates cyber architecture design reviews and contingency planning.
There is no mention of control system-directed cyberattacks. There is also a question about the term “known” – known to whom? There are only limited well-developed control system digital forensics techniques and tools for use on such systems. Consequently, the requirements for disclosure may not be able to be triggered simply because it may not be clear that a system is under cyberattack. Consider the ambiguity surrounding not just the Aurora demonstration, but the Stuxnet, Triton, and other incidents.
Lack of Engineering input
According to a senior representative at TSA, there were no pipeline engineers involved in the development of the pipeline cyber security requirements. It’s not surprising that the requirements do not address potential pipe failures because those requirements don’t address the process sensors and actuators the engineers concern themselves with. As Honeywell’s Sinclair Koelemij stated on LinkedIn: “A very small number of people actually know the details of pipeline automation. Only core OT engineers that work(ed) for a mac, epc, or asset owner would know these details. Which is probably not more than a few thousand globally, of which probably less than 1 % are involved in cyber security.”
Examples of where the TSA requirements are network-based to the exclusion of sensors and actuators include:
- Discovery of malicious software on an Information or Operational Technology system;
- Activity resulting in a denial of service to any Information or Operational Technology system;
- A physical attack against the Owner/Operator's network infrastructure, such as deliberate damage to communication lines;
- Any relevant information observed or collected by the Owner/Operator, such as malicious IP addresses, malicious domains, malware, or the abuse of legitimate software or accounts;
- A description of the incident's impact or potential impact on Information or Operational Technology systems and operations. This information must also include an assessment of actual, imminent or potential service operations, operational delays, and/or data theft that have or are likely to be incurred….
Missing is any mention of pipeline leaks, ruptures, process sensors, actuators, compressor stations, chemical analyzers, etc.
Lack of reporting of control system cyber incidents
Consider the conclusions of an article we contributed for the January 2022 issue of IEEE Computer magazine – “Control System Cyber Incidents Are Real—and Current Prevention and Mitigation Strategies Are Not Working”. The article states:
“There are minimal to no control system cyber forensics below the Internet Protocol (IP) level and almost no training for the engineers to identify if an upset condition or sensor malfunction could possibly be cyber-related. As an example, the chemical plant in Saudi Arabia that was the victim of the Triton attack on the safety systems was restarted with malware still in the system as no one recognized the plant shutdown was caused by a malware attack. It was not until a second shutdown occurred that it was recognized as a cyberattack. The culture gap between engineering and networking can exacerbate this inability to detect a control system cyberattack.
Companies and organizations are usually reluctant to publicly acknowledge they have been the knowing or even unsuspecting victim of a cyberattack. Most cyberattacks on large public companies are evaluated to be below the materiality threshold required for financial reporting. Reporting requirements tend to apply for data breaches not for equipment damage or injuries/deaths. Internet of Things (IOT) legislation focusing on data breach will likely make this lack of control system cyber incident reporting even more of a challenge. This can be seen from the recent pipeline cyberattack disclosures from TSA (https://www.controlglobal.com/blogs/unfettered/tsa-cyber-security-requirements-are-still-not-addressing-control-system-unique-issues ) and compared to previous cyber-related pipeline ruptures.
There are minimal control system cyber forensics and logging for control system field devices and minimal training for Operational personnel to identify control system cyber incidents. Consequently, there are few publicly identified control system cyber incidents. There are common threads to many of the Industrial Control System (ICS) cyber incidents beyond the traditional IT breakdowns given in the ICS Computer Emergency Response Team (CERT) report. In the 2007-10 time-frame, Applied Control Solutions was under contract to MITRE supporting NIST to extend NIST 800-53 for control systems. As part of that effort, three real public cases were used to demonstrate how the extended NIST 800-53 standard would be useful to the non-federal government organizations:
- the Maroochy Shire wastewater SCADA attack,
- the Olympic Pipeline rupture, and
- the Brown Ferry 3 nuclear plant broadcast storm.”
The TSA cyber security requirements focus on the IT network issues for reporting. As an example, critical pipeline operators reported more than 220 cybersecurity incidents in the several-month period after TSA implemented emergency measures in the wake of the Colonial Pipeline shutdown, according to TSA Administrator David Pekoske. However, I am not aware of any reports of pipeline ruptures or pipeline outages during that time period meaning the 220 cyber security incidents were IT incidents not affecting the operation of the pipelines. (https://www.controlglobal.com/blogs/unfettered/us-critical-infrastructure-cyber-security-is-backwards-its-the-process-that-counts-not-the-data). Moreover, the 220 reported incidents are more than almost all other industries have reported over 10 years.
Control system issues can cause kinetic damage to pipeline infrastructure
Natural gas compressor stations use computers to regulate the flow and number of units that are needed to handle the scheduled system flow requirements. Every station has an emergency shutdown system connected to a control system that can detect abnormal conditions such as an unanticipated pressure drop or natural gas leakage. These emergency systems will automatically stop the compressor units and isolate and vent compressor station gas piping. However, control system cyber vulnerabilities can affect the ability to stop the compressors resulting in damage to natural gas compressor stations.
From Ralph Langner’s To Kill a Centrifuge,
“The Stuxnet cyberattack infects Siemens S7-417 controllers. Immediately after infection the payload takes over control completely. Legitimate control logic is executed only as long as malicious code permits it to do so; it gets completely de-coupled from electrical input and output signals. The attack code makes sure that when the attack is not activated, legitimate code has access to the signals; in fact, it is replicating a function of the controller’s operating system that would normally do this automatically but was disabled during infection. The input and output signals are passed from the electrical peripherals to the legitimate program logic and vice versa by attack code that has positioned itself “in the middle”. Things change after activation of the attack sequence, which is triggered by a combination of highly specific process conditions that are constantly monitored by the malicious code. Then, the manipulation of process values inside the controller occur. Process input signals (sensor values, in this case serial) are recorded. Those values are then replayed in a constant loop during the execution of the cyberattack and will ultimately show on SCADA screens in the control room, suggesting normal operation to operators and software-implemented alarm routines. During the attack sequence, legitimate code continues to execute but receives fake input values, and any output (actuator) manipulations of legitimate control logic no longer have any effect. When the actual malicious process manipulations begin, pressure rises continuously. The pressure sensors have a configurable setpoint that prompts for action when exceeded, namely to signal the valve to open until the measured process pressure falls below that threshold. The pressure sensors must have a data link to the Siemens S7-417 which enables the latter to manipulate the valves. Pressure (and all other process) sensors can drift requiring that their errors be corrected by calibration. The pressure controller can be told what the “real” pressure is for given analog signals and then automatically linearize the measurement to what would be the “real” pressure. If the linearization is overwritten by malicious code on the S7-417 controller, analog pressure readings will be “corrected” during the attack by the pressure controller, which then interprets all analog pressure readings as perfectly normal pressure no matter how high or low their analog values are. The pressure controller then acts accordingly. In the meantime, actual pressure keeps rising. Other sensors are compromised as well because they would have shown critical high and low pressure readings, automatically closing valves and triggering an alarm. The same tactic could be used for other valves and the additional pressure sensors as they use the same products and logic. For Natanz, the way to exploit the physical vulnerability is to overpressure the centrifuges or to manipulate rotor speeds, resulting in predictable damage. Since centrifuge operating pressure at Natanz is controlled by the Cascade Protection System and rotor speed by the Centrifuge Drive System, these two systems became prime candidates for compromise. Only then started the cyber part of the attack engineers’ work. If they are able to determine cyber manipulations which reliably exploit a physical vulnerability, they have arrived at what I call a plant-level vulnerability, for which Stuxnet gives the perfect example. Getting there requires looking at cyber and physical systems in the context of the plant and its physical processes; an approach waiting to be adopted in cyber defense”.
Nuclear centrifuges may be unusual and exotic, but many engineers are familiar with “plant-level” vulnerabilities in natural gas compressor stations, petrochemical distillation columns, wastewater digester systems, power plant boiler control systems, etc. Relying on "security by obscurity" is not a good defense. The Stuxnet approach could be used against pipeline compressor stations and other critical infrastructures. Compressors stations and other critical infrastructures use the same, or similar, controllers and process sensors to accomplish similar tasks (e.g., monitoring and controlling pressure) as used in Stuxnet. In 2019, ethical hackers successfully gained full control of the advanced Siemens S7 Simatic System, analyzed and identified the code of Siemens protocol, created a fake alternative engineering station, commanded the controller at will, turned the controller on and off, downloaded rogue command logic, changed the operation and source codes, all while succeeding to create a situation in which the engineer operating the controller did not recognize their “hostile intervention.” Changing controller logic is not just a Siemens issue. Rockwell Automation’s controllers also had two Stuxnet-type threats, vulnerabilities CVE-2022-1161 and CVE-2022-1159. These vulnerabilities exposed Rockwell’s Logix Controllers and Logix Designer applications to attacks that can modify automation processes, allowing the attacker to fully damage systems without the user being aware.
As for the sensors, on December 29, 2021, Ankit Suthar published the article “Are your smart instruments secured?” https://www.linkedin.com/pulse/your-smart-instruments-secured-ankit-suthar/?trackingId=7r%2Bf25P7QXKo83zDDsPZkw%3D%3D. Ankit argues: “We have been doing the commissioning of more than 3,000 smart instruments which includes loop check, simulation, calibration, and datasheet verification, Asset Management System (AMS) configuration for each instrument. There were no passwords at all in most of the instruments, even by default. You simply plug in your HART communicator and change whatever you want.”
Additionally, presentations were made at the 2016 ICS Cyber Security Conference by the US Air Force Institute of Technology on hacking multiple vendors’ wired-HART transmitters as well by a Russian cyber security researcher from Moscow. Another US Air Force Institute of Technology presentation was made at the 2017 ICS Cyber Security Conference on hacking wireless-HART transmitters and digital valves.
And finally, the Aurora vulnerability can cause kinetic damage to Alternating Current (AC) rotating equipment as demonstrated in the 2007 INL Aurora test of damaging a generator. Aurora can also kinetically damage gas compressor stations by attacking the AC induction motors. This was explicitly identified in one of the Aurora vulnerability slides declassified by DHS in 2015. Aurora kinetic damage to petrochemical facilities and water facilities were also identified in the declassified DHS slides.
Yet, these types of kinetic events are not addressed by the TSA cyber security requirements (These same devices are considered out-of-scope by the American Water Works Association, American Petroleum Institute, and NERC Critical Infrastructure Protection-CIP cyber security requirements for electric utilities.)
Comparison of TSA requirements to actual cyber-related pipeline ruptures
The 1999 Olympic Pipeline gasoline pipeline rupture was very similar to the 2010 Pacific Gas & Electric (PG&E) San Bruno natural gas pipeline rupture in many ways. This demonstrates there is a lack of information sharing and that silos still exist between infrastructure operators – natural gas vs. gasoline. Both incidents involved SCADA mal operation, both killed people, both led to the bankruptcies of their companies, and neither would have been identified under the TSA pipeline cyber security guidelines.
Comparison between the Olympic and San Bruno Pipeline ruptures
Olympic Pipeline (gasoline) | PG&E San Bruno (natural gas) |
Known previous SCADA problems | Known previous SCADA problems |
SCADA and lead detection were on Ethernet LAN | SCADA (not sure about leak detection) was on Ethernet LAN |
Previous construction (water line) impacted the structural integrity of the pipeline months prior to the accident | Previous construction (water line) impacted the structure integrity of the pipeline months prior to the accident |
No SCADA cyber security training | No SCADA cyber security training |
Numerous NIST SP800-53 controls violations | Numerous NIST SP 800-53 controls violations |
On the day of the incident, the SCADA system became inoperable (went from a 3-7 second scan rate to totally inoperable immediately prior to the pipeline failure) and was unable to remotely monitor or actuate control valves. Anomalies with sensing. |
Just before the incident, PG&E was working on their uninterruptable power supple (UPS) resulting in a reduction in power supply to the SCADA system. Because of this anomaly, the electronic signal to the regulating valve for Line 132 (to San Bruno) was lost. The loss of the electrical signal resulted in the regulating valve moving from partially open to fully open position as designed. Anomalies with sensing. |
Operator displays didn't indicate loss of SCADA functionality | Operator displays didn't indicate loss of SCADA functionality |
Leak detection system did not function in a timely manner | Leak detection system did not function in a timely manner |
The following table addresses the control system issues associated with these and other cyber-related pipeline ruptures that are not addressed by the TSA cyber security requirements.
Actual Pipeline Rupture Issues | TSA Pipeline Cyber Security Requirements |
Sensor compromise/malfunctions | Not included |
Operator display deficiencies | Not included |
Leak detection alarm issues | Not included |
Addressing known previous SCADA problems | Not included |
Lack of control system cyber forensic capabilities | Not included |
Lack of SCADA simulator for training/forensics | Not included |
Information sharing on control system details | Not included |
Lack of alarms/alarm management | Not included |
Electro-Magnetic Interference (EMI) protection | Not included |
Summary
Unintentional cyber accidents or malicious cyberattacks can cause kinetic damage and there are no cyber forensics, training, or cyber security requirements for addressing these incidents. TSA (along with EPA, API, AWWA, FERC, NERC, and others) need to require that domain experts be involved, process sensor/equipment monitoring be included, and appropriate training be required.
Recommendations
The TSA Pipeline cyber security requirements (and corresponding requirements for other infrastructure sectors) need to be more control system-focused. That is, pipelines and pipeline critical control equipment such as compressors, process sensors, motors, actuators, and analyzers need to be explicitly included in the TSA cyber security requirements. Because many of the control system cyber incidents weren’t viewed as malicious cyberattacks, they have been largely ignored by the cyber security community. This despite the latest TSA (and other infrastructure) requirements for reporting pipeline (and other infrastructure) cyber security cyber incidents. There are also no requirements that engineers and technicians who are knowledgeable about pipeline (or other critical infrastructure) operation be included in the cyber security team. The same engineering vs networking gaps continue to occur across all critical infrastructure sectors even though it is the engineers that are familiar with “plant-level” vulnerabilities in compressor stations, distillation columns, digester systems, boiler control systems, etc. As the same or similar equipment and control system devices are used in other infrastructures, the Stuxnet approach can be used against pipeline compressor stations and other infrastructures. These staffing and training issues were not addressed by the CISA Advisory Committee at the April 4, 2022 CISA Advisory Committee meetings (https://www.controlglobal.com/blogs/unfettered/comments-to-the-cisa-cybersecurity-advisory-committee-on-process-sensor-cyber-insecurity).
NATO offers a model for the US
Vytautas Butrimas, from the NATO Energy Security Centre of Excellence’s Research and Lessons Learned Division, prepared a guide at the request of the NATO Petroleum Committee (Guide for Protecting IACS Against Cyber Incidents in the NATO Pipeline System). This guide focuses on the control systems and has been distributed inside NATO. As a community, we should consider building upon Vytautas' formative guide.
Joe Weiss
Leaders relevant to this article: