Yesterday, Ernie Rakaczky, business development manager for control system security for Invensys Systems Canada Inc., began a roundtable discussion on control system security by saying, "We're all on a journey together. Control system security is not a product but a path. It is important that we all work together, on this journey, with the end user community playing an active role." The point of the journey, he went on, "is to provide a more secure cyber infrastructure."
On this journey, Rakaczky noted, "no effort is too little, and there is no one perfect way." Very often, he said, people are paralyzed by the huge job control system security has turned out to be, but "there is support," he said, "and no one is alone."
Like Invensys' cube puzzle logo, Rakaczky posited a set of interlocking interests and functions for control system security with the end user at the center of the puzzle, and academia, standards and other non-governmental organizations, IT security suppliers, control system suppliers, and government agencies fitting together in a ring of protection around the end user.
As a control systems vendor, Rakaczky said, it is Invensys' job to try to maintain a balance between openness and functionality on the one hand, and security and protection on the other. Invensys is building network security into products, both for the installed base and for future development, as well as defining and practicing internal objectives within the company for security policies that apply not only to Invensys' own business practices but also to their security interactions with their customers.
Vendors, Rakaczky said, including Invensys, need to provide a set of security support services including assessment, design, implementation and management, and take a lifetime focused view of how to best serve the end user community with security. But, he said, it is up to the end users to provide guidance on their needs and requirements, with strong interaction and collaboration with the vendor community.
Security Technology leaders, Rakaczky said, need to understand what is unique about the controls environment, and how best to provide evaluation and testing support for their products. Government and academia need to take a lifetime focused position on their security awareness and contribution to the issue, and provide stronger collaboration in development cycles.
Larry Spoonemore, of the Southern Company, provided an end user viewpoint. One of his major issues is how best to apply systems management to legacy process control systems. "When we're disabling unused ports, services, and user accounts, how do we tell which ones to disable? How do we detect and prevent introduction of malicious code, and how do we recover?" he said. "After all, we can't always just re-boot when we do a backup and restore after an incident."
Spoonemore gave a further litany. "We need our control systems and networking vendors to test and approve antivirus protection and security patches, and then we need to do our own testing before we apply them, for safety's sake." He said that design change management, often ignored now, is a future must-do requirement. "New or updated systems need to be secured," he went on. Security requirements and policies have to be developed and maintained, tested, and any changes detected and dealt with according to policies.
"While the technology is not the solution," Spoonemore said, "we've come to believe that technology tool sets will help us to the solution." We need to develop compliance tools, he went on, that enable us to find solutions for asset, document, change management, virus and patch, vulnerability and training issues. These tools, he believes, need to provide a compliance posture and analyis, track remediation, facilitate assessments, and reduce costs.
"Assessments need to be ongoing, not a snapshot in time," Spoonemore concluded.
Mark Townshend, of Enterasys, and Marty Edwards, of Idaho National Laboratory and the Department of Homeland Security, concurred and continued the discussion about security policies. "It's not letting every packet on the wire," Townshend said, "it's about letting the right packet on the wire."
"We must audit for change control," Townshend said. "Change must be centralized, and a 'changes console' is required for management of configuration. Change control is part of the process," he went on.
"We need to provide for survivability in the event of problems," Townshend pointed out. "Networks must be intelligent and self healing. Failures may not be caused by a security event, but they can affect the security of the system anyway."
"We need to inventory in realtime," he noted, "the flow rates of data just like we record and manage fluid and gas flows in the process. A sudden flow rate change is a pretty good indicator that, even though you may not have any telltales, you have been penetrated by a virus, a worm or a trojan of some sort."
Enterasys and Invensys, Townshend said, are working as strategic partners for Mesh networking, to provide an always available network, policy-enabled networking, machine authentication, and secure network services. These include Invensys' Performance Services consulting and Enterasys' security response center for the modeling of new vulnerabilities and threats to Invensys users.
Marty Edwards continued, "We need to move from a culture of reliability to a culture of security; from improving awareness to increasing implementation; and from risk identification to risk reduction." Systems were designed for lifecycles in excess of 15 years, favoring reliability over security, Edwards said, and "the result is inherent vulnerabilities."
Speaking with his governmental agency hat on, although Edwards has been, himself a control systems user until joining Idaho National Laboratory in September (he is in fact still the Chair of the Emerson Exchange User Group Board of Directors), Edwards talked about INL's control systems security center, which is a full scale SCADA/DCS test center, at least as capable as Eric Byres' old lab at BCIT, and the fact that INL provides second tier support to the (now Department of Homeland Security) CERT team at Carnegie-Mellon. "We HAVE a disclosure mechanism so that you can report events and the outcome, in a secure way."
Summing up, Edwards said, "Ultimately it is up to the end user. We can all assist, but control system security requires a cultural change and a willingness to be responsible."
Then Edwards gave me my first of two huge nightmares.
Process control networks that were isolated are now connected to, not only the corporate networks, he said, but also to vendors and contractors. Products that have internal intelligence (like every HART and fieldbus transmitter) have over or covert backdoor entry to permit and facilitate legitimate maintenance and diagnostic penetration. These intrusion points are often very unprotected. In addition, he said, "Generally we find that vendor default accounts and passwords are not removed, and the guest accounts provided for startups are still available. SCADA use of enterprise services such as DNS is widespread, and there is no security level agreement with peer sites or vendors."
This produced the rather scary scenario that someone could hack into the vendor company at a remote location (say, somebody's office in China, or Kazakhstan, for example) and using the vendor's own network and VPNs gain access through one or more maintenance back doors to a refinery or other process plant.
Clayton Coleman, solutions architect for Invensys' enterprise networks and security, pointed out that Invensys, and most of the other major vendors have implemented "user call us" maintenance and diagnostics policies that mitigate against or even eliminate the potential of dialup or VPN penetration of users' systems for event-based outsourced work.
But what, I asked Invensys' Don Clark, who happened to be sitting next to me, does that do for your plans to sell all manner of remote online realtime services like asset management, optimization, alarm management, and stuff like that? Clayton's response was that nobody is doing that stuff yet, and he expected that by the time people actually get there, the security holes will be plugged.
If that nightmare wasn't bad enough, later in the day, I talked to Bob Adamski and Luis Duran of Triconex, who gave me my second heartburn nightmare.
As if the first one wasn't enough, Adamski told me that he'd just disabused the Department of Homeland Security of their feelgood notion that Safety Instrumented Systems weren't connected to the DCS in most plants. He estimates that over 80% of them are, not to mention the new integrated SIS systems that use a common backplane with the basic process control system, which ALL are connected to the DCS, by design.
So, his worse nightmare scenario is this: the blackhat first uses the control system vendors' access to disable the SIS, and then uses the DCS to blow up the plant, which the now-disabled SIS would have prevented.
I admire Clayton Coleman's fortitude and his optimism in believing as he does that we'll be able to institute secure enough policies that even realtime online remote communications won't permit this level of unauthorized intrusion, but I am a cynical and jaded automation professional, and I also know that the first rule of cybersecurity protection is, "what one coder can protect, another coder can break."
This is a real issue, and it isn't just or only an Invensys issue. Other automation suppliers are maybe even further down the path of providing outsourced realtime services to the process industries. The fact that Invensys and the other suppliers recognize this as an issue, and are willing to talk to their customers about it is not as reassuring as perhaps it might be.
Welcome to my nightmare.
Leaders relevant to this article: