Just because you’ve started on a digitalized, virtualized, cloud-computing journey doesn’t mean you’re enjoying the ride so far.
Most of the participants in a roundtable this week at Honeywell Users Group 2023 reported their companies have adopted some type of virtualized and/or cloud-based computing related to their process-industry applications. The session, “Virtualization and the cloud—running control systems today,” was moderated by Paul Hodge, marketing manager for Experion PKS at Honeywell Process Solutions, Robert Cox, operations technology manager at Georgia-Pacific, and Rick Stopf, operations manager for Experion PKS at HPS.
“Some level of virtualization is already in the majority of process-industry deployments, and the day is coming when all projects will have a virtual layer,” said Hodge. “There’s a large amount of churn in today’s equipment, systems and process operations, and virtualization can provide some stability. For instance, our Experion Highly Integrated Virtual Environment (HIVE) decouples software from hardware to give users greater mobility, while our cloud offerings are mostly SCADA applications. Likewise, Experion IT HIVE is a private, data-centric cloud service, while its second release will be a more public cloud.”
Basic benefits and drawbacks
In general, virtualization usually means shifting some monitoring or other non-critical tasks from hardware devices running dedicated software to virtual machines or other functions running entirely on servers either onsite or elsewhere, while the cloud typically involves hiring third-party providers such as Amazon Web Services (AWS) or Microsoft Azure to provide data processing via the Internet on their remote servers.
Either the cloud, virtualization or a combination of both can reportedly save significant expenses compared to running and maintaining one’s own computers and databases onsite, as well as reduce total cost of ownership (TCO), and improve reliability and cybersecurity. However, many of the roundtable’s participants indicated reservations about linking their computing and communications too closely with offsite servers due to more possible interruptions in crucial data processing or increased vulnerability to potential cyber-attacks due to added network connections.
Georgia-Pacific’s experience
Cox reported that Georgia-Pacific’s virtualization is presently all on-premises, but it’s looking at the cloud to handle some of its computing stack. “It’s a long journey, and different facilities can be at very different points along it,” explained Cox. “A couple of our facilities are very virtualized, and a couple are just dipping a toe in. The U.S. Dept. of Homeland Security’s (DHS) Cybersecurity and Infrastructure Security Agency (CISA) defines specific rules for the nation’s 16 critical infrastructure sectors, so none of our 50 facilities are planning on going with all-virtualized controls any time soon.”
Cox added that one of the most important aspects of adopting virtualization and cloud computing securely, effectively and safely is supporting these new technologies with well-trained personnel. “One of our mills is 100% virtualized in the OT space with virtual servers, except for one, high-speed, data-collection application,” said Cox. “The others needed to build in their virtual technologies and required skills first.”
Combine and hybrid as needed
Hodge explained that virtualization and the cloud aren’t an either/or proposition, and that potential users can deploy as much or as little as they need to support their existing hardware and hardwired equipment and systems.
“We’re not pushing virtualization and the cloud only, especially for distributed control systems. Any control of this nature will be hybrid by definition, for example, taking into account sensors and devices that record alarms,” said Hodge. “Virtualization has three core characteristics: consolidation, mobility and abstraction. Beyond the benefit of consolidating 100 nodes down to 10, virtualization can make it much easier for users to refresh and update their technology 10 or 15 years in the future. This simplification should also help them with business case justification in the present.”
Because virtualization increases flexibility by decoupling software from hardware, Cox concurs that users no longer have to worry that a physical workstation could be a single point of failure. “Virtualization also gives maintenance people more flexibility in responding to failures because they no longer have to rely on one device,” said Cox. “And, even if an incident occurs, it’s much easier to restore virtual systems than hardware. Virtual applications can be restored in a day, where hardware can take a week or longer.”
How to get involved—gradually
Cox and Hodge added that interested end users can learn about, practice, get comfortable, and begin to use virtualization and the cloud at their own speed. “Users don’t need to go all out at once,” explained Cox. “They can start with small experiments and pilots, such as implementing a virtual, industrial demilitarized zone (DMZ) in their network for Level 3 devices, such as historians, SCADA or Microsoft Windows domain controllers. Once they get familiar with their initial deployment, it will teach them to manage and trust their virtual environment. Eventually, they can they try to add some Level 2 devices such as PLCs, DCS functions and other critical operations to their virtual environment.”