Don’t hesitate to jump into virtualization

System integrator Hargrove shows how to approach virtualized tools—and prepare for AI
Feb. 16, 2026
5 min read

Key Highlights

  • Historically, industries resist change due to concerns over reliability, effort and unfamiliarity, but new inventions eventually become standard.
  • The shift from hardware to software, cloud and virtualization is accelerating, with virtualization now extending to control system controllers.
  • AI and machine learning are primarily used internally for process optimization, with agentic AI showing promise for autonomous decision-making in the future.

Silly humans, we don’t like change, even when it’s beneficial and necessary because there’s a chance new inventions might not work as well as old practices. Or, some added effort might be needed to get them up and running, and we primarily don’t want to get off our rear ends.

When the spear showed up, there had to be a bunch of guys with clubs saying it wasn’t ruggedized enough. And, the same grunting chorus has greeted every wheel, loom, waterwheel, blast furnace, motor, pneumatic device, relay, transistor, semiconductor, microprocessor and software package when they arrived, and continued until they took over. Remember when wireless and Ethernet networking emerged, and many questioned if they should be on plant-floors or in the field? And where are they now? Everywhere, of course, and the same is happening as many similar functions transition from hardware to software.

“Change comes slowly to the world of process control. We aren't yet seeing much of a shift in control technologies to the cloud, but we are seeing more process monitoring, historization and analysis slowly moving to the cloud,” says Heath Stephens, PE, automation solutions director at Hargrove Controls & Automation in Mobile, Ala., a division of Hargrove Engineers & Constructors, and a certified member of the Control System Integrators Association (CSIA). “And we’re seeing more software-based control technologies. In fact, we see much less resistance, or even questions, about whether a technology is software- or hardware-based these days. This is partly due to the improved robustness of Windows and Linux platforms and embedded PC hardware.”

Stephens reports that Hargrove has been virtualizing control system servers and operator stations for years, condensing server hardware, and using thin client workstations. “Now virtualization is coming to system controllers. Virtual controllers that used to be used for testing and development only are now rolling out to the production floor,” adds Stephens. “Emerson has recently unveiled its new virtual controller for its DeltaV DCS, and I think other vendors will be offering this type of product in the future.”

Go digitalize your AI self

Just as end-users are encouraged to explore and practice with new technologies in small, non-critical experiments before scaling up and deploying them more widely, this advice goes double for system virtualization, AI and other software-based solutions.

“We’re eager to deploy containerized applications for our clients, but have so far only done this internally. While offering several advantages, this technology leans more into an IT skillset than some of our operations clients are comfortable with at this time,” says Hargrove’s Stephens. “We’ve used J-SON for web applications for process historian and MES applications. RESTful API has also been a very useful data interfacing technology. While there are many other useful technologies, we still use OPC, Ethernet I/P and Modbus TCP for most of our data transfer applications.”

Likewise, Hargrove is using AI mostly to streamline its internal processes. It’s employed several AI/machine learning (ML) applications for improved process control and predictive reliability for clients, but most of these technologies are closer to simple ML tools, rather than advanced AI agents.

“The reasons for virtualizing systems are to reduce hardware investments and ongoing maintenance of multiple physical PCs. Virtualization has become the norm for any medium to large control system. The benefits are well proven through years of industry deployments,” adds Stephens. “Agentic AI is a much newer technology and extremely promising for fully autonomous AI decision-making and action. To date, few of these applications have advanced much past the pilot stage, and their real-world benefits are still being evaluated. I expect a lot more development in the next few years.”

Get your subscription to Control's tri-weekly newsletter.

Stephens reports a massive mental shift is required to move to an AI-powered and digitalized production world. Without it, new AI technologies will only gradually improve productivity and reliability. “A true shift will require re-envisioning our work processes and how we utilize our human workforce alongside new technology. We also have to accept that moving from a binary, pre-programmed world to one that’s self-organizing and intelligent comes with challenges and uncertainties. When we ask people to make predictions, even skilled and experienced people, we understand that sometimes they’ll be wrong. We will need to accept the same from our AI predictions. An AI prediction tool will be valuable if it makes profitable decisions on the whole, not based on perfection.”

Edging up to virtualization’s deep end

“It can be a challenge for busy operations personnel to learn all the new tools and technologies popping up out there,” says Hargrove’s Stephens. “First, vendors are always eager to offer product demos and talks about their offerings. This can be a good place to learn some basics. However, it can also quickly become confusing with so many vendors offering different and competing products. Talking to personnel in sister facilities is also a great way to not only learn about technologies, but also learn about real-world experiences (good and bad). Finally, seek help from system integrators that have experience implementing these technologies in various industries. Tech that’s considered cutting-edge in one industry may be standard practice in another, and system integrators are often experienced with multiple vendors and industries.

“Make sure all affected parties are part of the project's scoping process. This includes not just users and beneficiaries, but all the people who feed information, help with system maintenance, or are somehow impacted by the new technology. Also, make sure the project's financial benefit is well defined, and put metrics in place to measure it. Many projects are about cost avoidance, which can be much harder to track than projects that add production capacity, etc.”

About the Author

Jim Montague

Executive Editor

Jim Montague is executive editor of Control. 

Sign up for our eNewsletters
Get the latest news and updates