66ec48e6a1ff22bc65805268 Shutterstock 2226232223

Intelligence delivers know-how

Sept. 19, 2024
Hargrove acquires data from advanced smart devices

Once users realize the amount and variety of data today’s applications can provide, it only fuels their appetite for more, even if they need to organize and prioritize it later.

“The biggest shift in optimization and reliability in the process industries is that instruments and other devices are pulling and pushing far more data than ever before. More is expected by users and clients from their suppliers and system integrators because devices from flowmeters to weigh scales are now Ethernet-ready with built-in microprocessors,” says Heath Stephens, PE, digitalization leader at Hargrove Controls & Automation in Mobile, Ala., a division of Hargrove Engineers & Constructors and a certified member of the Control System Integrators Association (CSIA). “Regardless of the variety of instruments and non-traditional sensors—a vibration or power meter, a stick-on temperature sensor, or some other clamp-on instrument—control engineers are expected to know how to pull information from them.

“We need to know more details about different types of devices, including how they’re networked and what communication protocols they use. For instance, one client uses electronic data recorders to monitor a handful of tags in a single-loop control setup. These recorders were previously panel-mounted with a couple of wires, but now they’ve got Ethernet ports and register/parameter tables, which let users pull data and amalgamate it with the rest of their information.”

Smart tide rises

Stephens reports this rising flood of data is caused by recently added microprocessors and software, which make devices cheaper, smarter and more capable. “Suppliers put chips in devices for better onboard diagnostics, so they get free data and improved controls,” says Stephens. “Lots of flow data is available via HART and other protocols, but flowmeters also deliver temperature, density, mass flow and device health information.”

Stephens adds it’s also getting easier to acquire data from increasingly advanced smart devices. Previously, many types of I/O weren’t compatible with HART, so users could usually only get data via 4-20 mA, or they’d need to use a HART-compatible device or add a HART multiplexer to the junction box of their asset or system. Over the past 10-15 years, HART and similar fieldbus data became less isolated as more I/O cards, handhelds, and other devices were upgraded to be compatible. Wireless, Ethernet, and other communication methods also make rich data more available.

“In the past, these devices monitored control functions but didn’t necessarily perform control functions. Now, these devices communicate, perform field-level control, and act as data hubs,” explains Stephens. “For instance, a power-monitoring unit for a motor control center (MCC) sat on the panel door. Process engineers didn’t think of these devices as a means for data collection, and left them to maintenance personnel, especially in large plants. However, as variable-frequency drives (VFD) proliferated in recent years, they challenged what users thought of as traditional controls.”

Roll with the (network) changes

Stephens reports that digitalization’s shift from hardware to software is reflected in the designs, specifications and estimates for Hargrove’s many projects and how they unfold. “The I/O counts change as we move from classic, wired I/O via 4-20 mA or 120 V to software-centric I/O via Modbus TCP/IP, Profinet, OPC UA and other Ethernet-based protocols,” says Stephens. “In the past, we’d see 80% hard I/O and 20% soft I/O, and now we’re often seeing about 50% of each. Years ago, we’d bring soft I/O points to distributed control systems (DCS) via different serial or bus cards that could talk with Modbus, Profibus or Foundation Fieldbus. In the 1990s, these tasks were done by add-on or even third-party modules, which later started delivering data via regular Ethernet TCP/IP and versions like EtherNet/IP, Profinet and others.”

To acquire data in a more native way, Stephens adds that Emerson developed a built-in Ethernet I/O card (EIOC), which it launched in the mid-2010s as part of its S-Series platform. This was followed a few years later with the release of Emerson’s PK controller with onboard Ethernet connectivity for third-party data, such as skid units, IIoT devices or HVAC systems. Other systems vendors are increasingly supporting native Ethernet communications for third-party systems. However, even though collecting information has become easier, Stephens adds its usefulness still has to be ensured by using accurate sensors and instruments that reflect actual process conditions and environments, and how sensor accuracy decays over time due to clogs, corrosion and other physical factors. “Data scientists sometimes forget that information comes from operations and equipment that are subject to change,” says Stephens. “They tend to think that all data types and sources stay the same, so they don’t suspect when conditions may have changed. Just because some results are repeatable doesn’t necessarily make them right in the first place. If your information says the sky is green, it’s time to question that data and the devices that produced it.”

Though few are contemplating hybrid (onsite and offsite) cloud control loops due to reliability and inherent latency issues, Stephens reports that many end-users are getting more comfortable with using offsite cloud-computing services for data archiving, historizing and analysis. Their findings can be routed back through data hubs running parallel to the control system, and their recommendations and instructions can indirectly adjust the controls to help optimize process operations.

AI in on the act

To sort through all this information, Hargrove employs a unified namespace (UNS) that grants access to network participants, and lets it present content to operators, engineers, maintenance and managers on dashboards tailored for each group. Stephens adds that Hargrove is also using artificial intelligence (AI) tools to help clients improve quality and reliability through offline analytics and real-time monitoring and alerts.

“We work with several types of AI, but it all boils down to information on a platform that gets analyzed by an algorithm. AI tends to take everything as gospel, since it often doesn’t have a valid way to tell if that data is legitimate or if it should alert staff that it isn’t, so cleaning and pre-processing data is important,” adds Stephens. “AI is still evolving, but it’s beginning to allow us to look at much greater volumes of data, and find the relationships, efficiencies and value in it. For example, AI can help with multi-variable analyses, and generate a better fingerprint of what’s going on in a process. In addition, as microprocessor costs decrease, AI capabilities will increase, so more intelligence will be coming, including devices with self-analytics. It will make data collection and analysis easier and more powerful.”

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

IEC 62443 4-1 Cyber Certification – Why ML 3 is So Important

The IEC 62443 Security for Industrial Automation and Control Systems - Part 4-1: Secure Product Development Lifecycle Requirements help increase resilience for control systems...

Multi-Server SCADA Maintenance Made Easy

See how the intuitive VTScada Services Page ensures your multi-server SCADA application remains operational and resilient, even when performing regular server maintenance.

Your Industrial Historical Database Should be Designed for SCADA

VTScada's Chief Software Architect discusses how VTScada's purpose-built SCADA historian has created a paradigm shift in industry expectations for industrial redundancy and performance...

Linux and SCADA – What You May Not Have Considered

There’s a lot to keep in mind when considering the Linux® Operating System for critical SCADA systems. See how the Linux security model compares to Windows® and Mac OS®.