Balancing data access and security allows smooth IIoT sailing
June 28, 2024
Because IIoT has evolved beyond its early network and Internet connections to embrace a host of recently digitalized tools and software, it can provide flexible solutions to much wider audiences of users and processes.
“IIoT deployments vary by industry. For instance, manufacturers with brownfield plants and processes often employ IIoT as a stopgap against obsolescence. They use it to get real-time data from old sensors and equipment, or from applications and utilities where process controls don’t exist,” says Bruce Slusser, digital transformation practice director at Actemium-Avanceon, a system integrator in Exton, Pa., that’s also a founding and certified member of the Control System Integrators Association (CSIA). “Likewise, some users may have a black-box item they can’t get on their network or integrated with their controls, and adding an IIoT sensor with Power over Ethernet (PoE) lets them tie into an edge gateway with MQTT publish-subscribe protocols. Suddenly, they know if that box is running, blocked or whatever. They’ve got the whole story of that asset, and they can add it to the overall equipment effectiveness (OEE) picture of their operations.”
Gaining access to devices and data that used to be unreachable is excellent news, but Slusser cautions that the added connections that are IIoT’s greatest strength also pave the way for the cyber-intrusions that are simultaneously its greatest weakness.
“Without the Internet, the outside can’t touch the inside. So, just as local Internet networks are typically segregated by virtual local area networks (VLAN), we can design a routing strategy to talk with edge gateways, secure them with active directories about who can sign in, and then pass production data to them,” explains Slusser. “This typically includes two networks that are physically separated, but use managed Ethernet switches to establish a demilitarized zone (DMZ) and communicate through it. Users can also employ port configurations on local switches that let them communication on VLANs, while some VLANs need to remain isolated with no Internet access.”
Greasing the data wheels
Once access and security are balanced, IIoT can add value by streamlining formerly cumbersome tasks, such as centerlining, which is monitoring, optimizing and maintaining process setpoints. “Each shift used to centerline manually by walking to check that gauges were at the right setpoints, and that devices were properly configured for their process,” adds Slusser. “Now, IIoT can do more of these jobs by strapping an inline sensor onto a motor, and using its Ethernet port to publish data via MQTT for its subscribers.”
Because there are so many legacy components and protocols still running in the brownfield applications and facilities, Actemium-Avanceon also uses software agent programs to extract data from edge devices. These agents use application program interface (API) requests to generate responses, and parse the resulting content payloads as JavaScript object notations (JSON) that use readable text to get, put and/or push data blocks.
“Extractors let users create entry points and pipelines to a cloud-based data operations platform,” adds Slusser. “We still have the usual I/O, PLCs and DCSs, but they’re also migrating from traditional four-wire sensors to I/O input cards that can scale and range, and provide data to PLCs, HMIs, SCADA systems and historians. IIoT can bypass all of this with a single Ethernet path that doesn’t need ladder logic programming or scaling. It can send device heartbeat and health data directly from field devices to edge gateways, and use token authentication to interact with the cloud.
In the past, demand for this type of information would require an IT department to build a virtual server, configure it, download required software, integrate cybersecurity, and verify all these functions. Now, all these manual efforts can go into a Docker software container with all the necessary programming and configurations ready to go, according to Slusser.
“For example, a dashboarding tool like Grafana visualization software can already run onsite or in the cloud, and display information from a user’s database or central platform. However, an instance of a software tool can also be added to a Docker container, allowing it run onsite or elsewhere as needed,” says Slusser. “Likewise, IIoT sensors can also publish data that a software tool in a Docker container can subscribe to and use.”
Hub for context
Slusser reports that Actemium-Avanceon employs a uniform procedural architecture to establish data governance with its clients. Sensors, controllers, inspection devices and other OT hardware, along with MES, SCADA and historians, report to an Aveva Edge gateway. It’s here that software extractors create conduits for different protocols, so real-time data from the OT devices can be published to the system integrator’s DataOps platform, which can be provided by Aveva’s cloud-based Data Connect Services or Rockwell Automation’s Data Mosaix. This is where data cleansing, digital threads, model factory, descriptive, predictive and prescriptive functions are performed. It’s also where unified namespace (UNS) software separates content from computing functions, and provides centrally located storage for information and its context. Actemium-Avanceon uses Python-based data science libraries to contextualize three types of information, namely time-series data from sensors, PLCs and other devices, structured data such as alarms and OEE results, and engineering data from diagrams, 3D model and simulations. From there, subscribers like enterprise resource planning (ERP), data lakes and warehouses, and results and visualization programs like Power Bi and Grafana can consume what Data Hub publishes.
“We’ve been pushing this DataOps model for years. Now, it’s becoming a best practice, and Actemium-Avanceon’s partners are catching up and participating in it,” adds Slusser. “Previously, all this data cleaning, visualization and other tasks would have to be done onsite with limited resources, and manual contextualization could take hours or days. Now, we can offload these efforts to the cloud, and converge all our OT and IT data sources. UNS makes sense of information for third-party analytics programs and other users, and this contextualization makes it available, meaningful and useful. Plus, having time-series and structured data together in DataOps on the cloud makes it far easier to identify patterns and their impact on each other across multiple variables, and train new patterns and algorithms for process and asset optimization.”
Slusser adds that Actemium-Avanceon is no longer limited to single variables at single times and single-point solutions. “For instance, a batch process has variables like mix time, continuous temperature, moisture and power, and now we can bring them all together more easily into a process model. This let us run different scenarios, see the impact of changes like less moisture or more power, and determine if more moisture will save energy but still keep quality within limits,” explains Slusser. “Similarly, we had a client that lost a $100,000 gearbox on an extruder, but hadn’t been able to look at all its data. We added DataOps for anomaly detection and predictive maintenance to consume variables from two IIoT sensors on the gearbox measuring temperature and pressure. Finally, we developed and trained a model on the gearbox’s normal activity, so it could identify spikes in the data, tell the difference between nuisance alarms and truly decaying performance, and even produce a nice percentage of certainty figure for issues or failures over time.”