Once IIoT connects with formerly inaccessible processes, the next step is deciding how to handle all their information. Some savvy advice can make these issues easier to approach and solve.
“Data is king and end users are increasingly requesting quick and efficient access to larger pools of information to make smarter decisions,” says Jeff Sanders, senior systems integration manager at George T. Hall, a system integrator in Anaheim, Calif., and certified member of the Control System Integrators Association. “IIoT, cloud computing, virtualization and other forms of digitalization are driving the availability and demand for these insights by allowing users to improve their operations, reduce costs, and increase efficiency. Likewise, OPC UA and MQTT are increasingly utilized across the IIoT landscape, and these having these interfaces in common helps our integration efforts by letting us deploy our solutions more quickly and efficiently. As these technologies continue to evolve, we expect to see even more innovation and new opportunities in the process application space.”
To help clients determine what combination of IIoT and digitalization would best meet their needs, Sanders reports that George T. Hall collaborates with them to evaluate their present processes and technologies, identify their operational pain points, develop a roadmap, and implement solutions that deliver quantifiable business benefits. For example, George T. Hall is presently working with pharmaceutical and other process industry clients that battery recycling company that want all the data they can get from their sensors, tag counts, controls and other devices because it’s more accessible and less costly.
“Previously, barriers to obtaining detailed production information were often too high, but now that it’s easier to connect and pull, store and access more data from variable frequency drives (VFD) and other devices, more users are willing to invest in getting it,” says Sanders.
Coordination and security
Ironically, once big data is available, the next chores are prioritizing it, deciding where to perform analytics, and making sure it and the network are secure. “Cybersecurity can be scary, and users can’t house all their data offsite or in the cloud, so they need some onsite storage and analytics,” explains Sanders. “When we’re in the design phase with a client, we often talk about adding an on-premises server, so they can do a lot of computing at the edge, and run some analytics in the cloud. For example, we may have a greenfield project that wants to automate traditionally manual processes, such as crushers, grinders, shakers, conveyors and chemical processes. All of the package equipment for these tasks needs PLCs and I/O everywhere, but typical designs have them running as separate islands when they need to be working more in concert.”
To links these types of devices and processes, George T. Hall usually recommends that manufacturers add OPC UA networking and a control system that can push information to redundant servers onsite. This setup can house all the production data, send it where it’s needed, and serve as an on-premises cloud. This solution also has the same structure as any cloud-computing service, but it’s simply located at the user’s facility, where they also have to maintain and secure it.
“Deciding how much computing to do on the edge versus how much to send to the cloud depends on several factors, including type of application, amount of data generated, and available computing resources. It’s often a delicate balancing act, and takes a highly collaborative effort with each end user to find the right solution for their budget,” adds Sanders. “Some users are also concerned about housing data in the cloud versus onsite. Security risks, reliance on third-party providers, bandwidth limitation, and perceived lack of control of the system are issues we’ve encountered. Overall, we see these risks as minimal once the systems are up and running, but they do influence how and what data we send to the cloud.
“IT departments often want an onsite server because the biggest beef is data security, and some feel that having their server in their building is more secure. We think it’s perfectly acceptable to house data with Amazon Web Services (AWS) or Microsoft Azure on servers running elsewhere. Even highly regulated pharmaceutical applications are allowing remote access lately, but its ultimately up to each end user and their IT staff to decide what works best for them.
Once its design for each client’s combination of automation upgrades, IIoT system and onsite server are complete, George T. Hall will tie its processes, controls, networking and data logging together on the servers. “Then we’ll figure out how much and which data points they want, and prioritize and configure them. Plus, the available drive information isn’t just speed, but now it can also include power requirements and lists of other data we’re developing,” concludes Sanders. “IIoT makes connectivity and everything else 100% easier because all the instruments and pieces of equipment can connect on the Internet. This is much simpler that trying to get them together using the PLCs and specialized devices we had before because the suppliers often didn’t want to allow it.”