The famous Marshall McLuhan quote is, "The medium is the message," but while the philosopher was talking about television and mass media, his statement applies equally well to computing, the Internet and other types of fast-emerging digitalization, including data analytics. The proof is that process applications and the initial information they generate haven't changed much in recent decades, but the volumes of data they produce, connections and access to it, and the ability to process and analyze it more easily and closer to real-time are way beyond where they were just a few years ago.
To unlock its own siloed data from its specialty materials and intermediate chemistry processes, and combine it with 50 use cases for predictive maintenance (PdM), process optimization and energy management, Celanese Corp. recently decided to use KnowledgeNet (KNet) software, shortly after its 10-year-old parent company was acquired by Emerson Automation Solutions. Employing a PdM use case as a pilot project, Celanese tested KNet by using it to analyze lube oil temperatures on a rotating component and its bearings.
“Its performance looked relatively flat, but the new system identified this equipment as an issue. Looking at the two-month trend, we were still below alarm levels, but zooming out to a two-year view showed a change in the temperature level," says Greg Aguilar, senior principal instrument engineer at Celanese. "Because the temperature rise occurred over a longer period of time, no one noticed it on a day-to-day basis. It went unnoticed for months before the system picked it up.”
Aguilar explains the gradual temperature change in the rotating device's lube oil was attributed to a vacuum dehydrator the Celanese facility had installed nine months earlier, so it wasn't identified as a degraded system. However, if the temperature increase had continued unnoticed, it could have shut down the unit. KNet proved it could identify hidden failures, and the pilot's success gave its users confidence in their analytics. “We’re no longer pushing analytics from corporate down to the sites. They're asking for it," says Aguilar. "We’re building institutional knowledge, and creating a data-driven culture for making decisions.”
To chart the origins of modern data analytics, Manasi Menon, product manager for analytics and machine learning, PlantWeb digital ecosystem, Emerson, explains that traditional advanced process control (APC) and model predictive control (MPC) examine assets and processes, develop models, and produce recommendations and predictive alerts, but more recent analytics software isn't as siloed as its predecessors.
"Data analytics isn't focused on one asset or process. It looks system-wide, exploring how different datasets relate to each other, and seeking interactions and correlations," says Menon. "In a distillation column at a refinery, we'd traditionally monitor its efficiency, and look at model of what would happen if it flooded. Now, analytics can check all the upstream and downstream assets connected to the column such as pumps, reboiler, tanks and heat exchangers, and detail the impacts of each on it. Likewise, fit-for-purpose Plantweb Insight also understands the 10-15 functions connected to an asset and their correlations, and can combine prior industry knowledge with pattern recognition and machine learning, which lets users add predictive alerts, and allows more holistic models to scale up."
Implementing a data analytics project requires users to complete more steps than simply adopting analytics software. Manasi Menon, product manager for analytics and machine learning, PlantWeb digital ecosystem, Emerson, reports these are the most useful tasks to complete for successful data analytics:
-
Identify a specific operations problem to solve, and develop a use case for solving it with data analytics.
-
Recruit a group of stakeholders, who understand the process application, and can help decide what analytics are needed, and how they should be adjusted.
-
Design a pilot project that solves the initial problem and addresses its requirements.
-
Establish a strong data structure for handling, storing and maintaining incoming information, and making it accessible to users later.
-
Cooperate with corporate IT on implementing the pilot and scaling it up later.
-
Research, identify and deploy whichever analytics software will best achieve the project's goals.
-
Update stakeholders on pilot progress and results, and establish a policy for periodic updates and scale-up.
-
Once the data analytics model resulting from the pilot is up and running, tune it up according to changing conditions every six months to one year.
-
Make the analytics project scalable, so it can be used in other production areas and facilities, and deployable by larger numbers of applications and users.
Brian Joe, global product manager, Digital Transformation business group at Emerson, adds: "This isn't just more data coming from more sources including wireless, but also the desire to analyze it continuously online. Where technicians used to take measurements and enter them in Excel once per week or per month, their companies are asking for continuous analytics that can immediately turn raw data into health recommendations for pumps, heat exchangers, cooling towers, steam traps, pressure-relief valves, corrosion devices and other equipment. This change also demands people because data analytics are all for naught if they're not used to assist resource-strapped customers. Many processes that used to need people to walk plants have been automated, but human senses are now needed in upscale jobs serving data analytics such as adjusting continuous models and interpreting their results."
To speed up data access and processing, Joe reports that Emerson's Plantweb Insight software continuously gathers information, and plugs it into existing architectures configured upfront for distributed control, historian and wireless gateway functions, as well as simple scale-up. "Previously, one-off analytics software would take months to develop for tasks like heat exchanger calculations, but now it can be done in an afternoon with a typical application self-gathering gather data in 20 minutes and analyze it in 20 minutes."
Joe adds that Plantweb Insight can help facilities improve steam trap reliability, and save on energy with immediate, wireless monitoring and analytics using the steam trap application. Its uses pre-engineered algorithms to collect and analyze asset data, enabling staff to remotely monitor steam trap health in real-time, identify existing issues, and get alerts to avoid larger problems. In addition, the platform comprises applications for other key assets, such as the Plantweb Insight pump application, which provides a more complete picture of pump health. The application brings in continuous measurements; produces consistent, holistic, real-time data about their health; and allows remote dashboard viewing of pump conditions that eliminates the need for manual rounds. Reports from the analytics software also point out any pumps in warning states or critical conditions, while its preventive strategies help avoid failures beforehand.
"Even though analytics are easier, we still say start small and scale up," adds Joe. "Potential users should identify a persistent problem first, such as a couple of misbehaving pumps or pressure-relief valves, implement analytics just for them, get a little quick payback, and roll up as they go forward."