Many of us clean up nicely. It's a polite way to say we're unrecognizable until we shower, shave and comb what's left of our hair. The same transformation is usually needed for information before it can be analyzed. Many engineers and managers seek data they can trust, but it usually has to be cleaned, contextualized, formatted, resolved or otherwise aligned, so their subsequent analytics can produce better decisions and more productive and profitable operations.
In this series, Jim Montague explores how a variety of experts suggest to clean, manage and analyze process data. The articles that follow offer a detailed discussion.
Wash your stinkin' data
The good news is that data cleaning, preparing and gaining intelligence from data can apparently be performed almost anywhere. "Data collection, preparation and analytics can be done where needed, whether it's on the edge, in the cloud, or in between. If you're doing vibration analysis or face recognition with cameras, they produce a lot of real-time data, so you'll want to do initial data processing at the device," says Marc Taccolini, founder and CEO of Tatsoft. Read more.
The quest for consistency in data
To get from raw information to big savings, data analytics mainly involves turning widely different sources, formats, protocols and communications systems into equivalent forms that can be compared. "Users want to do advanced analytics, but first they need basic analytics functions for data aggregation and cleaning before it can be presented," says Interstates' Dan Riley. Read more.
Middleware pipes to models
Evonik, a specialty chemical producer, explains how by adopting Element Analytics' software at two of its plants has allowed the company to produce cleaner, contextualized asset data models for digitalized operations. Read more.
Historians, culture enable data analytics, modeling
Experts at Northwest Analytics show how dedicated process data, confronting entrenched issues and univariate analytics can be profitable and aid digital transformation. Read more.
Lather, rinse, repeat data conditioning
There's more than one way to clean, organize and contextualize information, so it can be analyzed quickly and more thoroughly, and lead to better insights and decisions. Here are some of the most frequent tasks. Read more.
Settle differences to unlock models
Tony Paine of HighByte reports that implementing a DataOps layer can resolve differences between data sources, streamline analytics, and allow the publication of richer models. "We believe a new layer in the technology stack is needed to abstract away the collection, standardization, normalization and contextualization of data that’s necessary for preparing and generating reusable information by any system, application or user. This data operations (DataOps) layer bundles information close to its source, and publishes it as an enriched model for consumption to any platform." Read more.
COVID-19 = clean in the cloud
Though the migration to cloud-based data processing and analytics has been underway for several years, the COVID-19 pandemic has turned a march to the cloud and digitalization into a stampede. "The pandemic caused users to throw out all their old rules about cloud adoption or avoidance. Remote workers everywhere need the cloud to connect to their data and analytics, so IT departments have been scrambling to deploy solutions on Microsoft Azure, Amazon Web Services (AWS) and other cloud platforms," says Seeq's Michael Risse. Read more.
Updated toolbox for analytics
The roster of data analytics software, services and support accessories fluctuates quickly and widely. Here are some of the latest players. Read more.