1660238328261 Jimmontague0609

Lather, rinse, repeat data conditioning

May 18, 2021
There's more than one way to clean, organize and contextualize information, so it can be analyzed quickly and more thoroughly, and lead to better insights and decisions.

Gathering, storing and preparing data for transmission, access and analysis by other users and applications can involve many different steps. These can also vary with the performance profile of the initial process and according to the different needs of the analysis that users want to conduct. However, there are some common threads and requirements. Here are some of the most frequent tasks:

  • Define one more production problems that could benefit from improved analysis to help define and direct the search for  the most appropriate data analytics solution.

  • Enlist an internal team and external system integrator, expert supplier or other partners to develop data analytics policy, procedures, requirements, specifications and a schedule for implementing them.     

  • Identify existing and expected information sources and files that need to be analyzed, including their locations, data formats and specifications.

  • Assess how data is gathered from sources, entered, handled, communicated and stored locally, in historians, on servers, in the cloud or elsewhere.

  • If signals, parameters and scheduling data are coming from multiple sources, check if they need to be pre-analyzed or coordinated, or if gaps need to be filled in before they're sent for further analysis.  

  • If a process application has an existing historian, determine if planned analytics software can automatically interact and collect data from it, or if some added capability is needed.   

  • Decide where to perform analytics, either locally, in an on-premises server, or in a cloud-computing service. Balance benefits and costs of processing data-intensive applications locally and reporting by exception versus possibly not sending enough data to the cloud and perhaps missing crucial trends. 

  • Evaluate if any data conversion is needed, and if so, determine if a software driver or other middleware can be applied and run automatically, or if an external or manual functions needs to be installed.

  • Determine which networking protocols are used to move information from where it's generated to where it will be analyzed, remove any communication snags, and test that applied fixes have resolved these former hurdles.

  • If legacy sensors, instruments, I/O, PLCs and other components aren't plugged into any historian or network, plan to get them plugged in, or devise another way to rescue their stranded data with minimal expenditure of time and labor. 

  • Design, pilot, test and periodically reexamine data analytics program,, software, components and networks to determine if existing needs are being met or if new capabilities need to be added.  

About the author: Jim Montague

About the Author

Jim Montague | Executive Editor

Jim Montague is executive editor of Control. 

Sponsored Recommendations

Make Effortless HMI and PLC Modifications from Anywhere

The tiny EZminiWiFi is a godsend for the plant maintenance engineers who need to make a minor modification to the HMI program or, for that matter, the PLC program. It's very easy...

The Benefits of Using American-Made Automation Products

Discover the benefits of American-made automation products, including stable pricing, faster delivery, and innovative features tailored to real-world applications. With superior...

50 Years of Automation Innovation and What to Expect Next

Over the past 50 years, the automation technology landscape has changed dramatically, but many of the underlying industry needs remain unchanged. To learn more about what’s changed...

Manufacturing Marvels Highlights Why EZAutomation Is a Force to Be Reckoned With

Watch EZAutomation's recent feature on the popular FOX Network series "Manufacturing Marvels" and discover what makes them a force to be reckoned with in industrial automation...