Asset Performance Improved with Integrated Historian Data

By Peter Reynolds


Operational excellence and continuous improvement programs at energy companies often target cost management and worker Asset Performance Improvedengagement to improve business performance. However, without making associated improvements in technologies or processes, many of these companies have struggled to reduce operations and maintenance costs to any significant degree.

Maintenance activities on industrial assets tend to be expensive and mostly reactive in nature.  This is because many operations and maintenance workers have lacked the tools needed to proactively predict when maintenance or repair activities are needed prior to asset failure.  The advent of machine learning and the convergence of information technology (IT) with operational technology (OT) systems has provided operating companies in these sectors with a glimmer of hope.

In July 2015, process data infrastructure provider, OSIsoft, and ERP provider, SAP, announced a significant technology partnership.  The goal of the partnership is to greatly simplify integrating operational data from the OSIsoft PI data infrastructure with the SAP HANA in-memory computing platform.   The resulting SAP HANA IoT Connector, jointly developed by both companies, allows process industry companies – and particularly those that already use SAP and OSIsoft – to make their operations more Asset Performance Improvedproactive to improve uptime and help reduce operations and maintenance (O&M) costs.

ARC believes that this represents an excellent example of what can be accomplished through IT-OT convergence and moves SAP’s largely enterprise-level applications further onto the plant floor.  From this partnership came the SAP HANA IoT Connector, developed jointly by OSIsoft and SAP.  The value of the connector extends far beyond simple streaming of real-time or operational data into enterprise systems, since it consumes all data recorded from the OT layer in a more effective way.

The Issue with Reactive Operations and Maintenance

Certainly, operator effectiveness programs or minor maintenance initiatives can help address some reliability issues.  But asset maintenance is often the responsibility of internal or outsourced maintenance organizations with limited software tools capable of predicting when assets will fail. As a result, most owner-operators rely on the operators to reactively initiate a maintenance work order when they identify a failed piece of equipment, instrument, or other production assets.

Asset Performance Improved

Costly reactive break-fix maintenance or often equally as costly unnecessary preventive maintenance usually constitutes over 80 percent of the actual work performed by maintenance and operating technicians. Studies also show that it is common practice to remove and overhaul many assets during a turnaround or shutdown, regardless of condition.

Challenges Dealing with Operational Technology Data

In the past, operating companies in the oil & gas and utilities industries viewed the separation of enterprise and plant floor systems as a challenging hurdle. Integrating ERP and real-time plant data has been expensive and difficult to maintain at best.  Often somewhat less-than-elegant middleware, ETL (extract, transform, and load) and other batch load mechanisms for data and data warehouse architectures have long been staples in the IT domain.  This is changing rapidly with the emergence of distributed computing, in-memory technology, and closer technology partnerships.

Although operations and maintenance groups generate vast quantities of data – both structured and unstructured – they can only leverage a small percentage of data to make better decisions. For decades, much of the process data collected from real-time operational systems were “locked up” in process historians. The majority of these data was seldom used, except by engineers and maintenance and operations staffs that tend to use either basic visualization tools or somewhat more sophisticated, but usually difficult-to-use, historian tools to investigate operational situations.  While some software vendors have made efforts to improve these tools, most are still very laborious to use and lack context with transactional and unconventional data scattered through the enterprise.

“Liberating” process data to enable enterprises to efficiently capitalize on the plant or shop floor information requires an overhaul of the data integration strategies. 

Process Historian Can Support Predictive Processes

Process historian infrastructures are designed to collect, store, normalize, and cleanse time-series sensor data from industrial equipment and processes. When made available to the appropriate O&M personnel, this information can empower the real-time analysis of asset performance, troubleshoot problems so industrial businesses can improve operations and maintenance efficiency. Asset Performance ImprovedIn the process industries, most process information is stored in the historian.

High-performing industrial organizations are becoming increasingly data-driven and companies in every industry and every geography rely on the process historian infrastructure as their primary operational data store to support real-time analysis and troubleshooting, enable historical and complex analytics, and reduce data management costs. Historians can help connect islands of production and maintenance information and remove barriers to decision-making by providing a single version of the truth with a common data foundation and consistent data cleansing.

However, in the past, plant data often did not move beyond the historian infrastructure.  But new technology approaches and technology convergence are changing this.  Convergence is the gateway to optimizing plant performance through cloud-based solutions, in-memory computing and powerful analytics, as well as the source of massive amounts of training data for machine learning and predictive maintenance solutions. This changes the way businesses and operations perform and compete by making data accessible and actionable. The ability of an industrial process manufacturer to leverage OT or operational data will hinge on the use and deployment of the process historian and the manner in which this data is consumed by the business and enterprise systems.

Why the Entire Process History Archival Storage Is Valued over Streaming Time-Series Data

Users typically interact with process historian data either through trend visualization packages, simple dashboard-style graphics, or by extracting event-related data to Microsoft Excel for deeper analysis. The user interface tools usually provide a mechanism for dealing with the current streaming data and the ability to display the historical values over a given range of time. Engineers use these tools to look at industrial processes and determine what happened or what may happen by expending the considerable time and effort required to contextualize the data.

Some important aspects of the emerging field of analytics will differentiate the available tools; with some tools able to provide a true prediction, with minimal false positive diagnosis. In the field of analytics, false positives are results that incorrectly indicate the presence of a particular condition or attribute.

The emergence of Big Data and multi-variate and analytics techniques, coupled with the steep rise in computing capability have led to develop machine learning solutions. Machine learning algorithms are designed to predict operational and mechanical outcomes precisely with minimal human effort.

Applying Machine Learning to Process Data

Machine learning algorithms form a major component of predictive analytics. Machine learning applications are self-modifying and highly automated. That is, machine learning algorithms are designed to adapt continuously and improve their performance with minimal human intervention. Machine learning algorithms can also be embedded transparently into process workflows to help solve problems that are just too difficult or complicated for human programmers to code.

Machine learning takes away the human effort required to learn from past process events. It provides a platform to capture knowledge of process anomalies and events and automates the laborious task of data mining.  Since process historians are good at storing data from many target systems, they provide a good foundation for cleansing process (and other) data that are often far from pristine and often contain many errors and gaps.

Single-variate, condition-based monitoring solutions have played a role.  But in the past, they have been susceptible to generating false positive diagnosis, leading to poor acceptance by operations. Standard historian-based desktop tools for maintenance engineers or process engineers have also been useful, but often require much time and effort for end users to create the context needed to predict outcomes.  Machine learning technology promises to be the “game changer” here.

Major Incidents with Machine Learning

Research conducted by the Abnormal Situation Management Consortium and the Control of Major Accidents Hazards (COMAH) organizations indicate that major incidents are becoming less frequent, but often have greater consequences.  Often, the data needed to help predict these incidents is sitting idle in a process historian archive.  With this in mind, users can begin to use machine learning and feed not only streaming data, but also the important events buried in the time-series data in the process historian archives.

To meet this need, SAP worked with OSIsoft to develop and commercialize the OSIsoft SAP HANA Connector.  The resulting product, which SAP added to its pricebook earlier this year, enables users to build machine learning solutions that employ sophisticated algorithms to recognize patterns in historical and streaming process data that otherwise might only have been visible to experienced engineers.  Several have already started applying it to help unlock data from their OSIsoft PI historians to support advanced machine learning solutions for predictive operations.  

Operational and Predictive Analytics at EDF

Devang Shah, Senior Manager of Database and Business Intelligence at US-based EDF Renewable Energy (part of the EDF Group in France) filled ARC in on the company’s journey to transform operations and maintenance using predictive analytics. As one of the largest renewable energy producers in North America, EDF RE needed to optimize operations and maintenance at more than 100 wind and solar farms.

While already an SAP user, the company did not yet have an enterprise data platform.  After researching the market, it chose SAP HANA as a common data warehouse platform to easily integrate and get access to a variety of sensor, geospatial, and ERP data to support analytics.  The goal was to develop a delivery platform for analytics that business users and IT groups alike could leverage via self-serve.

EDF RE already had a good understanding of industrial analytics, including the benefits of consuming not only streaming data from the field but also the entire history of field sensor data.  However, much of the company operational data resided in OSIsoft PI servers, which initially represented an integration challenge.

The timing of the collaboration between SAP and OSIsoft allowed EDF Renewable Energy to capitalize on the technology partnership. Mr. Shah mentioned that he believes this partnership and creation of the OSIsoft-SAP Connector technology benefited both companies, as well as end users like EDF RE.  The connector made it easy to integrate real-time data collected from the field into its enterprise platform.  This includes time-series data from field sensors associated with the company’s wind turbine assets.  The company considered other methods, but determined that these were not native to both OSIsoft and SAP, and thus unsustainable.  The IoT connector tool enabled the continued use of the OSIsoft framework for data cleansing and aggregation, a critical aspect of any data analytics strategy.

EDF RE plans to leverage predictive analytics applications now that a platform is complete. The future will involve operational and predictive analytics, looking to data patterns and machine learning to predict the frequency of failures and identify information about different assets to predict production.  Significantly, users can develop and employ their own analytics, without necessarily having to go to engineering or IT for support. 

Independent North American O&G Company Increases Operational Insight

An independent oil & gas exploration and production company with a portfolio of global offshore and onshore assets sought a set of standard tools to help decrease downtime and operating costs and support future growth through increased visibility. 

Depressed oil prices meant a slowdown in field development that provided the company with an opportunity to reorganize its resources.  The company selected Accenture and the Accenture Upstream Direct template to deliver core SAP functionality to its global assets.  According to the company, Accenture’s templatized approach helped reduce implementation time, costs, and risk.

The project enabled the company to accelerate its digital transformation, streamline the organization, and increase its visibility into operations, supplier spend, and contracts.  It also improved its financial close to a significant degree by an integrated flow of data, tighter processes, and more efficient operations from wellhead to executive reporting.

As part of the project justification, the company realizes the importance of seamlessly connecting real-time data and value from the Internet of Things.  In the future, a platform approach built on linking OSI-PI process data to SAP’s HANA enterprise platform will aim to increase insight into operations, identify opportunities to reduce costs, and optimize production volumes and value at the wellhead.  The company anticipates that operational analytics will serve many subject matter experts across ERP and operations.


Moving forward, it’s likely that the process historian infrastructure will remain a significant data source for predicting outcomes and provide a platform for data cleansing in the oil & gas and utilities industries.  At the same time, solutions such as the SAP HANA IoT Connector will enable operating companies to more easily integrate time-series operational data and a variety of other relevant data sources into their enterprise analytics platforms and build applications to increase competitiveness.

Asset Performance ImprovedEngineers and operators who, in the past, have spent countless hours manually searching through process historian data, may eventually migrate over to machine learning and analytics on enterprise platforms.  One challenge is that it can take an extraordinary amount of historical data to “train” the machine learning system since major equipment failures happen in intervals of years and major process upsets, happen infrequently.  

Properly crafted and converged IT-OT systems can help transform industrial organizations and change the game significantly for people using the tools.  Technologies can predict an accurate time-to-failure, indicating precisely when a known failure will occur, how the failure happens, and what to do. Prescriptive advice such as the exact failure code and actions may often be directly linked from a maintenance system. Knowing the precise lead time for a failure weeks in advance allows the end user to plan the appropriate preventive or mitigative action.


If you would like to buy this report or obtain information about how to become a client, please Contact Us

Keywords: SAP, OSIsoft, IT-OT Convergence, Predictive Maintenance, Predictive Analytics, Machine Learning, ARC Advisor Group.

Engage with ARC Advisory Group