Operational Analytics and the Value of Iterative Improvement

Author photo: Michael Guilfoyle
ByMichael Guilfoyle
Industry Trends

There are many different types of operational analytics. Solutions offered are far ranging, such as enhancements in traditional statistical methods, model-based techniques that include streaming data, and the use of machine learning. Additionally, some of the techniques can also applied for specific effect, such as natural language processing and semantic search. Trying to find a solution that fits a business need, and then can scale, can be challenging.

Operational analytics move beyond IT and data scientists

The technological terms associated with operational analytics no longer reside solely within the purview of data scientists and information technology (IT) groups. As operations personnel and business executives have been tasked with building out analytics business cases, they find it necessary to understand technology definitions and distinctions, many of which are nuanced.

The results are business cases and buyers heavily focused on sorting through technology questions. This focus typically primarily covers four areas: techniques, users, tools, and delivery. There are clearly important distinctions across those four elements, particularly when it comes to weighing resource and data readiness for analytics. However, when organizations are too heavily focused on these areas, business cases and analytics requirements often devolve into technology discussions, pulling them away from business improvement objectives.

A way to reorient analytics discussions firmly around value is to look at them through the lens of their immediate impact and potential for iterative improvement. This view can be helpful because not every operational area (and related business case) needs to strive for some transformational maturity. Often, by pursuing iterative improvement, organizations will reach a point where it becomes obvious that additional improvements aren't worth the cost or resources drain. In fact, striving for transformational change may be overkill for parts of operations, leading to open-ended pilots and overspending.


An example of how operational analytics can iteratively mature

A good example is using consolidating asset data to then improve work. For example, consider an organization burdened with a dozen or so siloed systems supporting work and assets, compounded by workers that use spreadsheets as part of data management and decision making.

Reducing manual spreadsheet analysis and pulling data from disparate systems into a real-time system of record for assets don’t imply a transformed operation. However, they set the foundation for iterative improvements across people, processes, and things.

Work requirements can be more fully understood and aligned to craft skills and availability, enabling higher resource utilization and eliminating costly rework. Analysis of work and the use of condition-based monitoring (CBM) can set the stage for process improvements that extend the lifecycle of assets. Asset data can flow into capital planning to inform investment strategies. In making these changes, the company clearly matures some operational components.

By doing so, a company can then determine if it worth embracing additional change, such as automating failure identification via machine learning for critical assets, for example. This could set the stage for implementing predictive maintenance strategies. The type of data used, resources involved, and desired improvement might then inform the technique used, from real-time statistical-driven control to predictive cognitive analytics. Additional technology requirements also become more apparent, as infrastructure and asset visibility and device connectivity would be fundamental to this improvement.

The result is enabling technology seen through the lens of a maturing operations, with the organization understanding that not every scenario requires some transformational end state. In this scenario, there is no worst-to-best orientation that then prompts ineffective technology comparison shopping. Instead, there is only the correct fit based on the desired outcomes.


Engage with ARC Advisory Group

Representative End User Clients
Representative Automation Clients
Representative Software Clients