ARC Advisory Group discussed the impact of digital transformation on the power industry and adoption of Industrie 4.0 in the blogpost, "Digital Transformation for the New Power Generation Landscape". This was based on the panel discussion held during NTPC’s “Global Energy Technology Summit” (GETS) 2016 in Delhi, India. Continuing with the series, this post will cover other crucial areas discussed by the eminent panel.
To begin with, how do plants achieve high degree of standardization among functional silos and software interfaces to seamlessly integrate systems and software from multiple sources?
As digitalization and IIoT technologies gain traction within plant manufacturing and operations, the potential for connecting devices and systems will rise drastically. However, seamless interoperability and integration between systems depends greatly on the standards adopted. In the current industrial ecosystem with a multitude of technologies, products, and stakeholders; a common set of standards is necessary for information sharing, continued innovation, and industry development.
To maximize the potential value of IIoT innovations, it is essential that different systems and assets communicate with each other, share data and respond to common monitoring and control systems. Moreover, as these connected devices interact with systems and plant personnel, getting the data in standardized and consistent formats can help in faster decision making. But how to connect silos of information that have different communication proprietary protocols? Bringing the data together is an important prerequisite for leveraging the benefits of digitalization, and centralization (of data) in particular.
This could be realized in several ways, for instance making the products more intelligent in terms of understanding more protocols (including native protocols) and providing users the ability to program devices on the go, recommended Siemens. Another way to aggregate information from different systems is to have a layer of software between the data sources and centralized processing level. For instance, for collating data from different vibration analysis systems, an intermediate level of software could provide users with early alarms to indicate process deviations and/or asset health. These intermediate layers of software products are extremely useful as the data analytics will be performed at a central level, rather than be limited to particular machines and units. The centralization layer brings along several powerful advantages like data uniformity, sequential capture of data (extremely important in root cause analysis), and would leverage the power of Big Data and statistics to populate similar assets.
Once alarms are detected and validated, the primary system will be interrogated to provide deep down analysis and diagnostics for the related problem. Moreover, making products “digitally fit” by building in all possible interfaces and data structures could help integrate software engines into complex industrial ecosystems.
The IT industry has driven standards for information technology; and organizations such as FieldComm Group, PROFIBUS International, and OPC foundation (to name a few) are driving standards for instrumentation and control. In addition, complex process data coming from process automation and condition monitoring systems would require standard file formats. For example, vibration data would require a standard file format for the vibration spectrum and waveform. Industry organizations like the Vibration Institute and API could help lead the development and proliferation of such standards.
Moreover, it’s vital to adapt open standards to help ensure that artificial and arbitrary barriers to interconnectedness and interoperability do not get in the way of seamless data sharing. According to many in the industry, end user companies should specify standards to encourage their suppliers to avoid use of proprietary network protocols, application interfaces, file formats, and so forth.
The panel members also discussed the potential for digitalization of existing vs. new plants. What are the recommendations for existing plants, ongoing projects & new projects? Is the payback quicker in an existing plant where equipment degradation is an ongoing process?
Companies need to consider the relative value of digitalising their old plants vs. their new plants. During the discussions, many agreed that digitalization could provide more value to the existing plants, as the need to optimize processes, asset maintenance/performance, and operations is more acute in existing plants than in greenfield units. Moreover, a multi-pronged approach could be taken for digitalizing old plants.
Firstly, using the existing data, the best value that can be delivered must be identified. Making use of historical operational data gathered over the operating life of the assets, units, and power plants will provide high-value insights that will help companies make the rights decisions, thereby improving the bottom line. In this regard, using IIoT platforms with pre-designed templates for analytics or digital twin models would allow the value to be demonstrated quickly. Even technologies such as virtual sensors (analysing sensor readings using mathematical models), and fusing data from multiple sources will bring more value from existing data, suggested GE. Moreover digital solutions for sensing, fleet monitoring and control, process control, and so forth will enhance the flexibility of units without any mechanical intervention.
The second approach is to retrofit the plant with new sensors in areas that offer significant business benefit. Moreover, existing power plants can add a second layer of automation based on plant-wide WirelessHART-based wireless sensor network. This would enable new sensors to be added at low cost and low risk for measuring parameters that were not monitored before. These sensors form the basis for condition and performance monitoring of assets, for instance blade health monitoring that provides significant value to customers. This can be done either on-premise or as an outsourced connected service. The ROI is predicted to be quick. Device or equipment retrofit is the third option, which should be evaluated while designing the overall solution.
For ongoing projects, WirelessHART/wireless sensor networks can be added at any stage of the project, even after commissioning. The cost and risk of installing the network itself during construction is low. Once the wireless network is in place, it’s relatively easy to add wireless sensors as and when required on an ongoing basis, and often on a maintenance budget rather than the project budget, if necessary, mentioned Emerson. This also applies for new projects.
Understanding and capturing the operational aspects of production fleets provides great value in terms of assessing the performance and condition of assets, both in existing and new plants. This knowledge can be fully utilized in a new co-creation environment involving both end users and OEMs, as the operational behaviour can be fed back into the design phase of the products, thus making them better adapted to the new conditions and requirements. Moreover, for new projects, it is best to have a digital strategy integral to the design package. With the “digital thread” approach to track the product lifecycle from design to decommissioning; insights on operations, maintenance, and service can be used to make smart operational decisions based on the operational and maintenance knowledge from other similar units or projects. End users will benefit as the centralized fleet knowledge can be made available quickly to experts, and OEMs can fine tune the design process to adapt to new technical challenges. Both, the changes in machine design and the new related behaviour reflected in the real process will have to be adapted one to another continuously. This will be the central point of co-operation between end customers and OEMs.