Digitisation experts explain benefits for nuclear

05 July 2017

Advances in digital technologies are enabling energy companies, including nuclear power utilities, to increase efficiency, cut unplanned downtime and improve the reliability of equipment, software providers told a recent conference in Moscow. ASE Group, Fiatech, GE, IBM and Siemens explained the value to the industry of using 'big data', the Internet of Things, 'digital twins' and other computing innovations.

Executives from these companies spoke at the session titled Digital future: the next step in the development of nuclear technologies at the Atomexpo conference and exhibition on 20 June. Delegates heard about the latest technologies and processes in 'digitisation' - the conversion of text, pictures, or sound into a digital form that can be processed by a computer. These included ASE Group's Multi-D, Fiatech's Productivity Advancement Targets (PATs), GE's Predix, IBM's Watson and Siemens' Product Lifecycle Management (PLM).

Richard Crisp, director of systems engineering at IBM Internet of Things, said: "We know nuclear power is advancing. There are many challenges with introducing new technology and there are many challenges with the constraints you have in this particular industry. But it's based on science and technology, so we have to provide the heavy-duty computing to allow you to get the insights of that data. As human beings, we can't process that data by ourselves. If you train the computer to help you, then you can do your job much more efficiently."

A report issued by the World Economic Forum in June last year, Electricity: uncovering value through digital transformation, estimates there is $1.3 trillion of value to be captured globally between 2016 and 2025. This report says that by "leveraging the building blocks of digitisation" - such as service platforms, smart devices, the 'cloud' and advanced analytics - utilities could increase the asset life cycle of infrastructure, optimise electricity network flows and innovate with customer-centric products. New pools of value could also be tapped 'beyond the electron' by harnessing big data across sectors, it adds.

Multi-D


ASE-Group, the engineering subsidiary of Russian state nuclear corporation Rosatom, says its Multi-D system ensures the efficient management of all life cycle stages of a nuclear power plant. Rosatom has said it can also be used in other capital construction facilities.

The technology won the international CETI award last year in the mega-project category. Established in 2006 by Fiatech, the Celebration of Engineering & Technology Innovation awards recognise significant achievements in technology research, development, and implementation in the capital projects industry.

Stuart Young, managing director of Fiatech's Europe & Middle East business unit, said Rosatom could "become the driver" for Russia's economic development through its work on digital solutions such as Multi-D.

Multi-D includes more than 30 data-based tools, one of which is a "shared information space" that site designers, builders, suppliers, customers and regulators, among others, can access, Vyacheslav Alenkov, director of systems engineering at ASE Group, said. These tools are already providing "tangible conclusions" on the duration and cost of on-site construction work, he added.

Multi-D produced "real results" for the Rostov nuclear power plant construction project, he said, and the system is being "rolled out" across ASE Group's portfolio of orders.

"The ASE Group of Companies is oriented not only to the nuclear industry, but also works on data extension to be used by other industries and sectors of the economy," Alenkov told the conference. "Everything that can be digitised should be digitised," he said, "but it's really important that this platform brings added value rather than exists as just another IT-system."

PATs


One of Fiatech's own "leadership Initiatives", Young said, are the 12 PAT opportunities it has developed, "along with identification of over 400 organisational behaviours and practices conducive to enabling success". PATs span all facets of the capital projects industry, industry segments and life cycles, he added.

"Organisations are constantly struggling to establish systems-thinking approaches to how work is defined and performed, in the hope of becoming more efficient and effective. The massive scale, complexity, short life cycle, and subtle stakeholder interdependencies has made sustainable solution finding a major barrier. With that in mind, Fiatech members have been rethinking why compelling improvements with the potential for significant industry benefits, emerge with enthusiastic interest, only to rapidly fade away.

"Generally, it is felt that most improvement initiatives focus on a specific aspect of an enterprise's operation and involve a few functions and a handful of stakeholders. As a consequence, these efforts consistently lack context, lack sufficient visibility and support, and do not have sufficient breadth of stakeholder engagement to ensure full acceptance and long-term viability."

Every PAT "represents a purpose, an objective, and an improvement destination expressed in economic terms", he said. PATs are led by multiple owner/operators and EPCs "working in concert" to provide the motivation, knowledge, expertise and consistency needed to ensure that strategies are complete, organised, and viable, he said.

To provide insight and enable organisations to conduct effective self-assessments, each PAT is expressed through a series of success enablers. These success enablers provide the spring-board for identifying and developing viable solutions. For example, PAT #3 Through Interactive Project Planning aims to capture a 25% improvement in project cost and schedule predictability.

Digital twins and Predix


Sanjay Chopra, global leader for Energy Practice, Presales Solution Architects at GE Digital, described work with digital twins and GE Predix. Digital twins are computerised "companions" of physical assets that can be used for various purposes. GE Predix is a cloud-based PaaS (platform as a service) to enable industrial-scale analytics for asset performance management (APM) and operations optimization by providing a standard way to connect machines, data, and people.

"GE counts itself as a 125-year-old 'start-up' because we are always transforming ourselves. We needed to re-engineer ourselves in the digital world and started this process about five years ago. We looked at how to get more productivity from our own industrial activities using big data analytics," Chopra said. "Then, about three years ago, we took the assets we'd developed in our engineering world to our customers. For example, in our Monitoring & Diagnostics Center, in Atlanta, where we monitor over 2000 turbines for over 500 customers, we started to use our replications to support our clients. And about two years ago we said that, if we can do this for our customers, then we can do it for the rest of the world."

He added: "We took our Predix platform, all the physics space models that we had, the digital twins that we'd built, and put them out into the market and said, we're going to make these available for everyone."

To enable the "democratisation" of its analytics, GE had to change its own "DNA and culture", he said. "We had to start looking at things, not just from the perspective of an engineer, but from the 'art of the possible'. To do that, we had to bring dreamers into the company, the kind of people who could explore the art of the possible. In addition to that we started to look at different business models, at how to monetise [digital processes]."

He added: "Today, when we talk about the energy value network, electrons are flowing bi-directionally; we've got distributed generation; we got variable electricity being generated through renewables. But it's not only electrons that are flowing bi-directionally: data is too. We've got a lot more sensors on the network and our customers are actually becoming sensors. Today the figure is about 3 trillion IP-enabled devices out there in the world. That means we've got the potential of 3 trillion sensors which we can take data from and use to run data analytics. Some of those are in the plants and some of those are with our customers."

Referring to the WEF report, Chopra highlighted the importance of APM.

"There's still roughly about $378 billion worth of opportunity for savings that are out there. How do I maintain my asset? How do I look to reduce my O&M costs? How do I extend the life of those assets? And then I need to look at how to operate these assets according to a different economic model.

"Today people are looking at a centralised baseload, but we're going into a more distributed energy market. How do I start to manage my assets in that world? I still need to make money out of them. It's still roughly about $2 trillion worth of societal benefit in terms of how do we make sure we have reliable and safe and secure access to electricity for our customers, how do we make sure that cheap electricity is also available in areas where there is no electricity today? Those are the big areas we need to focus on.

"We deliver those through a new model. Rather than the traditional models of localised compute power, we're looking at how to leverage data science; how to leverage the Cloud to do high-scale analytics; how to create models which we can interact with; how to talk to a digital model, to a digital twin of an asset, and understand how it's operating; and what we need to do to optimise it and to make money out of it. This is something we need to do across the electricity value network and not just on the generation side. It's a model that follows all the way from the turbine right down to the toaster."

A number of GE customers are yielding benefits from digitisation in their markets, he said. For example, GE's clients in the nuclear industry are now saving about $2000 per MWh in terms of O&M costs. "So even though customers are very mature in terms of how they maintain and operate their plants, there's still a lot of value that can be realised from these assets."

No unplanned downtime


GE's "vision" with digitisation is "focusing on 100% no unplanned outages and downtime for our assets and for our utilities", he said.

In terms of the thermal performance of plants, GE is using big data analytics and closed-loop analytics.

"For example, at steam plants we have optimised the combustion process by monitoring and changing 75 parameters every minute. No human could make those changes to get those kind of benefits," he said.

"In nuclear we've been focusing a lot on our digital twins and how to get advantage from assets where there's criticality; how to keep those up and running, and how to get early warnings around those. For example, we mined data from the nuclear power plant outage database in the US, as well as the data we have at our centre in Atlanta, and we came up with the major failure modes around the assets we see in our industry. We started to develop the digital twins and analytics to ensure that we get early warnings out of those," he said.

A digital twin can be used, for example, to provide early warning of a fault on a condenser.

"We worked with a client on pattern modelling and statistical analysis to reduce the early warning of when a failure with the assets was going to occur to about one day. We then built up a digital twin, which increased that one day to ten days before the failure would happen. Our research centre is now working to extend that to 30 days. The earlier you have a warning of something taking place, the earlier you can start planning for that outage and ensure that you don't have any unplanned downtime," he said.

Chopra urged delegates to be "action orientated" and not see digitisation as a "paper-based exercise".

"Take real-life examples in your business and say that's what I'm going to use for digitisation. And remember: there's no such thing as bad data because, once you start using it, you'll understand the level of quality you're going to need in your data. Never look for 100% quality because you'll never get started. Secondly, you need to have a way of understanding the quality level at the source and try to fix it at the source. Don't try and fix it as it's being fed into the data analytics platform because, by that time, it's too late. And always look at patterns of data."

IBM Watson


IBM Watson is a cognitive system enabling a new partnership between people and computers. In the industrial world, the IBM Watson IoT Platform can continually monitor incoming information in real time and based on what it 'knows' and is continually learning, to understand current conditions, what's normal, what's not, determine trends, and suggest actions.

Crisp described his role at IBM as to "look at the data and information you've been managing over the last 20-60 years and to find new ways of using heavy duty computing, cognitive computing, machine learning and artificial intelligence to solve the problems that have not been solved yet". One of these problems, he said, is bad data.

"There's nothing wrong with bad data, but the human brain makes us differentiate between good and bad data. So why don't we train the computer to do that for us because that's what cognitive computing and artificial intelligence will bring to us tomorrow," he told delegates. "As the Millennials join your companies, they're going to want to use different tools and techniques than the ones you've been using. My hypothesis is they will need cognitive computing, that they will want to exploit artificial intelligence," he said.

IBM has "invested heavily", he said, "in a new division to exploit this computing capability and we call it the Internet of Things".

"The word is full of sensors - cars, aircraft, entire cities are full of them. But we humans cannot consume that data and we need to rely on computers. And we need to apply advanced analytics and cognitive computing to be able to manage that data," he said.

"In the first phase of any project design, we're all used to using computer aided design systems, we all have configuration and change control, but how do we apply best practice and lessons learned from the maintenance and operations phase back into design? In the future, with heavy duty computing providing new insights, you should be able to feed the root cause or the effects of some of the problems you've got in operations back into the design," he said.

"For the asset management part of the life cycle, you would look to feed data providing insights that would improve efficiency, cost and commercial profitability back to the design teams. We've talked about the digital twin and about integrating these different systems. So, we have the capability and the technology. So now you've got that data supposedly joined up, but what are you doing with the data? How are you getting the metrics and providing the insights into your design, maintenance and operations teams? This means teaching the computer to recognise good data and to ignore bad data. And then on the good data, provide the insights, the trends, the metrics, the insights that your operators, maintenance and design teams need.

"The idea is we can have a continuous improvement and take into account not just the design, but the maintenance, safety, liability and environmental constraints. With cognitive computing, we can help analyse and provide insights into those trends and data."

For example, cognitive computing, the Internet of Things and advanced analytics can be used to predict the degradation of wind turbine blades and thus reduce downtime, he said. Another example is IBM's work with Siemens on Industry 4.0 and the Smart Factory.

"Rotating machinery is a good example, where perhaps a bearing is wearing out and starting to vibrate and if you've got a sensor and you're able to capture that data, then you can predict when it's going to fail. It's taking the human out of the loop and letting the computer do the work to provide the insights. Then the human can intervene."

PLM


Danila Torop described Siemens PLM Software, a business unit of the Siemens Digital Factory Division, that "works collaboratively with its customers to provide industry software solutions that help companies everywhere achieve a sustainable competitive advantage by making real the innovations that matter".

Its PLM software provides the "innovation platform through which manufacturers incorporate extremely high data volumes from complex products at high speeds, analyse them and make them available to all participants", he said.

"PLM can replace multiple and disconnected design data management platforms with a global team centre," he said. "Such a centre would manage design data that is currently managed as personal files with the risk of losing information," he added.

Siemens implements a Collaborative Product Development environment when the existing SAP system is "overburdened" in support of design, manufacturing and service operations, he said. It can also reduce the design cycle time through automation whenever the product development period is "longer than desired", he added.

"PLM is a key differentiator for many companies in the energy industry and Siemens has seen huge advantages from it," he said. "Energy is a critical industry for PLM and continues to drive investment."

Researched and written
by World Nuclear News