Next Generation Big Data Is Here | T&D world

There have been a number of studies, reports and surveys on how to make the power grid more resilient lately, and most have one thing in common. They push for the modernization of the electricity grid and recommend doing so by deploying more digital technology. Our power distribution system is one of the most asset-intensive industries on the planet and this approach will add more assets to it. That’s not bad, but with digitized assets comes more big data. It’s not bad either, although it can be a problem.

Unfortunately, today’s power distribution system produces gigantic amounts of big data every day, and there seems to be no end in sight. Some authorities claim that all this big data has moved to a new category. They call it by a variety of names such as next generation big data, big data, industry 4.0 data, advanced big data and others. Let’s use the term bigger data because it seems less confusing than all the others. After all, big data is just big data.

Bigger data is so much bigger that it requires more sophisticated tools. It cannot be efficiently handled by outdated big data methods. Big data requires modern, next-generation management tools with capabilities beyond those developed for the previous generation of big data. These systems use amazing techniques such as cloud computing, artificial intelligence (AI), machine learning, and high-tech data analytics to produce usable intelligence. It’s almost like witchcraft and makes us forget about the solid data science behind it.

Gigabytes, terrabytes, zettabytes

Is the increase in data really that important? Well, according to EarthWeb, around 79 zettabytes of data were generated in 2021 and they predict 180 zettabytes will be generated in 2025. To put it in a more familiar context, consider purchasing data storage. Big box stores sell terabyte external hard drives as off-the-shelf items. If you wanted to store a zettabyte of data, you would need billion terabyte hard drives.

This is bigger data and experts expect it will continue to grow as our connectivity continues, driven by IT/OT convergence. Research firm Markets and Markets released a report last year that stated: The big data industry is being driven by a surge in the volume of data. They predict that big data market spending will grow from US$162.6 billion in 2021 to US$273.4 billion in 2026.

Going back a bit, IT/OT connectivity combines operational technology (OT) and information technology (IT) under a single platform or system. It has found a home with asset-intensive businesses like the electric utility industry. IT/OT with magic IIoT (Industrial Internet of Things) merges the physical world with the virtual world through a complete digital model. This digital model is a detailed representation of the company.

The digital model is an essential element for the deployment of some amazing digital applications such as digital twins. Digital models have propelled asset management systems into their next generation in time for bigger data. Modern asset management systems started out as complicated platforms, but when performance was added between the words asset and management, nothing was the same. Manufacturers such as ABB, Bentley, GE Digital, Hitachi Energy, IBM, Siemens Energy, SAP, Schneider Electric and others have developed a wide range of APM platforms with a focus on a wide variety of capabilities and of capacities.

Smart Grid Analysis

These Asset Performance Management (APM) systems use AI, cybernetics, and machine learning to sift through all historical and real-time data. But it doesn’t stop there, APMs can review previous loading data, maintenance documents and weather records to develop asset health assessments. APM combines all of this information to determine the current state of every business asset. In other words, the APM system tracks the health of an organization’s physical assets, and this is where it gets interesting.

Recently Load ahead had the opportunity to speak with some colleagues. Gary Rackliffe, vice president of marketing development and innovation and Bart Gaskey, senior vice president of strategic marketing and business development. They are Hitachi Energy’s experts on key emerging innovations in the utility landscape, including APM and its developments. The discussion provided interesting insights into APM systems and how they fit into today’s power grid.

Gaskey started the discussion by saying, “For many years, utilities have deployed sensors and monitors on their systems and captured massive amounts of data. APM systems leverage smart grid analysis to make sense of collected data. They also cleanse the data and organize it so that it can be used with existing data and data from all parts of the organization. »

Gaskey continued, “Hitachi Energy software experts have developed analytics software that has accelerated the pace of predictive forecasting through predictive modeling that uses online and offline data. The analyzes went so far as to be more prescriptive by proposing corrective actions instead of simply identifying changes from normal conditions. To be able to understand what’s going on with the network, you need to be able to understand the data and it needs to be consistent across the enterprise. »

At this point, Rackliffe explained, “Organizations unwittingly built silos of digital data across their enterprise because no one thought data could be exchanged across the enterprise as it is today. Very often departments acted alone when it came to data and how it was used in their sector. Each system was designed specifically with its own needs in mind, but APM systems are not limited in this way. They are designed with the whole organization in mind. They have access to both historical data and real-time data from assets in the field. The data even comes from videos and photographs taken during drone and helicopter overflights.

Rackliffe continued: “These APM systems are taught to identify the forces that impact the health and condition of the equipment. Using supervised AI learning, APM systems synchronize the data exchanged from all the data available to the business. This is possible by leveraging the machine learning capabilities of AI. For known issues, the algorithm is trained to identify the problem signature, and then it can quickly find that signature whenever it is found in the big data it sifts through. Unsupervised AI learning enables the algorithm to recognize data anomalies in asset performance data. He is looking for something different that is different from normal or healthy conditions. Detecting a data anomaly may indicate that there is a problem, but you don’t know what is causing it or what the impact on performance will be.

Gaskey pointed out, “The transmission grid is being pushed harder every day and utilities need every advantage they can find. These APM systems are what we call “off-the-shelf” (that is, they are available technologies, not prototypes). They are used by utilities to examine the health of their assets and the operational status of those assets and they prove to be valuable.

Gaskey explained, “Consider driving a truck to a remote facility when an alarm goes off. The operator first checks the live data to determine what is happening. Is it an intruder, animal, equipment failure, etc.? If it turns out to be an equipment failure, the recovery team knows what equipment they need to take with them and put it on the truck. Before anyone hits the road, the staff know exactly what they’re going to find. They know what is needed to correct the problem – no wasted windshield time. »

Rackliffe and Gaskey summarized, “Section 40107 of the landmark US Infrastructure Investment and Jobs Act (IIJA) encourages innovation across a variety of technologies. The IIJA makes grants available to utilities to modernize energy systems. This program offers the utility a chance to apply some of these technologies like APM systems with minimal risk. It’s like a boost for modernization. One of the main categories of 407107 relates to data analytics enabling grid functions, meaning the utility would be able to apply powerful technologies to solve specific problems it faces in the operation of its system.

Growing trend

The art and science of asset management has come a long way with innovative APM systems. They have a reputation for improving the results of the companies that use them. According to GE, “APM is a proven approach to reducing unplanned downtime, lowering maintenance costs, and reducing EH&S (environmental, health and safety risks).”

APM platforms have moved beyond legacy asset management systems that just collected data and filtered it. These state-of-the-art systems are real-time, end-to-end platforms using predictive and prescriptive analytics across the entire enterprise database. They are able to identify problems, calculate risks and recommend the least risky solution to repair or replace the asset.

It is interesting to note that some public services adopt a wait-and-see attitude. Then there are others that engage the company in APM. Both approaches are risky and uncomfortable. What’s in your future?

About Troy McMiller

Check Also

White House: Recent Crypto Crash ‘Further Highlights’ Need for Careful Regulation of Digital Asset Space

The White House weighs in on the recent crypto crisis with pro-regulatory comments during a …