eZine Home Next →

As part of an industry working group queried by the Department of Energy for a 2015 report* an anonymous utility engineer made this confession about cleaning up utility data: “It took us one and a half years to go out and collect phase information on every transformer and build a connectivity model for every single customer,” the engineer said. “We identified 44,000 mapping errors in our GIS.”

This forthright engineer worked for a large, urban investor-owned utility that was one of the first in the nation to implement an advanced distribution management systems (ADMS). To deploy the software, the utility built a network model, but found the effort took more time than planned because data clean-up was a task far bigger than they’d expected. Although utility mangers gave the project a three-year timeline, they wound up doubling resources twice and still missed the target deadline.

It’s an enormous undertaking to get real-time state estimation working on your whole network.”

“If you have an idea that you have a lot of model work to do and you’re at the beginning of the project, you should probably think you’ve got five to 10 times the amount of work that you think you have,” the engineer said. “That’s only a slight exaggeration. It’s an enormous undertaking to get real-time state estimation working on your whole network.”

As more utilities follow this pioneer down the ADMS path, and as they implement distributed energy resources management systems (DERMS) and advanced analytics, these utilities are going to face similar issues. Reliable network models are crucial to implementing these systems. And the models must be built on accurate data. Creating that data is easier said than done, and here are some high-level points of guidance.

Validate now and again … and again

What exactly is a network model? Soorya Kuloor, Practice Director of Distribution Operations at Landis+Gyr, defines a network model as a computer representation of the actual physical distribution network utilities operate. The model reflects all the assets, such as transformers, overhead lines, underground cables, switches, capacitor banks and more, as well as network behavior.

It’s akin to computerized representations used in flight simulators, Kuloor says. “A flight simulator is driven by the model of a plane, right? In the same way, any kind of simulation you want to do for an electric distribution network is driven by a network model.”

So are systems like an ADMS, which Gartner defines as “the software platform that supports the full suite of distribution management and optimization. An ADMS includes functions that automate outage restoration and optimize the performance of the distribution grid.”

Kuloor continues: “When you get into ADMS or other real-time systems, implications of inaccurate models become more severe. You could electrocute someone.”

But, here’s the problem: There are hundreds of thousands of assets to track. “For large utilities, it’s millions of elements, and some were deployed more than 30 years ago, when they tended not to keep any electronic records of these assets,” Kuloor says. “The typical error happens within the GIS system, which keeps track of these assets.”

How do you find them? He recommends starting with validation software that looks at existing data and identifies potential problems. Using a model validation tool, Kuloor says a utility could be able to take data that’s only 50 percent accurate and bring that accuracy rate up to 95 percent.

“But the system changes all the time,” Kuloor adds. “People build new things and replace things, so you still need to continue to run your validation software repeatedly. It’s not a one-and-done process.”

At a minimum, Kuloor recommends validation on a bi-weekly basis.

Take it outside

Once the data is relatively clean, how do you keep it that way?  “The data starts at some point with someone installing something,” says David Price, Chief Technology Officer for Management Consulting at Black & Veatch.  Consequently, data accuracy “starts with the discipline with which you configure and deploy devices in the first place.”

In other words, simple quality controls will go far. “There’s a whole piece to this in terms of how good a job you do at defining and building the systems that people use in the construction phase of assets,” Price says.

Day-to-day maintenance must have quality checks built in, too. Often, that involves tasking field crews with verifying data and giving them tools to input and update data as needed. One utility preparing for an ADMS required each service center to check the phase on lines at 10 discreet locations each week. With those two reads per day from field crews, the utility achieved a 96 percent success rate on matching actual phase information with what was noted in the GIS.

Another utility had crews walk the lines for initial verification of phase on each conductor where there was a transition point, then sent two phase detection tools out to each service territory so crews could do regular spot checks on an ongoing basis.

Follow some rules

Kuloor says utilities should nurture a data culture where everyone whose hands touch data understand its importance. Utility workers who provided input to that afore-mentioned DOE report said, “It’s important to identify who is responsible for keeping the data accurate and hold their management accountable.”

According to Price, data governance policies can help utilities achieve process creation and awareness building. Part of that governance policy should address subcontractors, he says. “How do you make sure that people who do work for you follow your rules and data requirements in such a way that you can receive the information and readily apply it in an enterprise system?” It’s a matter of precisely defining requirements for installation, data capture, connectivity and inspection to verify that things are correct.

Price also says governance must accommodate the distributed nature of data within the utility. “Each aspect of asset management generally considers its own systems the system of record,” he notes. That is, the folks working with the SCADA platform would consider themselves masters of the critical operational data that allows them to control the network, while a definitive record of work history related to assets would likely reside elsewhere.

“What matters for an individual company is to decide which systems govern which piece of data and then implement the integration that supports that,” Price says.

And here’s another thing that matters: setting aside the time and resources to get data up to snuff.

Utilities who’ve been through the process say data cleanup can consume as much as 25 percent of the deployment costs on a software implementation project like an ADMS. If there’s new advanced control and automation software in your future, there’s data clean-up ahead for you, too.