About Me

Tim Taylor is a Distribution Industry Solution Executive with Ventyx, an ABB Company. He assists distribution companies to understand how advanced distribution managements systems (DMS), including SCADA, outage management, mobile workforce management, and business intelligence can improve their performance. Tim has worked for ABB in a number of R&D engineering, consulting, and business development roles. He has performed distribution planning studies for companies around the world, has developed and taught courses on distribution planning and engineering, and assisted with due diligence evaluations of electric distribution companies. Tim also worked with GE Energy in a number of roles. He was a Technical Solution Director in the Smart Grid Commercial Group, focusing on distribution system management, automation, and operations. He worked in T&D application engineering, where he focused on the application of protective relays, surge arresters, distribution transformers, and other equipment. Tim is a Senior Member of IEEE and holds an MS in Electrical Engineering from NC State University and an MBA from UNC-Chapel Hill.

Tuesday, October 4, 2011

Volt/VAR Control - Old Problem, New Solution

Volt/VAR Control.  Volt/VAR Optimization.  Real power and energy loss minimization.  Conservation voltage reduction.   Integrated volt/VAR control.  Model-based volt/VAR optimization.
Whatever you call it, one of the applications receiving the most attention in the smart grid world right now is volt/VAR optimization on distribution feeders.  Many folks, though, don’t realize that the problem of optimizing VAR flow, minimizing real power losses, and maintaining a target voltage level for electric loads has been around as long as the 130 years that electric power systems have been around.
In doing some reading about electric power industry pioneers the other day, I discovered that voltage drop and real power losses were a contributing factor to the invention of the incandescent light bulb.  It turns out that back in the late 1870’s, the majority of the research on suitable materials for the light bulb filament focused on low-resistance materials.  Thomas Edison chose to go down a different path instead.  He focused on high-resistance materials for the filament, and his reasoning was this:  in order to construct an economic power system, it would require small diameter conductors, because they were much less expensive than larger-diameter conductors.  (Imagine having to run 477 ACSR to every house!)  Edison knew if a system was going to be built with small conductors, then it would require low load currents so that excessive voltage drop and power losses would not be introduced.  And in order to get low currents, he knew that the light bulb, one of the “killer apps” of the electric world at that time, would have to have a high-resistance filament.
So evaluating the economics of the complete power system from the perspective of voltage drop and real power losses drove Edison to focus on high-resistance filaments, instead of low-resistance filaments.  This led to his eventual discovery of the carbonized cotton-thread filament, and subsequently the carbonized cardboard filament, that evolved into the commercially-viable incandescent light that changed the world.  Edison’s approach, looking to counter the still-present adversaries of voltage drop and real power losses, ultimately led to his finding the high-resistance filament for the first commercially-viable light bulbs. 
Now Edison was advocating a dc (direct current) system, and not the ac (alternating current) system that eventually proved to be the winner.  So he wasn’t even dealing with the reactive component of the current flows that typically produce the greatest amount of voltage drop and real power losses in distribution systems.   When the ac system of Tesla/Westinghouse eventually won “The War of the Currents”, then engineers had to deal with the reactive current flow creating voltage drop and real power losses as well.
Over the years, many solutions have been developed to deal with the effects of voltage drop and real power losses on distribution systems.  To produce VAR’s as close as possible to equipment requiring them, and minimize VAR flow on distribution lines, capacitors were developed.  Capacitors could be either fixed (connected to the system at all times) or switched on and off the system through the use of control variables such as time, current, VAR, voltage, or temperature.  The load tap changing transformer used in distribution substations for changing the voltage at the substation was developed.  Free standing voltage regulators, with the ability to be installed in either the substation or along the distribution lines, were developed.  Line drop compensation on voltage regulation equipment was developed for improved voltage control under variable load conditions. 
One of my friends in the industry said he had actually seen equipment, on the system of the utility that he was working for, that were called “capaciformers”.  The capaciformer was a distribution transformer which also had capacitors that were contained in it.  The concept was that any time you needed to add a distribution transformer to serve new load on the system, you could add a capaciformer.  In this way, the capacitors and their VAR supply would be inherently added as the load grew, since it was known that the load was going to require a VAR supply anyway.  So instead of adding separate capacitors later, you could just add the capacitance when the transformer is installed.  Since today’s distribution systems aren’t blanketed with capaciformers (primarily because of their inflexibility in adapting to the large variability in VAR requirements, as well as the variations in real power load, on the system), they obviously didn’t work out.  But it does illustrate the effort that has gone into volt/VAR control over the years.
A review of the industry literature also shows that volt/VAR control in distribution systems has always been a popular subject.  It seemed to hit a peak in the late 1970’s and 1980’s, after the energy crisis in the mid-1970’s.  But then it seemed to be relatively dormant in the 1990’s and into the 2000’s. So why all the attention now, on a problem that has been around for 130 years?  Well, there are several reasons for this.
One, electric power system operating objectives have changed.  Over the last couple of years, minimization of customer demand and minimization of losses have become more important in many locales.  And improved volt/VAR control is typically a very cost-effective means to these goals.  Think about this – for at 5000 MW distribution organization, which has to pay an equivalent capitalized charge of $1000/kW for peak demand, a 1% reduction in peak demand is worth $50M.  And despite the lack of load growth in some locations due to the slow economy, the green/efficiency movement continues, increasing the value of energy reduction measures, even when demand reduction is not as critical.
Two, the methods developed for volt/VAR control in the past don’t work as well as the model-based volt/VAR methods that have been developed recently.   Figure 1 provides a brief history of volt/VAR control methods.  Older methods have a number of weaknesses, including the fact that they can’t keep up with the continuous changes that are made on a distribution system, including both the planning and design changes that happen year-to-year and the operating changes that occur day-to-day.  Loads and capacitor banks routinely get transferred between feeders, rendering the older volt/VAR control methods less effective.  The older centralized control methods were also based on heuristics, and not formal mathematical optimization – results were almost always less than optimal.  The model-based volt/VAR system of today considers the as-operated state of the distribution system, and can apply true mathematical optimization to achieve maximum reduction of real power losses and customer demand.


Third, leading distribution organizations have been implementing a technology platform, which volt/VAR is able to leverage while sharing the expense of that platform with other distribution processes.  The technology platform includes GIS (geographic information system), which distribution organizations typically use as their record of distribution assets and system connectivity.  The GIS also provides the basis of the operating model for DMS applications such as model-based volt/VAR optimization.  The technology platform also includes DMS and SCADA systems that provide centralized control applications for efficient management of the distribution system.  The technologies also include the improved communications systems that distribution organizations have been installing for communicating with field devices and customer AMI meters.   Other processes in the utility are able to leverage these investments, including outage management, feeder monitoring, fault location and restoration switching, work management, and equipment condition monitoring.   When a utility considers the benefits of all these processes, and the shared cost of the infrastructure among all these processes, then the business case for model-based volt/VAR economics is very strong.
More information on model-based volt/VAR optimization is contained in a white paper that can be downloaded free of charge from www.ventyx.com.