Donmez B.,National Grid |
Abur A.,Northeastern University
IEEE Transactions on Power Systems | Year: 2011
This paper investigates a very practical and commonly encountered problem in state estimation. It is related to the placement of measurements in order to make an otherwise unobservable system observable. Typically, this is a situation that is frequently encountered during the daily operation as one or more of the measurements become unavailable due to various unexpected reasons. Under such circumstances, the state estimator resorts to the use of pseudo-measurements in order to recover observability and solve the state estimation problem. Pseudo-measurements that will make the system observable can be selected by applying one of the well-developed network observability analysis techniques in a repetitive fashion adding one candidate measurement at a time until the system is no longer unobservable. As the system size increases, this approach may prove computationally expensive. A nonrepetitive solution has also been proposed. However, its computational requirements grow rapidly with increasing number of observable islands that may exist in large scale systems. This paper presents an alternative approach which will be shown to be suitable for large scale power systems providing a compromise between the two existing techniques. Examples are given to highlight the main benefits of the proposed method. © 2010 IEEE.
News Article | August 27, 2016
Batteries won big in the U.K.'s contest to provide power that can be dispatched quickly to help keep the electricity grid stable, a step that will help the nation expand the amount of renewable energy it's using. National Grid Plc said it will award contracts to eight providers that bid to enhance the frequency of the grid if needed and help respond to fluctuations on the network, according to a statement from the company on Friday.
« Air New Zealand and Virgin Australia partner to investigate options for locally produced aviation biofuel | Main | CNG Fuels and National Grid unveil first high-pressure grid-connected CNG filling station; biomethane option » Sorghum, first grown more than 6,000 years ago in northeastern Africa, is a drought-resistant, hardy crop with numerous food, feed and fuel applications. Farmers in the southern plains of the United States have been growing this hardy cereal since the 1800s. Researchers recently released 40 varieties of early-flowering sorghum bred for use in cooler, more temperate areas. These early-flowering varieties of sorghum are critical for the spread of the crop to more new locations. When planted in areas with long days and cold soils, typical sorghum crops face difficulties. Sorghum originates in the tropical areas of Africa—it does not like cool temperatures or the long days in temperate climates. —Robert Klein, a researcher at the USDA-ARS and Texas A&M University As seasons change, the length of the day varies much more in temperate areas than in tropical regions. Sorghum needs day lengths of less than 12 hours and 20 minutes to flower. However, by the time days become short enough in late summer for sorghum crops to flower, it also becomes too cold for them to survive in temperate climates. The genetic diversity of sorghum and other plants is often preserved in germplasm collections. Researchers define germplasm as a living genetic resource such as seed or tissue. This genetic diversity is key. Diseases or pests can spread from one region to another and destroy entire crops. To prevent this, researchers can search germplasm collections and breed crop varieties with natural resistance. Forty sources of late-maturing sorghum [Sorghum bicolor (L.) Moench] germplasm were converted to early-maturing, dwarf-height BC1F3 families and released by the National Sorghum Foundation, the United Sorghum Checkoff Program, the USDA-ARS, and NuSeed/MMR Genetics. The conversion was accomplished by crossing late-maturing tropical accessions to inbred BTx406 in a short-day nursery with selection of early-maturing, short genotypes within F2 segregating populations in a long-day nursery.
What does it take to truly change a large utility? Not just cosmetic changes to branding -- but true structural changes around distributed energy deployment and customized offerings for customers. In this week's show, we’ll talk with an industry veteran who’s trying to usher in those changes. Ed White, vice president of New Energy Solutions at National Grid, joins the Gang to discuss the utility's new plan to integrate solar, efficiency, storage, electric cars and grid automation all into one area of the business. It's not an easy task. But we'll talk with White about how he hopes to pull it off. Later in the show, we'll discuss two major Supreme Court decisions on demand response and Obama's landmark climate rule. And we'll finish with a quick discussion of the positive outcome of California's net metering debate.
The global market for smart meters (also known as advanced metering infrastructure) isn’t just growing, it’s getting more complicated -- and while utilities don’t always keep up-to-date with the investments to manage this technology complexity, they can’t put them off forever. That means big spending ahead on for the data management and analytics software needed to make the most of the modern advanced metering infrastructure (AMI) network, according to GTM Research’s latest report, Utility AMI Analytics at the Grid Edge: Strategies, Markets and Forecasts. The report projects that utilities around the world will spend $10.1 billion on AMI analytics solutions and integration services through 2021, a significant increase over spending so far this decade. This boom in spending will be driven by two key changes in the market since the first big wave of AMI deployments about a decade ago. First of all, the next wave of regulator-mandated smart meter rollouts will need to do a lot more than serve as the digital replacements for meter readers. Second, the rise of distributed energy resources (DERs) on the edge of the grid is starting to open up revenue opportunities for utilities that can access the data to take advantage of them. These two trends are pushing utilities to confront the current limits of their AMI data collection and analytics, and start to enable the advanced features they’re looking to provide. Here's a breakdown of why this is happening today, and how utilities are reacting with their investment dollars. - Nobody’s justifying AMI deployments on meter reading alone. “Tapping into low-hanging fruit such as increased operational efficiency through improved the meter-to-cash processes no longer provides the overwhelming benefits that drove early AMI projects at U.S. electric utilities,” the report notes. That was OK for the earliest round of deployments, or those backed by American Recovery and Reinvestment Act stimulus funds. But nowadays, AMI networks need new value streams for positive business cases, including conservation through voltage reduction, improved customer engagement, demand management, and improved revenue assurance practices. We’ve seen some shining examples of these next-generation requirements from this year’s mega-deals in the smart meter space, including Con Ed’s $1.3 billion AMI rollout plan, or National Grid’s proposed 1.3 million meter deployment in Massachusetts. - AMI is just the first IT platform that requires integration. The scope of investment GTM Research is predicting over the next five years won’t be limited to just the utility AMI system, or the meter data management (MDM) system, or any other single system -- but to all of the above. “Many of the highest-value closed-loop AMI applications require data streams from the integration of traditional utility IT systems as well as operational systems,” the report notes, adding another layer of complexity to the data management challenge. “Inconsistency of input data formats -- including structured, unstructured, time series or transactional -- requires standardization and the application of an extract, transform and load (ETL) engine prior to performing analytics.” - The evolution of IT architectures has opened more nimble, interconnected AMI analytics. We’ve come a long way from the traditional utility IT infrastructure, in which different IT systems were more or less siloed off (GIS, CIS, OMS, EMS, MDM, etc.) from one another. The first big shift in this paradigm came with the rise of service-oriented architectures, which translate and route data between enterprise systems using a common data bus, reducing integration complexity and improving upgrade and interface management. The second has been the rise of cloud infrastructure, open-source data management software and virtualization-enabled scalable data storage and processing, which have opened opportunities for utilities and their vendors to put AMI data to use in ways that would have otherwise been prohibitively expensive or complex. - Utilities’ data analytics needs cover the range of IT architectures available. The problem for utilities in reaching this vision lies in their recent past, the report notes. Many have built themselves into a mix of standalone, point-to-point integrations: “System-wide solutions that support near-real-time aggregation and analysis of the volume and granularity of data provided by smart meters have yet to be successfully implemented in the utility industry. This has led most utilities to rely on existing database structures, delaying development of high-performance use cases.” That’s why GTM Research predicts continued demand for “turnkey solutions that can be rapidly integrated with existing back-end IT systems with limited customization,” along with analytics to improve the performance of already-deployed AMI networks, the report notes -- utilities want to make better use of what they already have. - DERs make smarter meters and analytics necessary. The growth in distributed energy resources (DERs) has become an increasingly important part of grid operations, customer relations and long-term economic planning, and the need for data analytics to manage this interplay “increases the need for utilities to receive timely, consistent net-load data from customer sites to improve situational awareness, enable utilities to offer customers specialized rates, and ensure accurate and efficient billing,” the report notes. These requirements are driving limited deployments in countries and states where AMI has not been mandated, such as Germany, as well as driving investment in states undergoing major regulatory reforms, like New York under its Reforming the Energy Vision (REV) initiative, or California with its Integration of Distributed Energy Resources (IDER) proceeding. The distributed energy data management challenge also presents utilities with a new set of actors -- DER-owning customers, or third-party operators and aggregators -- that need to be involved. - Stimulus and mandates are huge drivers for region-by-region investment. “The 2009 American Recovery and Reinvestment Act and California’s smart-meter mandate catalyzed the U.S. market, leading to the deployment of nearly 30 million smart meters in less than a decade,” the report notes. Meanwhile, the European Union, China, and Japan have all enacted legislation mandating some level of adoption of smart meters as part of broader clean energy and smart city initiatives. Each offers the potential for tens of millions -- or in China’s case, hundreds of millions -- of new smart meters per country. Of course, what governments give, they can take away, as we’ve seen from slowed and reduced AMI rollouts in Europe.