Entity

Time filter

Source Type

George Town, Malaysia

News Article | January 5, 2016
Site: http://www.aweablog.org

Cities look for new ways to turn on the lights There’s a new trend going on, and it’s one Americans are going to like a lot. Cities are increasingly turning to renewable energy to power their municipal utilities and local governments. Even better, they’re committing to wind power for much of this clean electricity. That’s good news, because polls show 87 percent of Americans think renewable energy is important to the country’s future. The latest headliner is San Diego, which recently announced a goal to source 100 percent of its electricity from renewable sources by 2035. By creating such an ambitious target, America’s eighth largest city set the bar high for urban areas across the country. Greensburg, KS, Aspen, CO, Burlington, VT, and Georgetown, TX were early trailblazers, committing to source 100 percent of their municipal utilities’ electricity from renewable energy. These utilities signed a mixture of power purchase agreements (PPA) and long-term contracts to purchase wind energy from specific projects. PPAs allow cities to lock in a fixed price for electricity provided by wind power over periods of 10 to 20 years. The long-term contracts provide shelter against volatile fuel prices and help local governments to reach their environmental targets. Other cities are also making important progress, even if they’re not quite at the same levels. This summer, Washington, D.C. signed an agreement to get 35 percent of the city government’s energy from wind, while Austin, TX’s municipal utility Austin Energy contracted the supply of over 1,500 megawatts (MW) of wind power. The motivations for these moves are varied. Greensburg suffered a devastating tornado in 2007 that destroyed 95 percent of the town’s structures. When it came time to rebuild, Greensburg’s residents decided they wanted to do so in the most sustainable way possible, leading them to choose wind power for 100 percent of their electricity needs. While not every shift was as dramatic as Greensburg’s, city officials cited a number of reasons for switching to renewable energy. Citizens in Aspen expressed environmental concerns. In Georgetown, leaders said the decision to turn to wind and solar was primarily based on finances. Keith Hutchinson, a city spokesman, explained the new PPAs were cheaper than previous deals with traditional utilities while having the added benefit of protecting citizens from natural gas price fluctuations. “We don’t know what’s going to happen in the future for fossil fuel regulations,” said Hutchinson. “This really removes that element from our price costs going forward.” The Burlington Electric Department represents the largest municipal utility to date to go 100 percent renewable, with over 40,000 customers. Washington, D.C. hit all of these themes in its wind PPA announcement. Mark Chambers, Sustainability and Energy Management Director at the city’s Department of General Services, said, “Directly sourcing renewable power costs 30 percent less than fossil fuel-based sources, reduces greenhouse gas emissions by 100,000 tons, and protects our city from volatile energy price increases.” This innovative thinking about meeting electricity demand is likely to continue in 2016. We’re excited to see which cities join the list next.


News Article
Site: http://phys.org/technology-news/

The FBI said it was investigating the paralyzing attack on MedStar Health Inc., which forced records systems offline, prevented patients from booking appointments, and left staff unable to check email messages or even look up phone numbers. The incident was the latest against U.S. medical providers, coming weeks after a California hospital paid ransom to free its infected systems using the bitcoin currency. A law enforcement official, who declined to be identified because the person was not authorized to discuss an ongoing investigation, said the FBI was assessing whether a similar situation occurred at MedStar. "We can't do anything at all. There's only one system we use, and now it's just paper," said one MedStar employee who, like others, spoke on condition of anonymity because this person was not authorized to speak with reporters. There were few signs of the attack's effects easing late Monday, with one employee at Georgetown University Hospital saying systems were still down, and saying some managers had to stay late and come in early because of the disruptions. Company spokeswoman Ann Nickels said she couldn't say whether it was a ransomware attack. She said patient care was not affected, and hospitals were using a paper backup system. But when asked whether hackers demanded payment, Nickles said, "I don't have an answer to that," and referred to the company's statement. MedStar operates 10 hospitals in Maryland and Washington, including the Georgetown hospital. It employs 30,000 staff and has 6,000 affiliated physicians. Dr. Richard Alcorta, the medical director for Maryland's emergency medical services network, said he suspects it was a ransomware attack based on multiple ransomware attempts on individual hospitals in the state. Alcorta said he was unaware of any ransoms paid by Maryland hospitals or health care systems. "People view this, I think, as a form of terrorism and are attempting to extort money by attempting to infect them with this type of virus," he said. Alcorta said his agency first learned of MedStar's problems about 10:30 a.m., when the company's Good Samaritan Hospital in Baltimore called in a request to divert emergency medical services traffic from that facility. He said that was followed by a similar request from Union Memorial, another MedStar hospital in Baltimore. The diversions were lifted as the hospitals' backup systems started operating, he said. Some staff said they were made aware of the virus earlier, being ordered to shut off their computers entirely by late morning. One Twitter user posted a picture Monday he said showed blacked-out computer screens inside the emergency room of Washington Hospital Center, a trauma center in Northwest Washington. Monday's hacking at MedStar comes one month after a Los Angeles hospital paid hackers $17,000 to regain control of its computer system, which hackers had seized with ransomware using an infected email attachment. Hollywood Presbyterian Medical Center, which is owned by CHA Medical Center of South Korea, paid 40 bitcoins—or about $420 per coin of the digital currency—to restore normal operations and disclosed the attack publicly. That hack was first noticed Feb. 5, and operations didn't fully recover until 10 days later. Hospitals are considered critical infrastructure, but unless patient data is affected, there is no requirement to disclose such hackings even if operations are disrupted. Computer security of the hospital industry is generally regarded as poor, and the federal Health and Human Services Department regularly publishes a list of health care providers that have been hacked with patient information stolen. The agency said Monday it was aware of the MedStar incident. Explore further: A Q&A about the malicious software known as ransomware


News Article | March 21, 2016
Site: http://motherboard.vice.com/

There are lots of conversations about the lack of diversity in science and tech these days. But along with them, people constantly ask, “So what? Why does it matter?” There are lots of ways to answer that question, but perhaps the easiest way is this: because a homogenous team produces homogenous products for a very heterogeneous world. This column will explore the products, research programs, and conclusions that are made not because any designer or scientist or engineer sets out to discriminate, but because the “normal” user always looks exactly the same. The result is products and research that are biased by design. Facial recognition systems are all over the place: Facebook, airports, shopping malls. And they’re poised to become nearly ubiquitous as everything from a security measure to a way to recognize frequent shoppers. For some people that will make certain interactions even more seamless. But because many facial recognition systems struggle with non-white faces, for others, facial recognition is a simple reminder: once again, this tech is not made for you. There are plenty of anecdotes to start with here: We could talk about the time Google’s image tagging algorithm labeled a pair of black friends “gorillas,” or when Flickr’s system made the same mistake and tagged a black man with “animal” and “ape.” Or when Nikon’s cameras designed to detect whether someone blinked continually told at least one Asian user that her eyes were closed. Or when HP’s webcams easily tracked a white face, but couldn’t see a black one. There are always technical explanations for these things. Computers are programmed to measure certain variables, and to trigger when enough of them are met. Algorithms are trained using a set of faces. If the computer has never seen anybody with thin eyes or darker skin, it doesn’t know to see them. It hasn’t been told how. More specifically: the people designing it haven’t told it how. The fact that algorithms can contain latent biases is becoming clearer and clearer. And some people saw this coming. “It’s one of those things where if you understand the ways that systemic bias works and you understand the way that machine learning works and you ask yourself the question: ‘could these be making biased decisions?’, the answer is obviously yes,” said Sorelle Friedler, a professor of computer science at Haverford College. But when I asked her how many people actually do understand both systemic bias and the way algorithms are built, she said that the number was “unfortunately small.” When you ask people who make facial recognition systems if they worry about these problems, they generally say no. Moshe Greenshpan, the founder and CEO of Face-Six, a company that develops facial recognition systems for churches and stores, told me that it’s unreasonable to expect these systems to be 100 percent accurate, and that he doesn’t worry about what he called “little issues,” like a system not being able to parse trans people. “I don’t think my engineers or other companies engineers have any hidden agenda to give more attention to one ethnicity,” said Greenshpan. “It’s just a matter of practical use cases.” And he’s right, mostly. By and large, no one at these companies is intentionally programing their systems to ignore black people or tease Asians. And folks who work on algorithmic bias, like Suresh Venkatasubramanian, a professor of computer science at the University of Utah, say that’s generally what they’re seeing too. “I don’t think there’s a conscious desire to ignore these issues," he said. "I think it’s just that they don’t think about it at all. No one really spends a lot of time thinking about privilege and status, if you are the defaults you just assume you just are.” When companies think about error, they see it statistically. A system that works 95 percent of the time is probably good enough, they say. But that misses a simple question about distribution: Is that 5 percent error spread randomly, or is it an entire group? If the system errs at random, if it fails evenly across all people just based on bad lighting or a weird glare, 5 percent might be a fine error rate. But if it fails in clumps, then that’s a different story. So an algorithm might be accurate 95 percent of the time and still totally miss all Asian people in the United States. Or it might be 99 percent accurate and wrongly classify every single trans person in America. This becomes especially problematic when, for example, the US Customs and Border Agency switches over to biometrics. And at the border we’ve already seen how biometric failures can be extremely painful. Trans people traveling through TSA checkpoints have all sorts of humiliating stories of what happens when their scans don’t “match” their stated identity. Shadi Petosky live-tweeted her detention at the Orlando International Airport in Florida, where she said that “TSA agent Bramlet told me to get back in the machine as a man or it was going to be a problem.” Since then, several more stories of “traveling while trans” have emerged revealing what happens when a biometric scan doesn’t line up with what the TSA agent is expecting. Last year the TSA said they would stop using the word “anomaly” to describe the genitalia of trans passengers. Facial recognition systems failing, tagging you and your friend as a gorilla or ape, or simply not seeing you because of your skin color, fall clearly into the suite of constant reminders that people of color face every day. Reminders that say: this technology, this system, was not built for you. It was not built with you in mind. Even if nobody did that on purpose, constantly being told you’re not the intended user gets demoralizing. Facial recognition tech is the new frontier of security, as demoed by this Mastercard app at the 2016 Mobile World Congress. But that just means another barrier for the many faces forgotten about in the design stage. Image: Bloomberg/Getty These examples, according to the people I spoke with, are just the tip of the iceberg. Right now, failures in facial recognition aren’t catastrophic for most people. But as the technology becomes more ubiquitous, its particular prejudices will become more important. “I would guess that this is very much a 'yet' problem,” said Jonathan Frankle, a staff technologist at Georgetown Law. “These are starting to percolate into places where they really do have trouble.” What happens when banks start using these as security measures, or buildings and airports start using them to let people in and out. So how do companies fix this? For one thing: they have to admit there’s a problem. And in a field where CEOs aren’t known for their deft handling of race or diversity, that might take a while. One clear thing they can do is hire better: Adding more people to your team can only help you predict and curb the bias that might be endemic to your algorithm. But there are technological solutions too. The most obvious of which is that you should feed your algorithm more diverse faces. That’s not easy. For researchers at universities, there are a few face databases available comprised mostly of undergraduate students who volunteered to have their pictures taken. But companies unaffiliated with an institution, like Face-Six, have to either assemble their own by scraping the web, or buy face databases to use for their systems. Another thing companies can do is submit their algorithms for outside testing. Both Venkatasubramanian and Sorelle work on designing tests for algorithms, to uncover hidden bias. The National Institute of Standards and Technology has a program that tests facial recognition systems for accuracy and consistency. But companies have to opt into these tests, they have to want to uncover possible bias. And right now there’s no incentive for them to do that. The problem of bias in facial recognition systems is perhaps one of the easiest to understand and to solve: a computer only knows the kinds of faces it has been shown. Showing the computer more diverse faces will make it more accurate. There are, of course, artists and privacy activists that wonder if we want these systems to be more accurate in the first place. But if they’re going to exist, they must at least be fair.


AWEA highlighted the industry’s sustained growth in releasing its U.S. Wind Industry Fourth Quarter 2015 Market Report. The strong market activity is expected to continue, with a bipartisan vote by Congress late last year for a multi-year extension of the Production Tax Credit (PTC), supplying the industry with much-needed policy certainty. As 2016 began, an additional 9,400 MW were under construction. “The data released today show 2016 presents an extraordinary opportunity for American wind power,” said Tom Kiernan, CEO of the American Wind Energy Association (AWEA). “The time has never been better for states and utilities to lock in low-cost, stably-priced wind energy to achieve their Clean Power Plan carbon reductions. Wind energy is on track to supplying 20 percent of the country’s electricity by 2030.” The emerging opportunities in American wind power will be on display at this year’s WINDPOWER Conference & Exhibition in New Orleans, May 23-26. Attendees can network with industry leaders at the largest wind power trade show in the Western Hemisphere. Registration and press accreditation is now open. Wind installations during the fourth quarter of 2015 represent the second strongest quarter ever recorded, and significantly more than the 4,854 MW installed in all of 2014. The total installations across 2015 trail only 2009 and 2012. Altogether there is now 74,472 MW of installed wind capacity in the United States and more than 52,000 operating wind turbines. “The U.S. wind industry hasn’t seen this kind of rapid growth for years,” said John Hensley, Manager, Industry Data & Analysis, for AWEA. “After surpassing the 70 gigawatt (GW) milestone late last year, American wind power is on the verge of reaching 75 GW of installed capacity in coming months.” Despite policy uncertainty, American ingenuity and over 500 wind-related manufacturing facilities across 43 states have helped reduce wind power’s costs by 66 percent in just six years. The extension of the PTC and the alternative Investment Tax Credit through 2019, as part of the budget deal Congress passed in December, keeps American wind power on the path to further advance wind turbine technology and lower its costs. “Low-cost, stably-priced wind energy is a ‘no-regrets’ solution for states and utilities looking for the best way to meet the Clean Power Plan,” said Kiernan. “Texas ranchers and Iowa farmers know wind power costs one-third as much as it did six years ago. Analysis by the Energy Information Administration confirms that wind energy will make up the majority of states’ lowest-cost Clean Power Plan strategy.” “Having policy certainty with the tax credits in place overcomes one challenge, but others still exist, including availability of the necessary power lines. New transmission can act as a renewable energy superhighway, delivering low-cost wind energy from rural areas to densely-populated U.S. cities where it’s needed the most,” said Kiernan. Reports out this week from the Southwest Power Pool and researchers at the National Oceanic and Atmospheric Administration confirm that investments in new transmission pay big dividends to a region and save consumers money, as well as being critical to meeting the nation’s carbon reduction goals. “ERCOT has experienced several new wind generation records in the past year,” said Dan Woodfin, ERCOT director of System Operations. “These records are the result of several factors, including growth in installed capacity, a significant investment in the transmission system in Texas, and improved forecasts and market tools that help us integrate these resources reliably.” According to AWEA’s report today, Iowa now ranks second in the nation in total installed capacity with more than 6,000 MW. Oklahoma just surpassed 5,000 MW installed, and New Mexico has become the 17th state to enter the ‘gigawatt club,’ passing the 1,000 MW threshold. Texas continues to lead the nation with over 17,700 MW of installed capacity, more than twice that of any other state. “Oklahoma has done a tremendous job of positioning the state as a competitive and attractive market for the wind industry to invest, and the benefits have been significant,” said Stephen Pike, Vice President of Operations and Maintenance for Enel Green Power North America, which last year in Oklahoma brought online more than 420 MW of capacity for a total of nearly one GW in the state. “For many Oklahoman communities, especially those located in rural parts of the state, wind has been a key pillar of economic development and, as a global investor, Enel Green Power is proud to call Oklahoma home and to play a significant role in the state’s energy economy.” Connecticut also saw its first utility-scale wind project completed during the fourth quarter, bringing the number of states with online wind projects to 40. Market activity continues to be robust with the industry starting construction on over 1,850 MW of new wind power projects in the fourth quarter. The more than 9,400 MW of projects under construction at the conclusion of 2015 represents a slight decrease from recent records, due to the large number of projects completing construction and coming online in the fourth quarter. However, construction activity is expected to gain momentum in coming quarters with the policy certainty. U.S. companies, American cities grow market for low-cost wind energy Fortune 500 companies continued to invest in low-cost wind energy during the fourth quarter as a way to lower their carbon footprints and lock in low-cost, stable electricity prices. Non-utility purchasers accounted for approximately 75 percent of the total MW contracted under wind power purchase agreements signed in the fourth quarter of 2015. This includes Fortune 500 companies like General Motors, Google Energy, and Procter & Gamble, which announced plans in October to use 100 percent wind power to make iconic brands Tide, Gain, Dawn, Downy, Febreze, and Mr. Clean. “2015 saw unprecedented adoption of long-term renewable energy from corporate, industrial and institutional customers in the US,” said John Powers, who leads the strategic renewables advisory division at Renewable Choice Energy. “This growth was triggered by the compelling economics in certain grid regions alongside a growing understanding in the developer community of how to meet the unique needs of the corporate buyer. With the extension of the PTC and ITC, we anticipate that this sector will continue its rapid growth; it really is the new market for renewable energy.” American cities have also increasingly turned to renewable energy to power their municipal utilities and local governments. Greensburg, Kan., Aspen, Colo., Burlington, Vt., and Georgetown, Texas are examples of cities choosing to meet a large share of their municipal electricity needs with renewable energy. These utilities and city governments signed a mixture of PPAs and long-term contracts to purchase wind energy from specific projects. Despite increased interest from non-utility purchasers, utilities continue to invest strongly in wind power. During the fourth quarter, more than 2,300 MW of the installed capacity had PPA contracts with utilities. Out of the 4,000 MW of wind PPAs signed in 2015, 1,800 MW came during the final quarter of 2015. PPAs allow purchasers to lock in a fixed price for electricity provided by zero-emission wind power over periods of 10 to 20 years. The long-term contracts protect against volatile fuel prices, helping purchasers reduce risk and saving consumers on the major power grids billions of dollars a year. A full press kit related to today’s reort release can be found at www.awea.org/4QMarket2015 For a library of up-to-date images of wind energy, use this link. Registration and press accreditation for WINDPOWER 2016 in New Orleans, May 23-26, is now open at www.windpowerexpo.org.


News Article | January 7, 2016
Site: http://motherboard.vice.com/

“Human activity is leaving a pervasive and persistent signature on Earth.” So begins one of the more depressing scientific papers I’ve ever read. What follows in “The Anthropocene is functionally and stratigraphically distinct from the Holocene,” a new study published in Science, is a laundry list of human sins that, in total, add up to what its authors say is irrefutable evidence that Earth has entered a human-driven geological epoch that began midway through the 20th century and continues today. Whether we’re actually living in the Anthropocene (the era of humans, basically) rather than a subdivision of the Holocene, an era that started roughly 11,700 years ago, has been a subject of great debate in scientific circles for the last two decades. Some argue that the Anthropocene started when humans first began making fires and polluting; others have traced it back to around 1610, when European settlers began earnestly making their mark on the Earth as a whole. Still others suggest that humans aren’t capable of making a geologically significant impact on Earth. Or at least they’re not yet. The paper, published Thursday by 24 well-respected scientists from the Anthropocene Working Group (whose members include scientists from the British Geological Survey, Cambridge University, Berkeley, the University of Nairobi, Harvard, Georgetown, Duke, the Australian National University, etc. etc. etc. and so on) argues that the Anthropocene started in the mid 20th century. If the study is officially recognized by the International Commission on Stratigraphy, the authors say that “not only would this represent the first instance of a new epoch having been witnessed firsthand by advanced societies, it would be one stemming from the consequences of their own doing.” As you might suspect, the driving force behind these changes are “accelerated technological development, rapid growth of the human population, and increased consumption of resources.” Let’s take a look at the evidence. The authors note that “recent anthropogenic deposits, which are the products of mining, waste disposal (landfill), construction, and urbanization contain the greatest expansion of new minerals since the Great Oxygenation Event [2 billion years ago].” More than 98 percent of all elemental aluminum (the metal is not naturally occurring) has been produced since 1950, and the past 20 years account for more than 50 percent of all concrete ever created. The biomass of plastics we’ve manufactured now weighs at least as much as the combined weight of all the human beings on Earth, and “the decay resistance and chemistry of most plastics suggest that they will leave identifiable fossil and geochemical records.” The remnants of Mir Mine in Russia. Image: Wikimedia Dams, mining activities, and landfills have “modified sedimentary processes sufficiently to leave clear expressions in river, lake, windblown, and glacial deposits that are often far removed from direct point sources.” Meanwhile, agriculture and livestock farming has transformed countless biomes around the world and deforestation in the tropics has necessarily influenced the construction of mountain roads that “is resulting in substantial surface erosion and landslides.” Pollution, farming, and energy use (coal, gasoline, etc) have resulted in nitrogen and phosphorus levels doubling in soils over the last 100 years. “Human processes are argued to have had the largest impact on the nitrogen cycle for some 2.5 billion years.” Use of rare earth elements since World War II has resulted in “a global pattern of dispersion in the environment and novel stoichiometric ratios,” while “industrial metals such as cadmium, chromium, copper, mercury, nickel, lead, and zinc have been widely and rapidly dispersed since the mid-20th century.” We of course haven’t even gotten into the fallout from nuclear bomb testing, which, according to the authors, is “potentially the most widespread and globally synchronous anthropogenic signal.” The scientists note that the fallout “will be identifiable in sediments and ice for the next 100,000 years.” The researchers write that atmospheric carbon, which is now over 400 parts per million, “was emitted into the atmosphere from 1999 to 2010 ~100 times as fast as the most rapid emission during the last glacial termination.” Most frightening, perhaps, is that Earth should be cooling due to its current orbit cycle around the sun, however, “increased anthropogenic emissions of greenhouse gases have instead caused the planet to warm abnormally fast, overriding the orbitally induced climate cycle.” The scientists note that we are likely in the beginning stages of a sixth mass-extinction event, but that “evolution and extinction rates are mostly too slow and diachronous to provide an obvious biological marker for the start of the Anthropocene.” The planet does, indeed, still host most of the species that we began the Holocene with. However, we can still use species distribution to mark human impact on the Earth: “Species assemblages and relative abundance have been altered worldwide,” they wrote. “This is especially true in recent decades because of geologically unprecedented transglobal species invasions and biological assemblage changes associated with agriculture on land and fishing in the sea.” Taken together, the findings noted above are “either entirely novel with respect to those found in the Holocene and pre-existing epochs or quantitatively outside the range of variation of the proposed Holocene subdivisions.” In other words, welcome to the Anthropocene, fellow human.

Discover hidden collaborations