Time filter

Source Type

Atlanta, GA, United States

News Article
Site: http://phys.org/technology-news/

The study, published in the latest issue of the journal Science Advances, along with scientists from NICTA (National Information Communications Technology Australia) and the University of California in San Diego concludes that it is possible to determine the damage caused by a natural disaster in just a few hours, by using data from social networks. "Twitter, the social network which we have analyzed, is useful for the management, real-time monitoring and even prediction of the economic impact that disasters like Hurricane Sandy can have," says one of the researchers, Esteban Moro Egido, of UC3M's Grupo Interdisciplinar de Sistemas Complejos - Complex Systems Interdisciplinary Group (GISC). The research was carried out by analyzing Twitter activity before, during and after Hurricane Sandy which, in 2012, caused more damage than any other storm in US history, with an economic impact in the region of 50,000 million dollars. Hundreds of millions of geo-located tweets making reference to this topic were collected from fifty metropolitan areas in the USA. "Given that citizens were turning to these platforms for communication and information related to the disaster, we established a strong correlation between the route of the hurricane and activity on social networks," explains Esteban Moro. But the main conclusion of the study was obtained when the data relating to social network activity was examined alongside data relating to both the levels of aid granted by the Federal Emergency Management Agency (FEMA) and insurance claims: there is a correlation between the mean per capita of social network activity and economic damage per capita caused by these disasters in the areas where such activity occurs. In other words, both real and perceived threats, along with the economic effects of physical disasters, are directly observable through the strength and composition of the flow of messages from Twitter. Furthermore, researchers have verified the results obtained from Hurricane Sandy and have been able to demonstrate that the same dynamic also occurs in the case of floods, storms and tornadoes; for example, whenever there is sufficient activity on social media to extract such data. In this way, communication on Twitter allows the economic impact of a natural disaster in the affected areas to be monitored in real time, making it possible to provide information in addition to that currently used to assess damage resulting from these disasters. Moreover, the distribution space of the event-related messages can also help the authorities in the monitoring and evaluation of emergencies, in order to improve responses to natural disasters. The authors of the study suggest that we are facing an increase in the frequency and intensity of natural disasters as a consequence of climate change. "We believe that this is going to cause even more natural disasters and, therefore, the use of social networks will allow us to obtain useful supplementary information," points out Professor Esteban Moro, who is currently working on further research in this area. "We are trying to see if there is a relationship between activity on social networks and climate change which will affect us in the future". Explore further: A system detects global trends in social networks two months in advance More information: Y. Kryvasheyeu, H. Chen, N. Obradovich, E. Moro, P. Van Hentenryck, J. Fowler, M. Cebrian, Rapid Assessment of Disaster Damage Using Social Media Activity. Sci. Adv. 2, e1500779 (2016) DOI: 10.1126/sciadv.1500779, http://advances.sciencemag.org/content/2/3/e1500779

News Article | March 12, 2016
Site: http://motherboard.vice.com/

In the weeks after Hurricane Katrina devastated New Orleans in 2005, the Federal Emergency Management Agency (FEMA) came under intense scrutiny for its bungled relief efforts that left thousands of residents trapped in the city without access to the basic necessities FEMA had been tasked with delivering. The explanations and excuses for FEMA’s inability to provide proper relief to the denizens of New Orleans were manifold, yet one can hardly help but wonder if the relief effort might’ve been more effective in saving lives and mitigating discomfort had social media been available as a tool for emergency responders and the victims. In recent years, we’ve seen social media become a powerful tool for everything from revolutionaries in Egypt to first responders in New York in the aftermath of Hurricane Sandy. Users have put Facebook, Twitter, and other platforms to use for everything from tracking down loved ones to distributing relief supplies, but when Katrina struck Facebook was just over a year old and Twitter wouldn’t appear on the scene for another seven months. Would they have made a difference for the victims of Katrina? It’s tough to say with absolute certainty, but according to a study published Friday in Science Advances, the answer is likely yes. In the aftermath of a national disaster, FEMA does complex modeling which considers everything from geography to infrastructure to the characteristics of the disaster itself in order to determine where the most severe damage was likely to have occurred. This allows the organization to distribute supplies in a timely manner to people in the regions most affected by the event—at least, hypothetically. As the United States saw in the aftermath of Hurricane Sandy in 2012 and even more so in the wake of Katrina, inaccurate damage mapping can add on weeks, if not months, to the amount of time it takes relief supplies to reach those most in need. To save more lives, relief workers need better maps. According to the team of researchers led by Yury Kryvasheyeu from Australia’s National Information and Communications Technology Research Centre of Excellence, one of the most accurate ways to map damage in the aftermath of a disaster is to map the tweets about the event. In fact, using tweets to predict damage is liable to give results that were slightly more accurate than the complex data modeling used by FEMA. To arrive at this conclusion, the team examined all the tweets between October 15 and November 12, 2012 that referenced Hurricane Sandy by looking for keywords such as “hurricane,” “Sandy,” “Frankenstorm,” and “flooding.” Although some of these tweets already had associated map coordinates, others did not so the team analyzed user accounts to determine the location of the tweets. After everything was said and done, the team had a data set compiled of nearly 10 million tweets from more than 2 million user accounts. As the team found, those that were closest to the storm and affected the most by its fallout were more likely to be talking about the event. Yet in order to account for extraneous variables that might skew the data (media reports might stoke irrational or inflated fear in people that were not severely affected by the storm, for instance), the team compared their tweet maps with data about Hurricane Sandy damage that had been collected by FEMA and the state governments in New York and New Jersey. When the team compared these two data sets, they found that Twitter was actually slightly better at predicting the location and severity of the damage than FEMA’s own models. These results are encouraging for the future of social media as a tool to mitigate the fallout from natural and manmade disasters, but before FEMA and other disaster relief organizations abandon their own models in favor of social media modeling more research needs to be done to account for variables which might skew the data such as Twitter bots and those who aren’t using social media. Moreover, if similar studies are done with Facebook, which has a larger user base, the results may actually become more precise and thus more useful in facilitating disaster relief. “The correlation that we observed is not uniformly definitive in its strength for all events, and care should be taken in the attempt to devise practical applications,” the team wrote. “However, we believe that the method can be fine-tuned and strengthened by combination with traditional approaches. Our results suggest that, during a disaster, officials should pay attention to normalized activity levels, rates of original content creation, and rates of content rebroadcast to identify the hardest hit areas in real time.”

News Article | January 22, 2016
Site: http://www.theenergycollective.com/rss/all

Whenever I hear that environmental protection is a partisan issue, I’m reminded of New York City Mayor Fiorello LaGuardia’s famous statement that there is no Democratic or Republican way to pick up the garbage. The provision of clean air, safe drinking water, solid waste management and flood control are all basic public services that people who pay taxes expect to receive. Too bad the folks running Flint, Michigan, and the state of Michigan didn’t get that memo. It’s also too bad that the federal Environmental Protection Agency sat on the sidelines and allowed Michigan to damage Flint’s water supply. In the spring of 2014, the city of Flint decided to stop using Detroit’s water system and instead began pumping its water from the Flint River. This was a cost-cutting measure designed to be temporary until the city could connect to a regional water system, then under construction. In September of 2015, the Associated Press reported that: “A group of doctors led by Dr. Mona Hanna-Attisha of Hurley Medical Center urges Flint to stop using the Flint River for water after finding high levels of lead in the blood of children. State regulators insist the water is safe.” While the city has now switched back to the Detroit water system, the water from the Flint River damaged the city’s water pipes and released lead and other pollutants from the pipes into the water supply. Had the state required corrosion protection chemicals to be added to the Flint River’s water, the lead pollution might have been avoided, but the state agency neglected to impose this requirement. In order to use the public water system, in-home filters must now be used and changed frequently to ensure that the water is safe. Last week, President Obama signed a declaration stating that Flint is under a state of emergency and requiring the Federal Emergency Management Agency to provide funds for filters and other remedial actions. Unfortunately, since this is a human-made disaster rather than a natural one, the funding available is capped at $5 million-although the cap could be raised by a specific though unlikely act by our dysfunctional Congress. According to Paul Egan and Todd Spangler of the Detroit Free Press, Michigan Governor Rick Snyder requested a disaster declaration, but instead received an emergency declaration, probably because the law typically doesn’t apply to human-made disasters. Egan and Spangler noted that: In other words, the cost of this cost-cutting measure will be at least $100 million and that does not include the cost of health care resulting from lead poisoning and the productivity lost when people go hunting for clean water that they once were able to obtain easily from their faucet. Flint has had a tough time throughout the late 20th and early 21st century, as the auto industry and other manufacturers abandoned this once thriving town. The water crisis is really kicking a good town when it’s down. Flint’s water crisis is not a natural disaster but a disaster of poor management based on the ideology of cost-cutting at all costs. According to Ryan Felton of the Guardian: Michigan’s state environmental agency kept insisting the water was safe, but anyone with a sense of sight, smell and taste knew that the state’s bureaucrats were wrong. Finally, Dan Wyant, the head of Michigan’s environmental agency resigned, and Governor Rick Snyder, apologized for the human-made disaster that has taken place under his watch. But it is not simply the state agency or the governor who should accept blame, but the federal EPA and President Obama as well. Hillary Clinton’s and Bernie Sander’s efforts to make this a partisan issue is cynical presidential primary political pandering. This is a bipartisan mess-up. The federal government sets the drinking water standards in America, even though monitoring and administration is delegated to the states. The federal EPA had the authority and responsibility to intervene. The failure in Flint belongs to all of us and it should lead to some hard thinking about the causes of this completely avoidable environmental disaster. It starts with a careless and poorly thought through engineering decision. Before the water source was changed there should have been an analysis of the possible impact of changing water sources. First, the water itself needed to be analyzed to see if the two sources were different. Second, the impact of the new water on the city’s water tanks, pipes and pumps needed to be analyzed. Third, the change over should have been preceded by a pilot test to ensure that the on-the-ground reality matched the theory of the design’s analysis. It is not clear that these steps were undertaken, and if they were, clearly the data or the risk assessment was inadequate. The fundamental concept of sustainability management is that CEOs and COOs must know enough science to manage what I have been calling the physical dimensions of sustainability: water quality and quantity, toxicity, waste, energy efficiency, environmental impacts and the impact of toxics on ecosystems and human health. Just as a manager must be able to read a financial statement and understand an analysis of marketing focus groups, that manager must understand enough science to make decisions about an organization’s use of and impact on natural systems. Over and over again we see companies and governments making short-term decisions to save money, but then see these “pragmatic” decisions costing more money when decisions must be reversed: Fukushima’s inadequate sea wall, VW’s deceptive software, BP’s reckless contracting in the Gulf of Mexico, GE’s dumping of PCBs in the Hudson river. The list is long and getting longer. We live on a more crowded planet and to maintain and grow our economy we must learn to be more careful in our use of natural resources. This is not an impossible task. We simply must move past short-term expedience and the type of thinking that states: “in order to make an omelet you’ve got to break some eggs.” We need to use our analytic, information and communication resources to do a better job of managing human impact on the environment. While this may raise some costs in the short term, it will lower costs in the long term. As we get better at managing our activities we will learn more about how to produce and protect simultaneously and the price of protecting the environment will go down. All over the world, from China to India and from West Virginia to the city of Flint, Michigan, poor management is harming the environment, public health, and everyone’s pocketbook. There are no short-cuts, and the sooner the people running our governments and businesses figure that out, the sooner we can proceed with the real work of growing our economy without destroying our home planet.

News Article | January 12, 2016
Site: http://www.techtimes.com/rss/sections/science.xml

NASA has finally officialized a defense program that has kept a fairly low profile for quite some time, one that is meant to detect and defend our planet against alien bodies. No, they're not body-snatchers (Invasion of the Body-Snatchers) or Yeerks (Animorphs) or even Aliens with a capital "A": rather, they're celestial bodies, and NASA's newly formed Planetary Defense Coordination Office's number one priority is to make sure any myriad of near-Earth objects (or as NASA calls them, NEOs) stay out of our atmosphere. The Planetary Defense Coordination Office, or PDCO, is in charge of classifying asteroids, comets, and other like-minded entitites that fall anywhere within our orbit and our sun. According to an official statement announcing the formation of PDCO, the department's responsibilities also include "coordinating interagency and intergovernmental efforts in response to any potential impact threats," meaning that they can collaborate with other domestic departments like the Federal Emergency Management Agency (FEMA) and come up with contingency plans in case of fallout. "Asteroid detection, tracking and defense of our planet is something that NASA, its interagency partners, and the global community take very seriously," said NASA's John Grunsfeld, the associate administrator for the agency's Science Mission Directorate. "While there are no known impact threats at this time, the 2013 Chelyabinsk super-fireball and the recent 'Halloween Asteroid' close approach remind us of why we need to remain vigilant and keep our eyes to the sky." Even if there is nothing pressingly imminent, there sure is a lot out there: according to NASA, there are about 13,500 NEOs between us and the sun to date, and about 1,500 new ones discovered every year on average — and those are stats that have only been compiled since 1998 when NASA began surveying for them. So how does NASA detect an NEO to begin with? With help from the NEOWISE, a "space-based" infrared telescope, which is helmed by the International Astronomical Union Minor Planet Center. Once a NEO is spotted and tracked, the data is then sent over to NASA's Center for NEO Studies, which is run out of the agency's Jet Propulsion Laboratory in Pasadena, Calif. But even though these departments area already in place, they only focus on keeping an eye on the NEOs they find, not necessarily coming with up with what to do if one gets a little too close for comfort. "The formal establishment of the Planetary Defense Coordination Office makes it evident that the agency is committed to perform a leadership role in national and international efforts for detection of these natural impact hazards, and to be engaged in planning if there is a need for planetary defense," said NEO program executive Lindley Johnson, who has now taken the lead at PDCO under the title Planetary Defense Officer. Part of what made the office's formation possible is a federal allocation of funds: NASA was recently given $50 million by the government specifically for NEO observation and laying the groundwork for actualized planetary defense efforts. This is a major step up from as recently as 2012, when NASA was given $20.4 million to launch its Grand Asteroid Challenge, a program dedicated to eking out asteroid threats (funding for this was doubled to $40 million in 2014). Before that, only $4 million was designated to NEO research annually. Learn more about NASA's Grand Asteroid Challenge and NEOs in the video below.

News Article | January 10, 2016
Site: http://www.techtimes.com/rss/sections/science.xml

In case any potentially catastrophic asteroids are looming around threatening the Earth, NASA is prepared to defend the planet – albeit not in the way superheroes would. On Thursday, the space agency formally founded its new asteroid detection program called the Planetary Defense Coordination Office (PDCO). The PDCO is an organization that is assigned to coordinate all of NASA's efforts to list down and classify Near-Earth Objects (NEOs) that may damage our planet. Before the institution of the PDCO, NASA has been engaged in global planning for planetary defense for a long time. The PDCO will improve and expand those efforts while working with the Federal Emergency Management Agency (FEMA), federal agencies and other departments. Lindley Johnson, who now holds the title Planetary Defense Officer aside from being the executive of the NEO Observations program, said the establishment of the PDCO makes it apparent that NASA is truly committed in its national and international efforts for the detection of natural impact hazards. The PDCO will issue notices of nearby passes and warnings if it detects any potential impacts. The office will also assist with coordination across U.S. government agencies; participate in planning possible responses to an actual impact threat; and work with FEMA, the U.S. Department of Defense and other international counterparts. What Happens If A NEO Such As A Comet Or An Asteroid Hits Earth? If you were born before the late 1990s, then you probably know what the film "Deep Impact" is all about. A 7-mile long comet threatens to destroy and collide with Earth, and a joint United States-Russia team is sent to destroy the comet. If they fail, humanity is doomed. A 7-mile long comet is definitely huge and can amass damage, but scientists say it is not enough to wipe out the entire planet. According to a research team that published a paper in the journal Astronomy & Geophysics, a 31- to 61-mile wide centaur, a term for a massive comet, is more likely to destroy Earth. Planetary collisions from centaurs often occur once about every 40,000 to 100,000 years, the team wrote. Comets are different from asteroids, of course. Comets are made up of ice, dust and rocky material while asteroids are made up of metals and rocky matter. Even though the two are different, a 60-mile asteroid can also obliterate our planet. University of Colorado geoscientist Brian Toon said an asteroid half a mile in diameter can do a lot of damage and cause widespread earthquakes with an energy equal to 100 billion tons of TNT. Just like with comets, that kind of damage won't be completely catastrophic. Scientists believe that the asteroid that killed the dinosaurs was about 7 to 8 miles wide. It's not a coincidence that "Deep Impact" writers chose a 7-mile comet. Luckily, planetary sciences professor Richard Binzel said that nowadays there are no asteroids in orbit big enough to wipe out Earth. Keep Our Eyes To The Sky Although asteroids have not made any impact on the planet ever since the extinction of dinosaurs, more than 13,500 NEOs have already been detected by astronomers. About 1,500 NEOs turn up every year, NASA said. On Halloween last year, an asteroid the size of five and a half football fields flew past Earth. NASA managed to capture radar images of asteroid 2015 TB145, which passed by the planet on the morning of Oct. 31. John Grunsfeld, the associate administrator of NASA's Science Mission Directorate, said there are currently no known impact threats, but the Chelyabinsk super-fireball and the close approach of the Halloween asteroid should remind scientists to remain vigilant and "keep our eyes to the sky." Scientists detect NEOs using ground-based telescopes located around the world. NASA uses its NEOWISE infrared telescope, which measures data containing asteroids and comets from images collected by the WISE or Wide-field Infrared Survey Explorer spacecraft. Why The PCDO Was Created Before the formation of the PCDO, NASA's NEO program has been searching for space rocks since 1998. Congress funded the program to find 90 percent of the potentially dangerous NEOs roaming around our planet, all of which have diameters of over one kilometer or 0.6 miles. Now, about 911 large NEOs are recorded, the agency said, accounting to a whopping 93 percent of these celestial NEOs. NASA's next research project involves the identification of 90 percent of NEOs with diameters of over 120 meters or 0.12 miles by 2020. However, these asteroids are more difficult to detect because of their smaller size, and they are also more abundant than large NEOs. Apart from that, the NEO program got entangled in a complicated financial and administrative mess. The program's expanded budget created a tangle of differences with no main supervision. "Even with a ten-fold increase in the NEO Program budget in the past five years - from $4 million in fiscal year (FY) 2009 to $40 million in FY 2014 – NASA estimates that it has identified only about 10 percent of all asteroids 140 meters and larger," said Paul Martin, NASA's inspector general who audited the company in September 2014. This became the kick that NASA needed to reorganize its program and therefore form the PDCO to help efforts get back on track.

Discover hidden collaborations