The Stuxnet computer worm discovered in 2012 set alarm bells ringing in industry and public sector offices all over the world. This very advanced software worm had the ability to infect and disable industrial process control systems. The scary thing was that the worm had crept its way into many of the most common industrial control systems. If a state or a hacker is able to spread malignant software so widely, what can we expect next? The Internet of Things, virtually connecting everything to everyone, is rapidly proliferating to businesses, public sector offices and our homes. How can we defend ourselves against a threat when we don't know what it looks like, or where it will strike next? Researchers at SINTEF are working to find a way of counteracting such threats. They are developing methods that will enable companies and public sector agencies to manage threats and attacks, including those that no-one has thought of. "Society is under pressure from new threat and vulnerability patterns", says Tor Olav Grøtan, a Senior Research Scientist at SINTEF. "Standard approaches involving defence systems based on clear control procedures and responsibility are inadequate when the risk is moving around between a diversity of areas and sectors. There is an urgent need for innovative thought and new approaches", he says. Grøtan is heading the project "New Strains of Society", which is aiming to develop new scientific theories in the field of hidden, dynamic and, what researchers call, "emergent" vulnerabilities. SINTEF's research partners are the Norwegian University of Science and Technology (NTNU), the Norwegian Defence Research Establishment (FFI), and the University of Tulsa in the USA. Professor Sujeet Shenoi at the University of Tulsa is closely involved. He lectures his students on "ethical hacking", with the aim of raising expertise in the US public sector to the same levels as those possessed by malicious experts and hackers. For the last twenty years, Professor Shenoi has been instructing almost 400 Master's and Doctoral (PhD) students in how to hack into public and private sector networks. The students need security clearance and must undertake to work in the American public sector after they have qualified. With the consent of the owners, the students have penetrated deep into computer systems controlling payment terminals, smart electricity meters, gas pipelines, coal mines and wind farms. They have succeeded every time. "Someone or other, not necessarily us, has the ability to break into any computer system", says Shenoi. "We have to live with this and manage it, and that is why the concept of resilience (the dynamic ability to resist and adapt) is so important", he says. Professor Shenoi sees Norway as an ideal location for the development of such resilience. "Norway is one of the most digital countries in the world", he says. "With a relatively small population of 5.2 million, it can become a whole-world laboratory. This is not easy to achieve in the USA, which is too big and too diverse", he says. SINTEF and its partners are looking into three so-called 'threat landscapes': oil industry activity in the high north, a global pandemic, and ICT systems embedded in critical infrastructure in the oil and electrical power sectors. A workshop was held recently with the aim of addressing vulnerabilities in the energy sector. It was attended by representatives from the Norwegian Ministry of Justice and Public Security, the Norwegian National Security Authority (NSM), the Norwegian Communications Authority (NKOM), the US National Security Agency (NSA), the Norwegian Water Resources and Energy Directorate (NVE), the Norwegian Petroleum Safety Authority, research scientists, consultants and businesses. "We were there to test a new method of exposing unknown threats and vulnerabilities, and to prepare a stress test", says Grøtan. "People from the oil and electrical power sectors, who aren't normally thinking on the same wavelength, had the chance to work and reflect on issues together. We will apply this experience as the project progresses as part of our work to develop a stress test method designed to investigate how well an organisation is equipped to handle an unexpected situation", he says. And the need is urgent. In 2014 Statnett and hundreds of other Norwegian energy sector companies were subject to a large-scale hacker attack. They are not alone. All sectors of society are under attack and the number of attacks increases every year. For example, Statoil intercepts 10 million spam e-mails every month. Opening an e-mail attachment is a very common way of allowing malignant software to enter a company's computer systems. Another is when careless employees give system access to subcontractors and other external parties. Explore further: Iran says Duqu malware under 'control'
News Article | August 18, 2016
The United Nations climate change conference held last year in Paris had the aim of tackling future climate change. After the deadlocks and weak measures that arose at previous meetings, such as Copenhagen in 2009, the Paris summit was different. The resulting Paris Agreementcommitted to: The agreement was widely met with cautious optimism. Certainly, some of the media were pleased with the outcome while acknowledging the deal’s limitations. Many climate scientists were pleased to see a more ambitious target being pursued, but what many people fail to realise is that actually staying within a 1.5℃ global warming limit is nigh on impossible. There seems to be a strong disconnect between what the public and climate scientists think is achievable. The problem is not helped by the media’s apparent reluctance to treat it as a true crisis. In 2015, we saw global average temperatures a little over 1℃ above pre-industrial levels, and 2016 will very likely be even hotter. In February and March of this year, temperatures were 1.38℃ above pre-industrial averages. Admittedly, these are individual months and years with a strong El Niñoinfluence (which makes global temperatures more likely to be warmer), but the point is we’re already well on track to reach 1.5℃ pretty soon. So when will we actually reach 1.5℃ of global warming? Timeline showing best current estimates of when global average temperatures will rise beyond 1.5℃ and 2℃ above pre-industrial levels. Boxes represent 90% confidence intervals; whiskers show the full range. Image via Andrew King. On our current emissions trajectory we will likely reach 1.5℃ within the next couple of decades (2024 is our best estimate). The less ambitious 2℃ target would be surpassed not much later. This means we probably have only about a decade before we break through the ambitious 1.5℃ global warming target agreed to by the world’s nations in Paris. A University of Melbourne research group recently published these spiral graphs showing just how close we are getting to 1.5℃ warming. Realistically, we have very little time left to limit warming to 2℃, let alone 1.5℃. This is especially true when you bear in mind that even if we stopped all greenhouse gas emissions right now, we would likely experience about another half-degree of warming as the oceans “catch up” with the atmosphere. The public seriously underestimates the level of consensus among climate scientists that human activities have caused the majority of global warming in recent history. Similarly, there appears to be a lack of public awareness about just how urgent the problem is. Many people think we have plenty of time to act on climate change and that we can avoid the worst impacts by slowly and steadily reducing greenhouse gas emissions over the next few decades. This is simply not the case. Rapid and drastic cuts to emissions are needed as soon as possible. In conjunction, we must also urgently find ways to remove greenhouse gases already in the atmosphere. At present, this is not yet viable on a large scale. The 1.5℃ and 2℃ targets are designed to avoid the worst impacts of climate change. It’s certainly true that the more we warm the planet, the worse the impacts are likely to be. However, we are already experiencing dangerous consequences of climate change, with clear impacts on society and the environment. For example, a recent study found that many of the excess deaths reported during the summer 2003 heatwave in Europe could be attributed to human-induced climate change. Also, research has shown that the warm seas associated with the bleaching of the Great Barrier Reef in March 2016 would have been almost impossible without climate change. Climate change is already increasing the frequency of extreme weather events, from heatwaves in Australia to heavy rainfall in Britain. These events are just a taste of the effects of climate change. Worse is almost certainly set to come as we continue to warm the planet. It’s highly unlikely we will achieve the targets set out in the Paris Agreement, but that doesn’t mean governments should give up. It is vital that we do as much as we can to limit global warming. The more we do now, the less severe the impacts will be, regardless of targets. The simple take-home message is that immediate, drastic climate action will mean far fewer deaths and less environmental damage in the future. By Andrew King, Climate Extremes Research Fellow, University of Melbourne and Benjamin J. Henley, Research Fellow in Climate and Water Resources, University of Melbourne. This article has been cross-posted from The Conversation.
Lead contamination is the most troubling in a series of water problems that have plagued Flint since the summer of 2014. All of them were caused by corrosion in the lead and iron pipes that distribute water to city residents. When the city began using the Flint River as its water source in April 2014, it didn’t adequately control the water’s ability to corrode those pipes. This led to high lead levels, rust-colored tap water, and possibly the growth of pathogenic microbes. Flint isn’t the only city susceptible to these problems. The pipes in its old distribution system had seen the same water for decades. Switching water supplies in 2014 changed the chemistry of the water flowing through those pipes. When a switch like this happens, the water system is going to move toward a new equilibrium, says Daniel Giammar, an environmental engineer at Washington University in St. Louis. “It could be catastrophic as it was in Flint, or it could be a small change.” Before 2014, Flint was getting its water from the Detroit Water & Sewerage Department, which would draw water from Lake Huron and then treat it before sending it to Flint. Looking to lower the city’s water costs, Flint officials decided in 2013 to instead take water from the Karegnondi Water Authority, which was building its own pipeline from the lake. Shortly after that, Detroit told Flint it would terminate their original long-term water agreement within a year and offered to negotiate a new, short-term agreement. Flint declined the offer. As an interim solution, while waiting for the new pipeline to be finished, Flint began taking water from the Flint River and treating it at the city’s own plant. Problems with the city’s tap water started the summer after the switch. First, residents noticed foul-tasting, reddish water coming out of their taps. In August and September, the city issued alerts about Escherichia coli contamination and told people to boil the water before using it. A General Motors plant stopped using the water in October because it was corroding steel parts. In December, the Michigan Department of Environmental Quality notified Flint that its water was in violation of national drinking water standards because it contained high levels of trihalomethanes, toxic by-products of chlorine disinfection. That same month a team led by Mona Hanna-Attisha, a pediatrician at Hurley Children’s Hospital, in Flint, released data showing that the number of Flint children with elevated levels of lead in their blood had increased since the water change. The percentage of affected kids went from 2.4% to 4.9%, according to a paper they published recently (Am. J. Public Health 2016, DOI: 10.2105/ajph.2015.303003). In areas with the highest lead concentrations in the water, about 10% of the children had elevated blood levels of the element. Lead is neurotoxic and can disrupt children’s development, leading to behavioral problems and decreased intelligence. Most important, the treated Flint River water lacked one chemical that the treated Detroit water had: phosphate. “They essentially lost something that was protecting them against high lead concentrations,” Giammar says. Cities such as Detroit add orthophosphate to their water as part of their corrosion control plans because the compound encourages the formation of lead phosphates, which are largely insoluble and can add to the pipes’ passivation layer. By press time, C&EN was unable to get a comment from Flint city officials about why a corrosion inhibitor wasn’t added to the river water. The entire Flint water crisis could have been avoided if the city had just added orthophosphate, Edwards says. He bases his opinion, in part, on experiments his group ran on the treated Flint River water. The researchers joined copper pipes with lead solder and then placed the pieces in either treated Flint River water or treated Detroit water. After five weeks in the Flint water, the joined pipes leached 16 times as much lead as those in the Detroit water, demonstrating just how corrosive the treated Flint water was. But when the scientists added a phosphate corrosion inhibitor to the Flint water, the factor went down to four. The pH drop over time seems to indicate that plant operators in Flint didn’t even have a target pH as part of a corrosion plan, Edwards says. Water utilities usually find a pH that’s optimal for preventing corrosion in their system. For example, in Boston, another city with old lead pipes, average water pH held steady around 9.6 in 2015, according to reports from the Massachusetts Water Resources Authority. By press time, C&EN wasn’t able to get a comment from Flint city officials about whether they had a target pH for the water. Disinfection by-products such as trihalomethanes can form through reactions between organic matter in water and chlorine disinfectant added at treatment plants. The Flint plant had increased the amount of chlorine it used in the summer of 2014 to combat the E. coli contamination problem. To reduce levels of trihalomethanes that formed, the plant removed organic matter from the water by adding ferric chloride, which coagulates organic matter, making it easier to filter out. Even though the treatment took care of the trihalomethanes problem, it increased the water’s chloride levels. Susan J. Masten, an environmental engineer at Michigan State University, points out that the Flint water distribution system has another issue that could have worsened both the corrosion and disinfection problems. Much of the distribution system was built when the city’s population was about 200,000 and Flint was a major manufacturing center. But the city now has less than half the population, and much of the industry, which used a lot of Flint’s water, has left town. As a result, water usage has dropped significantly, while the system’s capacity has remained the same. Now that Flint has switched back to the Detroit water, it may take months to a year for pipes to regain their passivation layers, for corrosion to slow to normal levels, and for lead concentrations to drop back into an acceptable range, say the environmental engineers that C&EN contacted. The lesson from Flint, they say, is to continually monitor water chemistry, especially when switching between water supplies.
News Article | April 6, 2016
In an effort to fight drought in California, residents and businesses were asked to cut water use by 25 percent. After nine months of water conservation efforts, the drought-stricken state narrowly missed its water saving goal. State regulators said urban residents were able to cut back their water usage by 23.9 percent from June 2015 to February 2016, just 1 percent short from Democratic Governor Jerry Brown's 25 percent goal stated on his April 1, 2015 executive order. The water saving equates to about 1.19 million acre-feet of water, about 96 percent of the 1.24 million acre-feet of goal set for February. Conservation efforts saved about 368 billion gallons of water, which can supply almost 6 million Californians for one year. "Twenty-four percent savings shows enormous effort and a recognition that everyone's efforts matters," said [pdf] State Water Resources Control Board chairwoman Felicia Marcus. "Californians rose to the occasion, reducing irrigation, fixing leaks, taking shorter showers, and saving our precious water resources in all sorts of ways." However, this is not enough reason to celebrate, as drought is far from over. "The drought is not over," said Max Gomberg, the water board's climate and conservation manager. "Conservation habits are still important heading into this summer." Even if the water conservation only missed the target by a small margin, non-compliant water suppliers would still be slapped with penalties. Residents and businesses only cut usage by 12 percent, which was the lowest monthly reduction since the mandate became effective in June. Statistics showed that Southern California residents only cut down usage by only 6.9 percent dragging the state's savings. A couple of water providers missed their monthly targets forcing regulators to slap them with penalties. Last fall, four suppliers were fined $61,000 for non-compliance. Coachella Valley Water District and Indio Water Authority promised to create environmental projects to reduce water usage. With the success of the cutback, officials are highly likely to propose regional water conservation efforts, which may relax drought orders in certain areas like in Northern California where some reservoirs are spilling already. Even if water cutback regulations were relaxed, Marcus still encourages Californians to continue conserving water because California is yet to recover from the exceptional drought that spanned four long years. The state regulators will conduct a public workshop later this month to tackle important steps for water conservation. Any new regulations would take effect in June. A study published in the journal Nature Climate Change states that the four-year drought is the driest California has ever been in 500 years. Researchers found that the Sierra snowpack, which the state relies on, could be in its lowest levels in 3,000 years. © 2016 Tech Times, All rights reserved. Do not reproduce without permission.
Gilmar Lima, president of the Latin American Composite Materials Association (ALMACO), and Antonio Bonetti, Secretary of the Environment and Water Resources of the state of Paraná, Brazil, have agreed to create a project for the environmentally-friendly re-use of composite parts. Created in late 2014 by ALMACO, the program for composite parts covers state capital Curitiba and other 29 cities, and initially comprises bus components such as ceilings, railings and bumpers. During the first year, the organization aims to re-use five tons of composites for co-processing in cement kilns, an alternative recognized as environmentally friendly. ‘If [manufacturers] do not join the program, they will be subject to expensive fines, similar to what happened in the segment of tires and oil filters,’ reported Paulo Camatta, executive manager of ALMACO. The plan drawn up by ALMACO has the support of Masimon consulting and twelve companies of composites production chain: Ashland, CPIC, Jushi, Marcopolo, Mascarello, Morquímica, MVC, Neobus, Owens Corning, Reichhold, Royal Polymers and Tecnofibras. Also support the project the National Association of Bus Manufacturers (FABUS) and the Industry Interstate Association of Rail and Highway Material and Equipment (SIMEFRE). This story uses material from ALMACO, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier. This story uses material from ALMACO, with editorial changes made by Materials Today. The views expressed in this article do not necessarily represent those of Elsevier.