News Article | October 26, 2016
The first sign that North Korea had carried out its fifth nuclear test last month was when a massive earthquake was detected by international scientists. As North Korea marks 10 years since its first test, geophysicist and disaster researcher Mika McKinnon explains how scientists have learned to identify these world-shaking events. Every stage of North Korea's nuclear weapons development programme is under global observation, with scientists using data to extract the true state of their progress. A worldwide network of sensors is constantly collecting data, seeking the faintest rumble of a weapons test. When one takes place, the explosion slams into its surroundings. The vibrations propagate through water, air, and earth, too low frequency to be heard by human ears but picked up by sensors and relayed to control centres for analysis. On 9 October 2006, seismometers around the world lit up, detecting a significant explosion in North Korea. The infrasound network stayed silent, the lack of an atmospheric blast indicating it happened underground away from prying satellites. Yet more silence on underwater hydro acoustic stations confirmed it had been muffled by rock, not water. The scientist triangulated the direction and time of seismic waves to pinpoint the explosion's origin under mountains in North Korea. In the following days, monitoring stations sniffed the air looking for the faintest trace of radioactive isotopes. If any isotopes escaped from the test tunnels to drift in the wind, they'd be unmistakable relics of radiation from a nuclear blast and rule out any chance of a conventional weapon. Two weeks later, a radionuclide station in Yellowknife, Canada downwind of the North Korea test site picked up elevated levels of Xenon 133, a radioactive product of nuclear fission. The data was processed at the International Data Centre in Vienna, headquarters for the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), confirming a nuclear weapons test. This disturbing pattern of an earthquake followed by North Korea confirmation of a nuclear test was repeated in May 2009, February 2013, January 2016, and most recently on 9 September this year. Each time, scientists knew the test has happened before North Korea formally confirmed them. And each burst of energy from North Korea has been larger than the one before, with the Norwegian monitoring station Norstar calculating that the most recent test indicated an explosive yield of approximately 20 kilotons of TNT. The backbone of this monitoring programme are seismometers, extremely sensitive microphones attuned to the Earth's quivers. Everything from a catastrophic earthquake to a passing truck has a distinct vibration. Earthquakes start with a quick, sharp pressure wave followed by the more destructive shear and surface waves that shake buildings from their foundations. Landslides are a drawn-out jumble of debris rolling down slopes, and volcanic eruptions can be spotted with the sudden blast of a breaking rock propelled by lava and gas. Seismometers near beaches pick up a regular surge of seismic signals, a gentle steady beat generated by waves crashing on sand every moment of every day. Explosions, too, have their own unique signature, a blast of high-intensity chaotic seismic energy confined to a brief duration. Scientists first realised this during the Cold War, with the United States funding research into seismology to remotely track progress of the Soviet Union's weapons development programme. Before long, scientists worldwide collaborated to create standards for measuring seismic energy and to share their observations, progressing our understanding of earthquakes along with their more sombre monitoring task. This information-sharing of the 1960s was the kernel that evolved into today's global network of science stations that verify nations obey the Comprehensive Nuclear Test Ban Treaty. Every nation has its own seismic network, but the CTBTO runs an independent network of strategically spread stations around the world for independent, consistent monitoring. Every day, seismic stations pick up the blasts from mining operations, seismic surveys, or military exercises. The explosions are identified by their small size and location, dismissed from concerns of clandestine weapons tests unless the radionuclide stations find unexpected gases. Many more signals are mundane in nature - the daily noises of people crawling all over this busy planet. Busy highways with roaring trucks and clattering trains on regular schedules are common sources of seismic noise. Even heavy footfalls of hikers exploring the paths used to service seismic stations are all picked up, recorded as small regular hiccups in the seismograph. Once, a team of scientists monitoring a seismic feed were concerned by a sudden localised signal at a station on Vancouver Island in Canada. Fearful someone was attempting to vandalise their equipment, the scientists checked on the seismometer only to discover a pair of teenagers using the encasing protective vault to get better acquainted. The discovery shed light on other signals detected by seismometers in scenic, secluded, yet accessible settings, the rhythmic peaks of waxing and waning energy finally tied to a specific activity. Once identified, carefully serious scientists were able to recognise it at other locations, spotting unexpected Lover's Lanes twisting through remote seismic stations. Not every seismic detection tells a horrifying story, but just rarely they pick up something special, something big, intense, and scary: a nuclear test. Mika McKinnon is a US-based geophysicist and disaster researcher. Follow her on Twitter: @Mika McKinnon
News Article | February 2, 2016
Machine learning software is helping the Comprehensive Nuclear-Test-Ban Treaty Organization monitor the globe for evidence of nuclear tests. When North Korea conducted its recent nuclear weapon test, the blast had been detected by a global seismic sensing network operated by the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The network, called the International Monitoring System, aims to “make sure that no nuclear explosion goes undetected.” Software designed in part by a Brown University computer scientist is helping to do just that. The most recent North Korean test wasn’t terribly difficult to detect. It was a fairly large blast, it occurred in a place where a test wasn’t surprising, and the North Korean government made no effort to hide it. But clandestine tests of smaller devices, perhaps by terrorist organizations or other non-state actors, are a different story. It’s those difficult-to-detect events that VISA — a machine learning system that Brown’s Erik Sudderth helped to design — aims to find. The International Monitoring System includes 149 certified seismic monitoring stations around the globe. Those stations send data to the CTBTO’s Vienna headquarters, where analysts compile all seismic events into a daily bulletin supplied to nations around the world. The vast majority of events detected by the system are natural — earthquakes and seismic tremors of various sorts. But occasionally, like recently in North Korea, an event is triggered by a large explosion. Analysts can easily pick out unnatural events from the characteristics of the seismic waveforms they create, but before they can determine whether an event is unnatural, they need to know that an event has occurred. “You have hundreds of stations all over the world producing high-dimensional data that’s streaming in 24x7,” said Sudderth, assistant professor of computer science. “[People] can’t look at all the data all the time. They need the help of automated tools.” Those automated tools keep a constant eye on every station and create a log of potential local detections. They also combine data from multiple stations to hypothesize the time, location and magnitude of plausible seismic events. Analysts then look at those data to determine if indeed each detection was from a seismic event or just represents random noise. Once an event is confirmed to be real, analysts review it to determine whether it was natural or human-made. The Vertically Integrated Seismic Analysis (VISA) project began in 2007, when the CTBTO was looking for an upgrade for its software system. The older software was making lots of mistakes, Sudderth said. It was wasting analysts’ time with false positives. But most critically, it was missing lots of smaller events and making errors in triangulating the exact position of events. “It was the job of the analysts to clean up the results of the automated system to some acceptable level of accuracy,” Sudderth said. For him and his colleagues, the raw automated data, combined with data that had been cleaned up by experts, was a goldmine. “This is what we in machine learning think of as training data,” he said. “We can look at an event that we have reasonable confidence was real and look at its relationship to what the station actually measured. What do the errors look like? What are the biases? We use this historical data to get calibrated models of these things.” Out of that came a system that models the multiple layers of uncertainty that occur in the processes of generating seismic events and in detecting them. Seismic waves travel in different ways through different types of rock. That introduces uncertainty because there is no high-resolution map of rock types across the entire surface of the Earth. On the detection side, each sensor works a little differently, and all of them are subject to many types of random noise — from wave activity in marine sensors to vehicle traffic on land. Combining these statistical analyses leads to a comprehensive generative model of the rates at which seismic events of various sizes occur in various locations and the ways that energy from these events propagates to seismic sensors. Sudderth and his team devised an efficient inference algorithm that can scan incoming data to find events that likely represent an actual seismic signal. VISA has been up and running at the CTBTO’s headquarters in Vienna for the last four years or so. It currently runs as a kind of backstop for the organization’s original automated system, and the CTBTO has asked the member states to approve it officially as a replacement. “As it is now, the analysts have kind of a VISA button [on their computers],” Sudderth said. “They can press the VISA button and it says, ‘Here’s a bunch of stuff we think you missed.’” The analysts can then decide whether or not those events should be included in the daily seismic bulletin. Sudderth and his colleagues have shown that VISA can reduce the number of missed events by 60 percent compared to the original system. It can also provide more accurate location information in many cases. For example, VISA did a better job than the older system in pinpointing the location of a prior North Korean nuclear test in 2013, Sudderth said. He and his colleagues at the University of California recently published a paper on their work in the Bulletin of the Seismological Society of America. The International Society for Bayesian Analysis awarded this paper the Mitchell Prize, which recognizes an outstanding paper that uses Bayesian analysis to solve an important applied problem. Sudderth was able to confirm that VISA did detect the recent North Korean test, but that blast was large enough that most traditional systems would have caught it as well. “If all you cared about was finding events in North Korea, there are simpler, more targeted things you could do,” Sudderth said. “Where this system would potentially be a lot more powerful would be catching someone trying to do a clandestine test of a smaller device somewhere that you don’t know is a test site.” Sudderth hopes VISA’s ability to detect more events might eventually aid in gaining full ratification of the Comprehensive Nuclear-Test-Ban Treaty. To be enforced by international law, the world’s 44 nuclear nations must ratify the treaty. Eight of those nations, including the United States, have yet to do so. President Clinton signed the treaty in 1996, but Congress refused to ratify it. “One of the things that was raised as a concern about ratifying it was the difficulty in verification,” Sudderth said. “If the technical systems for validation aren’t good enough, then countries aren’t going to be willing to ratify. So, that’s one thing this work is trying to address and remove as an obstacle.”
Van Der Schaar M.,Polytechnic University of Catalonia |
Ainslie M.A.,TNO |
Robinson S.P.,National Physical Laboratory United Kingdom |
Prior M.K.,CTBTO |
Andre M.,Polytechnic University of Catalonia
Journal of Marine Systems | Year: 2014
The growing scientific and societal concerns about the effects of underwater sound on marine ecosystems have been recently recognised through the introduction of several international initiatives, like the International Quiet Ocean Experiment, aimed at measuring the environmental impact of ocean noise on large spatial and temporal scales. From a regulatory perspective, the European Marine Strategy Framework Directive includes noise (and other forms of energy) as one of eleven descriptors of good environmental status of Europe's seas. The directive requires member states to monitor trends in annually averaged sound. The Laboratory of Applied Bioacoustics has developed a software package that measures sound levels and monitors acoustic sources in real-time; this software was used for the LIDO project (www.listentothedeep.com), which originated from the European Seafloor Observatory Network of Excellence (ESONET-NoE; www.esonet-noe.org). The system is currently operating worldwide from several wired and radio-linked observatories. The CTBTO (Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization) has made available years of data from hydroacoustic stations to look for ambient sound trends and to detect cetacean presence. Here, we present the analysis of four CTBTO platforms (located in the Pacific, Atlantic and Indian oceans), covering 42. months of data, intended to detect annual and monthly changes or trends in the ambient sound levels. © 2013 Elsevier B.V.
News Article | February 20, 2017
Nuclear scientists are struggling to determine the source of small amounts of nuclear radiation that bloomed over Europe throughout January. France's IRSN institute, the public body for radiological and nuclear risks, announced in a statement on February 13 that Iodine-131, a radionuclide of human origin, was detected in trace amounts at ground-level atmosphere in continental Europe. First detected in the second week of January over northern Norway, Iodine-131 presence was then detected over Finland, Poland, Germany, Czech Republic, France, and Spain. However, the levels have since returned to normal and scientists have yet to determine the source of the radiation. Norway's Radiation protection Authority (NRPA), which first detected the Iodine-131 over its northern Russian border, told Motherboard over the phone today that the levels present essentially no risk to human health. "I can assure you that the levels are low," said a press a spokesperson. But with a half-life of just eight days, the detection of Iodine-131 is proof of a recent release, said IRSN in its statement to the media. Rumors are circulating, of course, that Russia has secretly tested a low-yield nuclear weapon in the Arctic, possibly in the Novaya Zemlya region—historically used for Russia's nuclear tests. Iodine-131, discovered by two University of California researchers in 1938, is a radioisotope synonymous with the atomic bomb tests carried out by the US and Russia throughout the 1950s, and has recently presented threats from leaking during the Chernobyl nuclear power plant disaster and the 2011 Fukushima nuclear accident. But Iodine-131 is also found in the medical industry, commonly used for treating thyroid-related illnesses and cancers. Astrid Liland, head of the section for emergency preparedness at the NRPA, told Motherboard in an email today, "Since only Iodine-131 was measured, and no other radioactive substances, we think it originates from a pharmaceutical company producing radioactive drugs. Iodine-131 is used for treatment of cancer." Britain's Society for Radiological Protection (SRP) also told Motherboard that the exclusive presence of Iodine-131 suggests the source is not a nuclear incident, but rather a medical facility such as a hospital or a supplier of radio-pharmaceuticals. "The release was probably of recent origin. Further than this it is impossible to speculate," the SRP's Brian Gornall told Motherboard in an email. Still, where exactly that pharma company could be located is unknown. "Due to rapidly changing winds, it is not possible to track exactly where it came from. It points to a release source somewhere in Eastern Europe," Liland told Motherboard. The Iodine cloud prompted the United States Air Force to send over a specialized particle-sniffing aircraft to investigate. As per reports on The Aviationist, a US Air Force WC-135 deployed to Royal Air Force base Mildenhall in the UK on February 17, equipped to test the atmosphere over Europe for radiation. The aircraft's last intercontinental expedition was to analyse the atmosphere over the Korean Peninsula following an alleged North Korean nuclear test. The deployment spurred on rumors of a nuclear test from Russia, but a spokesperson for the the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), an international body that monitors nuclear weapon tests, told Motherboard in an email today, "Although some readings of I-131 above minimal detection level have been observed since beginning of year in Europe nothing extraordinary has been measured." The IRSN said in its statement that the data has now been shared between the members of the informal European network called Ring of Five, a group of organizations that research radiation levels in the atmosphere. Get six of our favorite Motherboard stories every day by signing up for our newsletter .
Matoza R.S.,CEA DAM Ile-de-France |
Matoza R.S.,University of California at San Diego |
Landes M.,CEA DAM Ile-de-France |
Le Pichon A.,CEA DAM Ile-de-France |
And 2 more authors.
Geophysical Research Letters | Year: 2013
The ability of the InternationalMonitoring System (IMS) infrasound network to detect atmospheric nuclear explosions and other signals of interest is strongly dependent on stationspecific ambient noise. This ambient noise includes both incoherent wind noise and real coherent infrasonic waves. Previous ambient infrasound noise models have not distinguished between incoherent and coherent components. We present a first attempt at statistically and systematically characterizing coherent infrasound recorded by the IMS. We perform broadband (0.01-5Hz) array processing with the IMS continuous waveform archive (39 stations from 1 April 2005 to 31 December 2010) using an implementation of the Progressive Multi-Channel Correlation algorithm in logfrequency space. From these results, we estimate multi-year 5th, 50th, and 95th percentiles of the RMS pressure of coherent signals in 15 frequency bands for each station. We compare the resulting coherent infrasound models with raw power spectral density noise models, which inherently include both incoherent and coherent components. Our results indicate that IMS arrays consistently record coherent ambient infrasound across the broad frequency range from 0.01 to 5Hz when wind noise levels permit. The multi-year averaging emphasizes continuous signals such as oceanic microbaroms, as well as persistent transient signals such as repetitive volcanic, surf, thunder, or anthropogenic activity. Systematic characterization of coherent infrasound detection is important for quantifying a station's recording environment, signal-to-noise ratio as a function of frequency and direction, and overall performance, which all influence the detection probability of specific signals of interest. © 2013. American Geophysical Union. All Rights Reserved.
Fee D.,University of Alaska Fairbanks |
Waxler R.,University of Mississippi |
Assink J.,University of Mississippi |
Gitterman Y.,Geophysical Institute of Israel |
And 8 more authors.
Journal of Geophysical Research: Atmospheres | Year: 2013
Three large-scale infrasound calibration experiments were conducted in 2009 and 2011 to test the International Monitoring System (IMS) infrasound network and provide ground truth data for infrasound propagation studies. Here we provide an overview of the deployment, detonation, atmospheric specifications, infrasound array observations, and propagation modeling for the experiments. The experiments at the Sayarim Military Range, Israel, had equivalent TNT yields of 96.0, 7.4, and 76.8 t of explosives on 26 August 2009, 24 January 2011, and 26 January 2011, respectively. Successful international collaboration resulted in the deployment of numerous portable infrasound arrays in the region to supplement the IMS network and increase station density. Infrasound from the detonations is detected out to ~3500 km to the northwest in 2009 and ~6300 km to the northeast in 2011, reflecting the highly anisotropic nature of long-range infrasound propagation. For 2009, the moderately strong stratospheric wind jet results in a well-predicted set of arrivals at numerous arrays to the west-northwest. A second set of arrivals is also apparent, with low celerities and high frequencies. These arrivals are not predicted by the propagation modeling and result from unresolved atmospheric features. Strong eastward tropospheric winds (up to ~70 m/s) in 2011 produce high-amplitude tropospheric arrivals recorded out to >1000 km to the east. Significant eastward stratospheric winds (up to ~80 m/s) in 2011 generate numerous stratospheric arrivals and permit the long-range detection (i.e., >1000 km). No detections are made in directions opposite the tropospheric and stratospheric wind jets for any of the explosions. Comparison of predicted transmission loss and observed infrasound arrivals gives qualitative agreement. Propagation modeling for the 2011 experiments predicts lower transmission loss in the direction of the downwind propagation compared to the 2009 experiment, consistent with the greater detection distance. Observations also suggest a more northerly component to the stratospheric winds for the 2009 experiment and less upper atmosphere attenuation. The Sayarim infrasound calibration experiments clearly demonstrate the complexity and variability of the atmosphere, and underscore the utility of large-scale calibration experiments with dense networks for better understanding infrasound propagation and detection. Additionally, they provide a rich data set for future scientific research. Key Points Three large ground-truth infrasound experiments were conducted in 2009 and 2011 Strong wind jets permitted long range detection Atmospheric specifications sufficient for qualitative propagation modeling © 2013. American Geophysical Union. All Rights Reserved.
Caudron C.,Nanyang Technological University |
Taisne B.,Nanyang Technological University |
Garces M.,University of Hawaii at Manoa |
Alexis L.P.,CEA DAM Ile-de-France |
Geophysical Research Letters | Year: 2015
The February 2014 eruption of Kelud volcano (Indonesia) destroyed most of the instruments near it. We use remote seismic and infrasound sensors to reconstruct the eruptive sequence. The first explosions were relatively weak seismic and infrasound events. A major stratospheric ash injection occurred a few minutes later and produced long-lasting atmospheric and ground-coupled acoustic waves that were detected as far as 11,000 km by infrasound sensors and up to 2300 km away on seismometers. A seismic event followed ∼12 minutes later and was recorded 7000 km away by seismometers. We estimate a volcanic intensity around 10.9, placing the 2014 Kelud eruption between the 1980 Mount St. Helens and 1991 Pinatubo eruptions intensities. We demonstrate how remote infrasound and seismic sensors are critical for the early detection of volcanic explosions, and how they can help to constrain and understand eruptive sequences. © 2015. The Authors.
Koohkan M.R.,ParisTech National School of Bridges and Roads |
Koohkan M.R.,French Institute for Research in Computer Science and Automation |
Bocquet M.,ParisTech National School of Bridges and Roads |
Bocquet M.,French Institute for Research in Computer Science and Automation |
And 3 more authors.
Atmospheric Environment | Year: 2012
The International Monitoring System (IMS) radionuclide network enforces the Comprehensive Nuclear-Test-Ban Treaty which bans nuclear explosions. We have evaluated the potential of the IMS radionuclide network for inverse modelling of the source, whereas it is usually assessed by its detection capability. To do so, we have chosen the degrees of freedom for the signal (DFS), a well established criterion in remote sensing, in order to assess the performance of an inverse modelling system. Using a recent multiscale data assimilation technique, we have computed optimal adaptive grids of the source parameter space by maximising the DFS. This optimisation takes into account the monitoring network, the meteorology over one year (2009) and the relationship between the source parameters and the observations derived from the FLEXPART Lagrangian transport model. Areas of the domain where the grid-cells of the optimal adaptive grid are large emphasise zones where the retrieval is more uncertain, whereas areas where the grid-cells are smaller and denser stress regions where more source variables can be resolved.The observability of the globe through inverse modelling is studied in strong, realistic and small model error cases. The strong error and realistic error cases yield heterogeneous adaptive grids, indicating that information does not propagate far from the monitoring stations, whereas in the small error case, the grid is much more homogeneous. In all cases, several specific continental regions remain poorly observed such as Africa as well as the tropics, because of the trade winds. The northern hemisphere is better observed through inverse modelling (more than 60% of the total DFS) mostly because it contains more IMS stations. This unbalance leads to a better performance of inverse modelling in the northern hemisphere winter. The methodology is also applied to the subnetwork composed of the stations of the IMS network which measure noble gases. © 2012 Elsevier Ltd.
Applied Radiation and Isotopes | Year: 2010
A worldwide radionuclide network of 80 stations, part of the International Monitoring System, is being setup to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The radioactivity sampled at these stations is primarily 220Rn progenies. Using the knowledge of the diurnal change of the 220Rn progeny 212Pb the sampled activity at the end of the sampling process can be minimised by choosing the right collection start time. It is shown that improvements of several percents in the minimum detectible concentration (MDC) for CTBT relevant nuclides can be achieved. © 2009 Elsevier Ltd. All rights reserved.
Journal of Environmental Radioactivity | Year: 2010
A worldwide radionuclide network of 80 stations, part of the International Monitoring System, is being setup to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The radioactivity sampled at these stations is primarily 220Rn progenies affecting the detection capability. A model linking the 220Rn emanation with the sampled 212Pb activity was developed and is presented here. The model and the performed measurements show that the variation of the sampled 212Pb activity can be fully explained by the variation of the local 220Rn activity concentration. © 2009 Elsevier Ltd. All rights reserved.