Entity

Time filter

Source Type

Maupin, OR, United States

Gibbons S.J.,NORSAR | Kvaerna T.,NORSAR | Harris D.B.,Deschutes Signal Processing LLC | Dodge D.A.,Lawrence Livermore National Laboratory
Seismological Research Letters | Year: 2016

Aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams that are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace detectors and/or empirical matched field processing. Source


Harris D.B.,Deschutes Signal Processing LLC | Gibbons S.J.,NORSAR | Rodgers A.J.,Lawrence Livermore National Laboratory | Pasyanos M.E.,Lawrence Livermore National Laboratory
IEEE Signal Processing Magazine | Year: 2012

One branch of signal processing in geophysics has undergone significant long-term development due to the requirements of nuclear test ban monitoring. There was a burst of activity in the 1960s and 1970s due to the Limited Test Ban Treaty (1963), which banned tests in the atmosphere, underwater, and in outer space, and the Threshold Test Ban Treaty (1974), which placed a cap on explosive yield at 150 kt. These treaties drove testing underground and created requirements for detecting, locating, and identifying explosions, and estimating their yield through observations of seismic waves. © 1991-2012 IEEE. Source


Wang J.,Lawrence Livermore National Laboratory | Templeton D.C.,Lawrence Livermore National Laboratory | Harris D.B.,Deschutes Signal Processing LLC
Transactions - Geothermal Resources Council | Year: 2011

We aim to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 2972 events were identified in our study area during January 2008 and December 2010. We use this earthquake catalog to identify the best potential empirical master templates. We create 242 master templates with at least four stations with good quality. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during January 2010. The MFP method successfully identified 1115 events. Therefore, we believe that the empirical MFP method combined with conventional methods significantly improves network detection capabilities. Source


Kuhn D.,NORSAR | Oye V.,NORSAR | Albaric J.,NORSAR | Harris D.,Deschutes Signal Processing LLC | And 3 more authors.
Energy Procedia | Year: 2014

Due to its remoteness, the CO2 Lab close to the town of Longyearbyen on Svalbard presents a unique opportunity to demonstrate the entire CO2 value chain based on its closed energy system. The formation considered as potential CO2 storage unit consists of mixed sandstone and shale beds, presenting itself as a fractured, low-permeability reservoir. A geophone network surrounding the injection well has been installed to locate microseismic events during injection tests and to estimate background seismicity. During the first water injection in 2010, a microseismic event (M ∼ 1) was recorded and located close to the injection well, followed by a series of aftershocks. Later injection tests did not generate any detectable microseismic events; nevertheless, pressure and flow rate showed a pattern characteristic for fracture opening potentially indicating "aseismic" fracture propagation. Records of ambient seismic noise are analysed by a cross-correlation method in order to reconstruct the impulse functions between sensors. The daily cross-correlations are dominated by tube wave signals originating from the bottom of the well showing a sudden increase of activity. We also demonstrate a noise cancellation method exhibiting great potential towards cancellation of electromagnetic and cultural noise. Albeit several difficulties that were approached at the CO2 Lab, new knowledge and guidelines for best practice containment monitoring using seismic methods in the Arctic could be developed. © 2014 The Authors. Published by Elsevier Ltd. Source


Harris D.B.,Lawrence Livermore National Laboratory | Harris D.B.,Deschutes Signal Processing LLC | Dodge D.A.,Lawrence Livermore National Laboratory
Bulletin of the Seismological Society of America | Year: 2011

We describe a prototype detection framework that automatically clusters events in real time from a rapidly unfolding aftershock sequence. We use the fact that many aftershocks are repetitive, producing similar waveforms. By clustering events based on correlation measures of waveform similarity, the number of independent event instances that must be examined in detail by analysts may be reduced. Our system processes array data and acquires waveform templates with a short-term average (STA)/long-term average (LTA) detector operating on a beam directed at the P phases of the aftershock sequence. The templates are used to create correlation-type (subspace) detectors that sweep the subsequent data stream for occurrences of the same waveform pattern. Events are clustered by association with a particular detector. Hundreds of subspace detectors can run in this framework a hundred times faster than in real time. Nonetheless, to check the growth in the number of detectors, the framework pauses periodically and reclusters detections to reduce the number of event groups. These groups define new subspace detectors that replace the older generation of detectors. Because low-magnitude occurrences of a particular signal template may bemissed by the STA/LTA detector, we advocate restarting the framework from the beginning of the sequence periodically to reprocess the entire data stream with the existing detectors. We tested the framework on 10 days of data from the Nevada Seismic Array (NVAR) covering the 2003 San Simeon earthquake. One hundred eighty-four automatically generated detectors produced 676 detections resulting in a potential reduction in analyst workload of up to 73%. Source

Discover hidden collaborations