PubMed | Japan Atomic Energy Agency, College Park, Pacific Northwest National Laboratory, Health Canada and 12 more.
Type: | Journal: Journal of environmental radioactivity | Year: 2016
The International Monitoring System (IMS) is part of the verification regime for the Comprehensive Nuclear-Test-Ban-Treaty Organization (CTBTO). At entry-into-force, half of the 80 radionuclide stations will be able to measure concentrations of several radioactive xenon isotopes produced in nuclear explosions, and then the full network may be populated with xenon monitoring afterward. An understanding of natural and man-made radionuclide backgrounds can be used in accordance with the provisions of the treaty (such as event screening criteria in Annex 2 to the Protocol of the Treaty) for the effective implementation of the verification regime. Fission-based production of (99)Mo for medical purposes also generates nuisance radioxenon isotopes that are usually vented to the atmosphere. One of the ways to account for the effect emissions from medical isotope production has on radionuclide samples from the IMS is to use stack monitoring data, if they are available, and atmospheric transport modeling. Recently, individuals from seven nations participated in a challenge exercise that used atmospheric transport modeling to predict the time-history of (133)Xe concentration measurements at the IMS radionuclide station in Germany using stack monitoring data from a medical isotope production facility in Belgium. Participants received only stack monitoring data and used the atmospheric transport model and meteorological data of their choice. Some of the models predicted the highest measured concentrations quite well. A model comparison rank and ensemble analysis suggests that combining multiple models may provide more accurate predicted concentrations than any single model. None of the submissions based only on the stack monitoring data predicted the small measured concentrations very well. Modeling of sources by other nuclear facilities with smaller releases than medical isotope production facilities may be important in understanding how to discriminate those releases from releases from a nuclear explosion.
News Article | November 10, 2016
The data center industry is booming, and tech giants are increasingly receiving large incentives packages from state and local governments in exchange for building their data centers in the area. But whether these centers create jobs and benefit local economies remains in question, according to a recent report from national policy resource center Good Jobs First. Google, Microsoft, Facebook, Apple, and Amazon alone have been awarded more than $2 billion in subsidies combined, according to the report "Money Lost to the Cloud: How Data Centers Benefit from State and Local Government Subsidies." While the number of construction jobs generated for each facility is comparable to building a factory or distribution center, an operating data center creates an average of just 30 to 50 permanent jobs, the report stated. For larger facilities, that number rises to up to 200 jobs. "The question is, how many long-term jobs are being created, and how many are accessible to people from the region? Do people from the region have skills to apply for them?" said Kasia Tarczynska, a research analyst at Good Jobs First and the author of the report. SEE: Why data centers fail to bring new jobs to small towns The US is home to 44% of the world's data centers, leading every other country, the report found. And 27 states have established incentive programs specifically meant to draw data centers in, the report found. "The incentive packages can be quite outlandish—far exceeding any reasonable economic justification," Todd L. Cherry, director of the Center for Economic Research and Policy Analysis at Appalachian State University, told TechRepublic for another story on data centers and job creation. "This is a form of what we call 'the winner's curse.' When governments engage in a competitive bidding process over an uncertain benefit, the one that wins is the one that overestimates the benefit." The most important factors for companies considering creating a data center are a stable climate, and access to inexpensive energy, Tarczynska said. That means any area of the country prone to earthquakes, tornadoes, or hurricanes is out, and many rural areas are in. Electric power is the largest operating expense for data centers, reported at 70% to 80% of the operating budget, the report stated. Increasingly, companies including Facebook under pressure from environmental groups are looking to access renewable energy for their data centers, Tarczynska said. A large data center can cost up to $1 billion to build, the report found. That cost might include land acquisition, facility construction, infrastructure upgrades such as utility hookups, and servers and other equipment. Purchase of mechanical equipment such as computers and cooling rooms, and power equipment such as generators and transformers, are the biggest initial costs. Construction can average $1,000 to $2,000 per square foot, the report found. Therefore, the amount of property and other taxes the company pays could be very large. The problem is many of those taxes are being abated, Tarczynska said. Companies might chose two locations in different states, and have each government bid against each other. For example, in September, Facebook engaged in a bidding process between New Mexico and Utah for the opportunity to host its new data center. The New Mexico town of Los Lunas agreed to give up property taxes for 30 years in exchange for annual payments from Facebook starting at $50,000 and going as high as $500,000. In June, the town approved an ordinance allowing for the issuance of up to $30 billion in industrial revenue bonds in efforts to draw the company's business. "These facilities receive a huge amount of tax breaks from state and local governments," Tarczynska said. "There are public costs to those facilities." Only 15 states provide easily available online reporting of costs for data center-specific programs, the report found. Many states do not consider these to be economic development incentive programs, but rather something written into the tax code, which invokes various confidentiality laws, Tarczynska said. But that will soon change: A new Governmental Accounting Standards Board rule will require states, cities, counties and school districts to release fiscal reports on tax incentives, starting in 2017. New York City released its findings early: The city lost $3 billion in 2016 to corporate tax breaks for housing and economic development projects. "It's quite a big development, and will help the public and researchers understand what is going on when it comes to tax breaks," Tarczynska said. The report also examined 11 "megadeals," or economic development incentives of more than $15 million, that states and local governments offered to tech companies. The average cost of one job in those deals was $2 million in tax breaks. "This is a huge amount of money being paid for the creation of each individual job," Tarczynska said. "It's important to think about the opportunity cost—what else could be done with that money?" However, Mehdi Paryavi, chairman of the International Data Center Authority (IDCA), argues that it's incorrect to compare the size of the data center to the number of jobs created. These data centers are similar to farms, in that they take up a lot of square footage, but don't require hundreds of employees. Tech companies building data centers in rural areas purchase land and develop it, and employ both contract and long term employees—a benefit for those areas, said Paryavi said. "When the Apples and Googles step into a troubled economy, the only thing they can do is help it," Paryavi said. "These guys are coming in and investing millions of dollars. Trusting that your county and state has legislation that protects that investment, why not have their business? It's revenue you didn't have before." Though the quantity of people working at data centers is less than at a factory, the quality of the positions and work is much higher, Paryavi said. There are also a number of jobs created indirectly, such as those for architects, for the companies that manufacture cooling supplies, and those that maintain generators and replace batteries. "Those companies are all prospering because of new investments these companies are making with their data centers," Paryavi said. International competition is also a large reason why states should encourage data center development, Paryavi said. "We have already lost all of our manufacturing to abroad, because incentives outside our borders were better than incentives within the borders," Paryavi said. "Do we want to give away or data center business as well, or provide the right incentives so the Apples and Facebooks will be loyal and have long term plans with us?" The average lifespan of a data center is 10 years, Paryavi said. One thing is certain: It's unlikely that we'll see data center construction stop any time soon. "It's an industry in a growing mode—data centers are necessary facilities for companies to operate," Tarczynska said. "There are going to be more and more of them as the need for providing backup on data grows."
News Article | February 16, 2017
This report studies sales (consumption) of Data Center Cooling Systems in Global market, especially in USA, China, Europe, Japan, India and Southeast Asia, focuses on top players in these regions/countries, with sales, price, revenue and market share for each player in these regions, covering Market Segment by Regions, this report splits Global into several key Regions, with sales (consumption), revenue, market share and growth rate of Data Center Cooling Systems in these regions, from 2011 to 2021 (forecast), like Split by product Types, with sales, revenue, price and gross margin, market share and growth rate of each type, can be divided into In-Row Cooling Overhead Cooling In-Rack Cooling Rear Door Cooling Split by applications, this report focuses on sales, market share and growth rate of Data Center Cooling Systems in each application, can be divided into Application 1 Application 2 Application 3 Global Data Center Cooling Systems Sales Market Report 2016 1 Data Center Cooling Systems Overview 1.1 Product Overview and Scope of Data Center Cooling Systems 1.2 Classification of Data Center Cooling Systems 1.2.1 In-Row Cooling 1.2.2 Overhead Cooling 1.2.3 In-Rack Cooling 1.2.4 Rear Door Cooling 1.3 Application of Data Center Cooling Systems 1.3.1 Application 1 1.3.2 Application 2 1.3.3 Application 3 1.4 Data Center Cooling Systems Market by Regions 1.4.1 USA Status and Prospect (2011-2021) 1.4.2 China Status and Prospect (2011-2021) 1.4.3 Europe Status and Prospect (2011-2021) 1.4.4 Japan Status and Prospect (2011-2021) 1.4.5 India Status and Prospect (2011-2021) 1.4.6 Southeast Asia Status and Prospect (2011-2021) 1.5 Global Market Size (Value and Volume) of Data Center Cooling Systems (2011-2021) 1.5.1 Global Data Center Cooling Systems Sales and Growth Rate (2011-2021) 1.5.2 Global Data Center Cooling Systems Revenue and Growth Rate (2011-2021) 9 Global Data Center Cooling Systems Manufacturers Analysis 9.1 Emerson Network Power 9.1.1 Company Basic Information, Manufacturing Base and Competitors 9.1.2 Data Center Cooling Systems Product Type, Application and Specification 18.104.22.168 Type I 22.214.171.124 Type II 9.1.3 Emerson Network Power Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.1.4 Main Business/Business Overview 9.2 APC 9.2.1 Company Basic Information, Manufacturing Base and Competitors 9.2.2 125 Product Type, Application and Specification 126.96.36.199 Type I 188.8.131.52 Type II 9.2.3 APC Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.2.4 Main Business/Business Overview 9.3 Rittal Corporation 9.3.1 Company Basic Information, Manufacturing Base and Competitors 9.3.2 145 Product Type, Application and Specification 184.108.40.206 Type I 220.127.116.11 Type II 9.3.3 Rittal Corporation Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.3.4 Main Business/Business Overview 9.4 Airedale International 9.4.1 Company Basic Information, Manufacturing Base and Competitors 9.4.2 Oct Product Type, Application and Specification 18.104.22.168 Type I 22.214.171.124 Type II 9.4.3 Airedale International Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.4.4 Main Business/Business Overview 9.5 Degree Controls Inc 9.5.1 Company Basic Information, Manufacturing Base and Competitors 9.5.2 Product Type, Application and Specification 126.96.36.199 Type I 188.8.131.52 Type II 9.5.3 Degree Controls Inc Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.5.4 Main Business/Business Overview 9.6 Schneider Electric 9.6.1 Company Basic Information, Manufacturing Base and Competitors 9.6.2 Million USD Product Type, Application and Specification 184.108.40.206 Type I 220.127.116.11 Type II 9.6.3 Schneider Electric Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.6.4 Main Business/Business Overview 9.7 Equinix 9.7.1 Company Basic Information, Manufacturing Base and Competitors 9.7.2 Machinery & Equipment Product Type, Application and Specification 18.104.22.168 Type I 22.214.171.124 Type II 9.7.3 Equinix Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.7.4 Main Business/Business Overview 9.8 Cloud Dynamics Inc 9.8.1 Company Basic Information, Manufacturing Base and Competitors 9.8.2 Product Type, Application and Specification 126.96.36.199 Type I 188.8.131.52 Type II 9.8.3 Cloud Dynamics Inc Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.8.4 Main Business/Business Overview 9.9 KyotoCooling BV 9.9.1 Company Basic Information, Manufacturing Base and Competitors 9.9.2 Product Type, Application and Specification 184.108.40.206 Type I 220.127.116.11 Type II 9.9.3 KyotoCooling BV Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.9.4 Main Business/Business Overview 9.10 Siemon 9.10.1 Company Basic Information, Manufacturing Base and Competitors 9.10.2 Product Type, Application and Specification 18.104.22.168 Type I 22.214.171.124 Type II 9.10.3 Siemon Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.10.4 Main Business/Business Overview 9.11 3M Corp 9.12 Siemens 9.13 Coolcentric 9.14 Latisys 9.15 AST Modular 9.16 Wakefield-Vette Inc 9.17 Mitsubishi Electric 9.18 Raritan Inc 9.19 General Air Products For more information, please visit https://www.wiseguyreports.com/sample-request/682659-global-data-center-cooling-systems-sales-market-report-2016%20
Evers L.G.,Royal Netherlands Meteorological Institute |
Evers L.G.,Technical University of Delft |
Brown D.,International Data Center |
Heaney K.D.,Ocean Acoustical Services and Instrumentation Systems |
And 4 more authors.
Geophysical Research Letters | Year: 2014
Atmospheric low-frequency sound, i.e., infrasound, from underwater events has not been considered thus far, due to the high impedance contrast of the water-air interface making it almost fully reflective. Here we report for the first time on atmospheric infrasound from a large underwater earthquake (M w 8.1) near the Macquarie Ridge, which was recorded at 1325 km from the epicenter. Seismic waves coupled to hydroacoustic waves at the ocean floor, after which the energy entered the Sound Fixing and Ranging channel and was detected on a hydrophone array. The energy was diffracted by a seamount and an oceanic ridge, which acted as a secondary source, into the water column followed by coupling into the atmosphere. The latter results from evanescent wave coupling and the attendant anomalous transparency of the sea surface for very low frequency acoustic waves. Key Points Evanescent wave coupling links the solid Earth, oceans, and atmosphere Acoustic waves use anomalous transparency of the water-air interface Underwater geophysical processes and events can be heard in the atmosphere ©2014. American Geophysical Union. All Rights Reserved.
Fee D.,University of Alaska Fairbanks |
Waxler R.,University of Mississippi |
Assink J.,University of Mississippi |
Gitterman Y.,Geophysical Institute of Israel |
And 8 more authors.
Journal of Geophysical Research: Atmospheres | Year: 2013
Three large-scale infrasound calibration experiments were conducted in 2009 and 2011 to test the International Monitoring System (IMS) infrasound network and provide ground truth data for infrasound propagation studies. Here we provide an overview of the deployment, detonation, atmospheric specifications, infrasound array observations, and propagation modeling for the experiments. The experiments at the Sayarim Military Range, Israel, had equivalent TNT yields of 96.0, 7.4, and 76.8 t of explosives on 26 August 2009, 24 January 2011, and 26 January 2011, respectively. Successful international collaboration resulted in the deployment of numerous portable infrasound arrays in the region to supplement the IMS network and increase station density. Infrasound from the detonations is detected out to ~3500 km to the northwest in 2009 and ~6300 km to the northeast in 2011, reflecting the highly anisotropic nature of long-range infrasound propagation. For 2009, the moderately strong stratospheric wind jet results in a well-predicted set of arrivals at numerous arrays to the west-northwest. A second set of arrivals is also apparent, with low celerities and high frequencies. These arrivals are not predicted by the propagation modeling and result from unresolved atmospheric features. Strong eastward tropospheric winds (up to ~70 m/s) in 2011 produce high-amplitude tropospheric arrivals recorded out to >1000 km to the east. Significant eastward stratospheric winds (up to ~80 m/s) in 2011 generate numerous stratospheric arrivals and permit the long-range detection (i.e., >1000 km). No detections are made in directions opposite the tropospheric and stratospheric wind jets for any of the explosions. Comparison of predicted transmission loss and observed infrasound arrivals gives qualitative agreement. Propagation modeling for the 2011 experiments predicts lower transmission loss in the direction of the downwind propagation compared to the 2009 experiment, consistent with the greater detection distance. Observations also suggest a more northerly component to the stratospheric winds for the 2009 experiment and less upper atmosphere attenuation. The Sayarim infrasound calibration experiments clearly demonstrate the complexity and variability of the atmosphere, and underscore the utility of large-scale calibration experiments with dense networks for better understanding infrasound propagation and detection. Additionally, they provide a rich data set for future scientific research. Key Points Three large ground-truth infrasound experiments were conducted in 2009 and 2011 Strong wind jets permitted long range detection Atmospheric specifications sufficient for qualitative propagation modeling © 2013. American Geophysical Union. All Rights Reserved.
Drob D.P.,U.S. Navy |
Garces M.,University of Hawaii at Manoa |
Hedlin M.,University of California at San Diego |
Brachet N.,International Data Center
Pure and Applied Geophysics | Year: 2010
Expert knowledge suggests that the performance of automated infrasound event association and source location algorithms could be greatly improved by the ability to continually update station travel-time curves to properly account for the hourly, daily, and seasonal changes of the atmospheric state. With the goal of reducing false alarm rates and improving network detection capability we endeavor to develop, validate, and integrate this capability into infrasound processing operations at the International Data Centre of the Comprehensive Nuclear Test-Ban Treaty Organization. Numerous studies have demonstrated that incorporation of hybrid ground-to-space (G2S) enviromental specifications in numerical calculations of infrasound signal travel time and azimuth deviation yields significantly improved results over that of climatological atmospheric specifications, specifically for tropospheric and stratospheric modes. A robust infrastructure currently exists to generate hybrid G2S vector spherical harmonic coefficients, based on existing operational and emperical models on a real-time basis (every 3- to 6-hours) (Drobet al.,2003). Thus the next requirement in this endeavor is to refine numerical procedures to calculate infrasound propagation characteristics for robust automatic infrasound arrival identification and network detection, location, and characterization algorithms. We present results from a new code that integrates the local (range-independent) τp ray equations to provide travel time, range, turning point, and azimuth deviation for any location on the globe given a G2S vector spherical harmonic coefficient set. The code employs an accurate numerical technique capable of handling square-root singularities. We investigate the seasonal variability of propagation characteristics over a five-year time series for two different stations within the International Monitoring System with the aim of understanding the capabilities of current working knowledge of the atmosphere and infrasound propagation models. The statistical behaviors or occurrence frequency of various propagation configurations are discussed. Representative examples of some of these propagation configuration states are also shown. © 2010 US Government.
Brown D.J.,International Data Center |
Szuberla C.A.L.,University of Alaska Fairbanks |
McCormack D.,Geological Survey of Canada |
Mialle P.,International Data Center
Pure and Applied Geophysics | Year: 2014
A spatial filter is often attached to a microphone or microbarometer in order to reduce the noise caused by atmospheric turbulence. This filtering technique is based on the assumption that the coherence length of turbulence is smaller than the spatial extent of the filter, and so contributions from turbulence recorded at widely separated ports will tend to cancel while those of the signal of interest, which will have coherence length larger than the spatial dimensions of the filter, will be reinforced. In this paper, the plane wave response for a spatial filter with an arbitrary arrangement of open ports is determined. It is found that propagation over different port-to-sensor distances causes out-of-phase sinusoids to be summed at the central manifold and can lead to significant amplitude decay and phase delays as a function of frequency. The determined spatial filter plane wave response is superimposed on an array response typical of infrasound arrays that constitute the International Monitoring System infrasound network used for nuclear monitoring purposes. It is found that signal detection capability in terms of the Fisher Statistic can be significantly degraded at certain frequencies. The least-squares estimate of signal slowness can change by up to 1.5° and up to 10 m/s if an asymmetric arrangement of low and high frequency spatial filters is used. However, if a symmetric arrangement of filters is used the least-squares estimate of signal slowness is found to be largely unaffected, except near the predicted null frequency. © 2012 The Author(s).
Brown D.,International Data Center |
Ceranna L.,Bundesanstalt fur Geowissenschaften und Rohstoffe BGR |
Prior M.,International Data Center |
Mialle P.,International Data Center |
Le Bras R.J.,International Data Center
Pure and Applied Geophysics | Year: 2014
The International Data Centre (IDC) in Vienna, Austria, is determining, as part of automatic processing, sensor noise levels for all seismic, hydroacoustic, and infrasound (SHI) stations in the International Monitoring System (IMS) operated by the Provisional Technical Secretariat of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Sensor noise is being determined several times per day as a power spectral density (PSD) using the Welch overlapping method. Based on accumulated PSD statistics a probability density function (PDF) is also determined, from which low and high noise curves for each sensor are extracted. Global low and high noise curves as a function of frequency for each of the SHI technologies are determined as the minimum and maximum of the individual station low and high noise curves, respectively, taken over the entire network of contributing stations. An attempt is made to ensure that only correctly calibrated station data contributes to the global noise models by additionally considering various automatic detection statistics. In this paper global low and high noise curves for 2010 are presented for each of the SHI monitoring technologies. Except for a very slight deviation at the microseism peak, the seismic global low noise model returns identically the Peterson (1993) NLNM low noise curve. The global infrasonic low noise model is found to agree with that of Bowmanet al. (2005, 2007) but disagrees with the revised results presented in Bowmanet al. (2009) by a factor of 2 in the calculation of the PSD. The global hydroacoustic low and high noise curves are found to be in quantitative agreement with Urick's oceanic ambient noise curves for light to heavy shipping. Whale noise is found to be a feature of the hydroacoustic high noise curves at around 15 and 25 Hz. © 2012 The Author(s).