International Data Center

Vienna, Austria

International Data Center

Vienna, Austria
SEARCH FILTERS
Time filter
Source Type

Kusmierczyk-Michulec J.,International Data Center | Krysta M.,International Data Center | Kalinowski M.,International Data Center | Hoffmann E.,Australian Nuclear Science and Technology Organisation | Bare J.,International Data Center
Journal of Environmental Radioactivity | Year: 2017

To investigate the transport of xenon emissions, the Provisional Technical Secretariat (PTS) operates an Atmospheric Transport Modelling (ATM) system based on the Lagrangian Particle Dispersion Model FLEXPART. The air mass trajectory ideally provides a “link” between a radionuclide release and a detection confirmed by radionuclide measurements. This paper investigates the long-range transport of Xe-133 emissions under convective and non-convective conditions, with special emphasis on evaluating the changes in the simulated activity concentration values due to the inclusion of the convective transport in the ATM simulations. For that purpose a series of 14 day forward simulations, with and without convective transport, released daily in the period from 1 January 2011 to 30 June 2013, were analysed. The release point was at the ANSTO facility in Australia. The simulated activity concentrations for the period January 2011 to February 2012 were calculated using the daily emission values provided by the ANSTO facility; outside the aforementioned period, the median daily emission value was used. In the simulations the analysed meteorological input data provided by the European Centre for Medium-Range Weather Forecasts (ECMWF) were used with the spatial resolution of 0.5°. It was found that the long-range transport of Xe-133 emissions under convective conditions, where convection was included in the ATM simulation, led to a small decrease in the activity concentration, as compared to transport without convection. In special cases related to deep convection, the opposite effect was observed. Availability of both daily emission values and measured Xe-133 activity concentration values was an opportunity to validate the simulations. Based on the paired t-test, a 95% confidence interval for the true mean difference between simulations without convective transport and measurements was constructed. It was estimated that the overall uncertainty lies between 0.08 and 0.25 mBq/m3. The uncertainty for the simulations with the convective transport included is slighted shifted to the lower values and is in the range between 0.06 and 0.20 mBq/m3. © 2017


Heaney K.D.,Ocean Acoustical Services and Instrumentation Systems | Prior M.,International Data Center | Prior M.,TNO | Campbell R.L.,Ocean Acoustical Services and Instrumentation Systems
Journal of the Acoustical Society of America | Year: 2017

The ocean is nearly transparent to low frequency sound permitting the observation of distant events such as earthquakes or explosions at fully basin scales. For very low frequency the ocean acts as a shallow-water waveguide and lateral variability in bathymetry can lead to out-of-plane effects. In this paper, data from the International Monitoring System of the Comprehensive Test Ban Treaty Organization (CTBTO) is used to present two cases where robustly localized seismic events in locations clearly within the two-dimensional (2-D) shadow of a continent or large island generate T-phase signals that are received on a hydro-acoustic station. A fully three- dimensional parabolic equation model is used to demonstrate that lateral variability of the bathymetry can lead to diffraction, explaining both observations. The implications of this are that the CTBTO network has greater coverage than predicted by 2-D models and that inclusion of diffraction in future processing can improve the automatic global association of hydroacoustic events. © 2017 Acoustical Society of America.


The data center industry is booming, and tech giants are increasingly receiving large incentives packages from state and local governments in exchange for building their data centers in the area. But whether these centers create jobs and benefit local economies remains in question, according to a recent report from national policy resource center Good Jobs First. Google, Microsoft, Facebook, Apple, and Amazon alone have been awarded more than $2 billion in subsidies combined, according to the report "Money Lost to the Cloud: How Data Centers Benefit from State and Local Government Subsidies." While the number of construction jobs generated for each facility is comparable to building a factory or distribution center, an operating data center creates an average of just 30 to 50 permanent jobs, the report stated. For larger facilities, that number rises to up to 200 jobs. "The question is, how many long-term jobs are being created, and how many are accessible to people from the region? Do people from the region have skills to apply for them?" said Kasia Tarczynska, a research analyst at Good Jobs First and the author of the report. SEE: Why data centers fail to bring new jobs to small towns The US is home to 44% of the world's data centers, leading every other country, the report found. And 27 states have established incentive programs specifically meant to draw data centers in, the report found. "The incentive packages can be quite outlandish—far exceeding any reasonable economic justification," Todd L. Cherry, director of the Center for Economic Research and Policy Analysis at Appalachian State University, told TechRepublic for another story on data centers and job creation. "This is a form of what we call 'the winner's curse.' When governments engage in a competitive bidding process over an uncertain benefit, the one that wins is the one that overestimates the benefit." The most important factors for companies considering creating a data center are a stable climate, and access to inexpensive energy, Tarczynska said. That means any area of the country prone to earthquakes, tornadoes, or hurricanes is out, and many rural areas are in. Electric power is the largest operating expense for data centers, reported at 70% to 80% of the operating budget, the report stated. Increasingly, companies including Facebook under pressure from environmental groups are looking to access renewable energy for their data centers, Tarczynska said. A large data center can cost up to $1 billion to build, the report found. That cost might include land acquisition, facility construction, infrastructure upgrades such as utility hookups, and servers and other equipment. Purchase of mechanical equipment such as computers and cooling rooms, and power equipment such as generators and transformers, are the biggest initial costs. Construction can average $1,000 to $2,000 per square foot, the report found. Therefore, the amount of property and other taxes the company pays could be very large. The problem is many of those taxes are being abated, Tarczynska said. Companies might chose two locations in different states, and have each government bid against each other. For example, in September, Facebook engaged in a bidding process between New Mexico and Utah for the opportunity to host its new data center. The New Mexico town of Los Lunas agreed to give up property taxes for 30 years in exchange for annual payments from Facebook starting at $50,000 and going as high as $500,000. In June, the town approved an ordinance allowing for the issuance of up to $30 billion in industrial revenue bonds in efforts to draw the company's business. "These facilities receive a huge amount of tax breaks from state and local governments," Tarczynska said. "There are public costs to those facilities." Only 15 states provide easily available online reporting of costs for data center-specific programs, the report found. Many states do not consider these to be economic development incentive programs, but rather something written into the tax code, which invokes various confidentiality laws, Tarczynska said. But that will soon change: A new Governmental Accounting Standards Board rule will require states, cities, counties and school districts to release fiscal reports on tax incentives, starting in 2017. New York City released its findings early: The city lost $3 billion in 2016 to corporate tax breaks for housing and economic development projects. "It's quite a big development, and will help the public and researchers understand what is going on when it comes to tax breaks," Tarczynska said. The report also examined 11 "megadeals," or economic development incentives of more than $15 million, that states and local governments offered to tech companies. The average cost of one job in those deals was $2 million in tax breaks. "This is a huge amount of money being paid for the creation of each individual job," Tarczynska said. "It's important to think about the opportunity cost—what else could be done with that money?" However, Mehdi Paryavi, chairman of the International Data Center Authority (IDCA), argues that it's incorrect to compare the size of the data center to the number of jobs created. These data centers are similar to farms, in that they take up a lot of square footage, but don't require hundreds of employees. Tech companies building data centers in rural areas purchase land and develop it, and employ both contract and long term employees—a benefit for those areas, said Paryavi said. "When the Apples and Googles step into a troubled economy, the only thing they can do is help it," Paryavi said. "These guys are coming in and investing millions of dollars. Trusting that your county and state has legislation that protects that investment, why not have their business? It's revenue you didn't have before." Though the quantity of people working at data centers is less than at a factory, the quality of the positions and work is much higher, Paryavi said. There are also a number of jobs created indirectly, such as those for architects, for the companies that manufacture cooling supplies, and those that maintain generators and replace batteries. "Those companies are all prospering because of new investments these companies are making with their data centers," Paryavi said. International competition is also a large reason why states should encourage data center development, Paryavi said. "We have already lost all of our manufacturing to abroad, because incentives outside our borders were better than incentives within the borders," Paryavi said. "Do we want to give away or data center business as well, or provide the right incentives so the Apples and Facebooks will be loyal and have long term plans with us?" The average lifespan of a data center is 10 years, Paryavi said. One thing is certain: It's unlikely that we'll see data center construction stop any time soon. "It's an industry in a growing mode—data centers are necessary facilities for companies to operate," Tarczynska said. "There are going to be more and more of them as the need for providing backup on data grows."


This report studies sales (consumption) of Data Center Cooling Systems in Global market, especially in USA, China, Europe, Japan, India and Southeast Asia, focuses on top players in these regions/countries, with sales, price, revenue and market share for each player in these regions, covering Market Segment by Regions, this report splits Global into several key Regions, with sales (consumption), revenue, market share and growth rate of Data Center Cooling Systems in these regions, from 2011 to 2021 (forecast), like Split by product Types, with sales, revenue, price and gross margin, market share and growth rate of each type, can be divided into In-Row Cooling Overhead Cooling In-Rack Cooling Rear Door Cooling Split by applications, this report focuses on sales, market share and growth rate of Data Center Cooling Systems in each application, can be divided into Application 1 Application 2 Application 3 Global Data Center Cooling Systems Sales Market Report 2016 1 Data Center Cooling Systems Overview 1.1 Product Overview and Scope of Data Center Cooling Systems 1.2 Classification of Data Center Cooling Systems 1.2.1 In-Row Cooling 1.2.2 Overhead Cooling 1.2.3 In-Rack Cooling 1.2.4 Rear Door Cooling 1.3 Application of Data Center Cooling Systems 1.3.1 Application 1 1.3.2 Application 2 1.3.3 Application 3 1.4 Data Center Cooling Systems Market by Regions 1.4.1 USA Status and Prospect (2011-2021) 1.4.2 China Status and Prospect (2011-2021) 1.4.3 Europe Status and Prospect (2011-2021) 1.4.4 Japan Status and Prospect (2011-2021) 1.4.5 India Status and Prospect (2011-2021) 1.4.6 Southeast Asia Status and Prospect (2011-2021) 1.5 Global Market Size (Value and Volume) of Data Center Cooling Systems (2011-2021) 1.5.1 Global Data Center Cooling Systems Sales and Growth Rate (2011-2021) 1.5.2 Global Data Center Cooling Systems Revenue and Growth Rate (2011-2021) 9 Global Data Center Cooling Systems Manufacturers Analysis 9.1 Emerson Network Power 9.1.1 Company Basic Information, Manufacturing Base and Competitors 9.1.2 Data Center Cooling Systems Product Type, Application and Specification 9.1.2.1 Type I 9.1.2.2 Type II 9.1.3 Emerson Network Power Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.1.4 Main Business/Business Overview 9.2 APC 9.2.1 Company Basic Information, Manufacturing Base and Competitors 9.2.2 125 Product Type, Application and Specification 9.2.2.1 Type I 9.2.2.2 Type II 9.2.3 APC Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.2.4 Main Business/Business Overview 9.3 Rittal Corporation 9.3.1 Company Basic Information, Manufacturing Base and Competitors 9.3.2 145 Product Type, Application and Specification 9.3.2.1 Type I 9.3.2.2 Type II 9.3.3 Rittal Corporation Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.3.4 Main Business/Business Overview 9.4 Airedale International 9.4.1 Company Basic Information, Manufacturing Base and Competitors 9.4.2 Oct Product Type, Application and Specification 9.4.2.1 Type I 9.4.2.2 Type II 9.4.3 Airedale International Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.4.4 Main Business/Business Overview 9.5 Degree Controls Inc 9.5.1 Company Basic Information, Manufacturing Base and Competitors 9.5.2 Product Type, Application and Specification 9.5.2.1 Type I 9.5.2.2 Type II 9.5.3 Degree Controls Inc Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.5.4 Main Business/Business Overview 9.6 Schneider Electric 9.6.1 Company Basic Information, Manufacturing Base and Competitors 9.6.2 Million USD Product Type, Application and Specification 9.6.2.1 Type I 9.6.2.2 Type II 9.6.3 Schneider Electric Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.6.4 Main Business/Business Overview 9.7 Equinix 9.7.1 Company Basic Information, Manufacturing Base and Competitors 9.7.2 Machinery & Equipment Product Type, Application and Specification 9.7.2.1 Type I 9.7.2.2 Type II 9.7.3 Equinix Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.7.4 Main Business/Business Overview 9.8 Cloud Dynamics Inc 9.8.1 Company Basic Information, Manufacturing Base and Competitors 9.8.2 Product Type, Application and Specification 9.8.2.1 Type I 9.8.2.2 Type II 9.8.3 Cloud Dynamics Inc Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.8.4 Main Business/Business Overview 9.9 KyotoCooling BV 9.9.1 Company Basic Information, Manufacturing Base and Competitors 9.9.2 Product Type, Application and Specification 9.9.2.1 Type I 9.9.2.2 Type II 9.9.3 KyotoCooling BV Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.9.4 Main Business/Business Overview 9.10 Siemon 9.10.1 Company Basic Information, Manufacturing Base and Competitors 9.10.2 Product Type, Application and Specification 9.10.2.1 Type I 9.10.2.2 Type II 9.10.3 Siemon Data Center Cooling Systems Sales, Revenue, Price and Gross Margin (2011-2016) 9.10.4 Main Business/Business Overview 9.11 3M Corp 9.12 Siemens 9.13 Coolcentric 9.14 Latisys 9.15 AST Modular 9.16 Wakefield-Vette Inc 9.17 Mitsubishi Electric 9.18 Raritan Inc 9.19 General Air Products For more information, please visit https://www.wiseguyreports.com/sample-request/682659-global-data-center-cooling-systems-sales-market-report-2016%20


Fee D.,University of Alaska Fairbanks | Waxler R.,University of Mississippi | Assink J.,University of Mississippi | Gitterman Y.,Geophysical Institute of Israel | And 8 more authors.
Journal of Geophysical Research: Atmospheres | Year: 2013

Three large-scale infrasound calibration experiments were conducted in 2009 and 2011 to test the International Monitoring System (IMS) infrasound network and provide ground truth data for infrasound propagation studies. Here we provide an overview of the deployment, detonation, atmospheric specifications, infrasound array observations, and propagation modeling for the experiments. The experiments at the Sayarim Military Range, Israel, had equivalent TNT yields of 96.0, 7.4, and 76.8 t of explosives on 26 August 2009, 24 January 2011, and 26 January 2011, respectively. Successful international collaboration resulted in the deployment of numerous portable infrasound arrays in the region to supplement the IMS network and increase station density. Infrasound from the detonations is detected out to ~3500 km to the northwest in 2009 and ~6300 km to the northeast in 2011, reflecting the highly anisotropic nature of long-range infrasound propagation. For 2009, the moderately strong stratospheric wind jet results in a well-predicted set of arrivals at numerous arrays to the west-northwest. A second set of arrivals is also apparent, with low celerities and high frequencies. These arrivals are not predicted by the propagation modeling and result from unresolved atmospheric features. Strong eastward tropospheric winds (up to ~70 m/s) in 2011 produce high-amplitude tropospheric arrivals recorded out to >1000 km to the east. Significant eastward stratospheric winds (up to ~80 m/s) in 2011 generate numerous stratospheric arrivals and permit the long-range detection (i.e., >1000 km). No detections are made in directions opposite the tropospheric and stratospheric wind jets for any of the explosions. Comparison of predicted transmission loss and observed infrasound arrivals gives qualitative agreement. Propagation modeling for the 2011 experiments predicts lower transmission loss in the direction of the downwind propagation compared to the 2009 experiment, consistent with the greater detection distance. Observations also suggest a more northerly component to the stratospheric winds for the 2009 experiment and less upper atmosphere attenuation. The Sayarim infrasound calibration experiments clearly demonstrate the complexity and variability of the atmosphere, and underscore the utility of large-scale calibration experiments with dense networks for better understanding infrasound propagation and detection. Additionally, they provide a rich data set for future scientific research. Key Points Three large ground-truth infrasound experiments were conducted in 2009 and 2011 Strong wind jets permitted long range detection Atmospheric specifications sufficient for qualitative propagation modeling © 2013. American Geophysical Union. All Rights Reserved.


Drob D.P.,U.S. Navy | Garces M.,University of Hawaii at Manoa | Hedlin M.,University of California at San Diego | Brachet N.,International Data Center
Pure and Applied Geophysics | Year: 2010

Expert knowledge suggests that the performance of automated infrasound event association and source location algorithms could be greatly improved by the ability to continually update station travel-time curves to properly account for the hourly, daily, and seasonal changes of the atmospheric state. With the goal of reducing false alarm rates and improving network detection capability we endeavor to develop, validate, and integrate this capability into infrasound processing operations at the International Data Centre of the Comprehensive Nuclear Test-Ban Treaty Organization. Numerous studies have demonstrated that incorporation of hybrid ground-to-space (G2S) enviromental specifications in numerical calculations of infrasound signal travel time and azimuth deviation yields significantly improved results over that of climatological atmospheric specifications, specifically for tropospheric and stratospheric modes. A robust infrastructure currently exists to generate hybrid G2S vector spherical harmonic coefficients, based on existing operational and emperical models on a real-time basis (every 3- to 6-hours) (Drobet al.,2003). Thus the next requirement in this endeavor is to refine numerical procedures to calculate infrasound propagation characteristics for robust automatic infrasound arrival identification and network detection, location, and characterization algorithms. We present results from a new code that integrates the local (range-independent) τp ray equations to provide travel time, range, turning point, and azimuth deviation for any location on the globe given a G2S vector spherical harmonic coefficient set. The code employs an accurate numerical technique capable of handling square-root singularities. We investigate the seasonal variability of propagation characteristics over a five-year time series for two different stations within the International Monitoring System with the aim of understanding the capabilities of current working knowledge of the atmosphere and infrasound propagation models. The statistical behaviors or occurrence frequency of various propagation configurations are discussed. Representative examples of some of these propagation configuration states are also shown. © 2010 US Government.


Brown D.J.,International Data Center | Szuberla C.A.L.,University of Alaska Fairbanks | McCormack D.,Geological Survey of Canada | Mialle P.,International Data Center
Pure and Applied Geophysics | Year: 2014

A spatial filter is often attached to a microphone or microbarometer in order to reduce the noise caused by atmospheric turbulence. This filtering technique is based on the assumption that the coherence length of turbulence is smaller than the spatial extent of the filter, and so contributions from turbulence recorded at widely separated ports will tend to cancel while those of the signal of interest, which will have coherence length larger than the spatial dimensions of the filter, will be reinforced. In this paper, the plane wave response for a spatial filter with an arbitrary arrangement of open ports is determined. It is found that propagation over different port-to-sensor distances causes out-of-phase sinusoids to be summed at the central manifold and can lead to significant amplitude decay and phase delays as a function of frequency. The determined spatial filter plane wave response is superimposed on an array response typical of infrasound arrays that constitute the International Monitoring System infrasound network used for nuclear monitoring purposes. It is found that signal detection capability in terms of the Fisher Statistic can be significantly degraded at certain frequencies. The least-squares estimate of signal slowness can change by up to 1.5° and up to 10 m/s if an asymmetric arrangement of low and high frequency spatial filters is used. However, if a symmetric arrangement of filters is used the least-squares estimate of signal slowness is found to be largely unaffected, except near the predicted null frequency. © 2012 The Author(s).


Brown D.,International Data Center | Ceranna L.,Bundesanstalt fur Geowissenschaften und Rohstoffe BGR | Prior M.,International Data Center | Mialle P.,International Data Center | Le Bras R.J.,International Data Center
Pure and Applied Geophysics | Year: 2014

The International Data Centre (IDC) in Vienna, Austria, is determining, as part of automatic processing, sensor noise levels for all seismic, hydroacoustic, and infrasound (SHI) stations in the International Monitoring System (IMS) operated by the Provisional Technical Secretariat of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). Sensor noise is being determined several times per day as a power spectral density (PSD) using the Welch overlapping method. Based on accumulated PSD statistics a probability density function (PDF) is also determined, from which low and high noise curves for each sensor are extracted. Global low and high noise curves as a function of frequency for each of the SHI technologies are determined as the minimum and maximum of the individual station low and high noise curves, respectively, taken over the entire network of contributing stations. An attempt is made to ensure that only correctly calibrated station data contributes to the global noise models by additionally considering various automatic detection statistics. In this paper global low and high noise curves for 2010 are presented for each of the SHI monitoring technologies. Except for a very slight deviation at the microseism peak, the seismic global low noise model returns identically the Peterson (1993) NLNM low noise curve. The global infrasonic low noise model is found to agree with that of Bowmanet al. (2005, 2007) but disagrees with the revised results presented in Bowmanet al. (2009) by a factor of 2 in the calculation of the PSD. The global hydroacoustic low and high noise curves are found to be in quantitative agreement with Urick's oceanic ambient noise curves for light to heavy shipping. Whale noise is found to be a feature of the hydroacoustic high noise curves at around 15 and 25 Hz. © 2012 The Author(s).


Kushida N.,International Data Center
PLoS ONE | Year: 2015

The present paper introduces a condition number estimation method for preconditioned matrices. The newly developed method provides reasonable results, while the conventional method which is based on the Lanczos connection gives meaningless results. The Lanczos connection based method provides the condition numbers of coefficient matrices of systems of linear equations with information obtained through the preconditioned conjugate gradient method. Estimating the condition number of preconditioned matrices is sometimes important when describing the effectiveness of new preconditionerers or selecting adequate preconditioners. Operating a preconditioner on a coefficient matrix is the simplest method of estimation. However, this is not possible for large-scale computing, especially if computation is performed on distributed memory parallel computers. This is because, the preconditioned matrices become dense, even if the original matrices are sparse. Although the Lanczos connection method can be used to calculate the condition number of preconditioned matrices, it is not considered to be applicable to large-scale problems because of its weakness with respect to numerical errors. Therefore, we have developed a robust and parallelizable method based on Hager's method. The feasibility studies are curried out for the diagonal scaling preconditioner and the SSOR preconditioner with a diagonal matrix, a tridaigonal matrix and Pei's matrix. As a result, the Lanczos connection method contains around 10% error in the results even with a simple problem. On the other hand, the new method contains negligible errors. In addition, the newly developed method returns reasonable solutions when the Lanczos connection method fails with Pei's matrix, and matrices generated with the finite element method. © 2015 Noriyuki Kushida.


Kusmierczyk-Michulec J.,International Data Center | Gheddou A.,International Data Center | Nikkinen M.,International Data Center
Journal of Environmental Radioactivity | Year: 2015

Data collected by the International Monitoring System (IMS) during 2009-2012 were used to study influence of precipitation and relative humidity on changes in 7Be concentrations in atmosphere. The significant decrease in 7Be concentrations, corresponding to measurements collected by stations located within Intertropical Convergence Zone (ITCZ) is demonstrated. This effect can be attributed to the process of enhanced wet deposition within the ITCZ. To quantify this effect data collected by IMS stations within ITCZ were thoroughly analyzed. It was found that the atmospheric content of 7Be strongly decreases under the rain conditions. The rain mediated depletion of 7Be to half of its before rain value, needs about 62h in case of light precipitation, while in the case of moderate precipitation about 38h is needed. In addition the evaluated impact of humidity showed that increase in relative humidity by 20%, for example from 70%±5% to 90%±5% causes almost a double decrease in beryllium concentration in surface air. © 2015 Elsevier Ltd.

Loading International Data Center collaborators
Loading International Data Center collaborators