DWD
Offenbach, Germany
DWD
Offenbach, Germany

Time filter

Source Type

SEOUL, South Korea, April 24, 2017 /PRNewswire/ -- According to a survey conducted by Korea Ransomware Computer Emergency Response Team, 70% of CIOs and 50% of staff at the IT department of global companies in the world said the first goal for 2017 is to prevent ransomware. In keeping with this, Korea IT Times had an interview with SM Technology CTO Kim Seong-ki on the DocStory solution, the world's first technology it developed to completely block ransomware, on April 12, 2017. The interview content is shown as below: Korea IT Times: What is ransomware? Kim Seong-ki(Kim): Ransomware, the compound word of ransom and software, refers to the malicious code that demands money by holding files hostage. In case that files of a PC user are infected by e-mail or a malicious link, he or she cannot access to the system and files. There are only two solutions - one is to pay ransom or to destroy relevant files eternally. According to the data of Korea Ransomware Computer Emergency Response Team, 49% of all companies became the target of cyber ransomware attack in 2016. The number of infectees soared from 53,000 in 2015 to 150,000 in 2016. The demanded bitcoin price also went up sharply from around $400 early 2016 to over $600 at present. The damage was estimated to have reached 300 billion won in 2016, a sharp rise from 109 billion won in 2015. Korea IT Times: Is there any method to cure ransomware? Kim: Unfortunately, there is no therapy method. Prevention is the best way. The ransom block is the best way by classifying normal and abnormal approaches intelligently, blocking the malicious approach and having the user know it. Korea IT Times: Is there a safe folder to completely block malicious approach? Kim: Valuable data on PC can be protected with ransom block. However, it is important to prepare a safe folder to protect the data more safely. Related to this, SM Technology has developed a safe folder that can be used easily like a general folder, block malicious approach completely and reduce the operation cost sharply. The overall process from the infiltration of ransomware to the progress of actual encryption should satisfy various conditions. There are many chances to detect in the process ranging from URL filtering web gateway to safe arrival of malware system, C&C server connection, encryption download, C&C server connection maintenance and file encryption. Then, what is the reason behind the high success rate of ransomware? For instance, the amount earned from ransomware surpassed 1 trillion won. Korea IT Times: What is the reason behind the high success rate of ransomware? Kim: In short, three are three reasons. Like this, the prevention system we have is to be infiltrated someday. And attackers are always faster than defenders. After all, there is a limit in the method to chase attackers so that it is necessary to develop technology, which makes attackers chase defenders. Korea IT Times: Would you introduce the base of the new paradigm that makes attackers chase defenders? Kim: The "DocStoryRB" Solution developed by SM Technology this time supports the ransomware block and data backup. Namely, it originally blocks the ransomware that infiltrated into the endpoint section from accessing to data and notifies the relevant threatening information to the control server on real time. Also it backs up the creating and modifying data with server on real time and activates the restoration to the original state by each time. Korea IT Times: The whitelist technology was introduced a long time ago. But it is widely known to have a limit in enhancing users' convenience. What is the difference with the existing technology? Kim: Various security technologies such as signature, behavior-based, restoration from the preventive dimension, blacklist, pattern and cloud authentication have been applied to the endpoint. After all, hackers won the victory. Options hackers can select are various. As a result, a best way was necessary to tide over such a limit. Accordingly, the technology method we had selected was the whitelist, but it was very inconvenient. So we established an advanced idea and thought in the technology development stage after going through trials and errors in the past several years and succeeded in reaching a solution. It is the whitelist-based one. I think it will be the best model in terms of users' convenience if it is operated on an equal footing with the existing security product level. Our organization made it possible. Korea IT Times: Would you explain the DocStory RB solution function in detail? Kim: DocStoryRB is the new paradigm technology and has the function of controlling the data access aimed by ransomware. It can enhance flexibility to meet the organization's characteristics with the adjustment of the two rights - control of the whitelist-style access and approval or disapproval of users' implementation. It is the final weapon to minimize damages from ransomware by chasing the data adjustment on a real time and backing it up automatically. It verifies confidence by comparing the document-editing application route and digital signature by using the ransom block engine. If there is an attempt to change, delete or code files through unreliable application and process, it stops a relevant act and transmit the relevant log to server. In case of doing the document or other works at the endpoint, DRB(Data Real-time Backup) has a function of backing up the creating and adjusting major documents on a real time. In case that a data is damaged by ransomware or other malicious code, it can retrieve the data to the time right before the damage or a hopeful time. It is an integrated management solution that can transmit various policies to the running DWD and DRB at the endpoint. It supports the dashboard that can monitors the endpoint situation on a real time. At the same time, log about an abnormal process act occurred at the endpoint is notified on a real time, enabling a speedy countermeasure. Korea IT Times: Would you explain the market competitiveness of the DocStory solution? Kim: The sole competitive power of the DocStory solution is to generate the best defense effect with the minimum cost. The existing prevention systems require high cost as they have to introduce additional equipment for operation to enhance the defense effect. However, the DocStory solution can generate the best defense effect without additional equipment. Its support functions and strong points of implementation technologies are as follows - automatized real-time date protection, real-time synchronization and scheduling back up support, automatic program drive with ransom block engine in case of booting the window, support for network drive deposit equipment, data file extension filtering function, support for restoration of wrongly deposited file owing to time backup, creation, duplication, adjustment of file and folder, deleted name change and synchronization, support for all MS-based OS with over Window 7, a function to remove duplication in the same folder in case of doing backup, and support for client's action control following the manager's policy. Korea IT Times: Would you comment on the intellectual property rights and certifications? Kim: We received patents on (1) the security method of movable storage and the following movable storage (2) driver security system and method using virtual call route and (3) the whitelist-based ransomware block equipment and method. Also we acquired the GS 1-grade (DocStory v1.0) that was certified by the TTA. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/korea-it-times-interview-sm-technology-develops-new-ransomware-blocking-docstory-solution-for-the-first-time-in-the-world-300443809.html


Korea IT Times: What is ransomware? Kim Seong-ki(Kim): Ransomware, the compound word of ransom and software, refers to the malicious code that demands money by holding files hostage. In case that files of a PC user are infected by e-mail or a malicious link, he or she cannot access to the system and files. There are only two solutions - one is to pay ransom or to destroy relevant files eternally. According to the data of Korea Ransomware Computer Emergency Response Team, 49% of all companies became the target of cyber ransomware attack in 2016. The number of infectees soared from 53,000 in 2015 to 150,000 in 2016. The demanded bitcoin price also went up sharply from around $400 early 2016 to over $600 at present. The damage was estimated to have reached 300 billion won in 2016, a sharp rise from 109 billion won in 2015. Korea IT Times: Is there any method to cure ransomware? Kim: Unfortunately, there is no therapy method. Prevention is the best way. The ransom block is the best way by classifying normal and abnormal approaches intelligently, blocking the malicious approach and having the user know it. Korea IT Times: Is there a safe folder to completely block malicious approach? Kim: Valuable data on PC can be protected with ransom block. However, it is important to prepare a safe folder to protect the data more safely. Related to this, SM Technology has developed a safe folder that can be used easily like a general folder, block malicious approach completely and reduce the operation cost sharply. The overall process from the infiltration of ransomware to the progress of actual encryption should satisfy various conditions. There are many chances to detect in the process ranging from URL filtering web gateway to safe arrival of malware system, C&C server connection, encryption download, C&C server connection maintenance and file encryption. Then, what is the reason behind the high success rate of ransomware? For instance, the amount earned from ransomware surpassed 1 trillion won. Korea IT Times: What is the reason behind the high success rate of ransomware? Kim: In short, three are three reasons. Like this, the prevention system we have is to be infiltrated someday. And attackers are always faster than defenders. After all, there is a limit in the method to chase attackers so that it is necessary to develop technology, which makes attackers chase defenders. Korea IT Times: Would you introduce the base of the new paradigm that makes attackers chase defenders? Kim: The "DocStoryRB" Solution developed by SM Technology this time supports the ransomware block and data backup. Namely, it originally blocks the ransomware that infiltrated into the endpoint section from accessing to data and notifies the relevant threatening information to the control server on real time. Also it backs up the creating and modifying data with server on real time and activates the restoration to the original state by each time. Korea IT Times: The whitelist technology was introduced a long time ago. But it is widely known to have a limit in enhancing users' convenience. What is the difference with the existing technology? Kim: Various security technologies such as signature, behavior-based, restoration from the preventive dimension, blacklist, pattern and cloud authentication have been applied to the endpoint. After all, hackers won the victory. Options hackers can select are various. As a result, a best way was necessary to tide over such a limit. Accordingly, the technology method we had selected was the whitelist, but it was very inconvenient. So we established an advanced idea and thought in the technology development stage after going through trials and errors in the past several years and succeeded in reaching a solution. It is the whitelist-based one. I think it will be the best model in terms of users' convenience if it is operated on an equal footing with the existing security product level. Our organization made it possible. Korea IT Times: Would you explain the DocStory RB solution function in detail? Kim: DocStoryRB is the new paradigm technology and has the function of controlling the data access aimed by ransomware. It can enhance flexibility to meet the organization's characteristics with the adjustment of the two rights - control of the whitelist-style access and approval or disapproval of users' implementation. It is the final weapon to minimize damages from ransomware by chasing the data adjustment on a real time and backing it up automatically. It verifies confidence by comparing the document-editing application route and digital signature by using the ransom block engine. If there is an attempt to change, delete or code files through unreliable application and process, it stops a relevant act and transmit the relevant log to server. In case of doing the document or other works at the endpoint, DRB(Data Real-time Backup) has a function of backing up the creating and adjusting major documents on a real time. In case that a data is damaged by ransomware or other malicious code, it can retrieve the data to the time right before the damage or a hopeful time. It is an integrated management solution that can transmit various policies to the running DWD and DRB at the endpoint. It supports the dashboard that can monitors the endpoint situation on a real time. At the same time, log about an abnormal process act occurred at the endpoint is notified on a real time, enabling a speedy countermeasure. Korea IT Times: Would you explain the market competitiveness of the DocStory solution? Kim: The sole competitive power of the DocStory solution is to generate the best defense effect with the minimum cost. The existing prevention systems require high cost as they have to introduce additional equipment for operation to enhance the defense effect. However, the DocStory solution can generate the best defense effect without additional equipment. Its support functions and strong points of implementation technologies are as follows - automatized real-time date protection, real-time synchronization and scheduling back up support, automatic program drive with ransom block engine in case of booting the window, support for network drive deposit equipment, data file extension filtering function, support for restoration of wrongly deposited file owing to time backup, creation, duplication, adjustment of file and folder, deleted name change and synchronization, support for all MS-based OS with over Window 7, a function to remove duplication in the same folder in case of doing backup, and support for client's action control following the manager's policy. Korea IT Times: Would you comment on the intellectual property rights and certifications? Kim: We received patents on (1) the security method of movable storage and the following movable storage (2) driver security system and method using virtual call route and (3) the whitelist-based ransomware block equipment and method. Also we acquired the GS 1-grade (DocStory v1.0) that was certified by the TTA. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/korea-it-times-interview-sm-technology-develops-new-ransomware-blocking-docstory-solution-for-the-first-time-in-the-world-300443809.html


News Article | January 11, 2016
Site: www.scientificcomputing.com

The Deutsches KlimaRechenZentrum (DKRZ), German Climate Computer Center, supercomputer is ranked among the largest systems employed for scientific computing. On October 5, 2015, Germany enhanced its leadership in climate research with the inauguration of Mistral — a state-of-the-art HPC system and one of the world’s most efficient supercomputers. The Mistral HPC system is 20 times faster than the previous supercomputer and features a large storage system to house the large climate simulation data archive managed by DKRZ. Using the Mistral system and high performance computing (HPC) tools will keep DKRZ at the forefront in supporting scientific and climate modeling research. Scientists conduct premium climate research and are able to simulate anthropogenic influences on the climate system, which includes cloud research. Clouds and precipitation strongly influence atmospheric radiation and are critical for life on earth. The scale of clouds spans from a micrometer — which is the size of a single cloud particle — to hundreds of kilometers, which is the dimension of a frontal system. Researchers have to resolve all ranges, which makes exact modeling of clouds and precipitation physically difficult and extremely resource-consuming in terms of computer time and storage space. Climate modeling research requires supercomputers that combine the power of thousands of computers and HPC tools to simulate complex climate models and research problems. Mistral is used in the High Definition Clouds and Precipitation for Climate Prediction-HD(CP)2 project, which integrates cloud building and precipitation processes into atmospheric simulations to better understand and research clouds and cloud related processes. The project uses cloud resolving modeling to determine cloud formations in central Europe. According to Professor Thomas Ludwig, Director of the German Climate Computing Center, “The unique characteristic of HD(CP)2 is to develop a cloud resolving LES version (Large Eddy Simulation) of the ICON model (Icosahedral non-hydrostatic general circulation model, a joint development of the German Weather Service DWD and the Max Planck Institute for Meteorology, see e.g. www.mpimet.mpg.de/en/science/models/icon.html) in order to explicitly simulate cloud and precipitation processes. The model region is centered on Germany using a grid with a resolution of 10,000 x 10,000 x 400 grid elements and a grid spacing of 100 m (www.hdcp2.eu). Such simulations are computationally very intensive and the necessary computing power can be found only on massively parallel computing platforms. In order to achieve this, DKRZ performed a major refactoring of the ICON model.” Figure 2 shows a visualization of the simulated cloud water content for one time step with about 3.5 billion cells per time step (22.5 million cells per slice on 150 levels). The data is on-the-fly resampled from an unstructured ICON grid onto a regular Cartesian grid with a down sampling of 1/10. The ICON simulation was performed using over 400 nodes of Mistral, while the visualization was done using the Vapor software on one single GPU node of the system, thereby consuming over 200 GB of main memory. “The ability to conduct this level of cloud and atmospheric research requires the use of a state-of-the-art HPC system. Using Mistral and HPC tools allows DKRZ to run new processes and ensemble members as well as see clouds or local climate at a higher resolution. Per core, we see a performance improvement of our models between 1.8 and 2.6 using one Intel Xeon processor core, as compared to one 4.7 GHz IBM Power6 core. In times where scientists expect performance gains only through scaling, this is a welcome advancement,” Ludwig states. Mistral, the new High Performance Computer System for Earth System Research (HLRE-3) is from the French company Bull, which was purchased by Atos in 2014. Mistral replaces the IBM Power6 system named Blizzard which was in operation at DKRZ since 2009. The Mistral supercomputer is being installed in two stages. Phase 1 of the Mistral system began in June 2015, with the second stage of Mistral expansion scheduled for summer 2016. Parallel to the installation of the Mistral system in Phase 1, DKRZ users had access to a small test system with 432 Intel Xeon processor cores and a 300 TByte Lustre file system from Xyratex/Seagate, for the purpose of preparing climate models for the new architecture. During the testing, DKRZ provided training classes on how to use the new system in the areas of debugging, machine usage and using visualization tools. The Mistral System consists of computer components by Bull, a disk storage system by Xyratex/Seagate and high performance network switches by Mellanox. These components are distributed over 41 racks weighing up to or even more than a metric ton, which are connected by bundles of fiber fabric. The Phase 1 Mistral supercomputer has about 1,500 compute nodes on the basis of the bullx 700 DLC system each with two 12 core Intel Xeon processors 2680 v3 (for a total of 36,000 cores) — the system and racks deploy hot liquid direct cooling. The Mistral Intel processor-based system allows an inlet cooling liquid temperature of 40 degrees centigrade. The hot liquid heats up to 50 degrees centigrade and is piped to the roof for cooling by fans only. “This means that all the racks that have the hot liquid cooling do not require additional expensive chillers, as the temperature on the roof in Hamburg almost never exceeds 40 degrees,” states Ludwig. Mistral provides 24 high-end visualization nodes equipped with powerful graphics processors and 100 further nodes for pre- and post-processing and analysis of data. All components are connected with each other via optical cables and can directly access the shared file system. This means that the results of modeling calculated on the supercomputer can be directly analyzed on the data visualization nodes. DKRZ does not conduct climate research itself but supports climate modeling and related scientific research. Ludwig indicates, “We participate in various infrastructure and research projects with the aim to support the climate scientists in all aspects of their work in our HPC environment. DKRZ departments support scientists in model parallelization and optimization of the code, data management, storage, data compression, analysis and visualization, help with libraries, improving I/O as well as quality assurance and archiving of data.” DKRZ uses the Allinea DDT debugging tool and Vampire and Intel VTune software as performance tuning tools and Vapor as the visualization tool. The DKRZ staff creates customized in-house tools to help with issues such as data compression, scalability and visualization issues in the parallel climate simulations and research models. There is close cooperation with Ludwig´s research group from the University of Hamburg. In fact, his chair for Scientific Computing has his offices in the DKRZ building. A group of 10 researchers focus on file system and storage issues and on energy efficiency for HPC. In the Mistral phase 1 system, DKRZ uses a 20 PBytes Lustre filesystem based on a Xyratex/Seagate CS9000 system with a bandwidth in excess of 150 GB/s. The metadata performance is outperforming its competitors. The Lustre filesystem will be expanded to 50 PBytes and 430 GB/s in 2016 as the Mistral system expands. According to Ludwig, “In addition to supporting our users to efficiently utilize the supercomputer, we engage in joint projects to enable new science on the current and future systems. Since our users run a large diversity of different models on our system, DKRZ also develops universally usable libraries to facilitate scalable parallel models (YAXT) and make better use of the available storage capacity through data compression (libAEC).” In addition to their other services, DKRZ manages the world’s largest climate simulation data archive. The archive is used by researchers worldwide and contains massive amounts of data. The archive currently contains more than 40 PBytes of data and is projected to grow by 75 PBytes annually over the next five years. “There is a growing gap in the ability of HPC systems to generate large amounts of data and the cost of storage to store this data. DKRZ estimates we are currently spending 25 percent of our investment budget, as well as the electricity expenses, on storage, and we expect this gap to increase for the climate modeling data created in the future. The widening gap between compute capabilities and storage is a problem which means we need to shift some focus to how to maximize storage if you want to keep the balance in the ability to store all the data being generated.” “Lustre as a file system gets constantly increasing support from major vendors and from the computer science community,” Ludwig said. “We are confident that emerging requirements will be picked up quickly and solutions can be provided promptly.” DKRZ supports CMIPs and the Intergovernmental Panel on Climate Change (IPCC) Research DKRZ performs simulations for the research community, such as the Climate Modeling Intercomparison Projects (CMIPs), which build the foundation for the findings presented in the IPCC reports. Climate modelers in Germany worked on the IPCC project performing calculations using the DKRZ computer with an Earth system model from the Max Planck Institute for Meteorology that also simulated the carbon cycle. Ludwig indicates, “We stored approximately 2 PBytes of CMIP5 data on the Mistral machine from DKRZ and international centers. Planning is going on for how much data DRKZ will receive on the next CMIP6 project. It is expected there will be 20 to 50 times more data. DKRZ expects to begin computations for the next IPCC report in 2016. The German climate model data contribution for publications that will be included in the next IPCC assessment report will start to be released in 2016 and will be computed exclusively on the expanded Mistral machine. The DKRZ Center has extensive experience in computations and data dissemination for the IPCC report, and with the new Mistral system, we have a powerful computer and storage system to host at least all the computations that will be conducted on the German side, and probably more.” Installation of the expanded Mistral system is predicted to start in February of 2016 and will also use the Bull direct hot liquid cooling being used in the Mistral Phase 1 system. According to Helena Liebelt, Intel Business Development Manager, “The second phase of the Mistral HLRE-3 System is planned to be available in summer 2016. The expanded Mistral system in 2016 will have more than 3,000 computing nodes and more than 68,000 cores. This extension will roughly double computing and disk storage capacity. With a peak performance of 3 PFlops and a 50 PByte parallel file system, scientists can improve the regional resolution, account for more processes in the Earth system models or reduce uncertainties in climate projections.” Ludwig estimates the expanded Mistral system will be in the Top 100 of the June 2016 TOP500 list and in the top five in the file system capabilities — making Mistral one of the top HPC systems worldwide for storage. While DKRZ uses Seagate Lustre in the production environment of Mistral, Ludwig´s research group became a member of Intel´s Parallel Processing Center for Lustre (IPCC-L) and will conduct research on data compression mechanisms. The group is also using Intel Xeon Phi coprocessors and graphics processing units (GPUs) in their test environment. How HPC will aid climate modeling in the future Professor Ludwig indicates that computer scientists face a number of challenges in climate modeling, including “the growing number of cores and the fact that parallelization is becoming more complicated due to multiple runs of climate simulations which are mathematically non-linear. Memory bandwidth is always a problem, because climate modeling applications are memory intensive. The ability to modify code to take advantage of HPC parallelization and optimization is a problem because of legacy code and not enough software engineers to adapt codes. Growing energy requirements may become a limitation in providing more computational power to future climate models. In addition, I/O bandwidth and storage capacity growth may be even harder to maintain. Science is looking to computer scientists to develop software that can handle the huge number of computing elements.” As supercomputers such as Mistral and HPC tools advance, it will be possible to create finer grids and more grid cells, which will provide a higher resolution of climate information. The German government has funded a project called PalMod that takes the opposite approach and uses a coarse grid for a very long simulated time period. It seeks to apply today’s climate models against 135,000 years of data going back to the ice age. The hope is that this will allow researchers to recompute climate data to see how effective the current climate models are in showing past climate changes and as a way to predict climate changes in the future. DKRZ will be involved in supporting PalMod. However, today’s many and multi core processor architectures will probably not be sufficient to achieve the desired performance: a challenge to be addressed jointly by DKRZ and industry. “DKRZ is the link between hardware vendors, solution providers and the climate research community. Its vision is to make the potential of accelerating technical progress reliably accessible to climate research. We closely follow technological trends and are in permanent contact with companies such as processor producers. At the same time, we participate in climate research projects to learn about the future resources that will be necessary for new insights. We translate between these communities and communicate scientific requirement specifications and technical product characteristics. An efficient usage of HPC adds optimal value to the science of climate researchers,” states Ludwig. Linda Barney is the founder and owner of Barney and Associates, a technical/marketing writing, training and web design firm in Beaverton, OR.


Yano J.-I.,Meteo - France | MacHulskaya E.,DWD | Bechtold P.,ECMWF | Plant R.S.,University of Reading
Bulletin of the American Meteorological Society | Year: 2013

The fifth annual series of workshop entitled 'Concepts for Convective Parameterizations in Large-Scale Models' was held in 2012. The purpose of the workshop series has been to discuss the fundamental theoretical issues of convection parameterization with a small number of European scientists. It was funded by the European Cooperation in Science and Technology (COST) Action ES0905. The theme of the workshop for the year 2012 was decided from a main conclusion of the earlier workshop, which focused on the convective organization problem, seeking a means for implementing such effects into convection parameterizations. The participants were informed that the inclusion of mid-troposphere humidity sensitivities into entrainment and detrainment formulations had contributed substantially to the model \improvements. It had significantly contributed to the improvement in prediction of the Madden-Julian oscillation (MJO).


Lock S.-J.,University of Leeds | Bitzer H.-W.,DWD | Coals A.,University of Leeds | Gadian A.,University of Leeds | Mobbs S.,University of Leeds
Monthly Weather Review | Year: 2012

Advances in computing are enabling atmospheric models to operate at increasingly fine resolution, giving rise to more variations in the underlying orography being captured by the model grid. Consequently, highresolution models must overcome the problems associated with traditional terrain-following approaches of spurious winds and instabilities generated in the vicinity of steep and complex terrain. Cut-cell representations of orography present atmospheric models with an alternative to terrain-following vertical coordinates. This work explores the capabilities of a cut-cell representation of orography for idealized orographically forced flows. The orographic surface is represented within the model by continuous piecewise bilinear surfaces that intersect the regular Cartesian grid creating cut cells. An approximate finite-volume method for use with advection-form governing equations is implemented to solve flows through the resulting irregularly shaped grid boxes. Comparison with a benchmark orographic test case for nonhydrostatic flow shows very good results. Further tests demonstrate the cut-cellmethod for flowaround 3D isolated hills and stably resolving flows over very steep orography. © 2012 American Meteorological Society.


Paschke E.,DWD | Leinweber R.,DWD | Lehmann V.,DWD
Atmospheric Measurement Techniques | Year: 2015

We present the results of a 1-year quasi-operational testing of the 1.5 μm StreamLine Doppler lidar developed by Halo Photonics from 2 October 2012 to 2 October 2013. The system was configured to continuously perform a velocity-azimuth display scan pattern using 24 azimuthal directions with a constant beam elevation angle of 75°. Radial wind estimates were selected using a rather conservative signal-to-noise ratio based threshold of -18.2 dB (0.015). A 30 min average profile of the wind vector was calculated based on the assumption of a horizontally homogeneous wind field through a Moore-Penrose pseudoinverse of the overdetermined linear system. A strategy for the quality control of the retrieved wind vector components is outlined for ensuring consistency between the Doppler lidar wind products and the inherent assumptions employed in the wind vector retrieval. Quality-controlled lidar measurements were compared with independent reference data from a collocated operational 482 MHz radar wind profiler running in a four-beam Doppler beam swinging mode and winds from operational radiosonde measurements. The intercomparison results reveal a particularly good agreement between the Doppler lidar and the radar wind profiler, with root mean square errors ranging between 0.5 and 0.7 ms-1 for wind speed and between 5 and 10° for wind direction. The median of the half-hourly averaged wind speed for the intercomparison data set is 8.2 ms-1, with a lower quartile of 5.4 ms-1 and an upper quartile of 11.6 ms-1. © Author(s) 2015.


Frech M.,DWD | Steinert J.,DWD
Hydrology and Earth System Sciences | Year: 2015

An intense orographic precipitation event on 5 January 2013 is analyzed using a polarimetric C-band radar situated north of the Alps. The radar is operated at the meteorological observatory Hohenpeißenberg (MHP, 1006 m a.s.l. - above sea level) of the German Meteorological Service (DWD). The event lasted about 1.5 days and in total 44 mm precipitation was measured at Hohenpeißenberg. Detailed high resolution observation on the vertical structure of this event is obtained through a birdbath scan at 90 elevation which is part of the operational scanning. This scan is acquired every 5 min and provides meteorological profiles at high spatial resolution which are often not available in other radar networks. In the course of this event, the melting layer (ML) descends until the transition from rain into snow is observed at ground level. This transition from rain into snow is well documented by local weather observers and a present-weather sensor. The orographic precipitation event reveals mesoscale variability above the melting layer which can be attributed to a warm front. This variability manifests itself through substantially increased hydrometeor fall velocities. Radiosounding data indicate a layered structure in the thermodynamic field with increased moisture availability in relation to warm air advection. Rimed snowflakes and aggregation in a relatively warm environment lead to a signature in the radar data which is attributed to wet snow. The passage of the warm front leads to a substantial increase in rain rate at the surface. We use the newly implemented hydrometeor classification scheme "Hymec" to illustrate issues when relating radar products to local observations. For this, we employ data from the radar near Memmingen (MEM, 65 km west of MHP, 600 m a.s.l.) which is part of DWD's operational radar network. The detection, in location and timing, of the ML agrees well with the Hohenpeißenberg radar data. Considering the size of the Memmingen radar sensing volume, the detected hydrometeor (HM) types are consistent for measurements at or in a ML, even though surface observations indicate for example rain whereas the predominant HM is classified as wet snow. To better link the HM classification with the surface observation, either better thermodynamic input for Hymec or a statistical correction of the HM classification similar to a model output statistics (MOS) approach may be needed. © Author(s) 2015.


News Article | November 8, 2016
Site: www.eurekalert.org

Researchers are running structural studies using extensive numerical simulations on a supercomputer to study the motion of more than 500 atoms -- in an effort to determine the forces on each atom and the total energy via density functional calculations WASHINGTON, D.C., Nov. 8, 2016 -- As Francis Crick, one of Britain's great scientists, once said: "If you want to understand function, study structure." Within the realm of chemical physics, a clear example of this is the two forms of carbon -- diamond and graphite. While they differ only in the atomic arrangement of atoms of a single element, their properties are quite different. Differences between the properties of seemingly similar elements of a "family" can be intriguing. Carbon, silicon, germanium, tin and lead are all part of a family that share the same structure of their outermost electrons, yet range from acting as insulators (carbon) to semiconductors (silicon and germanium) to metals (tin and lead). Is it possible to understand these and other trends within element families? In an article this week in The Journal of Chemical Physics, from AIP Publishing, a group of researchers from Peter Grünberg Institute (PGI) in Germany, and Tampere University of Technology and Aalto University in Finland, describe their work probing the relationship between the structure (arrangement of atoms) and function (physical properties) of a liquid metal form of the element bismuth. "There are relatively few -- less than 100 -- stable elements, which means that their trends are often easier to discern than for those of alloys and compounds of several elements," said Robert O. Jones, a scientist at PGI. The group's present work was motivated largely by the availability of high-quality experimental data -- inelastic x-ray scattering (IXS) and neutron diffraction -- and the opportunity to compare it with results for other liquids of the Group 15 nitrogen family (phosphorus, arsenic, antimony and bismuth). Phosphorus seems to have two liquid phases, and the amorphous form of antimony obtained by cooling the liquid crystallizes spontaneously and explosively. Their structural studies use extensive numerical simulations run on one of the world's most powerful supercomputers, JUQUEEN, in Jülich, Germany. "We're studying the motion of more than 500 atoms at specified temperatures to determine the forces on each atom and the total energy using density functional calculations," Jones explained. "This scheme, for which Walter Kohn was awarded the 1998 Nobel Prize in chemistry, doesn't involve adjustable parameters and has given valuable predictions in many contexts." While density functional theory is in principle exact, it is necessary to utilize an approximate functional. The positions and velocities of each atom, for example, are "stored at each step of a 'molecular dynamics' simulation, and we use this information to determine quantities that can be compared with experiment," he continued. "It's important to note that some quantities that are given directly by the simulation, such as the positions of the atoms, can only be inferred indirectly from the experiment, so that the two aspects are truly complementary." One of their most surprising and pleasing results was "the excellent agreement with recent IXS results," Jones said. "One of the experimentalists involved noted that the agreement of our results with the IXS 'is really quite beautiful,' so that even small differences could provide additional information. In our experience, it's unusual to find such detailed agreement." In terms of applications, the group's work "provides further confirmation that simulations and experiments complement each other and that the level of agreement can be remarkably good -- even for 'real' materials," Jones pointed out. "However, it also shows that extensive, expensive, and time-consuming simulations are essential if detailed agreement is to be achieved." Jones and his colleagues have extended their approach to even longer simulations in liquid antimony at eight different temperatures, with the goal of understanding the "explosive" nature of crystallization in amorphous antimony (Sb). "We've also run simulations of the crystallization of amorphous phase change materials over the timescale -- up to 8 nanoseconds -- that is physically relevant for DWD-RW and other optical storage materials," he added, emphasizing that these types of simulations on computers today typically require many months. "They show, however, just how valuable they can be, and the prospects with coming generations of computers -- with even better optimized algorithms -- are very bright." The prospects of applications within other areas of materials science are extremely good, but the group is now turning its attention to memory materials of a different type -- for which the formation and disappearance of a conducting bridge (a metallic filament) in a solid electrolyte between two electrodes could be the basis of future storage materials. "Details of the mechanism of bridge formation are the subject of speculation, and we hope to provide insight into what really happens," Jones said. The article, "Collective excitations and viscosity in in liquid Bi," is authored by Matti Ropo, Jaakko Akola and R.O. Jones. The article appeared in The Journal of Chemical Physics November 8, 2016 (DOI: 10.1063/1.4965429) and it can be accessed at http://scitation. . The Journal of Chemical Physics publishes concise and definitive reports of significant research in the methods and applications of chemical physics. See http://jcp. .


News Article | November 9, 2016
Site: www.sciencedaily.com

As Francis Crick, one of Britain's great scientists, once said: "If you want to understand function, study structure." Within the realm of chemical physics, a clear example of this is the two forms of carbon -- diamond and graphite. While they differ only in the atomic arrangement of atoms of a single element, their properties are quite different. Differences between the properties of seemingly similar elements of a "family" can be intriguing. Carbon, silicon, germanium, tin and lead are all part of a family that share the same structure of their outermost electrons, yet range from acting as insulators (carbon) to semiconductors (silicon and germanium) to metals (tin and lead). Is it possible to understand these and other trends within element families? In an article this week in The Journal of Chemical Physics, from AIP Publishing, a group of researchers from Peter Grünberg Institute (PGI) in Germany, and Tampere University of Technology and Aalto University in Finland, describe their work probing the relationship between the structure (arrangement of atoms) and function (physical properties) of a liquid metal form of the element bismuth. "There are relatively few -- less than 100 -- stable elements, which means that their trends are often easier to discern than for those of alloys and compounds of several elements," said Robert O. Jones, a scientist at PGI. The group's present work was motivated largely by the availability of high-quality experimental data -- inelastic x-ray scattering (IXS) and neutron diffraction -- and the opportunity to compare it with results for other liquids of the Group 15 nitrogen family (phosphorus, arsenic, antimony and bismuth). Phosphorus seems to have two liquid phases, and the amorphous form of antimony obtained by cooling the liquid crystallizes spontaneously and explosively. Their structural studies use extensive numerical simulations run on one of the world's most powerful supercomputers, JUQUEEN, in Jülich, Germany. "We're studying the motion of more than 500 atoms at specified temperatures to determine the forces on each atom and the total energy using density functional calculations," Jones explained. "This scheme, for which Walter Kohn was awarded the 1998 Nobel Prize in chemistry, doesn't involve adjustable parameters and has given valuable predictions in many contexts." While density functional theory is in principle exact, it is necessary to utilize an approximate functional. The positions and velocities of each atom, for example, are "stored at each step of a 'molecular dynamics' simulation, and we use this information to determine quantities that can be compared with experiment," he continued. "It's important to note that some quantities that are given directly by the simulation, such as the positions of the atoms, can only be inferred indirectly from the experiment, so that the two aspects are truly complementary." One of their most surprising and pleasing results was "the excellent agreement with recent IXS results," Jones said. "One of the experimentalists involved noted that the agreement of our results with the IXS 'is really quite beautiful,' so that even small differences could provide additional information. In our experience, it's unusual to find such detailed agreement." In terms of applications, the group's work "provides further confirmation that simulations and experiments complement each other and that the level of agreement can be remarkably good -- even for 'real' materials," Jones pointed out. "However, it also shows that extensive, expensive, and time-consuming simulations are essential if detailed agreement is to be achieved." Jones and his colleagues have extended their approach to even longer simulations in liquid antimony at eight different temperatures, with the goal of understanding the "explosive" nature of crystallization in amorphous antimony (Sb). "We've also run simulations of the crystallization of amorphous phase change materials over the timescale -- up to 8 nanoseconds -- that is physically relevant for DWD-RW and other optical storage materials," he added, emphasizing that these types of simulations on computers today typically require many months. "They show, however, just how valuable they can be, and the prospects with coming generations of computers -- with even better optimized algorithms -- are very bright." The prospects of applications within other areas of materials science are extremely good, but the group is now turning its attention to memory materials of a different type -- for which the formation and disappearance of a conducting bridge (a metallic filament) in a solid electrolyte between two electrodes could be the basis of future storage materials. "Details of the mechanism of bridge formation are the subject of speculation, and we hope to provide insight into what really happens," Jones said.


News Article | November 8, 2016
Site: phys.org

Differences between the properties of seemingly similar elements of a "family" can be intriguing. Carbon, silicon, germanium, tin and lead are all part of a family that share the same structure of their outermost electrons, yet range from acting as insulators (carbon) to semiconductors (silicon and germanium) to metals (tin and lead). Is it possible to understand these and other trends within element families? In an article this week in The Journal of Chemical Physics, a group of researchers from Peter Grünberg Institute (PGI) in Germany, and Tampere University of Technology and Aalto University in Finland, describe their work probing the relationship between the structure (arrangement of atoms) and function (physical properties) of a liquid metal form of the element bismuth. "There are relatively few—less than 100—stable elements, which means that their trends are often easier to discern than for those of alloys and compounds of several elements," said Robert O. Jones, a scientist at PGI. The group's present work was motivated largely by the availability of high-quality experimental data—inelastic x-ray scattering (IXS) and neutron diffraction—and the opportunity to compare it with results for other liquids of the Group 15 nitrogen family (phosphorus, arsenic, antimony and bismuth). Phosphorus seems to have two liquid phases, and the amorphous form of antimony obtained by cooling the liquid crystallizes spontaneously and explosively. Their structural studies use extensive numerical simulations run on one of the world's most powerful supercomputers, JUQUEEN, in Jülich, Germany. "We're studying the motion of more than 500 atoms at specified temperatures to determine the forces on each atom and the total energy using density functional calculations," Jones explained. "This scheme, for which Walter Kohn was awarded the 1998 Nobel Prize in chemistry, doesn't involve adjustable parameters and has given valuable predictions in many contexts." While density functional theory is in principle exact, it is necessary to utilize an approximate functional. The positions and velocities of each atom, for example, are "stored at each step of a 'molecular dynamics' simulation, and we use this information to determine quantities that can be compared with experiment," he continued. "It's important to note that some quantities that are given directly by the simulation, such as the positions of the atoms, can only be inferred indirectly from the experiment, so that the two aspects are truly complementary." One of their most surprising and pleasing results was "the excellent agreement with recent IXS results," Jones said. "One of the experimentalists involved noted that the agreement of our results with the IXS 'is really quite beautiful,' so that even small differences could provide additional information. In our experience, it's unusual to find such detailed agreement." In terms of applications, the group's work "provides further confirmation that simulations and experiments complement each other and that the level of agreement can be remarkably good—even for 'real' materials," Jones pointed out. "However, it also shows that extensive, expensive, and time-consuming simulations are essential if detailed agreement is to be achieved." Jones and his colleagues have extended their approach to even longer simulations in liquid antimony at eight different temperatures, with the goal of understanding the "explosive" nature of crystallization in amorphous antimony (Sb). "We've also run simulations of the crystallization of amorphous phase change materials over the timescale—up to 8 nanoseconds—that is physically relevant for DWD-RW and other optical storage materials," he added, emphasizing that these types of simulations on computers today typically require many months. "They show, however, just how valuable they can be, and the prospects with coming generations of computers—with even better optimized algorithms—are very bright." The prospects of applications within other areas of materials science are extremely good, but the group is now turning its attention to memory materials of a different type—for which the formation and disappearance of a conducting bridge (a metallic filament) in a solid electrolyte between two electrodes could be the basis of future storage materials. "Details of the mechanism of bridge formation are the subject of speculation, and we hope to provide insight into what really happens," Jones said. More information: "Collective excitations and viscosity in in liquid Bi," The Journal of Chemical Physics, DOI: 10.1063/1.4965429

Loading DWD collaborators
Loading DWD collaborators