Nicosia, Cyprus
Nicosia, Cyprus

Time filter

Source Type

News Article | May 8, 2017
Site: www.eurekalert.org

Tropical Cyclone Donna continues to move through the South Pacific Ocean as a major hurricane. NASA's Aqua satellite passed over the storm and captured an image of a clear eye as the storm was located between the island nations of Vanuatu and New Caledonia. The GPM satellite found that the powerful hurricane was generating very high amounts of rainfall. Over the weekend tropical cyclone Donna dropped very heavy rain over Vanuatu as it moved toward the west of the islands. Donna had intensified and had maximum sustained winds of 115 knots (132 mph) on Monday morning, May 8. This made it the equivalent of a category four on the Saffir-Simpson hurricane wind scale. Vanuatu is a South Pacific Ocean nation made up of about 80 islands. New Caledonia is a French territory made up of dozens of islands that lie southwest of Vanuatu. As Tropical Cyclone Donna was intensifying the GPM core observatory satellite had two excellent views of the storm on succeeding days. When GPM flew over Donna on May 6, 2017 at 0146 UTC (May 5 at 9:46 p.m. EDT) the tropical cyclone was getting organized. The following day on May 7 at 1411 UTC (10:11 a.m. EDT) GPM showed that Donna was very well organized and had a well-defined eye. GPM's Microwave Imager (GMI) and Dual-Frequency Precipitation Radar (DPR) data showed that Vanuatu was being drenched with bands of very intense rain to the east of Donna's center. DPR showed that precipitation was falling at a rate of over 189 mm (7.4 inches) per hour in the eastern side of Donna's eye wall. At NASA's Goddard Space Flight Center in Greenbelt, Maryland, Tropical Cyclone Donna's rainfall structure was examined using data from GPM's radar (DPR Ku Band). GPM's DPR data was made into a 3-D view of the tropical cyclone's radar reflectivity. GPM's data swath revealed a cross section of rainfall through the eastern side of the tropical cyclone. GPM showed that some storm top heights were reaching altitudes above 14.3 km (8.9 mile) in tall storms in the eastern eye wall. GPM is a joint mission between NASA and the Japanese space agency JAXA. On May 8 at 02:50 UTC (May 7 at 10:50 p.m. EDT), NASA's Aqua satellite passed over Donna. The Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard Aqua captured a visible image of Tropical Cyclone Donna that showed the eye of the storm between New Caledonia and Vanuatu. At 0900 UTC (5 a.m. EDT), Donna's maximum sustained winds were near 115 knots (132 mph/213 kph) making it a Category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. It was centered near 17.5 degrees south latitude and 165.1 degrees east longitude, about 183 nautical miles west of Port Vila, Vanuatu. Donna was moving to the south-southeast at 7 knots (8 mph/13 kph). The Vanuatu National Disaster Management Office (NDMO) advised residents that Red Alert is active for Sanma, Malampa and Shefa provinces and a Yellow Alert is now in effect for the Tafea province. For updated forecasts for Vanuatu, visit: http://www. . New Caledonia Meteorological Service continues to issue warnings on Donna. Updates can be found at: http://www. The Joint Typhoon Warning Center (JTWC) predicts that Tropical Cyclone Donna has reached its peak intensity and will weaken as it heads toward the south-southeast over the next few days. Donna is expected to still be a powerful hurricane as it passes close to the east of New Caledonia. After May 9, the storm is forecast to move into an area with cooler sea surface temperatures and increased wind shear that are expected to weaken it quickly.


Gascon E.,University of León | Sanchez J.L.,University of León | Charalambous D.,Meteorological Service | Fernandez-Gonzalez S.,Complutense University of Madrid | And 3 more authors.
Atmospheric Research | Year: 2015

On 4 March 2011, an exceptionally heavy snowfall event affected the Madrid region on the central Iberian Peninsula. At altitudes of 1200 m, snowfall reached a record of 34. cm in 24. h and produced considerable damage and disruption to electricity distribution and transport systems. Maximum intensity precipitation was identified between 1600 and 1800 UTC. Associated precipitation was particularly intense in the Guadarrama Mountains (at the center of the Peninsula, near Madrid).Analysis of Meteosat Second Generation (MSG) satellite images revealed a dark area, generated by a stratospheric intrusion originating in the Atlantic and reaching the Iberian Peninsula. We studied synoptic conditions and mesoscale factors involved in the event, using the Weather Research and Forecasting (WRF) model. This permitted analysis of the evolution of the dry intrusion caused by a tropopause fold, its movement, and frontogenesis-related mechanisms during its crossing of the Guadarrama Mountains. The blocking of a wet warm mass at altitude owing to a descent of the tropopause but mainly at low levels because of orographic effects, helped concentrate moisture and generate potential instability (PI). This was subsequently released in deep convection, owing to the formation of frontogenesis. © 2014 Elsevier B.V..


News Article | March 4, 2016
Site: www.nature.com

Following a record winter in many ways, Arctic sea-ice cover seems poised to reach one of its smallest winter maxima ever. As of 28 February, ice covered 14.525 million square kilometres, or  938,000 square kilometres less than the 1981–2010 average. And researchers are using a new technique to capture crucial information about the thinning ice pack in near real time, to better forecast future changes. Short-term weather patterns and long-term climate trends have conspired to create an extraordinary couple of months, even by Arctic standards. “This winter will be the topic of research for many years to come,” says Jennifer Francis, a climate scientist at Rutgers University in New Brunswick, New Jersey. “There’s such an unusual cast of characters on the stage that have never played together before.” The characters include the El Niño weather pattern that is pumping heat and moisture across the globe, and the Arctic Oscillation, a large-scale climate pattern whose shifts in recent months have pushed warm air northward. Together, they are exacerbating the long-term decline of Arctic sea ice, which has shrunk by an average of 3% each February since satellite records began in 1979. A persistent ridge of high-pressure air perched off the US West Coast has steered weather systems around drought-stricken California, funnelling warmth northward. As a consequence, sea ice is particularly scarce this year in the Bering Sea. “The ice would normally be extensive and cold, but we have open water instead,” says Francis. A storm last December compounded the situation by pushing warm air — more than 20 °C above average — to the North Pole. In January, an Arctic Oscillation-driven warm spell heated the air above most of the Arctic Ocean. By February, ice had begun to circulate clockwise around the Arctic basin and out through the Fram Strait, says Julienne Stroeve, a researcher at the US National Snow and Ice Data Center (NSIDC) in Boulder, Colorado. Given the Arctic’s notoriously unpredictable weather, the low maximum doesn’t necessarily foretell record-low melting this summer, when sea ice will reach its annual minimum. (The biggest summer melt on record happened in 2012, a year without an El Niño.) But researchers have one new tool with which to track the changes as they happen this year — the first detailed, near-real-time estimates of ice thickness, from the European Space Agency’s CryoSat-2 satellite. Three research groups currently calculate Arctic ice thickness from satellite data, but with a lag time of at least a month. Faster estimates would allow shipping companies to better plot routes through the Arctic, and scientists to improve their longer-term forecasts of ice behaviour. “The quicker you have these estimates of sea-ice thickness, the quicker you can start assimilating them into models and make more timely predictions of what’s going to happen,” says Rachel Tilling, a sea-ice researcher at University College London. She and her colleagues have developed a faster way to get information on ice thickness from CryoSat-2 (see ‘Measuring stick’). The satellite measures thickness by comparing the time that it takes for radar signals to bounce off the ice, as opposed to open water. Normally, it takes several months for satellite operators to calculate Cryo-Sat-2’s precise orbit (and therefore the exact location of the ice and water that it flew over). But Tilling’s group instead runs a quick-and-dirty analysis of orbital data, then combines it with near-real-time information on ice concentration from the NSIDC and ice type from the Norwegian Meteorological Service (R. L. Tilling et al. Cryosphere Discuss. http://doi.org/bcw5; 2016). The result is ice-thickness measurements that are ready in just 3 days, and accurate to within 1.5% of those produced months later. The current winter cycle is the first complete season for the near-real-time data. (The measurements cannot be done in the summer, when melt ponds on the ice confuse the satellite.) Tilling has begun to speak to shipping companies, among others, that are interested in using the data as fast as they are produced. “It really is a new era for CryoSat-2,” she says. More-accurate ice-thickness data would improve climate models and give better forecasts for the possible impacts of thick or thin sea ice, says Nathan Kurtz, a cryosphere scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Kurtz helps to lead NASA’s IceBridge project, which will begin flying aeroplanes north of Greenland later this month to measure ice thickness using lasers and an infrared camera that can detect heat from the underlying water. Thickness measurements are more crucial than ever, given the changing Arctic, says David Barber, a sea-ice specialist at the University of Manitoba in Winnipeg, Canada. He and his colleagues reported last year that there is increased open water all around the edge of the Arctic ice pack every month of the year (D. G. Barber et al. Prog. Oceanogr. 139, 122–150; 2015). “We’re getting more open water in the winter than we were expecting,” Barber says. “These changes are happening very quickly, and I don’t think people are fully aware of how dramatic they are.”


News Article | February 15, 2017
Site: www.realclimate.org

When climate deniers are desperate because the measurements don’t fit their claims, some of them take the final straw: they try to deny and discredit the data. The years 2014 and 2015 reached new records in the global temperature, and 2016 has done so again. Some don’t like this because it doesn’t fit their political message, so they try to spread doubt about the observational records of global surface temperatures. A favorite target are the adjustments that occur as these observational records are gradually being vetted and improved by adding new data and eliminating artifacts that arise e.g. from changing measurement practices or the urban heat island effect. More about this is explained in this blog article by Victor Venema from Bonn University, a leading expert on homogenization of climate data. And of course the new paper by Hausfather et al, that made quite a bit of news recently, documents how meticulously scientists work to eliminate bias in sea surface temperature data, in this case arising from a changing proportion of ship versus buoy observations. To illustrate the shenanigans of self-styled “climate skeptics”, take for example the following graph, which has been circulating for a while on climate denier websites. It beautifully illustrates two of the favorite tricks of climate deniers: cherry picking and deceptive trick graphics. Fig. 1 Revision history of two individual monthly values for January 1910 and January 2000 in the GISTEMP global temperature data from NASA (Source: WUWT ) If you look at the black arrows, do you have the impression that the 0.71 ° C temperature difference is mainly due to data adjustments? Because the arrow on the right is three times longer than that on the left? Far from it – can you spot the trick? In the vertical axis, 0.3 ° C is missing in the middle! The adjustment is actually only 0.26 °C. Even that is quite a lot of course – and that’s because it is an extreme example. The January 1910 shown is the month with the second largest downward correction, obviously cherry-picked from the 1,643 months of the data series. In the annual mean values and particularly for the temperatures since the Second World War, the corrections are minimal, as the following graph shows: Fig. 2 Revision history of global temperature data set from NASA. Here, too, one can see that in 1910 the greatest correction occurred. (Source: Goddard Institute for Space Studies ) This graph must be familiar to anyone who works with the NASA data, because it is in the notes to the data on the NASA site (even interactive). Incidentally, Gavin already debunked the misleading representation in Fig. 1 last March on Twitter. Anyone who shows you Fig. 1 without also explaining the big picture as shown in Fig. 2 is trying to fool you. A denier favorite is to suggest that NASA deliberately adjusts temperatures upward to exaggerate global warming. An absurd conspiracy theory, as demonstrated by the basic fact that the net effect of the data adjustments is to reduce global warming. The next figure shows this. If climate scientists were trying to exaggerate global warming they’d show you the unadjusted raw data! Fig. 3 The NASA data of global temperature compared to the uncorrected raw data (light blue) and two global temperature data sets from other institutes. (Source: Goddard Institute for Space Studies ) It is not surprising that you find such conspiracy theories on Anthony Watts’ sectarian blog WUWT, a place where climate science amateurs present earth-shattering insights to their echo chamber like: The CO2 increase is not due to burning fossil fuels, but to insects! Global warming is caused by the nuclear reactor in the Earth’s core! The warming of Antarctica comes from “waste heat from little pockets of humanity” there! The Greenland ice sheet can be at most 650 years old! (That’s obvious – how could the Vikings otherwise have grown crops there?) The claim that the NASA data can’t be trusted has recently turned up at WUWT once again because we had compared these data to a cooling forecast made by two German climate denialists, and the comparison wasn’t exactly looking good for that forecast. Fritz Vahrenholt and Sebastian Lüning, both former employees of Europe’s largest single CO2 emitter RWE, had made this forecast of imminent cooling in their 2012 book “Die kalte Sonne” (The cold sun), where this forecast is shown in relation to the HadCRUT surface temperature data in running mean over 23 months. So if they don’t like the fact that we compared their forecast to NASA’s GISTEMP data, let’s just do it with the HadCRUT data. Fig. 4 The “Cold Sun” forecast of Vahrenholt and Lüning compared with global surface temperatures of the British Meteorological Service (HadCRUT data), moving average over 23 months to end of October 2016. Graph: Prof. Stefan Rahmstorf, Creative Commons BY-SA 4.0 . Hmm. Still not so convincing for the cold-sun forecast. Thus in defense of their forecast at WUWT, Vahrenholt and Lüning have therefore applied three tricks which reduce the discrepancy of data and forecast. But even with these three changes compared to the original forecast graph in The Cold Sun, Vahrenholt and Lüning don’t succeed in preventing the observed temperature curve from rising out of their forecast interval. They try to belittle that with the argument that last month’s value just returns to the top edge of the forcast interval – which is irrelevant, however, because this forecast does not apply to individual months, which are always strongly scattered. The second step alone – just switching from surface to the satellite data – would not have helped them much by the way, bringing only a reduction from 0.34 to 0.30 °C. One can guess that this is why Vahrenholt and Lüning have also extended the smoothing period from two to three years. But let’s accept this longer averaging period as a legitimate choice, since the forecast applies to the medium-term climate evolution and not short-term fluctuations, so that the latter can be filtered out by smoothing. Using a period of three years instead of just two will take out El Niño better. With the 37 months smoothing period the comparison looks as follows: Fig. 5 The “Cold Sun” forecast of Vahrenholt and Lüning compared with global surface temperatures of the British Meteorological Service (HadCRUT data), running average over 37 months. Graph: Prof. Stefan Rahmstorf, Creative Commons BY-SA 4.0. This still clearly falsifies the cooling forecast of Vahrenholt and Lüning. I have discussed this example here in some detail because it exemplifies the methods of so-called “climate skeptics”. People like Vahrenholt and Lüning trust that a layperson won’t notice their various tricks. An outsider can ultimately hardly recognize these unless he studies intensively the available data and scientific literature. However, applying some common-sense criteria can give a layperson a clear indication of the lack of credibility: the source is a “climate skeptics” website, there is no research institution and no professional climate scientists behind these claims, and there is no peer-reviewed publication with the cooling forecast, rather it is directed exclusively at a lay audience. Finally there is a connection of the authors to the fossil energy business. As in professional journalism, there are several levels of quality assurance in professional science. A long study and training time, which conveys methods and ethics (like the search for truth and the continuous questioning of own assumptions). The standards of good scientific practice (non-compliance, such as manipulation of data, can cost a scientist their job and future prospects.) The reputation as the greatest asset of the scientist and his research institution, that is rapidly lost when making wrong claims. Peer review, i.e. the critical assessment of scientific publications (and even institutions) by independent third parties (mostly competitors). And last but not least, the culture of critical, open debate, which is very much alive e.g. at conferences, which will quickly identify most problems or mistakes. None of this is infallible, and professional scientists sometimes make mistakes. For this reason, one should not necessarily believe every individual statement by a scientist, not even each peer-reviewed publication. It is better to base ones assessment on the bigger picture. There is good reason why every few years, hundreds of climate scientists from around the world voluntarily and unpaid tackle the big task of sifting through the scientific literature and debating it and summarizing the state of knowledge in the reports of the IPCC. There has long been an overwhelming consensus about the basic facts of global warming. Anyone who finds serious, defensible counter-evidence would quickly become famous – a place in the top journals Nature, Science or PNAS would be assured. The likelihood that you will find a scientific sensation on a shrill layperson website like WUWT is infinitely smaller than that you are simply being fooled there.


Retalis A.,Institute for Environmental Research and Sustainable Development | Hadjimitsis D.G.,Cyprus University of Technology | Michaelides S.,Meteorological Service | Tymvios F.,Meteorological Service | And 3 more authors.
Natural Hazards and Earth System Science | Year: 2010

The monitoring of aerosol concentrations comprises a high environmental priority, particularly in urban areas. Remote sensing of atmospheric aerosol optical thickness (AOT) could be used to assess particulate matter levels at the ground. However, such measurements often need further validation. In this study, aerosol data retrieved from satellite and sun-photometer, on the one hand, and visibility data at various locations in Cyprus, on the other hand, for the period from January to June 2009 are contrasted. The results obtained by the direct comparison between MODIS and handheld sun-photometer AOT data exhibited a significant correlation (r=0.83); these results are in agreement with those reported by the National Aeronautics and Space Administration (NASA). The correlation between sun-photometer AOT and that estimated from visibility measurements was also significant (r=0.76). A direct and significant relationship between MODIS AOT and AOT estimated from visibility values was also found for all the locations used (the correlation coefficient was found to vary from 0.80 to 0.84). It is concluded that MODIS AOT data provide accurate information on the aerosol content in Cyprus, while in the absence of such data, visibility measurements could be used as a secondary source of aerosol load information, in terms of aerosol optical thickness, and provide useful information on a near-real time basis, whenever data are available. © 2010 Author(s).


Themistocleous K.,Cyprus University of Technology | Hadjimitsis D.G.,Cyprus University of Technology | Retalis A.,Institute of Environmental Research and Sustainable Development | Chrysoulakis N.,Foundation for Research and Technology Hellas | Michaelides S.,Meteorological Service
Atmospheric Research | Year: 2013

One of the most well-established atmospheric correction methods of satellite imagery is the use of the empirical line method using non-variant targets. Non-variant targets serve as pseudo-invariant targets since their reflectance values are stable across time. A recent adaptation of the empirical line method incorporates the use of ground reflectance measurements of selected non-variant targets. Most of the users are not aware of the existing conditions of the pseudo-invariant targets; i.e., whether they are dry or wet. Any omission of such effects may cause erroneous results; therefore, remote sensing users must be aware of such effects. This study assessed the effects of precipitation on five types of commonly located surfaces, including asphalt, concrete and sand, intended as pseudo-invariant targets for atmospheric correction. Spectroradiometric measurements were taken in wet and dry conditions to obtain the spectral signatures of the targets, from January 2010 to May 2011 (46 campaigns). An atmospheric correction of eleven Landsat TM/ETM. +. satellite images using the empirical line method was conducted. To identify the effects of precipitation, a comparison was conducted of the atmospheric path radiance component for wet and dry conditions. It was found that precipitation conditions such as rainfall affected the reflectance values of the surfaces, especially sand. Therefore, precipitation conditions need to be considered when using non-variant targets in atmospheric correction methods. © 2012 Elsevier B.V.


Gabella M.,Polytechnic University of Turin | Morin E.,Hebrew University of Jerusalem | Notarpietro R.,Polytechnic University of Turin | Michaelides S.,Meteorological Service
Atmospheric Research | Year: 2013

The spaceborne weather radar onboard the Tropical Rainfall Measuring Mission (TRMM) satellite can be used to adjust Ground-based Radar (GR) echoes, as a function of the range from the GR site. The adjustment is based on the average linear radar reflectivity in circular rings around the GR site, for both the GR and attenuation-corrected . NearSurfZ TRMM Precipitation Radar (TPR) images. In previous studies, it was found that in winter, for the lowest elevation of the Cyprus C-band radar, the GR/TPR equivalent rain rate ratio was decreasing, on average, of approximately 8. dB per decade. In this paper, the same analysis has been applied to another C-band radar in the southeastern Mediterranean area. For the lowest elevation of the "Shacham" radar in Israel, the GR/TPR equivalent rain rate ratio is found to decrease of approximately 6. dB per decade. The average departure at the "reference", intermediate range is related to the calibration of the GR. The negative slope of the range dependence is considered to be mainly caused by an overshooting problem (increasing sampling volume of the GR with range combined with non-homogeneous beam filling and, on average, a decreasing vertical profile of radar reflectivity). To check this hypothesis, we have compared the same . NearSurfZ TPR images versus GR data acquired using the second elevation. We expected these data to be affected more by overshooting, especially at distant ranges: the negative slope of the range dependence was in fact found to be more evident than in the case of the lowest GR elevation for both the Cypriot and Israeli radar. © 2013 Elsevier B.V.


Tymvios F.,Meteorological Service | Savvidou K.,Meteorological Service | Michaelides S.C.,Meteorological Service
Advances in Geosciences | Year: 2010

Dynamically induced rainfall is strongly connected with synoptic atmospheric circulation patterns at the upper levels. This study investigates the relationship between days of high precipitation volume events in the eastern Mediterranean and the associated geopotential height patterns at 500 hPa. To reduce the number of different patterns and to simplify the statistical processing, the input days were classified into clusters of synoptic cases having similar characteristics, by utilizing Kohonen Self Organizing Maps (SOM) architecture. Using this architecture, synoptic patterns were grouped into 9, 18, 27 and 36 clusters which were subsequently used in the analysis. The classification performance was tested by applying the method to extreme rainfall events in the eastern Mediterranean. The relationship of the synoptic upper air patterns (500 hPa height) and surface features (heavy rainfall events) was established, while the 36 member classification proved to be the most efficient. © 2010 Author(s).


Michaelides S.,Meteorological Service | Savvidou K.,Meteorological Service | Nicolaides K.,Meteorological Service
Advances in Geosciences | Year: 2010

The objective of this work is to study the relationship between the number of lightning recorded by a network of lightning detectors and the amount of rainfall recorded by the network of automatic rain gauges, during rainy events in Cyprus. This study aims at revealing possible temporal and spatial "relationships" between rainfall and lightning intensities. The data used are based on the available records of hourly rainfall data and the "associated" lightning data, with respect to both time and space. The search for temporal and spatial relationships between lightning and rainfall is made by considering various time-lags between lightning and rainfall, and by varying the area around the rain gauge which the associated lightning data set refers to. The methodology adopted in this paper is a statistical one and rainy events registered under the European Project "FLASH" are examined herein. © 2012 Author(s).


Michaelides S.,Meteorological Service | Tymvios F.,Meteorological Service | Charalambous D.,Meteorological Service | Charalambous D.,Lancaster University
Advances in Geosciences | Year: 2010

The present study is a comprehensive application of a methodology developed for the classification of synoptic situations using artificial neural networks. In this respect, the 500 hPa geopotential height patterns at 12:00 UTC (Universal Time Coordinated) determined from the reanalysis data (ERA-40 dataset) of the European Centre for Medium range Weather Forecasts (ECMWF) over Europe were used. The dataset covers a period of 45 years (1957-2002) and the neural network methodology applied is the SOM architecture (Self Organizing Maps). The classification of the synoptic scale systems was conducted by considering 9, 18, 27 and 36 synoptic patterns. The statistical analysis of the frequency distribution of the classification results for the 36 clusters over the entire 44-year period revealed significant tendencies in the frequency distribution of certain clusters, thus substantiating a possible climatic change. In the following, the database was split into two periods, the "reference" period that includes the first 30 years and the "test" period comprising the remaining 14 years. © Author(s) 2010.

Loading Meteorological Service collaborators
Loading Meteorological Service collaborators