Entity

Time filter

Source Type

Lausanne, Switzerland

News Article
Site: http://phys.org/space-news/

This star is surrounded by a disc of gas and dust—such discs are called protoplanetary discs as they are the early stages in the creation of planetary systems. This particular disc is seen nearly edge-on, and its appearance in visible light pictures has led to its being nicknamed the Flying Saucer. The astronomers used the Atacama Large Millimeter/submillimeter Array (ALMA) to observe the glow coming from carbon monoxide molecules in the 2MASS J16281370-2431391 disc. They were able to create very sharp images and found something strange—in some cases they saw a negative signal! Normally a negative signal is physically impossible, but in this case there is an explanation, which leads to a surprising conclusion. Lead author Stephane Guilloteau takes up the story: "This disc is not observed against a black and empty night sky. Instead it's seen in silhouette in front of the glow of the Rho Ophiuchi Nebula. This diffuse glow is too extended to be detected by ALMA, but the disc absorbs it. The resulting negative signal means that parts of the disc are colder than the background. The Earth is quite literally in the shadow of the Flying Saucer!" The team combined the ALMA measurements of the disc with observations of the background glow made with the IRAM 30-metre telescope in Spain. They derived a disc dust grain temperature of only -266 degrees Celsius (only 7 degrees above absolute zero, or 7 Kelvin) at a distance of about 15 billion kilometres from the central star. This is the first direct measurement of the temperature of large grains (with sizes of about one millimetre) in such objects. This temperature is much lower than the -258 to -253 degrees Celsius (15 to 20 Kelvin) that most current models predict. To resolve the discrepancy, the large dust grains must have different properties than those currently assumed, to allow them to cool down to such low temperatures. "To work out the impact of this discovery on disc structure, we have to find what plausible dust properties can result in such low temperatures. We have a few ideas—for example the temperature may depend on grain size, with the bigger grains cooler than the smaller ones. But it is too early to be sure," adds co-author Emmanuel di Folco (Laboratoire d'Astrophysique de Bordeaux). If these low dust temperatures are found to be a normal feature of protoplanetary discs this may have many consequences for understanding how they form and evolve. For example, different dust properties will affect what happens when these particles collide, and thus their role in providing the seeds for planet formation. Whether the required change in dust properties is significant or not in this respect cannot yet be assessed. Low dust temperatures can also have a major impact for the smaller dusty discs that are known to exist. If these discs are composed of mostly larger, but cooler, grains than is currently supposed, this would mean that these compact discs can be arbitrarily massive, so could still form giant planets comparatively close to the central star. Further observations are needed, but it seems that the cooler dust found by ALMA may have significant consequences for the understanding of protoplanetary discs. This research was presented in a paper entitled "The shadow of the Flying Saucer: A very low temperature for large dust grains", by S. Guilloteau et al., published in Astronomy & Astrophysics Letters.


News Article | September 2, 2016
Site: http://phys.org/biology-news/

It is known that genes inherited from ancient retroviruses are essential to the placenta in mammals, a finding to which scientists in the Laboratoire Physiologie et Pathologie Moleacuteculaires des Retrovirus Endogenes et Infectieux (CNRS/Universite Paris-Sud) contributed. Today, the same scientists reveal a new chapter in this story: these genes of viral origin may also be responsible for the more developed muscle mass seen in males. Their findings are published on 2 September 2016 in PLOS Genetics.


News Article
Site: http://www.spie.org/x2358.xml

The efficiency of today's photovoltaic (PV) solar cells is constrained by a number of energy-loss mechanisms (e.g., limited incoupling of sunlight, weak absorption of long-wavelength photons, and charge-carrier energy losses by thermalization). To enable the development of novel PV devices with record efficiencies, all of these optical processes must be carefully controlled. Several groups have proposed theoretical limits on the absorption efficiency of PV devices. The models upon which these limits—which primarily depend on the thickness of the absorbing layer (e.g., silicon)—are based make predictions regarding the amount by which an efficient light-trapping strategy can increase the mean-free path of sunlight in the absorbing medium of a solar cell, compared to flat (unpatterned) devices. Increasing the mean-free path causes trapped photons to explore the absorbing medium more extensively, thereby increasing overall absorption. Although the limits that are frequently used to make these predictions (i.e., 4n2 and Lambertian limits)1, 2 rely on strong assumptions (e.g., the need to consider thick layers or weakly absorbing media), they nonetheless provide useful references for benchmarking novel absorbers for use in PV devices (e.g., those using nanopatterns). The introduction of micro- and nanopatterns in such devices could, however, mitigate these limits or—in very specific cases, with limited wavelength ranges—overcome them entirely. Last year, a decade after increasingly intense research began in this area, the production of fully functional solar cells integrated with nanophotonic structures showed a net conversion-efficiency increase for the first time.3, 4 A number of key challenges and open questions remain, however. For example, it is not yet clear how appropriate patterns can be integrated in a real solar cell made of standard PV materials, what is the most appropriate active-layer thickness, or what the most appropriate geometry for micro- and nanopatterns might be. In our attempt to provide answers to these questions, we have demonstrated that by integrating a periodic array of nanoholes or nanopyramids (see Figure 1) in a crystalline-silicon-based solar cell, a photocurrent exceeding 23mA/cm2 can be generated with an active layer of only 1μm. This photocurrent, which we predicted using rigorous coupled-wave analysis and a finite-difference time-domain method, is twice the value expected from an unpatterned device of the same thickness. Recent experiments, including those performed in the European PhotoNVoltaics project, have shown that nanopattern-integrated PV devices with efficiently passivated surfaces can increase the conversion efficiency by 20% or more.5 Additionally, the optimization of technological processes and photonic pattern designs is likely to further enhance the conversion efficiency of such PV devices. Indeed, several authors have claimed that the generated photocurrent could be increased by introducing some disorder within a periodic light-trapping structure (such as a photonic crystal).6–8 However, the impact of such a perturbation has rarely been evaluated with respect to a perfectly optimized periodic structure. Additionally, clear design rules that enable selection of the relevant type of disorder for specific applications are still missing. Moreover, the full picture regarding physical mechanisms behind light trapping in such complex structures is not yet clear. To determine the influence of disorder in such devices, we have proposed the implementation of complex patterns based on a periodic square array of air holes. Using this simple array as a base, we define a large supercell in which the position of each nanohole is randomly shifted. The resultant structure is referred to as a pseudo-disordered structure (PDS): see Figure 2(a). We have demonstrated both theoretically and experimentally that the absorption in such a PDS can exceed that of a fully optimized solar-cell stack with a simple periodic nanopattern: see Figure 2(b). In particular, the long-wavelength reflexion peaks are substantially decreased, leading to a predicted photocurrent increase of 2–3%. We have also demonstrated the need for appropriate metrics to determine the type of disorder that can lead to optimized conversion efficiency. Indeed, we have found that different types of PDS may lead to a broad dispersion of absorption efficiencies for the same nanohole mean shift. From such considerations, we have defined more specific parameters, referred to as clustering (relating to the minimal distance between nanoholes) and compactness (which quantifies how many nanoholes are closely packed within a supercell). We used these parameters to sort the randomly obtained PDS and their corresponding expected photocurrent.9 Among our results, the most important show that disorder does lead to a net absorption increase, provided that holes are not clustered together. The pattern should also simultaneously exhibit spatial frequencies with low amplitudes in the short-frequency range (to inhibit the outcoupling of trapped light) and high amplitudes in the long-frequency range (to promote light trapping). Using design rules based on these results, we have developed an optimized PDS pattern with an evenly distributed ensemble of air nanoholes: see Figure 2(c). In summary, we have demonstrated that the introduction of PDS in the active layer of a PV device is likely to increase its absorption and, therefore, the conversion efficiency. Combining relevant designs inspired by photonic crystals with careful perturbation and optimized nanopatterning and passivation processes could enable the high potential of these approaches for next-generation solar cells to be realized. Moreover, manufacturing methods such as these are well suited for the generation of efficient devices using a limited amount of materials. In addition to the introduced sustainability, this novel approach may also enable the fabrication of flexible solar cells. Beyond thin-film solar cells based on silicon, this methodology could be used to optimize hybrid devices (e.g., those combining perovskites and silicon), and to control sunlight absorption in appropriate locations within a device (e.g., the top or bottom of a layer stack or, in the case of a tandem device, between two junctions). In our future work, we plan to fabricate and test fully functional single-junction solar cells that incorporate PDS. We also intend to develop these complex patterns for use in advanced devices, such as tandem solar cells, and to assist in up- or down-conversion processes in PV devices. We acknowledge support from the European Commission Seventh Framework Programme project PhotoNVoltaics (grant agreement 309127) and the French Research Agency (ANR) project NATHISOL (grant agreement ANR-12-PRGE-0004-01). He Ding acknowledges support from the China Scholarship Council (CSC). Results were obtained thanks to close collaboration with Jia Liu, Regis Orobtchouk, Hai Son Nguyen, Alain Fave, Fabien Mandorlo, Céline Chevalier, and Pierre Cremillieu (from the Institut des Nanotechnologies de Lyon, INL), Radoslaw Mazurczyk and Valérie Depauw (from IMEC), Olivier Deparis and Jérôme Muller (from the University of Namur), and Martin Foldyna and Pere Roca i Cabarrocas (from Laboratoire de Physique des Interfaces et des Couches Minces).


News Article
Site: http://www.spie.org/x2406.xml

The first detection of a gravitational wave depended on large surfaces with excellent flatness, combined with low microroughness and the ability to mitigate environmental noise. Albert Einstein's general theory of relativity predicted that massive, accelerating bodies in deep space, such as supernovae or orbiting black holes, emit huge amounts of energy that radiate throughout the universe as gravitational waves. Although these "ripples in spacetime" may travel billions of light years, Einstein never thought the technology would exist that would allow for their detection on Earth. But a century later, the technology does exist at the Laser Interferometer Gravitational-Wave Observatory (LIGO). Measurements from two interferometers, 3000km apart in Louisiana and Washington State, have provided the first direct evidence of Einstein's theory by recording gravitational-wave signal GW150914, determined to be produced by two black holes coalescing 1.2 billion light years away. At the heart of the discovery lies fused silica optics with figure quality and surface smoothness refined to enable measurement of these incredibly small perturbations. Their design is an important part of LIGO's story. The black hole coalescence was detected as an upward-sweeping 'chirp' from 35 to 300Hz, which falls in the detectors' mid-frequency range that is plagued by noise from the optics. Left and right images show data from Hanford and Livingston observatories. Click to enlarge. (Caltech/MIT/LIGO Laboratory) "Most impressive are [the optics'] size combined with surface figure, coating uniformity, monolithic suspensions, and low absorption," says Daniel Sigg, a LIGO lead scientist at Caltech. LIGO's optics system amplifies and splits a laser beam down two 4km-long orthogonal tubes. The two beams build power by resonating between reflective mirrors, or 'test masses,' suspended at either end of each arm. This creates an emitted wavelength of unprecedented precision. When the split beam recombines, any change in one arm's path length results in a fringe pattern at the photodetector. For GW150914, this change was just a few times 10-18 meters. Reducing noise sources at each frequency improves interferometer sensitivity. Green shows actual noise during initial LIGO science run. Red and blue (Hanford, WA and Livingston, LA) show noise during advanced LIGO's first observation run, during which GW150914 was detected. Advanced LIGO's sensitivity goal (gray) is a tenfold noise reduction from initial LIGO. Click to enlarge. (Caltech/MIT/LIGO Laboratory) But the entire instrument is subject to environmental noise that reduces sensitivity. A noise plot shows the actual strain on the instruments at all frequencies, which must be distinguished from gravity wave signals. The optics themselves contribute to the noise, which most basically includes thermal noise and the quality factor, or 'Q,' of the substrate. "If you ping a wine glass, you want to hear 'ping' and not 'dink'. If it goes 'dink', the resonance line is broad and the entire noise increases. But if you contain all the energy in one frequency, you can filter it out," explains GariLynn Billingsley, LIGO optics manager at Caltech. That's the Q of the mirrors. Further, if the test mass surfaces did not allow identical wavelengths to resonate in both arms, it would result in imperfect cancellation when the beam recombines. And if non-resonating light is lost, so is the ability to reduce laser noise. Perhaps most problematic, the optics' coatings contribute to noise due to stochastic particle motion. Stringent design standards ameliorate these problems. In 1996, a program invited manufacturers to demonstrate their ability to meet the specifications required by initial LIGO's optics. Australia's Commonwealth Science and Industrial Research Organisation (CSIRO) won the contract. "It was a combination of our ability to generate large surfaces with excellent flatness, combined with very low microroughness," says Chris Walsh, now at the University of Sydney, who supervised the overall CSIRO project. "It requires enormous expertise to develop the polishing process to get the necessary microroughness (0.2-0.4nm RMS) and surface shape simultaneously." Master optician Achim Leistner led the work, with Bob Oreb in charge of metrology. Leistner pioneered the use of a Teflon lap, which provides a very stable surface that matches the desired shape of the optic during polishing and allows for controlled changes. "We built the optics to a specification that was different to anything we'd ever seen before," adds Walsh. Even with high-precision optics and a thermal compensation system that balances the minuscule heating of the mirror's center, initial LIGO was not expected to detect gravity waves. Advanced LIGO, begun in 2010 and completing its first observations when GW150914 was detected, offers a tenfold increase in design sensitivity due to upgrades that address the entire frequency range. "Very simply, we have better seismic isolation at low frequencies; better test masses and suspension at intermediate frequencies; and higher powered lasers at high frequencies," says Michael Landry, a lead scientist at the LIGO-Hanford observatory. At low frequencies, mechanical resonances are well understood. At high frequencies, radiation pressure and laser 'shot' noise dominate. But at intermediate frequencies (60-100 Hz), scattered light and beam jitter are difficult to control. "Our bucket is lowest here. And there are other things we just don't know," adds Landry. "The primary thermal noise, which is the component at intermediate frequency that will ultimately limit us, is the Brownian noise of the coatings." To improve signal-to-noise at intermediate frequencies, advanced LIGO needed larger test masses (340mm diameter). California-based Zygo Extreme Precision Optics won the contract to polish them. "We were chosen based on our ability to achieve very tight surface figure, roughness, radius of curvature, and surface defect specifications simultaneously," says John Kincade, Zygo's Extreme Precision Optics managing director. The test masses required a 1.9km radius of curvature, with figure requirements as stringent as 0.3nm RMS. After super-polishing to extremely high spatial frequency, ion beam figuring fine-tunes the curvature by etching the surface several molecules at a time. This allows reliable shape without compromising on ability to produce micro-roughness over large surfaces. Advanced LIGO input test mass champion data. Zygo achieved figuring accuracy to 0.08nm RMS over the critical 160mm central clear aperture, and sub-nanometer accuracy on the full clear 300mm aperture of many other samples. Click to enlarge. (Zygo Extreme Precision Optics) Dielectric coatings deposited on the high-precision surfaces determine their optical performance. CSIRO and the University of Lyon Laboratoire des Materiaux Avances shared the contract to apply molecule-thin alternating layers of tantalum and silica via ion-beam sputtering. Katie Green, project leader in CSIRO's optics group, says "the thickness of the individual layers are monitored as they're deposited. Each coating consists of multiple layers of particular thicknesses, with the specific composition of the layers varying depending on how the optic needs to perform in the detector." Additionally, gold coatings around the edges provide thermal shielding and act as an electrostatic drive. LIGO's next observation run is scheduled to begin in September 2016. And after Advanced LIGO reaches its design sensitivity by fine-tuning current systems, further upgrades await in the years 2018-2020 and beyond. "One question is how you reduce the thermal noise of the optics, in particular their coatings. But coating technologies make it hard to get more than a factor of about three beyond Advanced LIGO's noise level," says Landry. One possibility is operating at cyrogenic temperatures. But "fused silica becomes noisy at cold temperatures, and you need a different wavelength laser to do this," according to Billingsley. Another way of increasing the sensitivity at room temperature is to use 40km-arm-length interferometers. Other optics-related systems reduce noise. Advanced LIGO's test masses are suspended on fused silica fibers, creating monolithic suspension that reduces thermal noise and raises the system's resonant frequency compared with initial LIGO. "The Q of that system is higher so an entire band shrinks. That means opening up more space at lower frequencies, where binary black holes are," says Landry. In the 17th century, Galileo pointed a telescope to the sky and pioneered a novel way of observing the universe. Now, LIGO's detection of GW150914 marks another new era of astronomy. As advances in glass lenses enabled Galileo's discoveries, so have state-of-the-art optics made LIGO's discoveries possible. And with astronomy's track record of developing new generations of optical devices, both the astrophysical and precision optics communities are poised for an exciting future.


News Article | February 2, 2016
Site: http://www.techtimes.com/rss/sections/environment.xml

Because of spiking levels of human-induced greenhouse gas emissions, global warming will possibly unleash devastating and extreme flooding in the coming years. Scientists say it will be similar to the severe storms that targeted a coastal plain in England in 2014. In a new report, a team of experts explained that climate change had "amplified" the violent storms that flooded Somerset Levels during late 2013 and early 2014. Now, man-made greenhouse gas emissions have upped the chances of extreme flooding by 43 percent, scientists said, as increasingly warmer temperatures hold larger amounts of moisture that lead to heavier downpour. "What was once a 1 in 100-year event in a world without climate change is now a 1 in 70-year event," said Oxford University's Friederike Otto, co-author of the report. Their paper is the first research to look into the likely role of climate change in the winter flooding of Somerset Levels. During December 2013 and January 2014, heavy rainfall poured down the coastal plain and wetland area of central Somerset in South West England, affecting Somerset, Dorset, Devon, Cornwall and the Thames valley. The downpour led to extensive flooding, where more than 5,000 houses and establishments, as well as 17,000 acres of agricultural land, were submerged. Losses amounted to more than £450 million ($647 million). The truth is, no single extreme weather occurrence can be linked to climate change, but Otto says it is more possible to estimate how much more likely an event is shaped by global warming. Aside from the Somerset Levels, Otto also calculated the severe flooding that occurred in Cumbria by Storm Desmond in December. Otto found that it was made 40 percent more possible by climate change, and that the record rainfall in the UK over the whole of that month was 50 to 75 percent more likely because of global warming. "We can definitely say with climate change that the issue of flooding isn't going to go away," said Otto. "As a society we need to think hard about the question of our vulnerability and exposure to flooding." The study also applied contributions from citizen scientists all over the world who had all used spare processing time on their computers to calculate more than 130,000 simulations of what the weather would have been like with and without human interference in the climate. According to Dr. Pascal Yiou of the Laboratoire des Sciences du Climat e l'Environnement (LSCE), the increase in amount of rainfall had been due to a rise in moisture. "The more extreme the weather, the stronger the effect of climate change over the UK," said Yiou. Meanwhile, Beate Werner, one of the authors of the report, said the recent flooding in the UK are adding to evidence of worsening flood problems across Europe, which has occurred also because of draining, barricading and building on the flood plains around major rivers. The Somerset Levels study, which is featured in the journal Nature Climate Change, was conducted by experts from LSCE and the Center for Ecology and Hydrology.

Discover hidden collaborations