News Article | December 15, 2016
Flash Physics is our daily pick of the latest need-to-know developments from the global physics community selected by Physics World's team of editors and reporters The Spanish astronomer Xavier Barcons will take over as director general (DG) of the European Southern Observatory (ESO) in September 2017, replacing the current DG Tim de Zeeuw who completes his mandate. Barcons is a professor at the Spanish Council for Scientific Research in Madrid and is an expert in the field of X-ray astronomy. He served as ESO council president in 2012–2014 and is currently chair of the organization’s Observing Programmes Committee. Based in Garching, Germany, the ESO has three observing sites in Chile. “I look forward to seeing the European Extremely Large Telescope (E-ELT) come to fruition and overseeing the further development of the Very Large Telescope, Atacama Large Millimeter/submillimeter Array (ALMA) and many other projects at ESO,” said Barcons. A molecular fountain has been created that allows molecules to be observed for very long times as they free fall. Created by Hendrick Bethlem and colleagues at Vrije University in the Netherlands, the technique involves cooling ammonia molecules to milliKelvin temperatures and then launching them upwards at about 1.6 m/s. The molecules can then be studied in free fall for as long as 266 ms. This set-up is similar to atomic fountains, which allow very precise measurements to be made of atomic energy levels and form the basis for atomic clocks. A molecular fountain has proven much more difficult to create because molecules can vibrate and rotate – and this makes it very difficult to cool and manipulate them using conventional laser techniques. Bethlem and colleagues overcame this problem by using electric field gradients to exert forces on ammonia, which is a polar molecule. The team says that its new molecular fountain could be used to look for tiny deviations from the Standard Model of particle physics – which could be revealed by tiny shifts in molecular energy levels. Tests of the equivalence principle of Einstein’s general theory of relativity could also be done by measuring the acceleration due to gravity experienced by different types of molecule. The fountain is described in Physical Review Letters. An X-ray imaging technique that could only be done at large synchrotron facilities has been adapted for widespread use by Sandro Olivo at University College London and colleagues. Called X-ray phase-contrast imaging (XPCI), the method involves measuring changes in the phase of an X-ray beam as it travels through a sample. This is unlike conventional X-ray imaging, which measures the attenuation of the X-ray beam. The technique is better able to distinguish structures in living tissue, making it ideal for medical imaging. XPCI is also better at finding tiny cracks and defects in materials and could also be used to detect the presence of weapons and explosives in baggage. However, XPCI could only be done using the laser-like X-ray beams produced by synchrotrons – which are huge electron accelerators. Now, Olivo and colleagues have developed a technique that allows XPCI to be performed using X-rays generated by conventional medical sources. It involves first passing the X-rays through a “mask” containing an array of apertures to create a number of beams. These then interact with the sample before passing through a second mask to a detector. This configuration converts differences in phase to differences in measured intensity. “We've now advanced this embryonic technology to make it viable for day-to-day use in medicine, security applications, industrial production lines, materials science, non-destructive testing, the archaeology and heritage sector, and a whole range of other fields," says Olivo. The technology has already been licensed to Nikon Metrology UK for use in a security scanner and UCL and Nikon are currently developing a medical scanner.
News Article | November 21, 2016
On top of that, there are even plenty of volunteers who are prepared to make a one-way journey to Mars, and people advocating that we turn it into a second home. All of these proposals have focused attention on the peculiar hazards that come with sending human beings to Mars. Aside from its cold, dry environment, lack of air, and huge sandstorms, there's also the matter of its radiation. Mars has no protective magnetosphere, as Earth does. Scientists believe that at one time, Mars also experienced convection currents in its core, creating a dynamo effect that powered a planetary magnetic field. However, roughly 4.2 billions year ago – either due to a massive impact from a large object, or rapid cooling in its core – this dynamo effect ceased. As a result, over the course of the next 500 million years, Mars atmosphere was slowly stripped away by solar wind. Between the loss of its magnetic field and its atmosphere, the surface of Mars is exposed to much higher levels of radiation than Earth. And in addition to regular exposure to cosmic rays and solar wind, it receives occasional lethal blasts that occur with strong solar flares. NASA's 2001 Mars Odyssey spacecraft was equipped with a special instrument called the Martian Radiation Experiment (or MARIE), which was designed to measure the radiation environment around Mars. Since Mars has such a thin atmosphere, radiation detected by Mars Odyssey would be roughly the same as on the surface. Over the course of about 18 months, the Mars Odyssey probe detected ongoing radiation levels which are 2.5 times higher than what astronauts experience on the International Space Station – 22 millirads per day, which works out to 8000 millirads (8 rads) per year. The spacecraft also detected 2 solar proton events, where radiation levels peaked at about 2,000 millirads in a day, and a few other events that got up to about 100 millirads. For comparison, human beings in developed nations are exposed to (on average) 0.62 rads per year. And while studies have shown that the human body can withstand a dose of up to 200 rads without permanent damage, prolonged exposure to the kinds of levels detected on Mars could lead to all kinds of health problems – like acute radiation sickness, increased risk of cancer, genetic damage, and even death. And given that exposure to any amount of radiation carries with it some degree of risk, NASA and other space agencies maintain a strict policy of ALARA (As-Low-As-Reasonable-Achievable) when planning missions. Human explorers to Mars will definitely need to deal with the increased radiation levels on the surface. What's more, any attempts to colonize the Red Planet will also require measures to ensure that exposure to radiation is minimized. Already, several solutions – both short term and long- have been proposed to address this problem. For example, NASA maintains multiple satellites that study the sun, the space environment throughout the solar system, and monitor for galactic cosmic rays (GCRs), in the hopes of gaining a better understanding of solar and cosmic radiation. They've also been looking for ways to develop better shielding for astronauts and electronics. In 2014, NASA launched the Reducing Galactic Cosmic Rays Challenge, an incentive-based competition that awarded a total of $12,000 to ideas on how to reduce astronauts' exposure to galactic cosmic rays. After the initial challenge in April of 2014, a follow-up challenge took place in July that awarded a prize of $30,000 for ideas involving active and passive protection. When it comes to long-term stays and colonization, several more ideas have been floated in the past. For instance, as Robert Zubrin and David Baker explained in their proposal for a low-cast "Mars Direct" mission, habitats built directly into the ground would be naturally shielded against radiation. Zubrin expanded on this in his 1996 book The Case for Mars: The Plan to Settle the Red Planet and Why We Must. Proposals have also been made to build habitats above-ground using inflatable modules encased in ceramics created using Martian soil. Similar to what has been proposed by both NASA and the ESA for a settlement on the Moon, this plan would rely heavily on robots using 3-D printing technique known as "sintering", where sand is turned into a molten material using x-rays. MarsOne, the non-profit organization dedicated to colonizing Mars in the coming decades, also has proposals for how to shield Martian settlers. Addressing the issue of radiation, the organization has proposed building shielding into the mission's spacecraft, transit vehicle, and habitation module. In the event of a solar flare, where this protection is insufficient, they advocate creating a dedicated radiation shelter (located in a hollow water tank) inside their Mars Transit Habitat. But perhaps the most radical proposal for reducing Mars' exposure to harmful radiation involves jump-starting the planet's core to restore its magnetosphere. To do this, we would need to liquefy the planet's outer core so that it can convect around the inner core once again. The planet's own rotation would begin to create a dynamo effect, and a magnetic field would be generated. According to Sam Factor, a graduate student with the Department of Astronomy at the University of Texas, there are two ways to do this. The first would be to detonate a series of thermonuclear warheads near the planet's core, while the second involves running an electric current through the planet, producing resistance at the core which would heat it up. In addition, a 2008 study conducted by researchers from the National Institute for Fusion Science (NIFS) in Japan addressed the possibility of creating an artificial magnetic field around Earth. After considering continuous measurements that indicated a 10% drop in intensity in the past 150 years, they went on to advocate how a series of planet-encircling superconducting rings could compensate for future losses. With some adjustments, such a system could be adapted for Mars, creating an artificial magnetic field that could help shield the surface from some of the harmful radiation it regularly receives. In the event that terraformers attempt to create an atmosphere for Mars, this system could also ensure that it is protected from solar wind. Lastly, a study in 2007 by researchers from the Institute for Mineralogy and Petrology in Switzerland and the Faculty of Earth and Life Sciences at Vrije University in Amsterdam managed to replicate what Mars' core looks like. Using a diamond chamber, the team was able to replicate pressure conditions on iron-sulfur and iron-nickel-sulfur systems that correspond to the center of Mars. What they found was that at the temperatures expected in the Martian core (~1500 K, or 1227 °C; 2240 °F), the inner core would be liquid, but some solidification would occur in the outer core. This is quite different from Earth's core, where the solidification of the inner core releases heat that keeps the outer core molten, thus creating the dynamo effect that powers our magnetic field. The absence of a solid inner core on Mars would mean that the once-liquid outer core must have had a different energy source. Naturally, that heat source has since failed, causing the outer core to solidify, thus arresting any dynamo effect. However, their research also showed that planetary cooling could lead to core solidification in the future, either due to iron-rich solids sinking towards the center or iron-sulfides crystallizing in the core. In other words, Mars' core might become solid someday, which would heat the outer core and turn it molten. Combined with the planet's own rotation, this would generate the dynamo effect that would once again fire up the planet's magnetic field. If this is true, then colonizing Mars and living there safely could be a simple matter of waiting for the core to crystallize. There's no way around it. At present, the radiation on the surface of Mars is pretty hazardous! Therefore, any crewed missions to the planet in the future will need to take into account radiation shielding and counter-measures. And any long-term stays there – at least for the foreseeable future – are going to have to be built into the ground, or hardened against solar and cosmic rays. But you know what they say about necessity being the mother of invention, right? And with such luminaries as Stephen Hawking saying that we need to start colonizing other worlds in order to survive as a species, and people like Elon Musk and Bas Lansdrop looking to make it happen, we're sure to see some very inventive solutions in the coming generations!
News Article | September 12, 2016
The team will present their work during the Frontiers in Optics (FiO) / Laser Science (LS) conference in Rochester, New York, USA on 17-21 October 2016. "Our target is the best tested theory there is: quantum electrodynamics," said Kjeld Eikema, a physicist at Vrije University, The Netherlands, who led the team that built the laser. Quantum electrodynamics, or QED, was developed in the 1940s to make sense of small unexplained deviations in the measured structure of atomic hydrogen. The theory describes how light and matter interact, including the effect of ghostly 'virtual particles.' Its predictions have been rigorously tested and are remarkably accurate, but like extremely dedicated quality control officers, physicists keep ordering new tests, hoping to find new insights lurking in the experimentally hard-to-reach regions where the theory may yet break down. A promising tool for the next generation of tests is the new high-intensity laser. It produces pulses of deep ultraviolet light with energies large enough to bump electrons in some of the simplest atoms and molecules into a higher energy level. "For increased precision, you have to do these QED tests in the most simple atoms and molecules," Eikema explained. The team has already tested the laser on molecular hydrogen. They measured the frequency of light required to excite a certain electron transition with a preliminary uncertainty of less than one part per 100 billion, more than 100 times better than previous measurements. The Challenge of Ultra-Precise Measurements in the UV The key challenge for the team wasn't really producing the deep UV light—a feat that has been accomplished before—but in finding a way to keep the measurements precise. Short pulses, which are easier to produce for UV light, make inherently uncertain measurements, due to the Heisenberg uncertainty principle. One way around this is to use a technique called Ramsey interferometry, which requires two pulses of light separated by an incredibly precise period of time. What Eikema and his colleagues did that had never been done before was to get the two pulses by extracting them from a device, called a frequency comb laser, uniquely suited to create precisely timed pulses. "People normally think that if you take just two pulses out of a frequency comb then you destroy the beauty of a frequency comb, but we do it in a special way," Eikema said. Extracting and amplifying the pulses introduced uncertainties, but the team found that if they hit an atom or molecule with differently spaced pulse pairs and then analyzed the results simultaneously, the uncertainties in effect canceled out. Even better, it also canceled out an unwanted effect called the AC-Stark effect, which arises when the high-intensity light used for measurement actually changes the structure of an atom or molecule. "Using this method we actually restore all the properties of the frequency comb, and we also get exciting new properties," Eikema said. "This was our eureka moment." The team's next goal is to use their laser to measure the first electron transition energy of a positively charged helium atom, called He+. He+ is the one of the "holy grails" for testing QED, Eikema said, because the properties of the nucleus have been extensively studied, it can be trapped with electromagnetic fields and observed for a very long time, and the QED effects are larger in helium than in hydrogen. "If it's possible to measure this transition in He+, people will immediately do it, because it's a very nice, clean transition," he said. A test of QED in He+ might also help resolve the proton radius problem, a new puzzle gripping the physics community after complementary tests turned up conflicting measurements of the proton's size. The discrepancy could be due to a problem with QED theory, and so a better test would help scientists see whether or not QED theory still holds at this unprecedented new level of precision. Going from molecular hydrogen to He+ is still an enormous jump, Eikema said, since the wavelength of light required is almost ten times shorter. If all goes according to plan, he estimates the team may have results to report in about 2 years. "I went to a conference about the proton size problem and explained how we want to measure this transition of He+. Everyone was asking 'When? When? When?' They really want to know," Eikema said. Sandrine Galtier, a postdoctoral researcher at Vrije University who will present the team's findings at the FiO meeting, says it's exciting how well their new laser system can test the extreme limits of theoretical physics. "We don't need huge accelerators. With just a tabletop experiment, we can test the Standard Model of physics," she said. Explore further: New experiments challenge fundamental understanding of electromagnetism More information: The presentation (FTu5C.6), "Testing QED with Ramsey-Comb spectroscopy in the deep-UV range," by Sandrine Galtier will take place from 04:00 - 06:00, Tuesday, 18 October 2016, at the Radisson Hotel, Grand Ballroom C, Rochester, New York, USA.
News Article | September 19, 2016
“The heat made people crazy. They woke from their damp bed sheets and went in search of a glass of water, surprised to find that when their vision cleared, they were holding instead the gun they kept hidden in the bookcase.’’ This passage, from Summer Island, a romance novel by Kristin Hannah, is how researchers introduce a potentially important new study they believe could alter peoples’ attitudes about the impact of unrelenting heat on violence, and why some parts of the world experience strikingly higher rates of violence than others. It’s not what people think. The new research goes beyond existing ideas about how hot summer nights cause tempers to flare and prompt sporadic acts of violence. Their model explores long-term cultural changes resulting from persistently high temperatures and a lack of seasonal variability, among them a loss of self-control and future-oriented goals. This combination can lead to more aggression and violence, they say. “People think about weather when they think about global warming, but don’t realize that climate change can increase aggression and violence, ’’ says Brad Bushman, professor of communication and psychology at The Ohio State University and one of the study’s authors. “But climate change affects how we relate to other people.’’ Moreover, he predicts that unmitigated global warming could increase violence levels in the United States, something he believes deserves immediate attention. Bushman, with colleagues Paul Van Lange, professor of psychology at Vrije University in Amsterdam (VU) and research assistant Maria Rinderu, also of VU, say their model, which they call CLASH (for CLimate Aggression and Self-Control in Humans), recently published in the journal Behavioral and Brain Sciences, could explain why violence is greater in countries closer to the Equator and in the Southern regions of the United States, and less so in the American North and in areas farther away from the Equator. People living in such climates are more tuned into the present — the here and now — and are less likely to plan for the future, they theorize. They are less strict about time, less stringent about birth control, and have children earlier and more often, Bushman says. “If you live farther away from the Equator, you have to exercise more self-control,’’ Bushman says. “You can’t just eat all your crops, because you then won’t have anything left to eat in the winter. But if you live closer to the Equator, those mango trees will grow mangoes year-round.’’ This scenario encourages a state of mind and lack of self-control that affects how people treat each other, according to Bushman. “Climate shapes how people live, and affects the culture in ways that we don’t think about in our daily lives.’’ Such a faster life strategy “can lead people to react more quickly with aggression and sometimes violence,’’ Bushman adds. Until recently only two models helped explain why violence and aggression are higher in hotter climates. The first, the General Aggression Model — which Bushman helped develop — holds that hot temperatures make people feel uncomfortable and irritated, causing them to become more aggressive. The second, known as the Routine Activity Theory, suggests that people go outside and interact with each other more when the weather is warm, thus providing more opportunities for conflict. But that doesn’t explain why there is more violence when the temperature is 95 degrees F (35 degrees C) than when it is 75 degrees F (24 degrees C), even though people are more likely to go outside under both conditions. To be sure, “our ability to cope with irritation and frustration may be less strong on hot days,’’ says Van Lange, the study’s lead author. “But this would be only part of the story. We thought it is not only average temperature that might matter, but also seasonal variation in temperature. The latter is predictable and may lead cultures that are facing seasonal variation to develop stronger norms and habits, and adopt longer-time planning and self-control — that is, to forgo immediate benefit for longer-term benefit.’’ These two factors, average temperature and predictable seasonal variation, may help experts better understand aggression, as “the psych literature has revealed that self-control is one of the strongest predictors of aggression and violence,’’ Van Lange adds. It also may explain why crime is higher in the American South, Bushman says. “Violent crime rates have always been higher in the South,’’ he says. “You see different life strategies in the North and the South. People seem to plan more for the future in the North. But we predict that if climate change continues, with less seasonal variability in the North, you will see violent crime rates increase there, too.’’ What about climate’s influence on war? “War is usually less impulsive, less the result of lack of self-control, and more planned and premeditated,’’ Bushman says. “However, the model could be applicable to a leader inclined to respond impulsively,’’ he says. The scientists have called for more research, and note that they are not suggesting people in hotter climates can’t help themselves when it comes to violence. However, they stress that it is important to recognize that culture is strongly affected by climate. “Climate doesn’t make a person, but it is one part of what influences each of us,’’ Van Lange says. Buy a cool T-shirt or mug in the CleanTechnica store! Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.
Nordin S.,Linköping University |
Carlbring P.,Umeå University |
Cuijpers P.,Vrije University |
Andersson G.,Karolinska Institutet
Behavior Therapy | Year: 2010
Cognitive behavioral bibliotherapy for panic disorder has been found to be less effective without therapist support. In this study, participants were randomized to either unassisted bibliotherapy (n=20) with a scheduled follow-up telephone interview or to a waiting list control group (n=19). Following a structured psychiatric interview, participants in the treatment group were sent a self-help book consisting of 10 chapters based on cognitive behavioral strategies for the treatment of panic disorder. No therapist contact of any kind was provided during the treatment phase, which lasted for 10 weeks. Results showed that the treatment group had, in comparison to the control group, improved on all outcome measures at posttreatment and at 3-month follow-up. The tentative conclusion drawn from these results is that pure bibliotherapy with a clear deadline can be effective for people suffering from panic disorder with or without agoraphobia. © 2010.
News Article | December 12, 2016
In his 49 years, Zablon Katende had never thought of leaving his hometown of Kipini in coastal Kenya. But now, looking at his dwindling mango trees, the farmer worries the harvest will not be enough to provide for his five children. “Every year there is less water,” he says, pointing at the murky Tana river which washes the shores of his village. Despite being Kenya’s longest river, the Tana is struggling to keep up with the country’s ever-growing demand for water and electricity. It is the backbone of the country’s economy, providing up to 80% of Nairobi’s water and half the country’s electricity through hydroelectric plants. Its water also irrigates thousands of hectares of cash crops such as tea, coffee and rice. However, erosion, pollution and excessive water capture are threatening the livelihoods of many who, like Katende, depend on the river. The government is currently planning to divert even more of Tana’s water for irrigation and power, but a study (pdf) by Wetlands International and the Vrije University in Amsterdam warns this management model is not ecologically sustainable. Despite concerns, Kenya’s government wants to use more of the Tana river’s resources to ensure economic prosperity for the country’s fast growing population. Known as Vision 2030, the plan includes 1m acres of monocultures, a 3km-long dam and a £28bn transportation corridor including a new port city in Lamu, near the Tana delta. Experts, however, warn the river’s resources are not unlimited. “Ignoring nature has a price,” says Julie Mulonga, programme manager of Wetlands International in Kenya. According to Mulonga, the government’s water management style focuses on the short-term benefit of industries around the capital, such as flower farms and breweries, and disregards the needs of people and animals downstream. The consequences are already being felt, especially in Tana’s delta where most locals live off fishing, raising cattle and growing sustenance crops. Without enough water, fish cannot breed, crops fail and animals are too emaciated to sell. “Without the river, nothing lives,” says Katende, who worries that the construction of another dam will mean even less water for his mango trees. Tourism is suffering, too. Tana’s delta is a wildlife refuge for hundreds of species, from hippos to monkeys. But water scarcity increases deforestation and animal poaching. What’s more, local authorities worry that competition over water will lead to violent clashes between pastoralist and farming tribes, which in 2012 resulted in 50 deaths and forced several hotels to close. The Kenyan government rejects the suggestion that their plans are putting strain on the environment, communities and business that relies on the river. “There is no need to compete over water because all economic activities on the river are complementary,” says Robinson Gaita, director of irrigation and water storage at the Ministry of Water and Irrigation. Gaita is overseeing the development of a new 10,000-acre maize farm near the middle section of the Tana, which he says is already improving food security. The government recently donated 62,000 bags of maize from this plantation to communities suffering from drought in the river’s delta. As for the colossal dam, Gaita says it will actually help downstream farmers like Katende because it will give the state the ability to prevent excessive flooding and increase the availability of water in case of drought – both of which are happening more frequently because of climate change. Private businesses could have a big role to play in the Tana’s conservation. Some of the country’s largest companies, including Coca-Cola and East African Breweries, have joined the Nairobi Water Fund, a scheme which aims to raise £8m to help preserve Tana’s ecosystems by planting trees or teaching farmers better soil-management practices. Nushin Ghassmi, communications manager for Frigoken, Kenya’s largest vegetable processing company, says working with the fund is important because “preserving our natural resources is crucial for our business survival”. Coca-Cola estimates the annual water treatment and filtration costs for their Nairobi bottling plant at more than $1m. Yet even with increased corporate responsibility, the Tana will continue to deteriorate if the government does not scale down its ambitious infrastructural projects, warns Pieter van Beukering, director of the Institute for Environmental Studies at Vrije University. If the economic benefits are not shared equally along the river this could also increase upstream migration. “Money follows water. And people follow money,” says Beukering. Many of Katende’s neighbours have already left Kipini looking for greener pastures for their cattle or cleaner waters for their nets. “But I’m a farmer,” says Katende, “I can’t abandon my land.” Instead he has joined a local conservation group to help raise awareness about the importance of preserving the Tana. Despite this year’s failing crop, he is hopeful. “We will find a way to give water to everybody,” he says. “We have to.” Sign up to be a Guardian Sustainable Business member and get more stories like this direct to your inbox every week. You can also follow us on Twitter.
Jager T.,Vrije University |
Philosophical Transactions of the Royal Society B: Biological Sciences | Year: 2010
The interest of environmental management is in the long-term health of populations and ecosystems. However, toxicity is usually assessed in short-term experiments with individuals. Modelling based on dynamic energy budget (DEB) theory aids the extraction of mechanistic information from the data, which in turn supports educated extrapolation to the population level. To illustrate the use of DEB models in this extrapolation, we analyse a dataset for life cycle toxicity of copper in the earthworm Dendrobaena octaedra. We compare four approaches for the analysis of the toxicity data: No model, a simple DEB model without reserves and maturation (the Kooijman-Metz formulation), a more complex one with static reserves and simplified maturation (as used in the DEBtox software) and a full-scale DEB model (DEB3) with explicit calculation of reserves and maturation. For the population prediction, we compare two simple demographic approaches (discrete time matrix model and continuous time Euler-Lotka equation). In our case, the difference between DEB approaches and population models turned out to be small. However, differences between DEB models increased when extrapolating to more field-relevant conditions. The DEB3 model allows for a completely consistent assessment of toxic effects and therefore greater confidence in extrapolating, but poses greater demands on the available data. © 2010 The Royal Society.
News Article | September 20, 2016
An experiment with fake frogs shows how certain bats adjust their hunting technique to compensate for unnatural noises. Humankind is loud, and research already suggests that birds alter their singing in urban noise. Now tests show that bats listening for the frogs they hunt switch from mostly quiet eavesdropping to pinging echolocating when artificial sounds mask the frog calls. That way, the bats can detect the motion of the frogs’ vocal sac poofing out with each call, researchers report in the Sept. 16 Science. That switch in sensory tactics could make bats the only animals besides people shown to react this way to interfering din in the classic cocktail party scenario, says study coauthor Wouter Halfwerk of Vrije University Amsterdam. People straining to hear each other over the cacophony of a party can get a boost in communication by paying attention to each other’s mouth movements. He points out that watching someone’s lips lets people tolerate about an extra 20 decibels of tipsy shrieking and shouting. Conservation biologists worry about the effects of human racket on other residents of the planet. Other researchers, for instance, found that noise interfered with pallid bats’ success in hunting insects on the wing. The fringe-lipped bats (Trachops cirrhosus) in the new study, however, specialize in frogs instead of insects. Hungry bats listen to frog choruses and swoop out of the darkness to carry off a male chirping his advertisements for a mate. “A talking pickle” is what Halfwerk calls the frog. Researchers tested 12 wild-caught bats in outdoor flight cages in Panama. Bats perching (upside down, of course) in cages were perfectly willing to make a grab at robotic frogs deployed in the cages. The robofrogs, modeled by an artist on the túngara species the bats naturally hunt, sit motionless but can on command start inflating a specially constructed balloon in time with broadcast calls. In this setup, interfering noise changed normal hunting. When researchers broadcast sounds that partially masked the main frequency of telltale frog calls, bats waited longer than normal to strike and also strongly preferred pouncing on a robofrog that was inflating his sac instead of an identical frog squatting nearby with a deflated sac. Recordings of bat noises from the perch showed that the hunters were pinging fast echolocation sounds instead of mostly listening for the pickle to betray its location. Even with the strategy switch, the bats aren’t completely making up for the noise nuisance, Jinhong Luo at Johns Hopkins University points out. A sensory biologist, he has tested noise effects on other bats but was not involved in this project. Looking at the new data, he notes that frog-eating bats in echolocating mode are slower to leave their perches and swoop than bats in eavesdropping mode. He also cautions about generalizing to the other 1,300-plus bat species. Many of them are already using echolocation to hunt insects and may not have a backup prey-finder method when noise complicates their foraging.
News Article | November 28, 2016
Water is not just vital to life on Earth – it turns out that it may have been a crucial ingredient of the primordial body that split apart 4.5 billion years ago to become Earth and the moon. The latest evidence for this, from lab simulations of how minerals formed in the early moon, may settle a long-running debate about whether the early moon and Earth contained water from the outset, or whether it arrived later through collisions with water-bearing comets or asteroids. “Our study shows that water was there at the time the moon formed, and because that happened soon after the formation of Earth, it shows water was present well before any later addition via comets or asteroids,” says Wim van Westrenen at Vrije University in Amsterdam, the Netherlands, who co-led the team. “We show that the moon, in its initial hot stage, contained a lot of water – at least as much as, and likely more than, the amount we have on Earth today.” Water has been detected in samples from the moon before, but only in young rock from the surface, which does not tell us whether it was there from the beginning or brought by asteroids. To investigate the role of water in the early moon’s formation, van Westrenen and his colleagues made small-scale lab mixtures weighing just 10 milligrams, but containing all the basic ingredients from which the moon originated. Specifically, this mimics the components that gave rise to the lunar magma ocean, the initial liquefied mass that gradually cooled and solidified to form the moon. “The main constituents are silicon and oxygen, with a sprinkling of magnesium, calcium, iron, titanium and aluminium,” says van Westrenen. The recipe reflects that revealed by seismic data collected from the moon’s surface by instruments left there by Apollo astronauts. Next, van Westrenen’s team simulated the moon’s evolving geology by subjecting the mixture to temperatures and pressures that matched those on the early moon, taking advantage of laboratory apparatus also used to create synthetic diamonds. They did this both with and without water to see whether this affected the type and amount of rocks formed. The team found that only when water was included in the mix, at levels of just 0.5 to 1 per cent by weight, did the types and amounts of rock formed match those that have been detected or measured on the moon. Most importantly, the water-based mixture generated a layer of plagioclase—the dominant component of the crust – that when extrapolated to the moon would be around 34 to 43 kilometres thick. This tallies with the average thickness reported in 2013 based on data from satellites orbiting the moon. When the mixture was dry, the plagioclase layer ended up twice as deep, at 68 kilometres. This suggests that the existing make-up of the moon’s geology could only have evolved if water was there at the outset. The latest research adds weight to arguments that Earth and the moon had water from the outset. Others have argued that water arrived later on asteroids or comets that smashed into the primordial planet and moon. Measurements sent back in 2014 from the Rosetta space probe when it visited a comet dealt that theory a blow by showing that the water on the comet had a combination of isotopes that did not match Earth’s. “This is yet another indication that the moon may have initially been water-rich, with important implications both for our models of lunar origin, and for the possibility there are still water-rich reservoirs on the moon today,” says Robin Canup, who studies the origins of planetary bodies at the Southwest Research Institute in Boulder, Colorado. “This work is going to force us to think about how the material that formed the moon managed to take some of Earth’s water along with it,” says Steve Jacobsen of Northwestern University in Evanston, Illinois. This month, he reported evidence for the deepest water yet discovered on Earth, at 1000 kilometres down, or a third of the way to the edge of the core.
News Article | October 23, 2015
In your Lindau lecture this year you talked about genetically modified organisms (GMOs). Are people right to worry about them? Frankly, they are not. We have been genetically modifying everything we eat for more than 5,000 years. We have been improving plants by 'natural' breeding since the origin of agriculture. When we breed plants, we make hybrids — and typically move hundreds of genes from one plant to another. You don't know what those genes are. You don't know where they go. And you don't know how these genes are influenced by moving them. Genetic engineering is just a better way of doing what we have been doing for the past 5,000 years. The argument that inserting bacterial genes into plants is a break with the past is invalid because, to pick an example, there is very good evidence that the sweet potato genome contains bacterial genes. It doesn't make sense to think that new methods of altering plant genomes will be inherently dangerous. Genes are genes; it is what they do that matters. We need to test whether the products are safe, not worry about the process of creating them. This argument extends to the potential ecosystem effects of GMOs. I do worry about ecosystems, but there is no special risk to them from plants created using these new methods. One of your main interests is microbes — indeed you gave a lecture about why we should love them at Lindau last year. Why did you feel this was necessary? The vast majority of the microbes that live with us are good. But bacteria have a bad reputation because science has focused on the ones that cause disease. Biologists are finally starting to realize that by manipulating and controlling microorganisms, we can probably do more for human health than by any other means. The nice thing about this kind of medicine is that it would be cheap. We should explore all sorts of ways to make bacteria more beneficial, including genetic engineering. If you can cure disease by manipulating the microbiome, that is going to save a lot of money and will probably also teach us how to live better. I love bacteria. Has biotechnology focused too much on the health of the human host without considering its microbial colonizers? I absolutely think we have gone overboard in studying humans as humans. We need to study good bacteria in the context of their human ecosystems. Until recently, microbiologists did almost no work on good bacteria, which means that these organisms are under-appreciated even though they are an incredibly important part of us. That is a big mistake. The average human contains two to five pounds of bacteria! They provide protection against pathogens and prime our immune systems. If I were to kill all the bacteria that live in or on you, you would probably die. It is as simple as that. We know this because bacteria-free individuals of other species die young. Why are you so passionate in your support of GM food? I feel that scientists need to provide more legitimacy to GMOs. A lot of people cannot grasp the nuances of the relevant science, but respect and listen when prominent scientists — particularly Nobel laureates — speak up. I want to make sure the general public receives the benefits of GM food, but also understands its limitations. The fabrications that the anti-GMO people have used to scare the population worry me very much. I would really like to convince green parties of the benefits of GMO. In general, I support green parties. I think they just made a mistake in opposing GM foods — and they did it not because they were against genetic modification per se, but because they were afraid that multinationals were going to take over the food supply. New techniques are making gene technology available to much smaller organizations than ever before. If what the anti-GMO lobby really cares about is multinationals taking over, might these techniques increase acceptance of GMOs? The way to think about this is to consider evolution as a very slow process. Plants might eventually adapt to global warming, but if they don't adapt fast enough we won't have enough to eat. Genetic modification is a fast way of doing things. If we do not interfere and 'help' evolution where we can, an awful lot of people are going to die unnecessarily, particularly in the developing world. There are opportunities to really get something done here, and there are strong moral arguments. And there is no reason why small companies or non-governmental organizations cannot make a big impact and significantly help the developing world. As well as the new GMO initiative, you also signed the Mainau Declaration on climate change and campaigned in China for the release of Nobel peace laureate Liu Xiaobo. Do you consider it a responsibility to use your Nobel laureate status for the public good? A Nobel prize is something rather special. Almost all of the laureates here in Lindau were awarded a Nobel prize because we were lucky. It is not that we are super smart or better than anybody else, but because we made a serendipitous discovery along the way. For whatever reason, when you win a Nobel prize people listen to you who never listened before. That means two things. The first is that you should use the opportunity to do good in the world, if you can. The second is that you should also be careful about what you say because you might not always be right. There are plenty of issues in which Nobel laureates could have been helpful, but they were rarely politically organized in the past. We tried to get Aung San Suu Kyi released from house arrest in Myanmar. Even though that was not successful, it showed that we laureates can come together — 225 of us signed letters that were sent to the Chinese and Burmese governments. What is the future of the Nobel prizes in the era of big collaborative science, in the light of projects such as ENCODE, the Encyclopedia of DNA Elements? Many of the major steps forward in biology have been made by individuals or small groups of individuals. Our knowledge of biology is so limited, we are still at the starting point of understanding how organisms work and there are still terrific roles for individuals. But, in general, I am not sure science prizes are a particularly good thing. They are wonderful for the people who win them, and can be terrible for those who don't. I think they end up causing rather a lot of heartbreak. Gijsbert Werner is a PhD student at Vrije University Amsterdam, the Netherlands, where he studies the evolution of plant-microbe mutualisms.