Science Institute

Puebla de Zaragoza, Mexico

Science Institute

Puebla de Zaragoza, Mexico
SEARCH FILTERS
Time filter
Source Type

News Article | April 25, 2017
Site: www.sciencedaily.com

Neuroscientists have long noted that if certain brain cells are destroyed by, say, a stroke, new circuits may be laid in another location to compensate, essentially rewiring the brain. Northeastern's William R. Hobbs, an expert in computational social science, wanted to know if social networks responded similarly after the death of a close mutual friend. In new research published Monday in the journal Nature Human Behavior, Hobbs found that they did, thereby representing a paradigm of social network resilience. Hobbs, who led the study, collaborated with Facebook data scientist Moira Burke. The researchers found that close friends of the deceased immediately increased their interactions with one another by 30 percent, peaking in volume. The interactions faded a bit in the following months and ultimately stabilized at the same volume of interaction as before the death, even two years after the loss. This insight into how social networks adapt to significant losses could lead to new ways to help people with the grieving process, ensuring that their networks are able to recover rather than collapse during these difficult times. "Most people don't have very many friends, so when we lose one, that leaves a hole in our networks as well as in our lives," says Hobbs, a postdoctoral research fellow in the lab of David Lazer, Distinguished Professor of Political Science and Computer and Information Science. He wondered: Would a social network unravel with a central member gone? If it recovered, how might it heal? "We expected to see a spike in interactions among close friends immediately after the loss, corresponding with the acute grieving period," says Hobbs. "What surprised us was that the stronger ties continued for years. People made up for the loss of interacting with the friend who had died by increasing interactions with one another." Hobbs came to the study from a crisis of his own. After college, he lived and worked in China studying local governments. But when he entered graduate school at the University of California, San Diego, his father was dying. "So I switched to American politics, then to studying chronic illnesses, and then moving into the effect of deaths on others," he says. That switch led to this first large-scale investigation of recovery and resilience after a death in social networks. It has the potential to reveal a great deal about ourselves, says Lazer, who is also a core faculty member in the Network Science Institute at Northeastern. "Death is a tear in the fabric of the social network that binds us together," he says. "This research provides insight into how our networks heal from this tear over time, and points to the ways that our digital traces can offer important clues into how we help each other through the grieving process." Using sophisticated data counters and computer analysis, the researchers compared monthly interactions -- wall posts, comments, and photo tags -- of approximately 15,000 Facebook networks that had experienced the death of a friend with monthly interactions of approximately 30,000 similar Facebook networks that had not. The first group comprised more than 770,000 people, the latter more than 2 million. They learned about the deaths from California state vital records, and characterized "close friends" as those who had interacted with the person who died before the study began. To maintain the users' privacy, the data was aggregated and "de-identified" -- that is, all elements that associated the data with the individual were removed. "The response was different from what other researchers have found regarding natural disasters or other kinds of trauma," says Hobbs. "There you see a spike in communications but that disappears quickly afterward." In particular, the researchers found that networks comprising young adults, ages 18 to 24, showed the strongest recovery. They were not only more likely to recover than others, their interaction levels also stayed elevated -- higher than before the loss. Networks experiencing suicides, on the other hand, showed the least amount of recovery. Further research is necessary to understand why, says Hobbs. "We didn't study the subjective experience of loss, or how people feel," cautions Hobbs. "We looked at recovery only in terms of connectivity. We also can't say for certain whether the results translate into closer friendships offline." What they do show is that online social networks appear to function as a safety net. "They do so quickly, and the effect persists," he says. "There are so few studies on the effect of the death of a friend on a network. This is a big step forward."


News Article | April 25, 2017
Site: www.eurekalert.org

Neuroscientists have long noted that if certain brain cells are destroyed by, say, a stroke, new circuits may be laid in another location to compensate, essentially rewiring the brain. Northeastern's William R. Hobbs, an expert in computational social science, wanted to know if social networks responded similarly after the death of a close mutual friend. In new research published Monday in the journal Nature Human Behavior, Hobbs found that they did, thereby representing a paradigm of social network resilience. Hobbs, who led the study, collaborated with Facebook data scientist Moira Burke. The researchers found that close friends of the deceased immediately increased their interactions with one another by 30 percent, peaking in volume. The interactions faded a bit in the following months and ultimately stabilized at the same volume of interaction as before the death, even two years after the loss. This insight into how social networks adapt to significant losses could lead to new ways to help people with the grieving process, ensuring that their networks are able to recover rather than collapse during these difficult times. "Most people don't have very many friends, so when we lose one, that leaves a hole in our networks as well as in our lives," says Hobbs, a postdoctoral research fellow in the lab of David Lazer, Distinguished Professor of Political Science and Computer and Information Science. He wondered: Would a social network unravel with a central member gone? If it recovered, how might it heal? "We expected to see a spike in interactions among close friends immediately after the loss, corresponding with the acute grieving period," says Hobbs. "What surprised us was that the stronger ties continued for years. People made up for the loss of interacting with the friend who had died by increasing interactions with one another." Hobbs came to the study from a crisis of his own. After college, he lived and worked in China studying local governments. But when he entered graduate school at the University of California, San Diego, his father was dying. "So I switched to American politics, then to studying chronic illnesses, and then moving into the effect of deaths on others," he says. That switch led to this first large-scale investigation of recovery and resilience after a death in social networks. It has the potential to reveal a great deal about ourselves, says Lazer, who is also a core faculty member in the Network Science Institute at Northeastern. "Death is a tear in the fabric of the social network that binds us together," he says. "This research provides insight into how our networks heal from this tear over time, and points to the ways that our digital traces can offer important clues into how we help each other through the grieving process." Using sophisticated data counters and computer analysis, the researchers compared monthly interactions--wall posts, comments, and photo tags--of approximately 15,000 Facebook networks that had experienced the death of a friend with monthly interactions of approximately 30,000 similar Facebook networks that had not. The first group comprised more than 770,000 people, the latter more than 2 million. They learned about the deaths from California state vital records, and characterized "close friends" as those who had interacted with the person who died before the study began. To maintain the users' privacy, the data was aggregated and "de-identified"--that is, all elements that associated the data with the individual were removed. "The response was different from what other researchers have found regarding natural disasters or other kinds of trauma," says Hobbs. "There you see a spike in communications but that disappears quickly afterward." In particular, the researchers found that networks comprising young adults, ages 18 to 24, showed the strongest recovery. They were not only more likely to recover than others, their interaction levels also stayed elevated--higher than before the loss. Networks experiencing suicides, on the other hand, showed the least amount of recovery. Further research is necessary to understand why, says Hobbs. "We didn't study the subjective experience of loss, or how people feel," cautions Hobbs. "We looked at recovery only in terms of connectivity. We also can't say for certain whether the results translate into closer friendships offline." What they do show is that online social networks appear to function as a safety net. "They do so quickly, and the effect persists," he says. "There are so few studies on the effect of the death of a friend on a network. This is a big step forward."


News Article | May 18, 2017
Site: phys.org

With this discovery, most of the known dwarf planets in the Kuiper Belt larger than 600 miles across have companions. These bodies provide insight into how moons formed in the young solar system. "The discovery of satellites around all of the known large dwarf planets - except for Sedna - means that at the time these bodies formed billions of years ago, collisions must have been more frequent, and that's a constraint on the formation models," said Csaba Kiss of the Konkoly Observatory in Budapest, Hungary. He is the lead author of the science paper announcing the moon's discovery. "If there were frequent collisions, then it was quite easy to form these satellites." The objects most likely slammed into each other more often because they inhabited a crowded region. "There must have been a fairly high density of objects, and some of them were massive bodies that were perturbing the orbits of smaller bodies," said team member John Stansberry of the Space Telescope Science Institute in Baltimore, Maryland. "This gravitational stirring may have nudged the bodies out of their orbits and increased their relative velocities, which may have resulted in collisions." But the speed of the colliding objects could not have been too fast or too slow, according to the astronomers. If the impact velocity was too fast, the smash-up would have created lots of debris that could have escaped from the system; too slow and the collision would have produced only an impact crater. Collisions in the asteroid belt, for example, are destructive because objects are traveling fast when they smash together. The asteroid belt is a region of rocky debris between the orbits of Mars and the gas giant Jupiter. Jupiter's powerful gravity speeds up the orbits of asteroids, generating violent impacts. The team uncovered the moon in archival images of 2007 OR10 taken by Hubble's Wide Field Camera 3. Observations taken of the dwarf planet by NASA's Kepler Space Telescope first tipped off the astronomers of the possibility of a moon circling it. Kepler revealed that 2007 OR10 has a slow rotation period of 45 hours. "Typical rotation periods for Kuiper Belt Objects are under 24 hours," Kiss said. "We looked in the Hubble archive because the slower rotation period could have been caused by the gravitational tug of a moon. The initial investigator missed the moon in the Hubble images because it is very faint." The astronomers spotted the moon in two separate Hubble observations spaced a year apart. The images show that the moon is gravitationally bound to 2007 OR10 because it moves with the dwarf planet, as seen against a background of stars. However, the two observations did not provide enough information for the astronomers to determine an orbit. "Ironically, because we don't know the orbit, the link between the satellite and the slow rotation rate is unclear," Stansberry said. The astronomers calculated the diameters of both objects based on observations in far-infrared light by the Herschel Space Observatory, which measured the thermal emission of the distant worlds. The dwarf planet is about 950 miles across, and the moon is estimated to be 150 miles to 250 miles in diameter. 2007 OR10, like Pluto, follows an eccentric orbit, but it is currently three times farther than Pluto is from the sun. 2007 OR10 is a member of an exclusive club of nine dwarf planets. Of those bodies, only Pluto and Eris are larger than 2007 OR10. It was discovered in 2007 by astronomers Meg Schwamb, Mike Brown, and David Rabinowitz as part of a survey to search for distant solar system bodies using the Samuel Oschin Telescope at the Palomar Observatory in California. The team's results appeared in The Astrophysical Journal Letters. The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C. Explore further: 2007 OR10 is the largest unnamed dwarf planet in the solar system More information: "Discovery of a Satellite of the Large Trans-Neptunian Object (225088) 2007 OR10," Csaba Kiss et al., 2017 Mar. 20, Astrophysical Journal Letters iopscience.iop.org/article/10.3847/2041-8213/aa6484 , arxiv.org/abs/1703.01407


News Article | May 18, 2017
Site: www.eurekalert.org

The combined power of three space observatories, including NASA's Hubble Space Telescope, has helped astronomers uncover a moon orbiting the third largest dwarf planet, catalogued as 2007 OR10. The pair resides in the frigid outskirts of our solar system called the Kuiper Belt, a realm of icy debris left over from our solar system's formation 4.6 billion years ago. With this discovery, most of the known dwarf planets in the Kuiper Belt larger than 600 miles across have companions. These bodies provide insight into how moons formed in the young solar system. "The discovery of satellites around all of the known large dwarf planets - except for Sedna - means that at the time these bodies formed billions of years ago, collisions must have been more frequent, and that's a constraint on the formation models," said Csaba Kiss of the Konkoly Observatory in Budapest, Hungary. He is the lead author of the science paper announcing the moon's discovery. "If there were frequent collisions, then it was quite easy to form these satellites." The objects most likely slammed into each other more often because they inhabited a crowded region. "There must have been a fairly high density of objects, and some of them were massive bodies that were perturbing the orbits of smaller bodies," said team member John Stansberry of the Space Telescope Science Institute in Baltimore, Maryland. "This gravitational stirring may have nudged the bodies out of their orbits and increased their relative velocities, which may have resulted in collisions." But the speed of the colliding objects could not have been too fast or too slow, according to the astronomers. If the impact velocity was too fast, the smash-up would have created lots of debris that could have escaped from the system; too slow and the collision would have produced only an impact crater. Collisions in the asteroid belt, for example, are destructive because objects are traveling fast when they smash together. The asteroid belt is a region of rocky debris between the orbits of Mars and the gas giant Jupiter. Jupiter's powerful gravity speeds up the orbits of asteroids, generating violent impacts. The team uncovered the moon in archival images of 2007 OR10 taken by Hubble's Wide Field Camera 3. Observations taken of the dwarf planet by NASA's Kepler Space Telescope first tipped off the astronomers of the possibility of a moon circling it. Kepler revealed that 2007 OR10 has a slow rotation period of 45 hours. "Typical rotation periods for Kuiper Belt Objects are under 24 hours," Kiss said. "We looked in the Hubble archive because the slower rotation period could have been caused by the gravitational tug of a moon. The initial investigator missed the moon in the Hubble images because it is very faint." The astronomers spotted the moon in two separate Hubble observations spaced a year apart. The images show that the moon is gravitationally bound to 2007 OR10 because it moves with the dwarf planet, as seen against a background of stars. However, the two observations did not provide enough information for the astronomers to determine an orbit. "Ironically, because we don't know the orbit, the link between the satellite and the slow rotation rate is unclear," Stansberry said. The astronomers calculated the diameters of both objects based on observations in far-infrared light by the Herschel Space Observatory, which measured the thermal emission of the distant worlds. The dwarf planet is about 950 miles across, and the moon is estimated to be 150 miles to 250 miles in diameter. 2007 OR10, like Pluto, follows an eccentric orbit, but it is currently three times farther than Pluto is from the sun. 2007 OR10 is a member of an exclusive club of nine dwarf planets. Of those bodies, only Pluto and Eris are larger than 2007 OR10. It was discovered in 2007 by astronomers Meg Schwamb, Mike Brown, and David Rabinowitz as part of a survey to search for distant solar system bodies using the Samuel Oschin Telescope at the Palomar Observatory in California. The team's results appeared in The Astrophysical Journal Letters. The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington, D.C. For images and more information about the study and Hubble, visit:


News Article | May 10, 2017
Site: www.cnet.com

There are a lot of crazy-looking nebulae, but a new portrait launches the Crab Nebula into the upper echelons of space strangeness. The Space Telescope Science Institute's Hubble site says the image "captures the complexity of this tortured-looking supernova remnant." The composite portrait exists thanks to a combination of images and data collected by five different observatories: the Hubble Space Telescope, the Spitzer Space Telescope, the XMM-Newton and the Chandra X-Ray Observatory (all located in space), and the ground-based VLA observatory in the US. A neutron star known as the Crab Pulsar sits at the heart of the Crab Nebula. "The nebula's intricate shape is caused by a complex interplay of the pulsar, a fast-moving wind of particles coming from the pulsar, and material originally ejected by the supernova explosion and by the star itself before the explosion," notes the National Radio Astronomy Observatory. Though it's called the Crab Nebula, the name "Octopus Nebula" might have been more apt considering its dramatic tendril-like formations. The nebula is located about 6,500 light-years from our planet. This isn't the first portrait of the Crab Nebula, but it is the most vivid. Hubble researchers released a lovely look at the nebula in 2016 as well as a ghostly green view later that same year. Solving for XX: The industry seeks to overcome outdated ideas about "women in tech." Life, disrupted: In Europe, millions of refugees are still searching for a safe place to settle. Tech should be part of the solution. But is it?


News Article | May 9, 2017
Site: phys.org

The above tweet looks like 140 characters of misery. But in the hands of Northeastern's Alessandro Vespignani and his colleagues, it is so much more. An international team led by Vespignani has developed a unique computational model to project the spread of the seasonal flu in real time. It uses posts on Twitter in combination with key parameters of each season's epidemic, including the incubation period of the disease, the immunization rate, how many people an individual with the virus can infect, and the viral strains present. Tested against official influenza surveillance systems, the model has been shown to accurately forecast the disease's evolution up to six weeks in advance—significantly earlier than other models. It will enable public health agencies to plan ahead in allocating medical resources and launching campaigns that encourage individuals to take preventative measures such as vaccination and increased hand washing. "In the past, we had no knowledge of initial conditions for the flu," says Vespignani, who is also director of the Network Science Institute at Northeastern. The initial conditions—which show where and when an epidemic began as well as the extent of infection—function as a launching pad for forecasting the spread of any disease. To ascertain those conditions, the researchers incorporated Twitter into their parameter-driven model. "This kind of integration has never been done before," says Vespignani. "We were not looking for the number of people who were sick because Twitter will not tell you that. What we wanted to know was: Do we have more flu at this point in time in Texas or in New Jersey, in Seattle or in San Francisco? Twitter, which includes GPS locations, is a proxy for that. By looking at how many people were tweeting about their symptoms or how miserable they were because of the flu, we were able to get a relative weight in each of those areas of the U.S." The paper on the novel model received a coveted Best Paper Honorable Mention award at the 2017 International World Wide Web Conference last month following its presentation. It was one of only four papers out of more than 400 presented to be selected for an award. The researchers' work began when the Centers for Disease Control and Prevention announced the "Predict the Influenza Season Challenge" in November 2013, an invitation to external researchers to advance the science of forecasting infectious diseases. Vespignani and his team have been participating ever since, with the new paper covering their projections for the 2014-15 and 2015-16 flu seasons in the U.S., Italy, and Spain. Over those time periods, they applied forecasting and other algorithms week by week to the key parameters informed by the Twitter data. "This gave us a large number of possible ways the disease might evolve," says Vespignani. They then matched the resulting simulations with the surveillance data generated by the CDC and clinical and personal reports of influenza-like illnesses from the three countries. "The surveillance data tells us the ground truth for the past four weeks, but it is always delayed by about one week because you need to get the report from the doctor," he says. By analyzing the evolving dynamics revealed in the past data, they were able to select the model that would most likely forecast the future. The explicit modeling of the disease's parameters—information about the dynamics of the disease itself—set Vespignani's model apart from others in the challenge. For example, they could identify the week when the epidemic would reach its peak and the magnitude of that peak with an accuracy of 70 to 90 percent six weeks in advance of the event. "By capturing the key parameters, we could track how serious the flu was each year compared with every other year and see what was driving the spread," says first author Qian Zhang, PhD'14, associate research scientist at Northeastern. "That is what the public health agencies and the epidemiologists really care about. We are not just playing a game of numbers, which is what straightforward statistical models do." While the paper reports results using Twitter data, the researchers note that the model can work with data from many other digital sources, too, as well as online surveys of individuals such as influenzanet, which is very popular in Europe. "Our model is a work in progress," emphasizes Vespignani. "We plan to add new parameters, for example, school and workplace structure. This is not a challenge in the sense that you want to win. This is a science challenge in which you want to learn—to see that there is not a single model but a portfolio of models that will tell us new things." Explore further: The number of locally transmitted cases of Zika in U.S. expected to be very small


News Article | May 11, 2017
Site: www.eurekalert.org

A study combining observations from NASA's Hubble and Spitzer space telescopes reveals that the distant planet HAT-P-26b has a primitive atmosphere composed almost entirely of hydrogen and helium. Located about 437 light years away, HAT-P-26b orbits a star roughly twice as old as the sun. The analysis is one of the most detailed studies to date of a "warm Neptune," or a planet that is Neptune-sized and close to its star. The researchers determined that HAT-P-26b's atmosphere is relatively clear of clouds and has a strong water signature, although the planet is not a water world. This is the best measurement of water to date on an exoplanet of this size. The discovery of an atmosphere with this composition on this exoplanet has implications for how scientists think about the birth and development of planetary systems. Compared to Neptune and Uranus, the planets in our solar system with about the same mass, HAT-P-26b likely formed either closer to its host star or later in the development of its planetary system, or both. "Astronomers have just begun to investigate the atmospheres of these distant Neptune-mass planets, and almost right away, we found an example that goes against the trend in our solar system," said Hannah Wakeford, a postdoctoral researcher at NASA's Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the study published in the May 12, 2017, issue of Science. "This kind of unexpected result is why I really love exploring the atmospheres of alien planets." To study HAT-P-26b's atmosphere, the researchers used data from transits-- occasions when the planet passed in front of its host star. During a transit, a fraction of the starlight gets filtered through the planet's atmosphere, which absorbs some wavelengths of light but not others. By looking at how the signatures of the starlight change as a result of this filtering, researchers can work backward to figure out the chemical composition of the atmosphere. In this case, the team pooled data from four transits measured by Hubble and two seen by Spitzer. Together, those observations covered a wide range of wavelengths from yellow light through the near-infrared region. "To have so much information about a warm Neptune is still rare, so analyzing these data sets simultaneously is an achievement in and of itself," said co-author Tiffany Kataria of the Jet Propulsion Laboratory in Pasadena, California. Because the study provided a precise measurement of water, the researchers were able to use the water signature to estimate HAT-P-26b's metallicity. Astronomers calculate the metallicity, an indication of how rich the planet is in all elements heavier than hydrogen and helium, because it gives them clues about how a planet formed. To compare planets by their metallicities, scientists use the sun as a point of reference, almost like describing how much caffeine beverages have by comparing them to a cup of coffee. Jupiter has a metallicity about 2 to 5 times that of the sun. For Saturn, it's about 10 times as much as the sun. These relatively low values mean that the two gas giants are made almost entirely of hydrogen and helium. The ice giants Neptune and Uranus are smaller than the gas giants but richer in the heavier elements, with metallicities of about 100 times that of the sun. So, for the four outer planets in our solar system, the trend is that the metallicities are lower for the bigger planets. Scientists think this happened because, as the solar system was taking shape, Neptune and Uranus formed in a region toward the outskirts of the enormous disk of dust, gas and debris that swirled around the immature sun. Summing up the complicated process of planetary formation in a nutshell: Neptune and Uranus would have been bombarded with a lot of icy debris that was rich in heavier elements. Jupiter and Saturn, which formed in a warmer part of the disk, would have encountered less of the icy debris. Two planets beyond our solar system also fit this trend. One is the Neptune-mass planet HAT-P-11b. The other is WASP-43b, a gas giant twice as massive as Jupiter. But Wakeford and her colleagues found that HAT-P-26b bucks the trend. They determined its metallicity is only about 4.8 times that of the sun, much closer to the value for Jupiter than for Neptune. "This analysis shows that there is a lot more diversity in the atmospheres of these exoplanets than we were expecting, which is providing insight into how planets can form and evolve differently than in our solar system," said David K. Sing of the University of Exeter and the second author of the paper. "I would say that has been a theme in the studies of exoplanets: Researchers keep finding surprising diversity." The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington. NASA's Jet Propulsion Laboratory in Pasadena, California, manages the Spitzer Space Telescope for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at Caltech in Pasadena. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA. For more information about Spitzer, visit: For images and more information about Hubble, visit:


News Article | May 9, 2017
Site: www.eurekalert.org

"This flu is horrendous. Can't breathe, can't sleep or eat. Muscles ache, fever 102. Should have gotten the shot. Time for a movie marathon." The above tweet looks like 140 characters of misery. But in the hands of Northeastern's Alessandro Vespignani and his colleagues, it is so much more. An international team led by Vespignani has developed a unique computational model to project the spread of the seasonal flu in real time. It uses posts on Twitter in combination with key parameters of each season's epidemic, including the incubation period of the disease, the immunization rate, how many people an individual with the virus can infect, and the viral strains present. Tested against official influenza surveillance systems, the model has been shown to accurately forecast the disease's evolution up to six weeks in advance--significantly earlier than other models. It will enable public health agencies to plan ahead in allocating medical resources and launching campaigns that encourage individuals to take preventative measures such as vaccination and increased hand washing. "In the past, we had no knowledge of initial conditions for the flu," says Vespignani, who is also director of the Network Science Institute at Northeastern. The initial conditions--which show where and when an epidemic began as well as the extent of infection--function as a launching pad for forecasting the spread of any disease. To ascertain those conditions, the researchers incorporated Twitter into their parameter-driven model. "This kind of integration has never been done before," says Vespignani. "We were not looking for the number of people who were sick because Twitter will not tell you that. What we wanted to know was: Do we have more flu at this point in time in Texas or in New Jersey, in Seattle or in San Francisco? Twitter, which includes GPS locations, is a proxy for that. By looking at how many people were tweeting about their symptoms or how miserable they were because of the flu, we were able to get a relative weight in each of those areas of the U.S." The paper on the novel model received a coveted Best Paper Honorable Mention award at the 2017 International World Wide Web Conference last month following its presentation. It was one of only four papers out of more than 400 presented to be selected for an award. The researchers' work began when the Centers for Disease Control and Prevention announced the "Predict the Influenza Season Challenge" in November 2013, an invitation to external researchers to advance the science of forecasting infectious diseases. Vespignani and his team have been participating ever since, with the new paper covering their projections for the 2014-15 and 2015-16 flu seasons in the U.S., Italy, and Spain. Over those time periods, they applied forecasting and other algorithms week by week to the key parameters informed by the Twitter data. "This gave us a large number of possible ways the disease might evolve," says Vespignani. They then matched the resulting simulations with the surveillance data generated by the CDC and clinical and personal reports of influenza-like illnesses from the three countries. "The surveillance data tells us the ground truth for the past four weeks, but it is always delayed by about one week because you need to get the report from the doctor," he says. By analyzing the evolving dynamics revealed in the past data, they were able to select the model that would most likely forecast the future. The explicit modeling of the disease's parameters--information about the dynamics of the disease itself--set Vespignani's model apart from others in the challenge. For example, they could identify the week when the epidemic would reach its peak and the magnitude of that peak with an accuracy of 70 to 90 percent six weeks in advance of the event. "By capturing the key parameters, we could track how serious the flu was each year compared with every other year and see what was driving the spread," says first author Qian Zhang, PhD'14, associate research scientist at Northeastern. "That is what the public health agencies and the epidemiologists really care about. We are not just playing a game of numbers, which is what straightforward statistical models do." While the paper reports results using Twitter data, the researchers note that the model can work with data from many other digital sources, too, as well as online surveys of individuals such as influenzanet, which is very popular in Europe. "Our model is a work in progress," emphasizes Vespignani. "We plan to add new parameters, for example, school and workplace structure. This is not a challenge in the sense that you want to win. This is a science challenge in which you want to learn--to see that there is not a single model but a portfolio of models that will tell us new things."


News Article | May 11, 2017
Site: spaceref.com

Astronomers have produced a highly detailed image of the Crab Nebula They did so by combining data from telescopes spanning nearly the entire breadth of the electromagnetic spectrum, from radio waves seen by the Karl G. Jansky Very Large Array (VLA) to the powerful X-ray glow as seen by the orbiting Chandra X-ray Observatory. And, in between that range of wavelengths, the Hubble Space Telescope's crisp visible-light view, and the infrared perspective of the Spitzer Space Telescope. The Crab Nebula, the result of a bright supernova explosion seen by Chinese and other astronomers in the year 1054, is 6,500 light-years from Earth. At its center is a super-dense neutron star, rotating once every 33 milliseconds, shooting out rotating lighthouse-like beams of radio waves and light -- a pulsar (the bright dot at image center). The nebula's intricate shape is caused by a complex interplay of the pulsar, a fast-moving wind of particles coming from the pulsar, and material originally ejected by the supernova explosion and by the star itself before the explosion. This image combines data from five different telescopes: the VLA (radio) in red; Spitzer Space Telescope (infrared) in yellow; Hubble Space Telescope (visible) in green; XMM-Newton (ultraviolet) in blue; and Chandra X-ray Observatory (X-ray) in purple. The new VLA, Hubble, and Chandra observations all were made at nearly the same time in November of 2012. A team of scientists led by Gloria Dubner of the Institute of Astronomy and Physics (IAFE), the National Council of Scientific Research (CONICET), and the University of Buenos Aires in Argentina then made a thorough analysis of the newly revealed details in a quest to gain new insights into the complex physics of the object. They are reporting their findings in the Astrophysical Journal. "Comparing these new images, made at different wavelengths, is providing us with a wealth of new detail about the Crab Nebula. Though the Crab has been studied extensively for years, we still have much to learn about it," Dubner said. The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington. NASA's Marshall Space Flight Center in Huntsville, Alabama, manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations. NASA's Jet Propulsion Laboratory in Pasadena, California, manages the Spitzer Space Telescope for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at Caltech in Pasadena. Spacecraft operations are based at Lockheed Martin Space Systems Company, Littleton, Colorado. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA. The National Radio Astronomy Observatory is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. Please follow SpaceRef on Twitter and Like us on Facebook.


Straw is commonly used for feeding animals, burning, baling, etc. As one of the "Three Canton Treasures", straw can actually be used as a raw material to produce biofuel. Ethanol, an alcohol, is a clean and renewable biofuel traditionally produced by fermentation of sucrose from sugarcane or glucose released from corn starch. With an increasing demand on biofuel in recent years, cellulose from non-edible plant materials (e.g. sugarcane leaves, corn stalks, rice straw) has been used as raw materials for bioethanol production. However, since cellulose is crosslinked with lignin in plant cell walls, it is very difficult to release glucose from cellulose. A collaborative research effort by the University of Hong Kong (HKU) and Kyoto University (Kyoto U) has revealed a new strategy to allow cellulose in rice straw to release its fermentable sugar more efficiently. The research breakthrough was recently published in a notable plant science journal Plant Physiology. Lignin is a complex polymer which functions to provide mechanical strength and structural integrity in plants. However, expensive and complicated procedures are required to loosen the lignin barrier in order to utilize cellulose more efficiently during the production of bioethanol. Rice and other cereals belong to the grass family (Poaceae). Lignin in their stems and leaves contain a special component called tricin. HKU plant biochemists Dr Clive Lo Sze-chung and his student Dr Lydia Lam Pui-ying, together with Kyoto U lignin specialist Dr Yuki Tobimatsu, started a collaborative project two years ago. According to their discovery, when flavone synthase II (FNSII), a key enzyme involved in tricin synthesis, is knocked out, not only is tricin not produced, but the lignin content in rice straw was also reduced by approximately one-third. In addition, the yield of glucose from cellulose degradation was increased by 37% without any chemical treatment. Glucose released from cellulose can be used for bioethanol production. In other words, it is more efficient to produce ethanol from this kind of rice straw: the cost of lignin treatment can be reduced and the production of ethanol can be enhanced. "This is the first demonstration of the reduction of cell wall lignin content in rice straw by the disruption of tricin production", said Clive Lo, "Importantly, there are no negative impacts on rice growth and productivity." As plants in the grass family all contain tricin-bound lignin, this strategy can be applied to other cereals like maize, wheat, and barley as well as grass species (e.g. sorghum and switchgrass) cultivated around the world exclusively for ethanol production, so that they can be utilized more efficiently as raw materials for biofuel. Dr. Lydia Lam has been recently awarded the JSPS Postdoctoral Fellowship for Research in Japan by the Japan Society for the Promotion of Science and will start her postdoctoral research at Kyoto U this September. She said, "I feel very delighted and honored to conduct a research project that could benefit society. Also, as a Hongkonger, I am always trained to work quickly and efficiently. During the eight-month research experience at Kyoto U, I was particularly impressed by the students there. They performed experiments with extreme care and precision. When I am doing research today, I always ask myself to do better than perfect in addition to seeking speed and efficiency." Link of the article in Plant Physiology: "Disrupting Flavone Synthase II Alters Lignin and Improves Biomass Digestibility" Dr Clive Lo is an Associate Professor in the School of Biological Sciences, the University of Hong Kong. His laboratory has been elucidating biosynthesis pathways of flavonoids in cereal crops for applications in metabolic engineering. His research projects are supported by the Research Grants Council of Hong Kong. Dr Lydia Lam joined the Summer Science Institute during her secondary school years and was then inspired and determined to study Biotechnology. She was admitted to HKU in 2008 and received her Bachelor of Science (First Class Honours) Degree in Biotechnology. Afterwards, she was awarded the highly competitive Hong Kong PhD Fellowship and completed her PhD study in December 2016. Previously Lo, Lam, and other lab members published two papers in Plant Physiology on tricin biosynthesis pathway in rice (DOIs: http://dx. and http://dx. ), providing important theoretical basis for the above investigation. Dr Yuki Tobimatsu is an Associate Professor in the Laboratory of Metabolic Science of Forest Plants & Microorganisms, Research Institute for Sustainable Humanosphere, Kyoto University, Japan. His research areas include structure and formation of plant cell walls, lignin chemistry and biochemistry, and molecular breeding of biofuel crops.

Loading Science Institute collaborators
Loading Science Institute collaborators