Puebla de Zaragoza, Mexico
Puebla de Zaragoza, Mexico

Time filter

Source Type

News Article | April 25, 2017
Site: www.sciencedaily.com

Neuroscientists have long noted that if certain brain cells are destroyed by, say, a stroke, new circuits may be laid in another location to compensate, essentially rewiring the brain. Northeastern's William R. Hobbs, an expert in computational social science, wanted to know if social networks responded similarly after the death of a close mutual friend. In new research published Monday in the journal Nature Human Behavior, Hobbs found that they did, thereby representing a paradigm of social network resilience. Hobbs, who led the study, collaborated with Facebook data scientist Moira Burke. The researchers found that close friends of the deceased immediately increased their interactions with one another by 30 percent, peaking in volume. The interactions faded a bit in the following months and ultimately stabilized at the same volume of interaction as before the death, even two years after the loss. This insight into how social networks adapt to significant losses could lead to new ways to help people with the grieving process, ensuring that their networks are able to recover rather than collapse during these difficult times. "Most people don't have very many friends, so when we lose one, that leaves a hole in our networks as well as in our lives," says Hobbs, a postdoctoral research fellow in the lab of David Lazer, Distinguished Professor of Political Science and Computer and Information Science. He wondered: Would a social network unravel with a central member gone? If it recovered, how might it heal? "We expected to see a spike in interactions among close friends immediately after the loss, corresponding with the acute grieving period," says Hobbs. "What surprised us was that the stronger ties continued for years. People made up for the loss of interacting with the friend who had died by increasing interactions with one another." Hobbs came to the study from a crisis of his own. After college, he lived and worked in China studying local governments. But when he entered graduate school at the University of California, San Diego, his father was dying. "So I switched to American politics, then to studying chronic illnesses, and then moving into the effect of deaths on others," he says. That switch led to this first large-scale investigation of recovery and resilience after a death in social networks. It has the potential to reveal a great deal about ourselves, says Lazer, who is also a core faculty member in the Network Science Institute at Northeastern. "Death is a tear in the fabric of the social network that binds us together," he says. "This research provides insight into how our networks heal from this tear over time, and points to the ways that our digital traces can offer important clues into how we help each other through the grieving process." Using sophisticated data counters and computer analysis, the researchers compared monthly interactions -- wall posts, comments, and photo tags -- of approximately 15,000 Facebook networks that had experienced the death of a friend with monthly interactions of approximately 30,000 similar Facebook networks that had not. The first group comprised more than 770,000 people, the latter more than 2 million. They learned about the deaths from California state vital records, and characterized "close friends" as those who had interacted with the person who died before the study began. To maintain the users' privacy, the data was aggregated and "de-identified" -- that is, all elements that associated the data with the individual were removed. "The response was different from what other researchers have found regarding natural disasters or other kinds of trauma," says Hobbs. "There you see a spike in communications but that disappears quickly afterward." In particular, the researchers found that networks comprising young adults, ages 18 to 24, showed the strongest recovery. They were not only more likely to recover than others, their interaction levels also stayed elevated -- higher than before the loss. Networks experiencing suicides, on the other hand, showed the least amount of recovery. Further research is necessary to understand why, says Hobbs. "We didn't study the subjective experience of loss, or how people feel," cautions Hobbs. "We looked at recovery only in terms of connectivity. We also can't say for certain whether the results translate into closer friendships offline." What they do show is that online social networks appear to function as a safety net. "They do so quickly, and the effect persists," he says. "There are so few studies on the effect of the death of a friend on a network. This is a big step forward."


News Article | April 25, 2017
Site: www.eurekalert.org

Neuroscientists have long noted that if certain brain cells are destroyed by, say, a stroke, new circuits may be laid in another location to compensate, essentially rewiring the brain. Northeastern's William R. Hobbs, an expert in computational social science, wanted to know if social networks responded similarly after the death of a close mutual friend. In new research published Monday in the journal Nature Human Behavior, Hobbs found that they did, thereby representing a paradigm of social network resilience. Hobbs, who led the study, collaborated with Facebook data scientist Moira Burke. The researchers found that close friends of the deceased immediately increased their interactions with one another by 30 percent, peaking in volume. The interactions faded a bit in the following months and ultimately stabilized at the same volume of interaction as before the death, even two years after the loss. This insight into how social networks adapt to significant losses could lead to new ways to help people with the grieving process, ensuring that their networks are able to recover rather than collapse during these difficult times. "Most people don't have very many friends, so when we lose one, that leaves a hole in our networks as well as in our lives," says Hobbs, a postdoctoral research fellow in the lab of David Lazer, Distinguished Professor of Political Science and Computer and Information Science. He wondered: Would a social network unravel with a central member gone? If it recovered, how might it heal? "We expected to see a spike in interactions among close friends immediately after the loss, corresponding with the acute grieving period," says Hobbs. "What surprised us was that the stronger ties continued for years. People made up for the loss of interacting with the friend who had died by increasing interactions with one another." Hobbs came to the study from a crisis of his own. After college, he lived and worked in China studying local governments. But when he entered graduate school at the University of California, San Diego, his father was dying. "So I switched to American politics, then to studying chronic illnesses, and then moving into the effect of deaths on others," he says. That switch led to this first large-scale investigation of recovery and resilience after a death in social networks. It has the potential to reveal a great deal about ourselves, says Lazer, who is also a core faculty member in the Network Science Institute at Northeastern. "Death is a tear in the fabric of the social network that binds us together," he says. "This research provides insight into how our networks heal from this tear over time, and points to the ways that our digital traces can offer important clues into how we help each other through the grieving process." Using sophisticated data counters and computer analysis, the researchers compared monthly interactions--wall posts, comments, and photo tags--of approximately 15,000 Facebook networks that had experienced the death of a friend with monthly interactions of approximately 30,000 similar Facebook networks that had not. The first group comprised more than 770,000 people, the latter more than 2 million. They learned about the deaths from California state vital records, and characterized "close friends" as those who had interacted with the person who died before the study began. To maintain the users' privacy, the data was aggregated and "de-identified"--that is, all elements that associated the data with the individual were removed. "The response was different from what other researchers have found regarding natural disasters or other kinds of trauma," says Hobbs. "There you see a spike in communications but that disappears quickly afterward." In particular, the researchers found that networks comprising young adults, ages 18 to 24, showed the strongest recovery. They were not only more likely to recover than others, their interaction levels also stayed elevated--higher than before the loss. Networks experiencing suicides, on the other hand, showed the least amount of recovery. Further research is necessary to understand why, says Hobbs. "We didn't study the subjective experience of loss, or how people feel," cautions Hobbs. "We looked at recovery only in terms of connectivity. We also can't say for certain whether the results translate into closer friendships offline." What they do show is that online social networks appear to function as a safety net. "They do so quickly, and the effect persists," he says. "There are so few studies on the effect of the death of a friend on a network. This is a big step forward."


News Article | February 24, 2017
Site: www.eurekalert.org

Three decades ago, astronomers spotted one of the brightest exploding stars in more than 400 years. The titanic supernova, called Supernova 1987A (SN 1987A), blazed with the power of 100 million suns for several months following its discovery on Feb. 23, 1987. Since that first sighting, SN 1987A has continued to fascinate astronomers with its spectacular light show. Located in the nearby Large Magellanic Cloud, it is the nearest supernova explosion observed in hundreds of years and the best opportunity yet for astronomers to study the phases before, during, and after the death of a star. To commemorate the 30th anniversary of SN 1987A, new images, time-lapse movies, a data-based animation based on work led by Salvatore Orlando at INAF-Osservatorio Astronomico di Palermo, Italy, and a three-dimensional model are being released. By combining data from NASA's Hubble Space Telescope and Chandra X-ray Observatory, as well as the international Atacama Large Millimeter/submillimeter Array (ALMA), astronomers -- and the public -- can explore SN 1987A like never before. Hubble has repeatedly observed SN 1987A since 1990, accumulating hundreds of images, and Chandra began observing SN 1987A shortly after its deployment in 1999. ALMA, a powerful array of 66 antennas, has been gathering high-resolution millimeter and submillimeter data on SN 1987A since its inception. "The 30 years' worth of observations of SN 1987A are important because they provide insight into the last stages of stellar evolution," said Robert Kirshner of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, and the Gordon and Betty Moore Foundation in Palo Alto, California. The latest data from these powerful telescopes indicate that SN 1987A has passed an important threshold. The supernova shock wave is moving beyond the dense ring of gas produced late in the life of the pre-supernova star when a fast outflow or wind from the star collided with a slower wind generated in an earlier red giant phase of the star's evolution. What lies beyond the ring is poorly known at present, and depends on the details of the evolution of the star when it was a red giant. "The details of this transition will give astronomers a better understanding of the life of the doomed star, and how it ended," said Kari Frank of Penn State University who led the latest Chandra study of SN 1987A. Supernovas such as SN 1987A can stir up the surrounding gas and trigger the formation of new stars and planets. The gas from which these stars and planets form will be enriched with elements such as carbon, nitrogen, oxygen and iron, which are the basic components of all known life. These elements are forged inside the pre-supernova star and during the supernova explosion itself, and then dispersed into their host galaxy by expanding supernova remnants. Continued studies of SN 1987A should give unique insight into the early stages of this dispersal. Some highlights from studies involving these telescopes include: Hubble studies have revealed that the dense ring of gas around the supernova is glowing in optical light, and has a diameter of about a light-year. The ring was there at least 20,000 years before the star exploded. A flash of ultraviolet light from the explosion energized the gas in the ring, making it glow for decades. The central structure visible inside the ring in the Hubble image has now grown to roughly half a light-year across. Most noticeable are two blobs of debris in the center of the supernova remnant racing away from each other at roughly 20 million miles an hour. From 1999 until 2013, Chandra data showed an expanding ring of X-ray emission that had been steadily getting brighter. The blast wave from the original explosion has been bursting through and heating the ring of gas surrounding the supernova, producing X-ray emission. In the past few years, the ring has stopped getting brighter in X-rays. From about February 2013 until the last Chandra observation analyzed in September 2015 the total amount of low-energy X-rays has remained constant. Also, the bottom left part of the ring has started to fade. These changes provide evidence that the explosion's blast wave has moved beyond the ring into a region with less dense gas. This represents the end of an era for SN 1987A. Beginning in 2012, astronomers used ALMA to observe the glowing remains of the supernova, studying how the remnant is actually forging vast amounts of new dust from the new elements created in the progenitor star. A portion of this dust will make its way into interstellar space and may become the building blocks of future stars and planets in another system. These observations also suggest that dust in the early universe likely formed from similar supernova explosions. Astronomers also are still looking for evidence of a black hole or a neutron star left behind by the blast. They observed a flash of neutrinos from the star just as it erupted. This detection makes astronomers quite certain a compact object formed as the center of the star collapsed -- either a neutron star or a black hole -- but no telescope has uncovered any evidence for one yet. These latest visuals were made possible by combining several sources of information including simulations by Salvatore Orlando and collaborators that appear in this paper: https:/ . The Chandra study by Frank et al. can be found online at http://lanl. . Recent ALMA results on SN 87A are available at https:/ . The Chandra program is managed by NASA's Marshall Space Flight Center in Huntsville, Alabama, for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory in Cambridge, Massachusetts, controls Chandra's science and flight operations. The Hubble Space Telescope is a project of international cooperation between NASA and ESA (European Space Agency). NASA's Goddard Space Flight Center in Greenbelt, Maryland, manages the telescope. The Space Telescope Science Institute (STScI) in Baltimore conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc., in Washington. ALMA is a partnership of ESO (representing its member states), NSF (USA) and NINS (Japan), together with NRC (Canada), NSC and ASIAA (Taiwan), and KASI (Republic of South Korea), in cooperation with the Republic of Chile. The Joint ALMA Observatory is operated by ESO, AUI/NRAO and NAOJ.


News Article | February 16, 2017
Site: www.eurekalert.org

Study led by University of Oregon researcher finds that to prevent substance abuse problems in adolescents, interventions should specifically target neurobehavioral mechanisms underlying impulsive decision-making EUGENE, Ore. -- Feb. 16, 2017 -- Drug use in adolescence is often linked to later substance-abuse problems, but a new study suggests that a key risk factor is a combination of weak working memory and difficulties with impulse control. These risk factors predispose to progressive drug use in younger years and subsequent dependence, report researchers at three institutions in a paper placed online Feb. 16 by the journal Addiction. The study focused on alcohol, marijuana and tobacco use, the most commonly used drugs by adolescents. For young people with difficulties in impulse control, intervention programs that focus on simply stopping early drug use don't go far enough, said lead author Atika Khurana, assistant professor in the Department of Counseling Psychology and Human Services at the University of Oregon. "We found that there is some effect that was carried through the early progression in drug use. It is a risk factor," said Khurana, who also is a research scientist in the UO's Prevention Science Institute. "But we also found that the underlying weakness in working memory and impulse control continues to pose a risk for later substance use disorders." Working memory refers to the ability to concentrate on a task without being easily distracted. Youth with weak working memory tend to have problems controlling their impulses and thus appear to be at greater risk of continuing drug use. The findings emerged from a final assessment of 387 young people, ages 18-20, who were recruited as 10- to 12-year-olds in 2004 for a long-term study by the Annenberg Public Policy Center of the University of Pennsylvania in collaboration with the Children's Hospital of Philadelphia. In a paper published in 2015 in the journal Development and Psychopathology, Khurana's team documented how adolescents with stronger working memory were better equipped to escape progression into heavy use following initial experimentation. "Unanswered in our earlier work was whether it was specific forms of early use that predict later substance abuse," said Khurana, who was a postdoctoral fellow at the Annenberg Public Policy Center when the long-term study began. "People really hadn't focused on the heterogeneity of drug-use patterns. Some youth can start early and experiment but not progress while others experiment and progress into heavier drug use." Analyzing multiple waves of data from early to late adolescence, the researchers found that experimenting with drugs at an early age wasn't a key factor in predicting later substance use disorders. It was the progression in drug use along with weakness in working memory and impulse control difficulties that predicted substance use disorders at later ages. The researchers also reported that underlying weaknesses in working memory and impulse control continue to pose a risk for later substance use disorders, apart from early drug use progression. "Substance use disorders are a major public health concern in this country," Khurana said. "The onset of substance use happens during adolescence. There is a lot of research that links early onset of use to later substance use disorders. Our study advances the field by showing that just addressing early use is not going to solve the problem." "Drug prevention strategy in the schools typically focuses on middle school when early drug use tends to take place and assumes that any drug use at all is a problem," said co-author Dan Romer, research director of the Annenberg Public Policy Center. "This study suggests that prevention needs to be more nuanced. The risk depends on whether drug use is likely to progress." Interventions that strengthen working memory and cognitive processing related to inhibiting impulsive responses need to be developed to help adolescents better navigate drug-related temptations, Khurana said. Co-authors with Khurana and Romer were Laura M. Betancourt and Hallam Hurt of the Children's Hospital of Philadelphia. The National Institutes of Health supported the research through two grants from the National Institute on Drug Abuse. Note: The UO is equipped with an on-campus television studio with a point-of-origin Vyvx connection, which provides broadcast-quality video to networks worldwide via fiber optic network. There also is video access to satellite uplink and audio access to an ISDN codec for broadcast-quality radio interviews.


News Article | March 2, 2017
Site: www.eurekalert.org

Humanity may soon generate more data than hard drives or magnetic tape can handle, a problem that has scientists turning to nature's age-old solution for information-storage--DNA. In a new study in Science, a pair of researchers at Columbia University and the New York Genome Center (NYGC) show that an algorithm designed for streaming video on a cellphone can unlock DNA's nearly full storage potential by squeezing more information into its four base nucleotides. They demonstrate that this technology is also extremely reliable. DNA is an ideal storage medium because it's ultra-compact and can last hundreds of thousands of years if kept in a cool, dry place, as demonstrated by the recent recovery of DNA from the bones of a 430,000-year-old human ancestor found in a cave in Spain. "DNA won't degrade over time like cassette tapes and CDs, and it won't become obsolete--if it does, we have bigger problems," said study coauthor Yaniv Erlich, a computer science professor at Columbia Engineering, a member of Columbia's Data Science Institute, and a core member of the NYGC. Erlich and his colleague Dina Zielinski, an associate scientist at NYGC, chose six files to encode, or write, into DNA: a full computer operating system, an 1895 French film, "Arrival of a train at La Ciotat," a $50 Amazon gift card, a computer virus, a Pioneer plaque and a 1948 study by information theorist Claude Shannon. They compressed the files into a master file, and then split the data into short strings of binary code made up of ones and zeros. Using an erasure-correcting algorithm called fountain codes, they randomly packaged the strings into so-called droplets, and mapped the ones and zeros in each droplet to the four nucleotide bases in DNA: A, G, C and T. The algorithm deleted letter combinations known to create errors, and added a barcode to each droplet to help reassemble the files later. In all, they generated a digital list of 72,000 DNA strands, each 200 bases long, and sent it in a text file to a San Francisco DNA-synthesis startup, Twist Bioscience, that specializes in turning digital data into biological data. Two weeks later, they received a vial holding a speck of DNA molecules. To retrieve their files, they used modern sequencing technology to read the DNA strands, followed by software to translate the genetic code back into binary. They recovered their files with zero errors, the study reports. (In this short demo, Erlich opens his archived operating system on a virtual machine and plays a game of Minesweeper to celebrate.) They also demonstrated that a virtually unlimited number of copies of the files could be created with their coding technique by multiplying their DNA sample through polymerase chain reaction (PCR), and that those copies, and even copies of their copies, and so on, could be recovered error-free. Finally, the researchers show that their coding strategy packs 215 petabytes of data on a single gram of DNA--100 times more than methods published by pioneering researchers George Church at Harvard, and Nick Goldman and Ewan Birney at the European Bioinformatics Institute. "We believe this is the highest-density data-storage device ever created," said Erlich. The capacity of DNA data-storage is theoretically limited to two binary digits for each nucleotide, but the biological constraints of DNA itself and the need to include redundant information to reassemble and read the fragments later reduces its capacity to 1.8 binary digits per nucleotide base. The team's insight was to apply fountain codes, a technique Erlich remembered from graduate school, to make the reading and writing process more efficient. With their DNA Fountain technique, Erlich and Zielinski pack an average of 1.6 bits into each base nucleotide. That's at least 60 percent more data than previously published methods, and close to the 1.8-bit limit. Cost still remains a barrier. The researchers spent $7,000 to synthesize the DNA they used to archive their 2 megabytes of data, and another $2,000 to read it. Though the price of DNA sequencing has fallen exponentially, there may not be the same demand for DNA synthesis, says Sri Kosuri, a biochemistry professor at UCLA who was not involved in the study. "Investors may not be willing to risk tons of money to bring costs down," he said. But the price of DNA synthesis can be vastly reduced if lower-quality molecules are produced, and coding strategies like DNA Fountain are used to fix molecular errors, says Erlich. "We can do more of the heavy lifting on the computer to take the burden off time-intensive molecular coding," he said. The Data Science Institute at Columbia University is training the next generation of data scientists and developing innovative technology to serve society. http://datascience. Columbia Engineering is one of the top engineering schools in the U.S. and one of the oldest in the nation. Based in New York City, the School offers programs to both undergraduate and graduate students who undertake a course of study leading to the bachelor's, master's, or doctoral degree in engineering and applied science. Columbia Engineering's nine departments offer 16 majors and more than 30 minors in engineering and the liberal arts, including an interdisciplinary minor in entrepreneurship with Columbia Business School. With facilities specifically designed and equipped to meet the laboratory and research needs of faculty and students, Columbia Engineering is home to a broad array of basic and advanced research installations, from the Columbia Nano Initiative and Data Science Institute to the Columbia Genome Center. These interdisciplinary centers in science and engineering, big data, nanoscience, and genomic research are leading the way in their respective fields while our engineers and scientists collaborate across the University to solve theoretical and practical problems in many other significant areas. The New York Genome Center is an independent, nonprofit academic research organization at the forefront of transforming biomedical research and clinical care with the mission of saving lives. A collaboration of renowned academic, medical and industry leaders across the globe, the New York Genome Center's goal is to translate genomic research into development of new treatments, therapies and therapeutics against human disease. Its member organizations and partners are united in this unprecedented collaboration of technology, science and medicine, designed to harness the power of innovation and discoveries to advance genomic services.


News Article | December 14, 2016
Site: www.eurekalert.org

Using crop models as a tool to assist nitrogen management decisions in corn as a win-win for the agronomy business and the environment With an innovative modeling approach, researchers set out to examine corn and soybean yields and optimal nitrogen (N) fertilizer rates. In their study, recently published in Frontiers in Plant Science, they uses a 16-year long-term dataset from central Iowa, USA, with a state-of-the-art simulator that modeled corn and soybean yields, improving predictions of optimal N fertilizer rates for corn. This has global relevance for food security and sustainable agricultural practices in light of future climate change scenarios. Corn, also known as maize, is one of the top three staple crops farmed globally with global production predicted to rise from 720.8 million tons in 2015 to 872.9 by 2030, according to the Food and Agriculture Organization . Corn also requires large nutrient supplements in the form of fertilizer due to its fast-growing, nitrogen hungry characteristics. And global demand is growing. "A huge challenge in agriculture is predicting the optimal N fertilizer rates which, if fine-tuned, can reduce N losses and increase profits", explains Laila Puntel, a graduate student and research assistant in Crop Production and Physiology at Iowa State University, USA, and lead author of the study. The ultimate goal is accurately predicting the economic optimum nitrogen rate (EONR), the amount of nitrogen fertilizer that will provide the maximum economic return to nitrogen added. This is notoriously complex to calculate due to factors including the soil-plant-atmosphere system, uncertainty in weather and fluctuations in crop and fertilizer prices. To solve this conundrum, many technologies and approaches have been developed to assess the state of agricultural land. These include real-time remote sensing, aerial imaging, soil mapping and nitrate testing, crop canopy sensing and measuring chlorophyll levels. Web applications have also been developed including digital soil and weather databases. However, no single technology can make predictions of yield or optimal N fertilizer rates with the required accuracy or precision. Puntel and her international co-authors tackled this problem head on, designing an inter-disciplinary approach using field and experimental data. These data were used to test the Agricultural Production Systems sIMulator (APSIM), an internationally recognized highly advanced simulator of agricultural systems. "We found that long-term experimental data incorporating agricultural, economic and environmental factors are valuable in testing and refining the APSIM model predictions, leading to more accurate predictions of EONR" says co-author Dr. Sotirios Archontoulis, Assistant Professor in the Department of Agronomy at Iowa State University, USA. Archontoulis continues "The study results show that predictions of N fertilizer rates for corn are more accurate when inter-annual variability is taken into account. Site-specific datasets on variables such as landscape factors, weather and prices for fertilizers and crops are also key to achieving the best results." The study identifies five potential applications where the model could assist N management, ranging from simulation of N dynamics to climate change impact on optimal N requirement. It also found that optimum N rate was high for corn production alone, but could be reduced by rotating the corn with soybean. The study is timely as environmental concerns are very real and increasing. Excess nutrients such as nitrogen and phosphorus enter the water cycle via surface run-off, leaching or denitrification. This contaminates water systems and can also promote algal growth in water systems which can be toxic, damaging fisheries. "The study shows that using a combination of methods including process-based modeling, existing N rates and field data really can fine-tune N rate guidance for corn. Ultimately, reducing the use of nitrogen fertilizer is a win-win for the agricultural business and the environment." concludes Puntel. This work was part of the Agriculture and Food Research Initiative Hatch project No. 1004346 and was also partially supported by the Plant Science Institute, and the Brown Graduate Fellowship program of Iowa State University. Citation: Puntel LA, Sawyer JE, Barker DW, Dietzel R, Poffenbarger H, Castellano MJ, Moore KJ, Thorburn P and Archontoulis SV (2016) Modeling Long-Term Corn Yield Response to Nitrogen Rate and Crop Rotation. Front. Plant Sci. 7:1630. doi: 10.3389/fpls.2016.01630


News Article | December 21, 2016
Site: www.eurekalert.org

BLOOMINGTON, Ind. -- The Observatory on Social Media at Indiana University has launched a powerful new tool in the fight against fake news. The tool, called Hoaxy, visualizes how claims in the news -- and fact checks of those claims -- spread online through social networks. The tool is built upon earlier work at IU led by Filippo Menczer, a professor and director of the Center for Complex Networks and Systems Research in the IU School of Informatics and Computing. Hoaxy is online at http://hoaxy. . "In the past year, the influence of fake news in the U.S. has grown from a niche concern to a phenomenon with the power to sway public opinion," Menczer said. "We've now even seen examples of fake news inspiring real-life danger, such as the gunman who fired shots in a Washington, D.C., pizza parlor in response to false claims of child trafficking." Previous tools from the observatory at IU include BotOrNot, a system to assess whether the intelligence behind a Twitter account is more likely a person or a computer, and a suite of online tools that allows anyone to analyze the spread of hashtags across social networks. In response to the growth of fake news, several major web services are making changes to curtail the spread of false information on their platforms. Google and Facebook recently banned the use of their advertisement services on websites that post fake news, for example. Facebook also rolled out a system last week through which users can flag stories they suspect are false, which are then referred to third-party fact-checkers. Over the past several months, Menczer and colleagues were frequently cited as experts on how fake news and misinformation spread in outlets such as PBS Newshour, Scientific American, The Atlantic, Reuters, Australian Public Media, NPR and BuzzFeed. Giovanni Luca Ciampaglia, a research scientist at the IU Network Science Institute, coordinated the Hoaxy project with Menczer. Ciampaglia said a user can now enter a claim into the service's website and see results that show both incidents of the claim in the media and attempts to fact-check it by independent organizations such as snopes.com, politifact.com and factcheck.org. These results can then be selected to generate a visualization of how the articles are shared across social media. The site's search results display headlines that appeared on sites known to publish inaccurate, unverified or satirical claims based upon lists compiled and published by reputable news and fact-checking organizations. A search of the terms "cancer" and "cannabis," for example, turns up multiple claims that cannabis has been found to cure cancer, a statement whose origins have been roundly debunked by the reputable fact-checking website snopes.com. A search of social shares of articles that make the claim, however, shows a clear rise in people sharing the story, with under 10 claims in July rising to hundreds by December. Specifically, Ciampaglia said, Hoaxy's visualizations illustrate both temporal trends and diffusion networks as they relate to online claims and fact-checks. Temporal trends plot the cumulative number of Twitter shares over time. Diffusion networks show how claims spread from person to person. Twitter is currently the only social network tracked by Hoaxy, and only publicly posted tweets appear in the visualizations. "Importantly, we do not decide what is true or false," Menczer said. "Not all claims you can visualize on Hoaxy are false, nor are we saying that the fact-checkers are 100 percent correct all of the time. Hoaxy is a tool to observe how unverified stories and the fact-checking of those stories spread on public social media. It's up to users to evaluate the evidence about a claim and its rebuttal." Menczer's interest in fake news began over seven years ago. In an experiment reported in a paper titled "Social Spam Detection," he created a website of fake celebrity news clearly marked as false and promoted the articles on social bookmarking websites, which were popular at the time. After a month, Menczer was shocked to receive a check based on ad revenue from the site. "That early experiment demonstrated the power of the internet to monetize false information," he said. "I didn't expect at the time that the problem would reach the level of national debate." In the years since the experiment, however, the volume and influence of fake news have expanded across the web from sources as disparate as satirical websites, ideologically motived organizations and Macedonian teenagers working to rake in advertising dollars. "If we want to stop the growing influence of fake news in our society, first we need to understand the mechanisms behind how it spreads," Menczer said. "Tools like Hoaxy are an important step in the process." Menczer is also a member of the IU Network Science Institute, a project partner that contributed support to Hoaxy. Other researchers on the project were Chengcheng Shao, a visiting doctoral student, and graduate students Lei Wang and Gregory Maus, all of the IU School of Informatics and Computing. An academic paper on the project, "Hoaxy: A Platform for Tracking Online Misinformation," is available online from the Proceedings of the 25th International Conference Companion on World Wide Web. This research was supported in part by the National Science Foundation and the J.S. McDonnell Foundation.


News Article | December 21, 2016
Site: www.cnet.com

Now you can map the web of lies. A beta version of Hoaxy, a search engine designed to track fake news, was released Wednesday by Indiana University's Network Science Institute and its Center for Complex Networks and System Research. Hoaxy indexes stories from 132 sites known to produce fake news, such as WashingtonPost.com.co and MSNBC.website, and allows you to see how these sites' links spread across social media. Fake news has plagued the internet and social networks for a long time but has grown in prominence in the past year or so, forcing Facebook to introduce new features to flag false articles. The hoaxes have lead to real-life consequences, with a fake news creator taking some credit for Donald Trump's White House win and a Washington DC shooting earlier this month related to "Pizzagate." Even Pope Francis has chimed in, comparing the spread of fake news to a literal shit show. Type any subject, and Hoaxy responds with a list of fake articles related to the search term. There are even fake news stories about fake news. A search for "Pizzagate," for example, generates 20 results on Hoaxy. Pizzagate itself is a conspiracy theory falsely claiming Hillary Clinton helped run a child sex ring out of a Washington DC pizza place. The lies culminated with a gunman, who claimed to be investigating Pizzagate, opening fire at the targeted pizzeria on December 4. Following the shooter's arrest, the website DC Clothesline published a fake-news article on December 6 titled "More Evidence Pizzagate Shooting is a PSYOP: The Shooter Has an IMDB Page, He's Literally An Actor." The article claims the shooting was propaganda and faked. The story claims the gunman, Edgar Maddison Welch, is an actor -- much like how Sandy Hook conspiracy theorists say the victims killed in the 2012 school shooting were also actors. After selecting an article on Hoaxy, you can visualize its influence in two charts, one showing its popularity over time and the other showing how it spread on Twitter. The DC Clothesline story has been shared on Twitter 137 times and 643 times on Facebook. On the chart, you can watch it move from the DC Clothesline's Twitter account through its network of followers and how the article branches off after that. The Indiana University research center decided to build Hoaxy and track the spread of fake news to figure out how to address the public's concerns about this issue, Filippo Menczer, the center's director, said in an interview. "Until we understand the phenomenon, we can't really develop countermeasures," Menczer said. Using Hoaxy, you can see how the posts are connected on social media. Hoaxy's developers generate the search results from 132 websites compiled via fake-news warning lists from watchdog sites like Snopes and Fake News Watch. Hoaxy also tracks the spread of fact-checking articles, which Menczer found don't go as viral as fake articles do. The search engine also appears to discover stories that may not be fake, per se, but that include a strong bias. A search for "Amazon" finds a December 1 story from Infowars.com titled "Amazon Pushes Islamic Propaganda in New 'Priest and Imam' Commercial." Infowars.com, which traffics in conspiracy theories, gained attention during the presidential campaign for spreading fake news about Democratic nominee Hillary Clinton. The article is based on a real event: Amazon released a commercial in mid-November featuring a priest and imam that tells a story of two friends ordering gifts for each other with Amazon Prime. However, there is no "Islamic propaganda" in the commercial, just the use of heart-touching humanity to help market Amazon's member services. The priest-imam story has been shared on Twitter 1,202 times and 6,467 times on Facebook. More than 700 of those retweets came via the Infowars editor-at-large in this tweet, while another tweet from Infowars founder Alex Jones contributed more than 150 retweets.


News Article | February 27, 2017
Site: www.csmonitor.com

An artist drew the possible surface of TRAPPIST-1f, on one of seven newly discovered exoplanets in the TRAPPIST-1 system. The star was originally observed with the TRAnsiting Planets and Planetesimals Small Telescope (TRAPPIST), an instrument at the La Silla Observatory in Chile that gave the star its nickname – the official scientific name is 2MASS J23062928-0502285. —If the sun never set, could life evolve? Last week, astronomers announced that they had found a miniature solar system with seven Earth-sized planets in tiny, fast orbits around the super-cool dwarf star TRAPPIST-1. Some, and perhaps all, of the seven planets are "tidally locked" to their dwarf star, say the researchers, which means the same side of each planet always faces the star. One side is always in daylight, and one side is always dark. That's not a deal-breaker for life, the team said during an "Ask Me Anything" question and answer session on Reddit. "We think as long as there is an atmosphere (even a thin atmosphere like that on Mars), heat will circulate around the planet." Of course, "we don't yet know what kind of atmospheres, if any, are present on those planets," cautions Angela Zalucha, a principal investigator at SETI in Boulder, Colo., who was not involved in the discovery. Any data about an atmosphere will probably require observations from the long-awaited James Webb Space Telescope, due to launch in October 2018. So far, scientists have no unambiguous evidence of an atmosphere on a rocky planet outside our solar system, despite tantalizing hints from lava planet 55 Cancri e and waterworld Gliese 1214 b. But astronomers do know how atmospheres work on large tidally locked planets, Dr. Zalucha explains in a phone interview with The Christian Science Monitor. "From studies of larger planets – Neptune- or Jupiter-sized planets – we've found that if there's a significant atmosphere, [it] can transport all that heat that the sunlit side is getting over to the dark side of the planet, so it's not just really, really hot on one side and really, really cold on the other side," she says. "There's a way to mitigate that huge temperature gradient." Just like Earth's atmosphere moves heat from the sun-drenched tropics all the way to the poles, she says, an atmosphere on a TRAPPIST-1 planet could moderate the temperatures between the day and night sides. The details depend entirely on how thick the atmosphere is and what it's made of. "Look at the conditions between Mars, Earth, and Venus, and Saturn's moon Titan," Zalucha says. "They all have completely different atmospheres and completely different conditions." As scientists begin to acquire data about atmospheres on any or all of these seven planets, "that's going to change what we think the temperature is at the surface, and whether or not there could be liquid water and, therefore, life," she says. At least three of the seven known TRAPPIST-1 planets are in the "Goldilocks zone," known to scientists as the circumstellar habitable zone, where conditions are neither too cold nor too hot for liquid water – that is, where oceans will neither boil off nor freeze solid, but stay warm and inviting places where life could, theoretically, arise. But everything we know about life comes from our warm planet, where we experience 365 days – and nights – every year. Without that, it's hard to imagine what life would look like. "Crops and everything would develop differently without the diurnal cycle," says Jessie Christiansen, an astronomer at the NASA Exoplanet Science Institute at the California Institute of Technology. "From an anthropocentric point of view, we tend to imagine life near the terminators," she tells the Monitor. (The "terminator" is the boundary between day and night, which in these tidally locked planets would be an unmoving zone existing in a constant state of twilight.) But we may have parallels on Earth, Dr. Christiansen notes. "If you think about life in the deep ocean, it has evolved without a true diurnal cycle." Exobiologists, the scientists who theorize about life on other planets, have investigated Earth's sea floors, icy islands, deep caves, hot springs, and other extreme settings in hopes of broadening their understanding of what organisms need to live. "We continue to be surprised by life on Earth," says SETI's Zalucha, "bacteria that survives in 130-degree caves, or things at the bottom of the ocean where there's no light." She adds, "It's not a hopeless case as to where life could be on these planets."


News Article | February 27, 2017
Site: www.csmonitor.com

An artist drew the possible surface of TRAPPIST-1f, on one of seven newly discovered exoplanets in the TRAPPIST-1 system. The star was originally observed with the TRAnsiting Planets and Planetesimals Small Telescope (TRAPPIST), an instrument at the La Silla Observatory in Chile that gave the star its nickname – the official scientific name is 2MASS J23062928-0502285. —If the sun never set, could life evolve? Last week, astronomers announced that they had found a miniature solar system with seven Earth-sized planets in tiny, fast orbits around the super-cool dwarf star TRAPPIST-1. Some, and perhaps all, of the seven planets are "tidally locked" to their dwarf star, say the researchers, which means the same side of each planet always faces the star. One side is always in daylight, and one side is always dark. That's not a deal-breaker for life, the team said during an "Ask Me Anything" question and answer session on Reddit. "We think as long as there is an atmosphere (even a thin atmosphere like that on Mars), heat will circulate around the planet." Of course, "we don't yet know what kind of atmospheres, if any, are present on those planets," cautions Angela Zalucha, a principal investigator at SETI in Boulder, Colo., who was not involved in the discovery. Any data about an atmosphere will probably require observations from the long-awaited James Webb Space Telescope, due to launch in October 2018. So far, scientists have no unambiguous evidence of an atmosphere on a rocky planet outside our solar system, despite tantalizing hints from lava planet 55 Cancri e and waterworld Gliese 1214 b. But astronomers do know how atmospheres work on large tidally locked planets, Dr. Zalucha explains in a phone interview with The Christian Science Monitor. "From studies of larger planets – Neptune- or Jupiter-sized planets – we've found that if there's a significant atmosphere, [it] can transport all that heat that the sunlit side is getting over to the dark side of the planet, so it's not just really, really hot on one side and really, really cold on the other side," she says. "There's a way to mitigate that huge temperature gradient." Just like Earth's atmosphere moves heat from the sun-drenched tropics all the way to the poles, she says, an atmosphere on a TRAPPIST-1 planet could moderate the temperatures between the day and night sides. The details depend entirely on how thick the atmosphere is and what it's made of. "Look at the conditions between Mars, Earth, and Venus, and Saturn's moon Titan," Zalucha says. "They all have completely different atmospheres and completely different conditions." As scientists begin to acquire data about atmospheres on any or all of these seven planets, "that's going to change what we think the temperature is at the surface, and whether or not there could be liquid water and, therefore, life," she says. At least three of the seven known TRAPPIST-1 planets are in the "Goldilocks zone," known to scientists as the circumstellar habitable zone, where conditions are neither too cold nor too hot for liquid water – that is, where oceans will neither boil off nor freeze solid, but stay warm and inviting places where life could, theoretically, arise. But everything we know about life comes from our warm planet, where we experience 365 days – and nights – every year. Without that, it's hard to imagine what life would look like. "Crops and everything would develop differently without the diurnal cycle," says Jessie Christiansen, an astronomer at the NASA Exoplanet Science Institute at the California Institute of Technology. "From an anthropocentric point of view, we tend to imagine life near the terminators," she tells the Monitor. (The "terminator" is the boundary between day and night, which in these tidally locked planets would be an unmoving zone existing in a constant state of twilight.) But we may have parallels on Earth, Dr. Christiansen notes. "If you think about life in the deep ocean, it has evolved without a true diurnal cycle." Exobiologists, the scientists who theorize about life on other planets, have investigated Earth's sea floors, icy islands, deep caves, hot springs, and other extreme settings in hopes of broadening their understanding of what organisms need to live. "We continue to be surprised by life on Earth," says SETI's Zalucha, "bacteria that survives in 130-degree caves, or things at the bottom of the ocean where there's no light." She adds, "It's not a hopeless case as to where life could be on these planets."

Loading Science Institute collaborators
Loading Science Institute collaborators