News Article | October 28, 2016
Leading online higher education resource provider AffordableCollegesOnline.org has released its rankings of the Best Online Colleges in Massachusetts for 2016-2017. A two- and four-year school list was created for each state, with the following receiving top honors: University of Massachusetts - Lowell, Amherst and Dartmouth campuses, Westfield State University and Lesley University for four-year schools; Bunker Hill Community College, Holyoke Community College, Middlesex Community College, Massasoit Community College and Greenfield Community College for two-year schools. "The Massachusetts Department of Education has been steadily working on initiatives to ramp up college completion numbers by 2025,” said Dan Schuessler, CEO and Founder of AffordableCollegesOnline.org. "These colleges are examples of how higher education in Massachusetts is becoming more flexible, offering affordable, top-quality online learning programs to help more students earn degrees.” To earn a spot on the Best Online Colleges in Massachusetts’s list, schools are required to meet several stringent base qualifications. Each institution must be an accredited, public or private not-for-profit college or university. Schools must also fall within specific affordability guidelines, offering in-state tuition rates below $5,000 annually at two-year schools and below $25,000 annually at four-year schools. A complete lists of colleges on the two- and four-year lists are included below. To learn more about where each specifically falls in the ranking and find more details about the data analysis and methodology used to score each state, visit the following link: The two-year schools honored as the Best Online Colleges in Massachusetts for 2016 are: Berkshire Community College Bristol Community College Bunker Hill Community College Greenfield Community College Holyoke Community College Massachusetts Bay Community College Massasoit Community College Middlesex Community College Northern Essex Community College Roxbury Community College The four-year schools honored as the Best Online Colleges in Massachusetts for 2016 are: Fitchburg State University Framingham State University Hebrew College Lesley University Massachusetts Maritime Academy National Graduate School of Quality Management Salem State University University of Massachusetts - Amherst University of Massachusetts - Boston University of Massachusetts - Dartmouth University of Massachusetts - Lowell Westfield State University AffordableCollegesOnline.org began in 2011 to provide quality data and information about pursuing an affordable higher education. Our free community resource materials and tools span topics such as financial aid and college savings, opportunities for veterans and people with disabilities, and online learning resources. We feature higher education institutions that have developed online learning environments that include highly trained faculty, new technology and resources, and online support services to help students achieve educational and career success. We have been featured by nearly 1,100 postsecondary institutions and nearly 120 government organizations.
News Article | February 15, 2017
People who eat a gluten-free diet may be at risk for increased exposure to arsenic and mercury - toxic metals that can lead to cardiovascular disease, cancer and neurological effects, according to a report in the journal Epidemiology. Gluten-free diets have become popular in the U.S., although less than 1 percent of Americans have been diagnosed with celiac disease - an out-of-control immune response to gluten, a protein found in wheat, rye and barley. A gluten-free diet is recommended for people with celiac disease, but others often say they prefer eating gluten-free because it reduces inflammation - a claim that has not been scientifically proven. In 2015, one-quarter of Americans reported eating gluten-free, a 67 percent increase from 2013. Gluten-free products often contain rice flour as a substitute for wheat. Rice is known to bioaccumulate certain toxic metals, including arsenic and mercury from fertilizers, soil, or water, but little is known about the health effects of diets high in rice content. Maria Argos, assistant professor of epidemiology in the UIC School of Public Health, and her colleagues looked at data from the National Health and Nutrition Examination Survey searching for a link between gluten-free diet and biomarkers of toxic metals in blood and urine. They found 73 participants who reported eating a gluten-free diet among the 7,471 who completed the survey, between 2009 and 2014. Participants ranged in age from 6 to 80 years old. People who reported eating gluten-free had higher concentrations of arsenic in their urine, and mercury in their blood, than those who did not. The arsenic levels were almost twice as high for people eating a gluten-free diet, and mercury levels were 70 percent higher. "These results indicate that there could be unintended consequences of eating a gluten-free diet," Argos said. "But until we perform the studies to determine if there are corresponding health consequences that could be related to higher levels of exposure to arsenic and mercury by eating gluten-free, more research is needed before we can determine whether this diet poses a significant health risk." "In Europe, there are regulations for food-based arsenic exposure, and perhaps that is something we here in the United States need to consider," Argos said. "We regulate levels of arsenic in water, but if rice flour consumption increases the risk for exposure to arsenic, it would make sense to regulate the metal in foods as well." Catherine Bulka of UIC; Matthew Davis of the University of Michigan; Margaret Karagas of Dartmouth University; and Habibul Ahsan of the University of Chicago are co-authors on the paper. This research was supported by National Institutes of Health grants R01 ES024423, R21 ES024834, R01 CA107431, P42 ES010349 and T32 HL125294.
News Article | December 13, 2016
Arctic climate change went into overdrive in 2016, with record high temperatures eclipsing anything seen since at least 1900. Accompanying the record warmth were freak losses of sea ice, melting of the Greenland Ice Sheet and a host of cascading feedbacks that scientists have been warning about for years. The record warm year reflects both human-caused global warming, which is hitting the Arctic harder than almost any other region on Earth, as well as natural weather variability that pumped even more heat into the region this year. Scientists detailed the changes on Tuesday with the release of an annual peer-reviewed report that amounts to a physical of the Arctic environment, known as the "Arctic Report Card." SEE ALSO: Exxon CEO Rex Tillerson is, strangely, the most pro-climate Trump nominee Due to human-caused global warming and natural weather patterns that acted to enhance such warming, the region scored an extremely poor grade this year. Don Perovich, a scientist at Dartmouth University who helped prepare the report, said the Arctic is "shouting change" right now, whereas it used to "whisper change" back when the first report card was first released 11 years ago. At a press conference during the annual meeting of the American Geophysical Union in San Francisco, Perovich said that if he were to give a letter grade to the sea ice cover, it would be a D-plus. His colleagues agreed. “Rarely have we seen the Arctic show a clearer, stronger or more pronounced signal of persistent warming and its cascading effects on the environment than this year,” said Jeremy Mathis, director of NOAA’s Arctic Research Program, in a press release. Mathis said researchers are investigating how much of the Arctic warming had to do with the intense El Niño in 2015, but that untangling that mystery will take several more months. The changes occurring throughout the Arctic, as the region warms at twice the rate of the rest of the planet, are sweeping and vivid. Average annual air temperatures over Arctic land areas were the highest in the observational record during the year ending in September 2016, with a 3.5-degree Celsius, or 6.3 degree Fahrenheit, increase compared to 1900. Record monthly highs were set in January, February, October and November. Such records are especially noteworthy because these months are darker months, with little if any sunshine reaching the surface of the Arctic. Seeing such a temperature spike during these months reflects both warm air being transported into the region by weather patterns, as well as feedbacks between missing sea ice, open ocean waters and air temperatures. "Those are the periods when the Arctic should be really cold,” Mathis told Mashable in an interview. “It’s one thing for the summers to be warmer, but this is [a new] trend of the winter months setting record temperatures.” Winter air temperatures set a new record overall, with some areas seeing air temperature anomalies of 8 degrees Celsius, or 14.4 degrees Fahrenheit, above average, the report found. Spring snow cover extent in the North American Arctic reached the lowest level in the satellite record, which began in 1967. Greenland began melting at the second-earliest point on record in the spring season. The amount of young, thin sea ice cover has continued to increase compared to older, thicker sea ice that is more resistant to melting. The Arctic sea ice minimum extent from mid-October 2016 to late November 2016 was the lowest since the satellite record began in 1979, and 28 percent less than the average for 1981-2010 in October, the report card found. Arctic sea ice is thinning, with multi-year ice now comprising just 22 percent of the ice cover, compared to 78 percent for the more fragile first-year ice. Comparatively, multi-year ice made up 45 percent of ice cover in 1985. Perovich said sea ice loss is already large enough that it is impacting human activities, with greater access to the Arctic for ships and drilling activities. A massive, 1,000 passenger cruise ship successfully sailed through the Northwest Passage during the summer, for example. This year's report card contains a supplementary report focusing on the extremes observed in the Arctic during the past few months, with record high temperatures across parts of the Arctic preventing sea ice from reforming. From mid-October to the present, the report found, sea ice extent has been the lowest on record since satellite data began in 1979. The report found that the cause of the record warm temperatures have been winds transporting mild air masses from mid-latitudes, with complex interactions taking place between midlatitude weather patterns and the Arctic. Sea ice extent in November set a new record low, for example. Astonishingly high air and ocean temperatures during November caused sea ice to trail far behind typical levels, with sea ice extent ending the month at a record low. Arctic sea ice extent averaged 3.51 million square miles for the month, which was 753,000 square miles below the 1981-2010 average for the period, according to data released Tuesday by the National Snow and Ice Data Center (NSIDC) in Boulder, Colorado. Sea ice extent compared to average, with 2016 in blue. The section of missing ice was about the same size as the entire country of Mexico. Or to put it in terms of U.S. states, the missing ice was greater than the states of Texas, California, Montana and New Mexico combined. Research has shown that there may be connections between Arctic warming and the weather in the midlatitudes, potentially even causing more extreme weather events in the U.S. and Europe. Ocean acidification, which is a global concern as more carbon is absorbed into ocean waters, is a greater threat to the Arctic due to the cooler water temperatures and sea ice formation there, scientists found. When carbon dioxide dissolves in the water, a series of chemical reactions break it down in a way that reduces the amount of calcium carbonate minerals available. This process throws off the pH balance in oceans and can make it more difficult for some creatures, such as corals, snails and oysters, to function in a healthy way. Data from the report card indicates that corrosive waters have been extending deeper into the Arctic Basin in recent years, damaging sea life, particularly shellfish. This is only likely to worsen in coming years. In addition, on land, increasing temperatures are causing permanently frozen soils, known as permafrost, to become not-so-permanently frozen soils. This is releasing greenhouse gases to the atmosphere, making parts of the Arctic a net source of global warming pollution. It is thought that the permafrost zone that rings the Arctic from Alaska to northern Canada and around Scandinavia to Siberia contains twice as much organic carbon as is currently in the atmosphere. If all the permafrost were to melt, it would be an utter disaster for the climate. "Overall, tundra appears to be releasing net carbon to the atmosphere," the report states. Outside observers say the 2016 report card is alarming and should spur action. “The 2016 Arctic Report Card further documents the unraveling of the Arctic and the crumbling of the pillars of the global climate system that the Arctic maintains,” said Rafe Pomerance, chair of Arctic 21 and a member of the Polar Research Board of the National Academy of Sciences, in a statement. “Governments must urgently work together to establish a future Arctic that minimizes ever greater warming from the loss of sea ice and snow cover and thawing permafrost, and massive sea level rise from the shrinking Greenland ice sheet and other Arctic glaciers.” Scientists involved with the report card said they are hoping to get more funding to support their work by funding additional observing networks to get more real-time data from the Far North. That may be a tough sell with the incoming Trump administration, given the administration's strong climate denial bent.
News Article | December 22, 2015
Home > Press > Simple shell of plant virus sparks immune response against cancer: Mice tumor free and protected from metastases after treatment Abstract: The shells of a common plant virus, inhaled into a lung tumor or injected into ovarian, colon or breast tumors, not only triggered the immune system in mice to wipe out the tumors, but provided systemic protection against metastases, researchers from Case Western Reserve University and Dartmouth University report. The scientists tested a 100-year-old idea called in-situ vaccination. The idea is to put something inside a tumor and disrupt the environment that suppresses the immune system, thus allowing the natural defense system to attack the malignancy. That something--the hard coating of cowpea mosaic virus--caused no detectible side effects, which are a common problem with traditional therapies and some immunotherapies. The team's research is published in the journal Nature Nanotechnology. "The cowpea virus-based nanoparticles act like a switch that turns on the immune system to recognize and fight against the tumor - as well as to remember it," said Nicole Steinmetz, an assistant professor of biomedical engineering at Case Western Reserve, appointed by the Case Western Reserve School of Medicine. "The particles are shockingly potent," said Steven Fiering, professor of microbiology and immunology at Dartmouth's Geisel School of Medicine. "They're easy to make and don't need to carry antigens, drugs or other immunostimmulatory agents on their surface or inside." The professors studied the nanoparticles with Dartmouth's Pat Lizotte, a molecular and cellular biology PhD student; Mee Rie Sheen, a postdoctoral fellow; and Pakdee Rojanasopondist, an undergraduate student; and Case Western Reserve's Amy Wen, a biomedical engineering PhD student. Taking another shot The immune system's ability to detect and destroy abnormal cells is thought to prevent many cancers, according to the National Cancer Institute. But when tumors start to develop, they can shut down the system, allowing tumors to grow and spread. To restart immune defenses, the scientists used the tumor itself as if it were the antigen in a vaccine--that is, the target for antibodies produced by the immune system. The cowpea virus shell, with its infectious components removed, acts as the adjuvant--a substance that triggers and may enhance or prolong antigen-specific immune responses. The process and results The researchers first switched on the immune system in mice to attack B16F10 lung melanoma or skin melanoma, leaving the mice tumor-free. When the treated mice were later injected with B16F10 skin melanoma (to re-challenge the cured mice), four out of five mice were soon cancer free and one had a slow-growing tumor. The nanoparticles proved effective against ovarian, breast and colon tumor models. Most of the tumors deteriorated from the center and collapsed. The systemic response prevented or attacked metastatic disease, which is the deadliest form of cancer. "You get benefits against disease you don't even know is there yet," Fiering said. "Because everything we do is local, the side effects are limited," despite the strength and extent of the immune response, Fiering said. No toxicity was found. Harsh side effects, such as fatigue, pain, flu-like symptoms and more are common with chemo and radiation therapies and with some immunostimulation drugs. The researchers are now trying to understand how the virus shell stimulates the immune system. "It's not cytotoxic, there's no RNA involved or lipopolysaccharides that may be used as adjuvants, and it's not simply an irritant," Steinmetz said. "We see a specific immune response." Unlike most other adjuvants, Fiering said, the virus shells stimulate neutrophils, a type of white blood cell. What role that plays is not yet known. The researchers are seeking grants to study whether the shell's physical traits or something virus-specific causes the immune response. They are also seeking grants to test the therapy in animal models that have immune systems closer to humans. If the virus shell continues to prove effective, the researchers believe it could eventually be used in combination with other therapies tailored to individual patients. For more information, please click If you have a comment, please us. Issuers of news releases, not 7th Wave, Inc. or Nanotechnology Now, are solely responsible for the accuracy of the content.
News Article | October 28, 2016
The immortal story of Frankenstein was conceived 200 years ago by the author Mary Shelley, as a fireside tale during a strange, cold summer. Over the two ensuing centuries, the popular conception of Shelley’s sophisticated science-fiction and horror novel has come to be dominated by the image of Boris Karloff with bolts protruding from his neck. But now a new study evaluates a scientific underpinning of the original novel: a concept of extinction conceived by the 20-year-old Mary Shelley decades before Charles Darwin could even dream up his theory of evolution. “We conclude… that the central horror of Mary Shelley’s novel lies in its prescient command of foundational concepts in ecology and evolution,” the authors write in the paper “Frankenstein and the Horrors of Competitive Exclusion” was published in the journal BioScience today. At a crucial turning point in the novel, the hapless Victor Frankenstein is convinced by the Creature to make a female companion for his creation, who has lived a lonely existence in the wild parts of Europe for three years. The Creature, 8 feet tall and precocious in learning multiple languages and understanding human society, begs for the companion. The Creature assures his creator that the couple will move to the “vast wilds of South America,” to be left alone and eat a vegetarian diet. But just as the doctor is about to finish creating the female companion for his Creature, he destroys it, leading to the climax of the novel as the enraged Creature exacts his revenge. Nathaniel Dominy of Dartmouth University and Justin Yeakel of the University of California, Merced, contend in their new paper that Victor Frankenstein’s impulsive destruction of the uncompleted companion was actually an act of salvation – for humankind. The concept of the Creature and his descendants inheriting the Earth was far beyond the scope of most biologists in 1818, the year of the novel’s publication, Dominy and Yeakel argue. “The principle of competitive exclusion was not formally defined until the 1930s,” Dominy said. “Given Shelley’s early command of this foundation concept, we used computational tools developed by ecologists to explore if, and how quickly, an expanding population of creatures would drive humans to extinction.” The pair actually worked out the mathematics of the population growth rate if the Creature was presented with a mate, and allowed to reproduce in the wilds of South America. The Creature and his ilk would have several key advantages, including the fact that his resurrected flesh apparently has resistance to tissue death, and a correspondingly lower death rate. The worst-case scenario indicates the Creature and his heirs would take over the world within about 4,100 years, they concluded. “Our study adds to Mary Shelley’s legacy, by showing that her science fiction accurately anticipated fundamental concepts in ecology and evolution by many decades,” added Yeakel. “To date, most scholars have focused on (her) knowledge of then-prevailing views on alchemy, physiology and resurrection; however, the genius of Mary Shelley lies in how she combined and repackaged existing scientific debates to invent the genre of science fiction.” The weather science of the summer in which Shelley’s tale was conceived is also correspondingly complex. The year 1816 is often referred to as “The Year Without A Summer,” since the eruption of the volcano Mount Tambora in April 1815 enveloped the world with millions of tons of ash and gas. The phenomenon cooled the entire globe, and thrust the climate into a temporary period of chaos. The darkness and cold of the year 1816 inspired Mary Shelley and her husband, the Romantic poet Percy Bysshe Shelley, along with the coterie of writers like Lord Byron and the physician John Polidori, to huddle inside, close to the fire at the Villa Diodati at Lake Geneva and tell each other ghost stories. Polidori’s The Vampyre (1819) is another milestone genre work to come from the campfire storytelling sessions of that summer.
News Article | October 26, 2016
More than half of breast cancers newly diagnosed in the United States are likely cases of mistaken identity that subject women to needless anxiety, treatment and expense, researchers reported Wednesday in the New England Journal of Medicine. The study also found that the value of mammograms as a life-saving tool has been significantly overstated. Instead, the introduction of more effective treatments should get most of the credit for improving survival rates among women diagnosed with breast cancer, the researchers concluded. The findings cast fresh doubt on the value of universal breast cancer screening for women over 40 with no family history of the disease. They also underscore that breast cancer — the most common form of cancer among American women — is a far more complex disease than initially believed. The hope that early detection would consistently save women’s lives accorded with scientists’ limited understanding of cancer in the mid-1970s. Experts believed that a small breast lump was almost always a harbinger of a tumor that would, with time, grow and spread. Catch and treat it early, their reasoning went, and you will see one less woman coming in later with a large and aggressive cancer. But medical researchers have come to recognize that a tumor’s genetic make-up, as well as the interaction between tumor and host, are better predictors of its progression than the tumor’s size upon discovery. One woman’s tumor might reach 2 centimeters and then stop growing for many years. Another’s might progress from undetectable to a dangerous 5 centimeters in a matter of months. It was a new, more complex picture of breast cancer. And it undercut the idea that early detection and early treatment were essential to save lives. “The mantras, ‘All cancers are life-threatening’ and ‘When in doubt, cut it out,’ require revision,” Dr. Joann G. Elmore, a physician and epidemiologist at the University of Washington, wrote in an editorial that accompanies the study. The “well intentioned efforts” of doctors, she wrote, are exacting “collateral damage.” As breast imaging became widely available in the early 1980s, physicians told women that catching tumors early, before they could be felt by hand, was key to their survival. Advocates quickly began pushing for universal screening programs. By the mid-1980s, an American Cancer Society awareness campaign told women over 35, “If you haven’t had a mammogram, you need more than your breasts examined.” It’s now clear that physicians, activists and the media “quite simply have overstated the value” of mammography, said study leader Dr. H. Gilbert Welch, one of the first researchers to raise questions about overscreening. Whether motivated by true belief, commercial gain or fear of litigation, he said, those forces have been slow to accept that when all women get mammograms, some will respond to scary findings in ways that do more harm than good. In 2016, physicians in the United States are expected to diagnose 246,660 new cases of invasive breast cancer, along with 61,000 new cases of non-invasive breast cancer (sometimes referred to as ductal carcinoma in situ, or DCIS). The analysis of data from the National Cancer Institute suggests that the majority of abnormalities picked up by screening mammograms would likely never become deadly if left alone. Still, patients and their physicians routinely attack small lump with biopsies, diagnostic work-ups and treatments that can be risky and debilitating. Welch, who teaches community and family medicine at Dartmouth University’s Geisel School of Medicine, and his team tallied the number of breast cancer findings and the size of the tumors found in women over 40 who were diagnosed with breast cancer between 1975 and 1979, before screening mammography became widely available. They compared those figures with breast cancer findings between 2000 and 2002, when screening was widespread. For both groups, they tracked how women were treated and whether they were still alive 10 years after diagnosis. The team observed that as more women got routine mammograms, more breast cancers were diagnosed. The additional cancers tended to be smaller, or to be confined to spaces, such as milk ducts, where they had not invaded normal tissue. If catching tumors while they were still small were a way of nipping large, aggressive tumors in the bud, then widespread screening should have reduced the number of large tumors discovered on mammograms. But the rate at which large and aggressive tumors were found remained “essentially unchanged” between 1975 and 2010, the researchers found. “The introduction of screening mammography has produced a mixture of effects,” the authors explained. To a modest extent, mammography screening was having the desired effect of finding dangerous tumors before they had grown large. For those women — estimated to be about 20% of those whose small tumors were detected by screening mammography — early treatment was potentially life-saving. But the other 80% of women likely would not have died of breast cancer had their tumors never been detected in the first place. Ironically, as mammography became more widespread and technically better, screening was doing a better and better job of finding these harmless tumors: while lumps smaller than 2 centimeters represented 37% of mammogram-detected abnormalities in the early years of the study, they represented 67% of a much larger pool by 2010. By comparing changes in mortality rates over time for women diagnosed with tumors of various sizes, the researchers calculated that improvements in breast cancer therapy were responsible for at least two-thirds of the reduction in deaths, according to the study. Dr. Michael LeFevre, a University of Missouri physician who was not involved in the new study, said while the findings offer only rough estimates of mammograms’ harms, it helps counter a powerful narrative about routine breast cancer screening for all women.
News Article | February 27, 2017
On a Thursday segment of Fox News' "The O'Reilly Factor," host Bill O'Reilly directed a debate over crime and immigration in Sweden. On one side of the issue was a Swedish newspaper reporter Anne-Sofie Naslund, who argued against the notion that immigration was making her country dangerous. On the other side was a man named Nils Bildt, who was identified onscreen and verbally as a "Swedish defense and national security advisor." Mr. Bildt argued that immigration had led to considerable social problems in Sweden, and said that the country's liberal leanings meant that people who shared his opinion on the subject were being "viewed as outsiders." But it turns out that Bildt may be even more of an outsider than the show seemed to indicate. Swedish officials in the Swedish Defense Ministry and Foreign Office said that they had never heard of Bildt, and that he was not associated with any part of Sweden's government. Recommended: How much do you know about the EU? Take our quiz. "We have no spokesman by that name," Marie Pisäter of the Ministry of Defense told the Swedish newspaper Dagens Nyheter. The segment was the second time in under a week that a strange political happenstance in the United States left many Swedish officials scratching their heads. The first was a comment by President Trump during a rally in Florida which seemed to imply that a terrorist attack or immigrant-related had happened the previous day in Sweden, even though nothing of the sort had been reported. "We've got to keep our country safe. You look at what's happening in Germany, you look at what's happening last night in Sweden," Mr. Trump said at the time. "Sweden, who would believe this?" Bildt echoed many of Trump's worries about immigration policy during the Thursday segment, speaking about crime and "socially deviant activity" in Swedish cities related to the influx of migrants into the country. Bildt's appearance comes at a time of increased scrutiny of the "fake news" phenomenon on both sides of the political aisle, where inaccurate or outright false information is spread by media outlets with potentially negative consequences, particularly on the internet. While there is no evidence to suggest the Thursday misidentification of Bildt was an attempt to mislead viewers in any way, many experts have expressed concern over the tendency for people to accept falsehoods based on their preconceived biases. As the Christian Science Monitor's Story Hinckley reported in December: “This is exactly what psychology literature on the topic would say. We don’t want to feel uncomfortable, so we expose ourselves to select information so we feel good about ourselves,” says Clare Wadel, research director at First Draft, a nonprofit that advocates for truth in the digital sphere. “When people on Facebook write ‘My sources say…’ it proves they are not looking for objective truth in the middle. Sources that are ‘mine’ will give me information to reaffirm what I already think.” Critics blame the bait-and-click revenue system of today’s news for pushing the line between real and fake news. They also point to Facebook for allowing these stories to trend on news feeds around the world, misinforming voters at a crucial time, though both Facebook and Google now say they are taking steps to address the issue. “The problem is that we are too credulous of news that reinforces our predispositions and too critical of sites that contradict them,” says Brendan Nyhan, a political scientist at Dartmouth University in Hanover, N.H. According to Dagens Nyheter, Nils Bildt was born Nils Tolling, who emigrated to the US in 1994 and changed his name in the early 2000s. Allegedly, Tolling was convicted of assaulting a law enforcement official, public inebriation, and obstruction of justice while living in Virginia and sentenced to a year in jail in 2014. "Had I spent a year in prison, I would think I would remember it," Bildt told The Washington Post. Responding to the controversy, Bildt said that he had made it clear that he was a US-based, independent analyst, and that Fox News had chosen the "Swedish defense and national security advisor" descriptor. "Sorry for any confusion caused, but needless to say I think that is not really the issue," Bildt said in a statement to the website Mediaite. "The issue is Swedish refusal to discuss their social problems and issues."
News Article | December 15, 2016
They argued about moon-plasma interactions, joked about polar bears, and waxed nostalgic for sturdy sea ice. But few of the 20,000 Earth and climate scientists meeting in San Francisco this week had much to say about the president-elect, Donald Trump – though his incoming administration loomed over much of the conference. For some, the annual meeting of the American Geophysical Union (AGU) – the largest Earth and space science gathering in the world – was a call to action. California’s governor, Jerry Brown, addressed the scientists on Wednesday morning, telling them, “the time has never been more urgent or your work ever more important. The danger is definitely rising.” Citing financial inequality, the risks of nuclear arms and the mounting effects of climate change, Brown said, “we’re facing far more than one or two or even thousands of politicians. “We’re facing big oil, we’re facing big financial structures that are at odds with the survivability of our world. It’s up to you as truth tellers, truth seekers, to mobilize all your efforts to fight back.” Brown compared the struggle to the campaigns waged by the tobacco industry, noting that health science and the law eventually curtailed its power. “Some people need a heart attack to stop smoking,” he said. “Maybe we just got our heart attack.” “This is not a battle of one day or one election,” he added, calling scientists “foot soldiers” for truth. The governor promised to help lead the campaign, daring Trump to shut down climate science satellites and mocking Rick Perry, his pick for secretary of energy. “If Trump turns off the satellites, California will launch its own damn satellites,” Brown said. “Rick, I’ve got some news for you: California’s growing a hell of a lot faster than Texas. And we’ve got more sun than you’ve got oil.” Not far away, a team of lawyers with the Climate Science Legal Defense Fund met with scientists to discuss the threats ahead. The group helps defend scientists from harassment and suits over climate change research, most prominently a case brought by a climate change-denying organization to obtain emails and research of scientist Michael Mann. At the AGU table, attorneys handed out guides to “handling political harassment and legal intimidation”. Some scientists are taking action on their own, including Eric Holthaus, a meteorologist who has started one of several “guerrilla archiving” efforts to preserve public climate data on non-government servers. Holthaus and others fear that a Trump administration could take down the data, as former Canadian prime minister Stephen Harper tried to silence scientists. Sally Jewell, the outgoing secretary of the interior, tried to reassure scientists that the Trump administration could not quickly gut federal research. Science would be “foundational” to government, she told attendees: “We have a president-elect that likes to win, and we can’t win without science.” Jewell argued that scientists should stress the benefits of science to industry, saying they should start speaking “in the language of business, perhaps, to translate” for the Trump administration. Many federal researchers had already begun to speak in those terms. Scott Hagen, presenting on the dangers of rising tides to Louisiana, said the state could face losses of up to $280m in agricultural lands. Jennifer Francis, a Rutgers professor, said the extreme heat in the Arctic would almost certainly contribute to extreme weather around the world next year. Marco Tedesco, presenting on the “Arctic report card 2016” – a year of record lows – noted that changes in snowfall patterns would affect hydropower and freshwater resources. “Snow melting sooner and faster is leaving a drier soil exposed to a drier summer,” he said. “You might have more drought, might have more forest fires, ramifications for economy, population.” Thomas Zurbuchen, the head of Nasa’s science mission directorate, stressed that scientists should “behave like scientists” and “focus on the data that we get and not amplify the noise”. He too drew a link between science and business: “What you’re carrying around in your pocket is a lot of space data that’s doing work for you.” Jewell urged scientists to “speak up for scientific integrity”, and tried to assure them that pro-science policies “are not going to be easy to undo”. But uncertainty and anxiety reigned, for the future of research and the planet. “Trump’s leadership will have a chilling effect on environmental and science policy no matter what,” said one climate scientist, who asked for anonymity for fear of work being politicized. “What worries me most is that this administration might launch a fundamental attack on the scientific process.” The best-case scenario, the scientist said, would resemble the Bush administration, in which “leadership doesn’t care much about what the climate scientists say, but they continue to support funding for research”. A worst-case scenario would be “an effort to undermine the scientific infrastructure of the country”. Agencies could radically restructure their staff, for instance, shuffling scientists to unappealing projects while Republicans in Congress slash budgets and Trump reneges on world climate talks. Charles Kennel, a former Nasa official, said that the US’s withdrawal for several years would be “serious but not fatal”. Other scientists stressed that the world is already changing in dramatic, unpredictable ways. Donald Perovich, a professor at Dartmouth University, said that the Arctic in 2016 looked “as though part of the United States has melted”, a region comparable to all the states east of the Mississippi plus the midwest and North Dakota. When the researchers began doing a yearly report card in the mid-2000s, Perovich said, “you kind of had to listen closely, because the Arctic was whispering change.” “Now it’s not whispering anymore,” he said. “It’s shouting.”
News Article | November 25, 2016
Glioblastomas are among the deadliest cancers. Diagnosis with the brain tumors means a median survival time of just 14 months – even with the full range of surgery, radiation and chemotherapy. But two disruptions in the tumor DNA shows a bit of how the cancer hijacks health biology – and the clues could provide a blueprint for future treatments, according to a new Dartmouth University study in the journal Nature Communications today. Thirty patients’ brain tumor tissues yielded complete genomes. Two DNA modifications made by the cancer appear to be integral to its spread and its attack on healthy tissue. The 5-methylcytosine (5mC) and 5-hydroxymethylcytosine (5-hmC) were assessed using parallel processing of DNA with bisulfate and oxidative bisulfite treatments – and then by statistical algorithm to understand its effects. They found that patients with more 5-hmC had better survival times. The link between decreasing levels and increasing tumor aggressiveness may be explained by the tumor hijacking the normal transcription, they find. “Our results demonstrate that the glioblastoma genome exhibits a global loss of 5hmC compared with healthy prefrontal cortex tissues, but observed regions of conserved 5-hmC implying novel associations between 5-hmC and critical tumor transcriptional programs,” they write. The epigenetic changes wrought by the tumor could improve prognosis, and eventually, treatments, according to Brock Christensen, lead author, an associated professor of epidemiology at Dartmouth’s Geisel School of Medicine. “Together, our work reveals more about the powerful influence of the epigenome in cancer and highlights the distinct functional role of 5hmC,” he said.
News Article | April 9, 2016
Love it or hate it, there’s no question that street art has become a firmly entrenched aspect of urban cultures around the world. Styles range from hastily scrawled obscenities and tags to stencils and wildstyle, but as any veteran street artist is wont to tell you, the larger the piece, the more technical expertise is required. The reasons that requisite skill scales up with the size of a street mural are manifold: some are intensely practical (larger pieces are harder to hide from the law) while others are simply a matter of artistic vision (proportion, color and other aspects of painting become more difficult when you can only look at small parts of the piece at a time). Yet thanks to the work of a team of researchers from Dartmouth University, massive photorealistic spray paint murals can now be done by anyone outfitted with their new high-tech, motion sensing spray paint can. "Typically, computationally-assisted painting methods are restricted to the computer," said Wojciech Jarosz, an assistant professor of computer science at Dartmouth. "In this research, we show that by combining computer graphics and computer vision techniques, we can bring such assistance technology to the physical world even for this very traditional painting medium, creating a somewhat unconventional form of digital fabrication.” As the team details in a study published this week inComputers & Graphics, their spray paint can allows a painter to reproduce a photo as a spray mural with staggering accuracy—with little to no skill required on the part of the painter. To begin, the researchers selected a photo they wanted writ large on a canvas or wall and uploaded it to a computer. Next, they took a normal spray paint can and outfitted its nozzle with a QR-cube to track the can’s motion and an actuation device which controls the amount of paint released from the can. Using two webcams to track the can’s motion relative to the wall, an algorithm designed by the researchers ‘told’ the can how much paint to release and as the painter waved the can around in front of the canvas. Although computer-aided painting dates back to Desmond Paul Henry’s Drawing Machine in 1962, prior to the Dartmouth team’s invention, none of the research in this field has allowed non-artists to reproduce images at this scale. Moreover, the researchers hope that as their design becomes more sophisticated, it can be used to recreate images on more complicated, curved surfaces. Paramount to the researchers, however, was maintaining the integrity of the art form despite relinquishing much of the artistic control to a computer algorithm. Counter-clockwise: Input image, computer simulated painting, actual painting done by the smart spray can”; Image: Wojciech Jarosz “Our assistive approach is like a modern take on 'paint by numbers' for spray painting,” said Jarosz. “Most importantly, we wanted to maintain the aesthetic aspects of physical spray painting and the tactile experience of holding and waving a physical spray can while enabling unskilled users to create a physical piece of art."