News Article | May 17, 2017
PRINCETON, N.J., May 17, 2017 /PRNewswire/ -- Tickets are now on sale for all 22 performances in the 13th season of The Princeton Festival (www.princetonfestival.org), which runs June 3-25. The offerings cover a broad range of genres and styles, from Baroque concerts played on period...
News Article | May 25, 2017
PRINCETON, N.J.--The U.S. Nuclear Regulatory Commission (NRC) relied on faulty analysis to justify its refusal to adopt a critical measure for protecting Americans from the occurrence of a catastrophic nuclear-waste fire at any one of dozens of reactor sites around the country, according to an article in the May 26 issue of Science magazine. Fallout from such a fire could be considerably larger than the radioactive emissions from the 2011 Fukushima accident in Japan. Published by researchers from Princeton University and the Union of Concerned Scientists, the article argues that NRC inaction leaves the public at high risk from fires in spent-nuclear-fuel cooling pools at reactor sites. The pools -- water-filled basins that store and cool used radioactive fuel rods -- are so densely packed with nuclear waste that a fire could release enough radioactive material to contaminate an area twice the size of New Jersey. On average, radioactivity from such an accident could force approximately 8 million people to relocate and result in $2 trillion in damages. These catastrophic consequences, which could be triggered by a large earthquake or a terrorist attack, could be largely avoided by regulatory measures that the NRC refuses to implement. Using a biased regulatory analysis, the agency excluded the possibility of an act of terrorism as well as the potential for damage from a fire beyond 50 miles of a plant. Failing to account for these and other factors led the NRC to significantly underestimate the destruction such a disaster could cause. "The NRC has been pressured by the nuclear industry, directly and through Congress, to low-ball the potential consequences of a fire because of concerns that increased costs could result in shutting down more nuclear power plants," said paper co-author Frank von Hippel, a senior research physicist at Princeton's Program on Science and Global Security (SGS), based at the Woodrow Wilson School of Public and International Affairs. "Unfortunately, if there is no public outcry about this dangerous situation, the NRC will continue to bend to the industry's wishes." Von Hippel's co-authors are Michael Schoeppner, a former postdoctoral researcher at Princeton's SGS, and Edwin Lyman, a senior scientist at the Union of Concerned Scientists. Spent-fuel pools were brought into the spotlight following the March 2011 nuclear disaster in Fukushima, Japan. A 9.0-magnitude earthquake caused a tsunami that struck the Fukushima Daiichi nuclear power plant, disabling the electrical systems necessary for cooling the reactor cores. This led to core meltdowns at three of the six reactors at the facility, hydrogen explosions, and a release of radioactive material. "The Fukushima accident could have been a hundred times worse had there been a loss of the water covering the spent fuel in pools associated with each reactor," von Hippel said. "That almost happened at Fukushima in Unit 4." In the aftermath of the Fukushima disaster, the NRC considered proposals for new safety requirements at U.S. plants. One was a measure prohibiting plant owners from densely packing spent-fuel pools, requiring them to expedite transfer of all spent fuel that has cooled in pools for at least five years to dry storage casks, which are inherently safer. Densely packed pools are highly vulnerable to catching fire and releasing huge amounts of radioactive material into the atmosphere. The NRC analysis found that a fire in a spent-fuel pool at an average nuclear reactor site would cause $125 billion in damages, while expedited transfer of spent fuel to dry casks could reduce radioactive releases from pool fires by 99 percent. However, the agency decided the possibility of such a fire is so unlikely that it could not justify requiring plant owners to pay the estimated cost of $50 million per pool. The NRC cost-benefit analysis assumed there would be no consequences from radioactive contamination beyond 50 miles from a fire. It also assumed that all contaminated areas could be effectively cleaned up within a year. Both of these assumptions are inconsistent with experience after the Chernobyl and Fukushima accidents. In two previous articles, von Hippel and Schoeppner released figures that correct for these and other errors and omissions. They found that millions of residents in surrounding communities would have to relocate for years, resulting in total damages of $2 trillion -- nearly 20 times the NRC's result. Considering the nuclear industry is only legally liable for $13.6 billion, thanks to the Price Anderson Act of 1957, U.S. taxpayers would have to cover the remaining costs. The authors point out that if the NRC does not take action to reduce this danger, Congress has the authority to fix the problem. Moreover, the authors suggest that states that provide subsidies to uneconomical nuclear reactors within their borders could also play a constructive role by making those subsidies available only for plants that agreed to carry out expedited transfer of spent fuel. "In far too many instances, the NRC has used flawed analysis to justify inaction, leaving millions of Americans at risk of a radiological release that could contaminate their homes and destroy their livelihoods," said Lyman. "It is time for the NRC to employ sound science and common-sense policy judgments in its decision-making process." The paper, "Nuclear safety regulation in the post-Fukushima era," was published May 26 in Science. For more information, see von Hippel and Schoeppner's previous papers, "Reducing the Danger from Fires in Spent Fuel Pools" and "Economic Losses From a Fire in a Dense-Packed U.S. Spent Fuel Pool," which were published in Science & Global Security in 2016 and 2017 respectively. The Science article builds upon the findings of a Congressionally-mandated review by the National Academy of Sciences, on which von Hippel served.
News Article | May 4, 2017
PRINCETON, N.J. -- Scientists and policymakers use measurements like global warming potential to compare how varying greenhouse gases, like carbon dioxide and methane, contribute to climate change. Yet, despite its widespread use, global warming potential fails to provide an accurate look at how greenhouse gases affect the environment in the short and long-term, according to a team of researchers from Princeton University, the Environmental Defense Fund and Harvard University. The researchers argue in the May 5 issue of Science that because global warming potential calculates the warming effects of greenhouse gases over 100 years, they discount the effects of any greenhouse gas that disappears from the atmosphere after a decade or two. This masks the trade-offs between short- and long-term policies at the heart of today's political and ethical debates. What is needed, the researchers conclude, is a standardized approach that recognizes both commonly utilized timescales -- 20 and 100 years -- as a ubiquitous pair. This two-valued approach would provide clarity to climate change policy analyses, which often result in misleading debates about policy trade-offs. "Different gases have widely different lifetimes in the atmosphere after emission and affect the climate in different ways over widely different timescales," said co-author Michael Oppenheimer, the Albert G. Milbank Professor of Geosciences and International Affairs, Woodrow Wilson School of Public and International Affairs and the Department of Geosciences at Princeton University. "The paired approach creates a more comprehensive picture of the nature of climate change and the effects of various policies to stem its consequences." While most reports reference only one of these metrics -- most measure the effects over 100 years -- a standardized approach including both should become the norm to avoid skewing results. For example, recent studies show anti-shale gas advocacy groups base arguments around the 20-year time horizon, while the pro-shale gas community emphasizes the 100-year timescale, but both metrics are needed to truly understand the short- and long-term impacts shale gas has on the environment. The researchers liken the 20- and 100-year timescales to city-highway vehicle fuel efficiency data. Car dealerships boast about miles per gallon for both highway and city, providing buyers with an analysis relevant to different roadways. The dual-number system also enables buyers to calculate an average. Another example is how blood pressure is measured with two numbers, systolic and diastolic. The first number (systolic) measures the pressure in your blood vessels as the heart beats. The second number (diastolic) calculates the pressure in your blood vessels when your heart rests between beats. Together, the numbers reveal whether a person has an average blood pressure, like 120 over 80, or is at risk of pre-hypertension or high blood pressure. While the researchers advocate using both 20- and 100-year time scales (rather than one or the other), they do not advocate for a change in time horizons. Both the 20- and 100-year time scales are now the default in climate change policy, and shifting to new time horizons would likely be met with much resistance. "It is imperative that both the near- and long-term climate impacts of policies be transparent to a decisionmaker," said lead author Ilissa B. Ocko, Environmental Defense Fund. "We are not saying that one timescale is more important than the other, just that the decisionmaker must be fully informed of climate impacts on all timescales." The widespread adoption of this improved combined measure would require communication and coordination between key scientific journals and scientific societies. Groups dealing with climate change groups --the Intergovernmental Panel on Climate Change, U.S. Environmental Protection Agency, the U.N. Environmental Programme and the United Nations Framework Convention on Climate Change -- would also need to adopt the new measure in their reports. The proposal was written by lead authors Ilissa B. Ocko, Steven P. Hamburg and Nathaniel O. Keohane from the Environmental Defense Fund. Co-authors include, from Princeton University, Michael Oppenheimer, the Albert G. Milbank Professor of Geosciences and International Affairs, Woodrow Wilson School of Public and International Affairs and the Department of Geosciences; and Stephen W. Pacala, the Frederick D. Petrie Professor in Ecology and Evolutionary Biology, Department of Ecology and Evolutionary Biology. Additional authors include David W. Keith, Joseph D. Roy-Mayhew and Daniel P. Schrag from Harvard University. The paper, "Unmask temporal trade-offs in climate policy debates," will be published May 5 in Science. The work was partially funded by the Robertson Foundation, the Kravis Scientific Research Fund and the High Meadows Foundation.
News Article | March 1, 2017
PRINCETON, N.J. -- An influx of pollution from Asia in the western United States and more frequent heat waves in the eastern U.S. are responsible for the persistence of smog in these regions over the past quarter century despite laws curtailing the emission of smog-forming chemicals from tailpipes and factories. The study, led by researchers at Princeton University and the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory (GFDL), highlights the importance of maintaining domestic emission controls on motor vehicles, power plants and other industries at a time when pollution is increasingly global. Published March 1 in the journal Atmospheric Chemistry and Physics, the study looked at the sources of smog, also known as ground-level ozone, across a period ranging from the 1980s to today. Ground-level ozone, which is distinct from the ozone in the upper atmosphere that protects the planet from ultraviolet radiation, is harmful to human health, exacerbating asthma attacks and causing difficulty breathing. It also harms sensitive trees and crops. Despite a 50 percent cut in smog-forming chemicals such as nitrogen oxides, commonly known as "NOx", over the past 25 years, ozone levels measured in rural areas of the west have actually climbed. And while ozone in the eastern U.S. has decreased overall, the levels can spike during heat waves. The study traced the increase of ozone in the west to the influx of pollution from Asian countries, including China, North and South Korea, Japan, India, and other South Asian countries. Collectively, the region has tripled its emissions of NOx since 1990. In the eastern U.S., meanwhile, heat waves -- which have become more frequent in the past few decades -- trap polluted air in place, leading to temporary escalations in locally produced ozone. The study explains why springtime ozone levels measured in Yellowstone National Park and other western parks far from urban areas have climbed over the past quarter century. According to the study, springtime ozone levels in the national parks rose during that period by 5 to 10 parts per billion (ppb), which is significant given that the federal ozone standard is 70 ppb. The influx of pollution from Asia could make it difficult for these areas to comply with the federal ozone standards, according to the study's authors. "Increasing background ozone from rising Asian emissions leaves less room for local production of ozone before the federal standard is violated," said lead author Meiyun Lin, a research scholar in the Program in Atmospheric and Oceanic Sciences at Princeton University and a scientist at GFDL. Lin's co-authors were Larry Horowitz, also of GFDL; Richard Payton and Gail Tonnesen of the U.S. Environmental Protection Agency; and Arlene Fiore of the Lamont-Doherty Earth-Observatory and Department of Earth and Environmental Sciences at Columbia University. Using ozone measurements combined with climate models developed at GFDL, the authors identified pollution from Asia as driving the climb in ozone in western U.S. national parks in the spring, when wind and weather patterns push Asian pollution across the Pacific Ocean. In the summer, when these weather patterns subside, ozone levels in national parks are still above what would be expected given U.S. reductions in ozone-precursors. While it has been known for over a decade that Asian pollution contributes to ozone levels in the United States, this study is one of the first to categorize the extent to which rising Asian emissions contribute to U.S. ozone, according to Lin. In the eastern United States, where Asian pollution is a minor contributor to smog, NOx emission controls have been successful at reducing ozone levels. However, periods of extreme heat and drought can trap pollution in the region, making bad ozone days worse. Regional NOx emission reductions alleviated the ozone buildup during the recent heat waves of 2011 and 2012, compared to earlier heat waves such as in 1988 and 1999. As heat waves appear to be on the rise due to global climate change, smog in the eastern U.S. is likely to worsen, according to the study. Climate models such as those developed at GFDL can help researchers predict future levels of smog, enabling cost-benefit analyses for costly pollution control measures. The researchers compared results from a model called GFDL-AM3 to ozone measurements from monitoring stations over the course of the last 35 years, from 1980 to 2014. Prior studies using global models poorly matched the ozone increases measured in western national parks. Lin and co-authors were able to match the measurements by narrowing their analysis to days when the airflow is predominantly from the Pacific Ocean. Modeling the sources of air pollution can help explain where the ozone measured in the national parks is coming from, explained Lin. "The model allows us to divide the observed air pollution into components driven by different sources," she said. The team also looked at other contributors to ground-level ozone, such as global methane from livestock and wildfires. Wildfire emissions contributed less than 10 percent and methane about 15 percent of the western U.S. ozone increase, whereas Asian air pollution contributed as much as 65 percent. These new findings suggest that a global perspective is necessary when designing a strategy to meet U.S. ozone air quality objectives, said Lin. The negative effect of imported pollution on the US's ability to achieve its air quality goals is not wholly unexpected, according to Owen Cooper, a senior research scientist at the University of Colorado and the NOAA Earth System Research Laboratory, who is familiar with the current study but not directly involved. "Twenty years ago, scientists first speculated that rising Asian emissions would one day offset some of the United States' domestic ozone reductions," Cooper said. "This study takes advantage of more than 25 years of observations and detailed model hindcasts to comprehensively demonstrate that these early predictions were right."
Hiszpanski A.M.,Princeton |
Energy and Environmental Science | Year: 2014
The morphology of thin films of molecular and polymeric semiconductors, which is structurally complex and heterogeneous across multiple length scales, is known to significantly affect the device performance. Yet, controlling the film structure is challenging, typically requiring chemical modification of the organic semiconductors, substrates, or the conditions under which the films are formed. Post-deposition processing offers an opportunity to decouple film formation from structural development, providing greater control over molecular ordering in organic semiconductor thin films. This review highlights recent advances in post-deposition processing, focusing specifically on methods that control three important aspects of the film structure-the in-plane and out-of-plane molecular orientations and molecular packing-and correlating these structural changes with device performance in organic thin-film transistors and solar cells. © The Royal Society of Chemistry.
El-Asrag H.A.,ANSYS Inc. |
Combustion and Flame | Year: 2014
Direct numerical simulations (DNSs), for a stratified flow in HCCI engine-like conditions, are performed to investigate the effects of exhaust gas recirculation (EGR) by NOx and temperature/mixture stratification on autoignition of dimethyl ether (DME) in the negative temperature coefficient (NTC) region. Detailed chemistry for a DME/air mixture with NOx addition is employed and solved by a hybrid multi-time scale (HMTS) algorithm. Three ignition stages are observed. The results show that adding (1000ppm) NO enhances both low and intermediate temperature ignition delay times by the rapid OH radical pool formation (one to two orders of magnitude higher OH radicals concentrations are observed). In addition, NO from EGR was found to change the heat release rates differently at each ignition stage, where it mainly increases the low temperature ignition heat release rate with minimal effect on the ignition heat release rates at the second and third ignition stages. Sensitivity analysis is performed and the important reactions pathways for low temperature chemistry and ignition enhancement by NO addition are specified. The DNSs for stratified turbulent ignition show that the scales introduced by the mixture and thermal stratifications have a stronger effect on the second and third stage ignitions. Compared to homogenous ignition, stratified ignition shows a similar first autoignition delay time, but about 19% reduction in the second and third ignition delay times. Stratification, however, results in a lower averaged LTC ignition heat release rate and a higher averaged hot ignition heat release rate compared to homogenous ignition. The results also show that molecular transport plays an important role in stratified low temperature ignition, and that the scalar mixing time scale is strongly affected by local ignition. Two ignition-kernel propagation modes are observed: a wave-like, low-speed, deflagrative mode (the D-mode) and a spontaneous, high-speed, kinetically driven ignition mode (the S-mode). Three criteria are introduced to distinguish the two modes by different characteristic time scales and Damkhöler (Da) number using a progress variable conditioned by a proper ignition kernel indicator (IKI). The results show that the spontaneous ignition S-mode is characterized by low scalar dissipation rate, high displacement speed flame front, and high mixing Damkhöler number, while the D-mode is characterized by high scalar dissipation rate, low displacement speeds in the order of the laminar flame speed and a lower than unity Da number. The proposed criteria are applied at the different ignition stages. © 2013 The Combustion Institute.
Fox J.,Princeton |
Pach J.,City College of New York
Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms | Year: 2011
Computing the maximum number of disjoint elements in a collection C of geometric objects is a classical problem in computational geometry with applications ranging from frequency assignment in cellular networks to map labeling in computational cartography. The problem is equivalent to finding the independence number, α(Gc), of the intersection graph Gc of C, obtained by connecting two elements of C with an edge if and only if their intersection is nonempty. This is known to be an NP-hard task even for systems of segments in the plane with at most two different slopes. The best known polynomial time approximation algorithm for systems of arbitrary segments is due to Agarwal and Mustafa, and returns in the worst case an n 1/2+o(1)- approximation for α. Using extensions of the Lipton-Tarjan separator theorem, we improve this result and present, for every ε > 0, a polynomial time algorithm for computing α(Gc) with approximation ratio at most n ε. In contrast, for general graphs, for any ε > 0 it is NP-hard to approximate the independence number within a factor of n 1-ε. We also give a subexponential time exact algorithm for computing the independence number of intersection graphs of arcwise connected sets in the plane.
Krueger A.B.,Princeton |
Journal of Health Economics | Year: 2013
Most existing work on the demand for health insurance focuses on employees' decisions to enroll in employer-provided plans. Yet any attempt to achieve universal coverage must focus on the uninsured, the vast majority of whom are not offered employer-sponsored insurance. In the summer of 2008, we conducted a survey experiment to assess the willingness to pay for a health plan among a large sample of uninsured Americans. The experiment yields price elasticities of around one, substantially greater than those found in most previous studies. We use these results to estimate coverage expansion under the Affordable Care Act, with and without an individual mandate. We estimate that 35 million uninsured individuals would gain coverage and find limited evidence of adverse selection. © 2013.
Soft Matter | Year: 2011
Intracellular bodies consisting of dynamic aggregates of concentrated proteins and often RNA are a ubiquitous feature of the cytoplasm and nucleus of living cells. Dozens of different types of protein bodies are involved in diverse physiological processes including ribosome biogenesis, RNA splicing, and cell division. Unlike conventional organelles, they are not defined by an enclosing membrane. Instead, these bodies represent dynamic patterns of locally concentrated macromolecules which turn over on timescales of seconds. Here we discuss recent findings suggesting that intracellular protein bodies are active liquid-like drops that self-assemble within an intrinsically structured cytoplasm. © The Royal Society of Chemistry 2011.
News Article | February 16, 2017
PRINCETON, N.J.--Recent polls have shown that many white, working-class people in America feel pushed out by society, a reason why many voted for President Donald Trump. Many of these supporters latched onto misinformation spread online, especially stories that justified their own beliefs. New research may show why so many were willing to believe exaggerated and misleading reports. According to a Princeton University study published in the Journal of Experimental and Social Psychology, social exclusion leads to conspiratorial thinking. The two-part analysis -- which did not specifically investigate Trump supporters, but two random samples of people -- found that the feelings of despair brought on by social exclusion can cause people to seek meaning in miraculous stories, which may not necessarily be true. Such conspiratorial thinking leads to a dangerous cycle, said co-lead author Alin Coman, assistant professor of psychology and public affairs at Princeton. When those with conspiratorial ideas share their beliefs, it can drive away family and friends, triggering even more exclusion. This may lead them to join conspiracy theory communities where they feel welcome, which in turn will further entrench their beliefs. "Attempting to disrupt this cycle might be the best bet for someone interested in counteracting conspiracy theories at a societal level," Coman said. "Otherwise, communities could become more prone to propagating inaccurate and conspiratorial beliefs." Coman published the study with Damaris Graeupner, a research assistant in Princeton's Department of Psychology. For the first part of the study, they recruited 119 participants through Amazon's Mechanical Turk, a crowdsourcing internet marketplace. Participants engaged in four phases. First, they were asked to write about a recent unpleasant event that involved a close friend. Next, they were asked to rate the degree to which they felt 14 different emotions, including exclusion, which was the emotion being analyzed. They then were asked to complete a questionnaire that contained 10 statements and rank their agreement or disagreement using a seven-point scale from absolutely untrue to absolutely true. These statements included phrases like "I am seeking a purpose or mission for my life" and "I have discovered a satisfying life purpose." Finally, participants had to indicate the degree to which they endorsed three different conspiratorial beliefs ranging from one (not at all) to seven (extremely). These included the following statements: "Pharmaceutical companies withhold cures for financial reasons"; "Governments use messages below the level of awareness to influence people's decisions"; and "Events in the Bermuda Triangle constitute evidence of paranormal activity." "We chose these particular conspiracy theories for their widespread appeal in the population," Coman said. "These three are, indeed, endorsed by a significant portion of the American population." After analyzing the data, the researchers' hypothesis was confirmed: Social exclusion does lead to superstitious beliefs and, according to their statistical analyses, is likely the result of one searching for meaning in everyday experiences. "Those who are excluded may begin to wonder why they're excluded in the first place, causing them to seek meaning in their lives. This may then lead them to endorse certain conspiracy beliefs," Coman said. "When you're included, it doesn't necessarily trigger the same response." In the second part of the study, the researchers wanted to causally determine whether the degree to which someone was socially excluded influenced their conspiratorial beliefs. They recruited 120 participants, all of whom were Princeton University students. Participants were first asked to write two paragraphs describing themselves, one about "What it means to be me," and another about "The kind of person I want to be." They were told that these paragraphs would be given to two other participants in the room who would then rank whether they'd want to work with them. Each of the three participants was then randomly selected to either be in the inclusion group (selected for collaboration in a subsequent task), the exclusion group (not selected for collaboration) or the control group (no instructions about selection). This was deceitful: The participants did not evaluate the other participants' self-descriptions but instead descriptions created by the researchers. Finally, all participants went through the same four phases as the first study, which measured how social exclusion is linked to acceptance of conspiracy theories. The second study replicated the findings of the first, providing solid experimental evidence that if a person feels excluded, they are more likely to hold conspiratorial beliefs. In terms of policy, the findings highlight the need for inclusion, especially among populations at risk of exclusion. "When developing laws, regulations, policies and programs, policymakers should worry about whether people feel excluded by their enactment," Coman said. "Otherwise, we may create societies that are prone to spreading inaccurate and superstitious beliefs." The paper, "The dark side of meaning-making: How social exclusion leads to superstitious thinking," will be published in the March 2017 print edition of the Journal of Experimental Social Psychology. This research did not receive any specific grant from funding agencies in the public, commercial or not-for-profit sectors.