News Article | May 17, 2017
PRINCETON, N.J., May 17, 2017 /PRNewswire/ -- Tickets are now on sale for all 22 performances in the 13th season of The Princeton Festival (www.princetonfestival.org), which runs June 3-25. The offerings cover a broad range of genres and styles, from Baroque concerts played on period...
News Article | May 4, 2017
PRINCETON, N.J. -- Scientists and policymakers use measurements like global warming potential to compare how varying greenhouse gases, like carbon dioxide and methane, contribute to climate change. Yet, despite its widespread use, global warming potential fails to provide an accurate look at how greenhouse gases affect the environment in the short and long-term, according to a team of researchers from Princeton University, the Environmental Defense Fund and Harvard University. The researchers argue in the May 5 issue of Science that because global warming potential calculates the warming effects of greenhouse gases over 100 years, they discount the effects of any greenhouse gas that disappears from the atmosphere after a decade or two. This masks the trade-offs between short- and long-term policies at the heart of today's political and ethical debates. What is needed, the researchers conclude, is a standardized approach that recognizes both commonly utilized timescales -- 20 and 100 years -- as a ubiquitous pair. This two-valued approach would provide clarity to climate change policy analyses, which often result in misleading debates about policy trade-offs. "Different gases have widely different lifetimes in the atmosphere after emission and affect the climate in different ways over widely different timescales," said co-author Michael Oppenheimer, the Albert G. Milbank Professor of Geosciences and International Affairs, Woodrow Wilson School of Public and International Affairs and the Department of Geosciences at Princeton University. "The paired approach creates a more comprehensive picture of the nature of climate change and the effects of various policies to stem its consequences." While most reports reference only one of these metrics -- most measure the effects over 100 years -- a standardized approach including both should become the norm to avoid skewing results. For example, recent studies show anti-shale gas advocacy groups base arguments around the 20-year time horizon, while the pro-shale gas community emphasizes the 100-year timescale, but both metrics are needed to truly understand the short- and long-term impacts shale gas has on the environment. The researchers liken the 20- and 100-year timescales to city-highway vehicle fuel efficiency data. Car dealerships boast about miles per gallon for both highway and city, providing buyers with an analysis relevant to different roadways. The dual-number system also enables buyers to calculate an average. Another example is how blood pressure is measured with two numbers, systolic and diastolic. The first number (systolic) measures the pressure in your blood vessels as the heart beats. The second number (diastolic) calculates the pressure in your blood vessels when your heart rests between beats. Together, the numbers reveal whether a person has an average blood pressure, like 120 over 80, or is at risk of pre-hypertension or high blood pressure. While the researchers advocate using both 20- and 100-year time scales (rather than one or the other), they do not advocate for a change in time horizons. Both the 20- and 100-year time scales are now the default in climate change policy, and shifting to new time horizons would likely be met with much resistance. "It is imperative that both the near- and long-term climate impacts of policies be transparent to a decisionmaker," said lead author Ilissa B. Ocko, Environmental Defense Fund. "We are not saying that one timescale is more important than the other, just that the decisionmaker must be fully informed of climate impacts on all timescales." The widespread adoption of this improved combined measure would require communication and coordination between key scientific journals and scientific societies. Groups dealing with climate change groups --the Intergovernmental Panel on Climate Change, U.S. Environmental Protection Agency, the U.N. Environmental Programme and the United Nations Framework Convention on Climate Change -- would also need to adopt the new measure in their reports. The proposal was written by lead authors Ilissa B. Ocko, Steven P. Hamburg and Nathaniel O. Keohane from the Environmental Defense Fund. Co-authors include, from Princeton University, Michael Oppenheimer, the Albert G. Milbank Professor of Geosciences and International Affairs, Woodrow Wilson School of Public and International Affairs and the Department of Geosciences; and Stephen W. Pacala, the Frederick D. Petrie Professor in Ecology and Evolutionary Biology, Department of Ecology and Evolutionary Biology. Additional authors include David W. Keith, Joseph D. Roy-Mayhew and Daniel P. Schrag from Harvard University. The paper, "Unmask temporal trade-offs in climate policy debates," will be published May 5 in Science. The work was partially funded by the Robertson Foundation, the Kravis Scientific Research Fund and the High Meadows Foundation.
News Article | July 31, 2017
PRINCETON, N.J.--When the Supreme Court issued its 2015 ruling in favor of same-sex marriage, Americans understood the decision as a signal of Americans' increasing support of same-sex marriage, according to a study published by Princeton University. The researchers found that, regardless of political ideology, non-LGBTQIA Americans perceived stronger and increasing public support for gay marriage in the wake of the Court's ruling than before the decision. This was in spite of the fact that personal attitudes and feelings toward gay marriage did not change in reaction to the decision. Published in Psychological Science, the research, which features a time series survey of norms and attitudes toward gay marriage, showed that, regardless of how someone feels about an issue, a Court decision can alter the perceptions of the prevailing social norms -- opinions or behaviors accepted by a group of people -- around the issue. Trusted institutions like the Supreme Court are seen to represent societal collectives and, as such, its decisions may be perceived as a signal of where the public stands and where the public is headed. "What we observed was a shift in perceived norms, or the perception of public support for gay marriage," said study co-author Elizabeth Levy Paluck, professor of psychology and public affairs at Princeton's Woodrow Wilson School of Public and International Affairs. "That shift matters because we know from decades of research in psychology that people's behavior is often guided by their understanding of what others around them are doing and thinking." In addition to Paluck, the study was conducted by lead author Margaret Tankard, associate behavioral and social scientist at the RAND Corporation. In the months before and after the Supreme Court decision, 1063 participants were surveyed several times. Just after the ruling, the researchers observed a significant jump in participants' belief that Americans support same-sex marriage, and in their belief that support would keep growing in the future. This uptick in perceptions of supportive social norms persisted weeks later. The findings were also supported in an experimental study with 1,673 participants who were told prior to the SCOTUS decision that experts predicted a favorable versus an unfavorable ruling on the legality of gay marriage. Here too, the researchers found that participants who were led to believe that the Supreme Court would rule in favor of gay marriage estimated higher public support for gay marriage, compared to participants who read the opposite. The researchers' findings could be particularly timely given the recent announcement that the Court will hear Masterpiece Cakeshop v. Colorado Civil Rights Commission, the appeal of a state decision to uphold discrimination charges levied by a gay couple against a bakery that refused to make a cake for the couple's same-sex wedding. The paper, "The effect of Supreme Court decision regarding gay marriage on social norms and personal attitudes," was published online July 31 in Psychology Science. This research was made possible in part by the Canadian Institute for Advanced Research and Princeton University funding.
News Article | July 24, 2017
PRINCETON, N.J. -- Being exposed to and actively remembering violent episodes -- even those that happened up to a decade before -- hinders short-term memory and cognitive control, according to a study published in the Proceedings of the National Academy of Sciences (PNAS). The study, which was co-authored by Princeton University's Pietro Ortoleva, examined more than 500 civilians in Colombia, a country that has experienced both urban violence and rural warfare within the past two decades. The findings demonstrate the long-lasting effects of violence on cognition and memory recall and highlight the need for policies that provide proper therapy for those coping with violence. "Memory and cognitive control impact how people do in school, how they perform at work and if they can keep their jobs, and how they fare in life in general, which all has significant impacts on the economy as a whole," said Ortoleva, a professor of economics and public affairs at Princeton's Woodrow Wilson School of Public and International Affairs. In addition to Ortoleva, who conducted the work as a faculty member at Columbia University, the research team included Francesco Bogliacino of the Universidad Nacional de Colombia, and Gianluca Grimalda and Patrick Ring, both of the Kiel Institute for the World Economy. To study the effects of violence on both short-term memory and cognitive control, the research team conducted experiments on two groups: one from an urban setting and the other from a rural area. The urban group consisted of residents of Bogotá, where violence and crime are widespread. Those surveyed were between the ages of 18 and 24, and came from all but two of Bogotá's 19 districts. This age group was chosen because young people typically have not moved away from the neighborhood in which they grew up. The rural group consisted of civilians displaced by war, who experienced armed conflicts up to 10 or 20 years ago. Many of these people were forced to abandon their homes and move elsewhere under the threat of massacres by paramilitary groups, which happened just months before in the same region. "We studied both short-term memory and cognitive control because they are important determinants for individual well-being and societal development," Ortoleva said. "Stronger short-term memory is positively associated with school attainment, job performance, and with lower probability of contracting Alzheimer's disease and post-traumatic stress disorder. Weaker cognitive control among children has been shown to lead to issues with physical health, higher mortality rates, lower personal wealth and criminal offenses 30 years later." In the first experiment, the researchers asked a randomly chosen subset of the urban group to recall an event that occurred in the last year that caused anxiety or fear, explicitly hinting at violence as a possible cause of such an emotional state. These individuals reported different types of violence, including armed assault or witnessing murder. The remaining participants were asked to recall a joyful experience or a generic experience devoid of emotion. Participants were then asked to recall a sequence of geometrical figures to test short-term memory, or their ability to store information after recalling such violent episodes. Those in the group who were exposed to serious violence and recalled such an event performed poorly on this test. For those who were not exposed to serious violence, or were not asked to recall a violent event, no effect was seen. The researchers repeated a similar experiment with the rural group. These individuals also reported different types of extreme violence, including experiencing rape or witnessing murder. In addition to the short-term memory test, these participants were also given a cognitive control test to test their ability to inhibit immediate, instinctive responses. Participants were given numerical sequences in which a digit from one to four appeared one to four times and were asked to state the number of times the digit appeared. "A person's first instinct is typically to say the digit that appears -- not the number of times a digit appears. If they exercise cognitive control, however, they can recognize the question is about the latter," Ortoleva said. "This was a challenge for the participants who were exposed to violence and asked to remember such traumatic episodes." Similar effects were seen in the memory test of the participants in the rural group as the urban group, and the same happened for their cognitive control. Those who were both exposed to and recalled a violent event exhibited poorer performance than those who did not. "We used these distinct groups to compare the short- and long-term effects of violence and the impact of different types of violence: warfare in rural areas versus ordinary criminality in urban areas," Ortoleva said. "Regardless of place or type of violence, if subjects were exposed to high levels and were asked to recall it, poorer performance in our tests is seen." "Our results demonstrate that exposure to violence can have effects on cognitive functions," said Ortoleva. "Besides the obvious negative effects on the physical and psychological well-being, this may lead to vicious cycles: both poverty and violence hinder the ability to develop for a person or a group, which in turn may generate further poverty and violence. The study has broad implications, especially for the ongoing peace process in Colombia." Susan Fiske, Eugene Higgins Professor at Princeton's Woodrow Wilson School, edited the paper for PNAS and emphasized its importance with relation to war, crime and violence. "This innovative research suggests a cognitive mechanism for how exposure to violence spills over to daily life later," Fiske said. "Interventions should fight the downstream cognitive effects of experiencing war, terrorism, and neighborhood violence." The paper, "Exposure to and recall of violence reduce short-term memory and cognitive control," will be published online July 24 in PNAS. This work was supported by Open Evidence Grant 008-Tierra-Colombia; Fundacio?n Universitaria Konrad Lorenz Internal Grant 7INV3131; Universitat Jaume I Grant P1.1B2015- 48; Spanish Ministry of Economics and Competitivity Grant ECO 2015-68469- R; and Fondazione Franceschi.
News Article | July 18, 2017
PRINCETON, N.J. -- Certain occupations may significantly contribute to mobility problems as workers age, contributing to income-based disparities in disability, a study co-authored by Princeton University's Noreen Goldman finds. The research, conducted by Goldman with Hiram Beltrán-Sánchez and Anne Pebley at the University of California, Los Angeles, finds this to be especially true in middle-income countries like Mexico, where occupational health and safety regulations are weak or were enacted comparatively recently. Poor people are considerably more likely to suffer from limited mobility as they age. Previous research has demonstrated a relationship between disability among older adults and factors that poor people experience disproportionately, such as childhood adversity, lack of access to health care, harmful behaviors, unhealthy and unsafe neighborhoods and stress both on and off the job, Goldman said. But disadvantaged people also are more likely to hold jobs that require heavy physical labor, repetitive movement, strain on the body and safety hazards. "This is particularly true in low- and middle-income countries where workers with little education and low income often face multiple health conditions that lead to higher disability over their lifetimes," Beltrán-Sánchez said. The researchers sought to determine whether people's primary occupations contribute to the unequal distribution of disability among older adults. They focused on Mexico, Goldman said, because it's a middle-income country with a diverse economy that, like many other countries, has a rapidly growing older population. "With growing segments of the population experiencing functional limitations and disability, the country is facing a serious health policy challenge," said Goldman. Moreover, in countries like Mexico that have weaker occupational safety and health regulations, physical job demands are typically harsher and a higher proportion of people work in the unregulated informal sector of the economy. Beltrán-Sánchez, Pebley and Goldman analyzed responses from the Mexican Health and Aging Study, which began surveying about 15,000 older Mexican citizens in 2001. After filtering out survey respondents who didn't meet their criteria, they developed a nationally representative sample of more than 12,000 people aged 50 or older in all 32 Mexican states, about 54 percent of whom are women and 46 percent men. Survey respondents were asked to name the main job that they'd held during their lives. The researchers organized the jobs into categories with similar physical demands, then looked for associations between those categories and mobility limitations among people over age 50. Two job categories, they found, were most strongly associated with limited mobility: domestic workers and food/beverage/tobacco workers, both of which had an average of more than three mobility limitations. Workers in a larger group of categories -- including agricultural laborers and repair and maintenance workers, among others -- averaged two to three physical limitations. Workers in less physically demanding jobs that are typically held by people higher on the socioeconomic spectrum, such as managerial positions, had fewer limitations on average. But because poor people are generally less healthy than wealthier people, the question is whether certain job categories are associated with disability among older adults simply because they are held by poor people, or whether the jobs themselves are contributing to the disability. In other words, if wealthier people worked in similarly physically demanding jobs, would they also have more functional limitations as they age? To find out, Beltrán-Sánchez, Pebley and Goldman controlled for socioeconomic status (measured by education and wealth) in their analyses. They found that in fact, with controls for socioeconomic status, job categories were still associated with mobility limitations as people grew older. Nevertheless, physically demanding jobs are overwhelmingly held by poorer people, compounding the problem of health disparities. Thus, Pebley said, "For policy makers in Mexico and elsewhere concerned about social inequality in health, our results suggest that a greater focus on improving conditions in the workplace could be a cost-effective investment for reducing mobility limitations among older adults." The study appears as Hiram Beltrán-Sánchez, Anne Pebley and Noreen Goldman, "Links Between Primary Occupation and Functional Limitations Among Older Adults in Mexico" in SSM -- Population Health, Vol. 3 (December 2017), pp. 382-392. It was published online in April 2017. The work was supported by grants from the National Institute on Aging (R01AG052030), a pilot grant from the USC-UCLA Center on Biodemography and Population Health (P30AG017265), and grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development to the California Center for Population Research at UCLA (P2CHD041022) and to the Office of Population Research at Princeton University (P2CHD047879).
News Article | June 28, 2017
PRINCETON, N.J. -- Despite the relatively large number of employees working in downtown Detroit, the city continues to be afflicted by urban blight, surrounded by a swath of vacant neighborhoods. Changing this pervasive phenomenon has been at the forefront for developers, city officials and groups like Detroit Future City, an initiative with a strategic vision for the city's future. Debuting a new economic model, a team of researchers from Princeton University and the Federal Reserve Bank of Richmond have identified 22 neighborhoods that, if developed, could bring in millions of dollars in residential and business rents while attracting thousands of new residents to the city. These neighborhoods -- which include areas like Rosa Parks, Lower Woodward, Middle East Central, among others -- differ significantly from those targeted by Detroit Future City, which mostly focuses on neighborhoods close to downtown. The results are published as a working paper in the National Bureau of Economic Research (NBER). "Reviving Detroit requires coordination and buy-in from multiple developers, residents, and city governments. You can't think small scale on this," said co-author Esteban Rossi-Hansberg, Theodore A. Wells '29 Professor of Economics at Princeton University's Woodrow Wilson School of Public and International Affairs. "Our analysis shows there are mutual gains to be had by all parties. The gains would also be distributed across the city and beyond its boundaries, so coordination between counties is crucial." In addition to Rossi-Hansberg, the model was designed and evaluated by Raymond Owens and Pierre-Daniel Sarte, both of the Federal Reserve Bank of Richmond. The economists focused on Detroit given its significant evolution from a world famous city to a "hollow shell" over the past 100 years. The researchers constructed their model around what they call "development guarantees" -- buy-in from the government or private institutions that guarantee a certain level of development in a particular area. The model accounts for businesses moving into the area, the location itself, and workers' willingness to commute to that area. The ideal development guarantee, the researchers argued, is one that ensures that developers are willing to build in areas in which people would be willing to live. The policy would help to trigger the entry of private developers and, if successful, would imply no actual resources from the guarantor. On the flip side, if the guarantee isn't large enough, it could lead to undesirable outcomes -- like the city having to buy properties that developers aren't able to sell. A similar but alternative proposal was advanced by Detroit Future City, whose strategic framework lays out a desired image of what the city should look like in 10, 20, and 50 years into the future. Detroit Future City's most ambitious proposal involves 22 tracts, but the proposal was never quantified -- until now. The researchers quantify the gains and losses of alternative plans by Detroit Future City and others by modeling the employment decisions of firms, location and commuting decisions of workers, as well as the decision of developers to enter particular neighborhoods. The Princeton-Federal Reserve researchers tested their new model on the city of Detroit and all surrounding counties and identified 52 census tracts (neighborhoods) across the city that are currently mostly vacant and generally in bad shape. Between the researchers' model and Detroit Future City's plan, only 11 out of 22 neighborhoods are shared. One of the biggest differences is that the Detroit Future City proposal focuses on developing the areas closest to the downtown core, while the researchers' estimates--which they call the "Best 22 Residential Plan," covers some of the same areas but also areas in a wider outer ring. Although both policies promise gains, the difference can amount to several tens of millions of dollars and many less new residents. The results in the study are based on a particular model of the city of Detroit and, as for any policy evaluation, depend on a number of assumptions that are described in detail in the NBER working paper, the researchers noted. The study, "Rethinking Detroit," was published in NBER as a working paper and was not peer-reviewed or subject to the review by the NBER Board of Directors that accompanies official NBER publications.
News Article | May 25, 2017
PRINCETON, N.J.--The U.S. Nuclear Regulatory Commission (NRC) relied on faulty analysis to justify its refusal to adopt a critical measure for protecting Americans from the occurrence of a catastrophic nuclear-waste fire at any one of dozens of reactor sites around the country, according to an article in the May 26 issue of Science magazine. Fallout from such a fire could be considerably larger than the radioactive emissions from the 2011 Fukushima accident in Japan. Published by researchers from Princeton University and the Union of Concerned Scientists, the article argues that NRC inaction leaves the public at high risk from fires in spent-nuclear-fuel cooling pools at reactor sites. The pools -- water-filled basins that store and cool used radioactive fuel rods -- are so densely packed with nuclear waste that a fire could release enough radioactive material to contaminate an area twice the size of New Jersey. On average, radioactivity from such an accident could force approximately 8 million people to relocate and result in $2 trillion in damages. These catastrophic consequences, which could be triggered by a large earthquake or a terrorist attack, could be largely avoided by regulatory measures that the NRC refuses to implement. Using a biased regulatory analysis, the agency excluded the possibility of an act of terrorism as well as the potential for damage from a fire beyond 50 miles of a plant. Failing to account for these and other factors led the NRC to significantly underestimate the destruction such a disaster could cause. "The NRC has been pressured by the nuclear industry, directly and through Congress, to low-ball the potential consequences of a fire because of concerns that increased costs could result in shutting down more nuclear power plants," said paper co-author Frank von Hippel, a senior research physicist at Princeton's Program on Science and Global Security (SGS), based at the Woodrow Wilson School of Public and International Affairs. "Unfortunately, if there is no public outcry about this dangerous situation, the NRC will continue to bend to the industry's wishes." Von Hippel's co-authors are Michael Schoeppner, a former postdoctoral researcher at Princeton's SGS, and Edwin Lyman, a senior scientist at the Union of Concerned Scientists. Spent-fuel pools were brought into the spotlight following the March 2011 nuclear disaster in Fukushima, Japan. A 9.0-magnitude earthquake caused a tsunami that struck the Fukushima Daiichi nuclear power plant, disabling the electrical systems necessary for cooling the reactor cores. This led to core meltdowns at three of the six reactors at the facility, hydrogen explosions, and a release of radioactive material. "The Fukushima accident could have been a hundred times worse had there been a loss of the water covering the spent fuel in pools associated with each reactor," von Hippel said. "That almost happened at Fukushima in Unit 4." In the aftermath of the Fukushima disaster, the NRC considered proposals for new safety requirements at U.S. plants. One was a measure prohibiting plant owners from densely packing spent-fuel pools, requiring them to expedite transfer of all spent fuel that has cooled in pools for at least five years to dry storage casks, which are inherently safer. Densely packed pools are highly vulnerable to catching fire and releasing huge amounts of radioactive material into the atmosphere. The NRC analysis found that a fire in a spent-fuel pool at an average nuclear reactor site would cause $125 billion in damages, while expedited transfer of spent fuel to dry casks could reduce radioactive releases from pool fires by 99 percent. However, the agency decided the possibility of such a fire is so unlikely that it could not justify requiring plant owners to pay the estimated cost of $50 million per pool. The NRC cost-benefit analysis assumed there would be no consequences from radioactive contamination beyond 50 miles from a fire. It also assumed that all contaminated areas could be effectively cleaned up within a year. Both of these assumptions are inconsistent with experience after the Chernobyl and Fukushima accidents. In two previous articles, von Hippel and Schoeppner released figures that correct for these and other errors and omissions. They found that millions of residents in surrounding communities would have to relocate for years, resulting in total damages of $2 trillion -- nearly 20 times the NRC's result. Considering the nuclear industry is only legally liable for $13.6 billion, thanks to the Price Anderson Act of 1957, U.S. taxpayers would have to cover the remaining costs. The authors point out that if the NRC does not take action to reduce this danger, Congress has the authority to fix the problem. Moreover, the authors suggest that states that provide subsidies to uneconomical nuclear reactors within their borders could also play a constructive role by making those subsidies available only for plants that agreed to carry out expedited transfer of spent fuel. "In far too many instances, the NRC has used flawed analysis to justify inaction, leaving millions of Americans at risk of a radiological release that could contaminate their homes and destroy their livelihoods," said Lyman. "It is time for the NRC to employ sound science and common-sense policy judgments in its decision-making process." The paper, "Nuclear safety regulation in the post-Fukushima era," was published May 26 in Science. For more information, see von Hippel and Schoeppner's previous papers, "Reducing the Danger from Fires in Spent Fuel Pools" and "Economic Losses From a Fire in a Dense-Packed U.S. Spent Fuel Pool," which were published in Science & Global Security in 2016 and 2017 respectively. The Science article builds upon the findings of a Congressionally-mandated review by the National Academy of Sciences, on which von Hippel served.
News Article | June 28, 2017
PRINCETON, N.J. -- Exposure to lead in the preschool years significantly increases the chance that children will be suspended or incarcerated during their school careers, according to research at Princeton University and Brown University. Conversely, a drop in exposure leads to less antisocial behavior and thus may well be a significant factor behind the drop in crime over the past few decades. Given that children who are suspended or incarcerated are more likely to be involved in crime as adults, the finding supports the hypothesis that falling crime rates over the past few decades were caused largely by a sharp decline in childhood lead exposure. Lead was banned from house paint in 1976, and leaded gasoline was phased out between 1979 and 1986. People exposed to lead as young children (from 0 to 6 years old) are more likely to exhibit poor thinking skills and impulse control, to have trouble paying attention, and to behave aggressively. These traits can lead to antisocial or criminal behavior in adults. Studies seeking links between adult crime and early childhood lead exposure have suggested that the drop in lead exposure could explain up to 90 percent of the sharp downward trend in U.S. crime that started in the mid-1990s. But other explanations have also been proposed. For example, said Princeton's Janet Currie -- the Henry Putnam Professor of Economics and Public Affairs -- falling crime rates have been tied to increased availability of abortions, improved policing, the growth of the prison population, and the waning of the crack-cocaine epidemic. Because these phenomena all occurred around the same time, it can be hard to distinguish their effects from one another. The researchers sought to find lead exposure's effect on school disciplinary problems and juvenile incarceration, which could shed light on whether the decrease in lead exposure was in fact a contributing factor to the decline in the crime rate. Currie and Anna Aizer, a professor of economics and public policy at Brown who did postdoctoral work at Princeton's Center for Health and Wellbeing (CHW), based their study on data covering about 120,000 children born in Rhode Island. The study appeared as a working paper on the National Bureau of Economic Research website. "Rhode Island is an ideal place to study the effects of lead because of the state's aggressive lead screening program," Currie said. Nearly three-quarters of Rhode Island children have been screened at least once by the time they reach 18 months, far above the national average; by age six, children in the study had been screened an average of three times. The state's expansive screening program conferred two advantages for the study, noted Currie, who is also chair of Princeton's Department of Economics and co-director of the CHW. First, because so many children in Rhode Island were screened, including many who were showing no obvious signs of lead exposure, the sample included a large percentage of children with low blood lead levels for comparison purposes. Second, because so many children received multiple screenings, the researchers were partially able to compensate for an inherent problem with blood level tests for lead. Lead doesn't stay in children's bloodstreams for long before it's deposited in organs like the brain, and multiple blood screenings increase the chances of detecting lead exposure. The researchers examined children born from 1990, which was shortly after leaded gasoline was phased out, until 2014. They accessed Rhode Island Department of Health blood lead level tests for preschool children conducted from 1994 to 2014. They linked those records to school suspension records beginning in the 2007-08 school year, as well as to juvenile detention records beginning in 2004. Beyond the blood tests, Currie and Aizer were also able to estimate lead exposure by linking their data to records of the children's addresses. Because it is heavy, lead from vehicle exhaust pipes had settled in the soil within 25 to 50 meters of a road. Naturally, the busier the road, the more lead could be found in the surrounding soil. Children living nearby absorbed lead from the soil mostly by inhaling it. In 1990, soil lead levels near busy roads remained high. By 2014, soil lead levels near busy roads were similar to lower soil lead levels elsewhere. Geographic information allowed the researchers to create a measure of "average traffic" and thus potential lead exposure near each child's home as their families moved from place to place over time. With such a large sample and multiple types of linked data, Currie and Aizer were able to compensate for a number of factors that could have led to under- or overestimating lead's influence on school suspension and juvenile incarceration. In the end, they found that lead exposure had a powerful effect. A one-unit increase in blood lead levels -- which are measured in units of millionths of a gram per each tenth of a liter of blood -- raised the probability that a child would be suspended from school by 6.4 to 9.3 percent. Among boys, a one-unit increase in blood lead levels raised the probability of incarceration by 27 to 74 percent. Because few juveniles, and almost no girls, ever experience incarceration, estimates of lead's effect on incarceration were less accurate. "Children who have been suspended are ten times more likely to be involved in criminal activity as adults," Currie said. Moreover, young people who are incarcerated for even a short period are less likely to graduate from high school and more likely to commit crimes as adults. "Our results support the hypothesis that reductions in blood lead levels may have been responsible for a significant part of the observed decrease in antisocial behavior among youths and young adults in recent decades," Currie concluded. The study, "Lead and Juvenile Delinquency: New Evidence from Linked Birth, School and Juvenile Detention Records," first appeared on NBER's website in May 2017. As a working paper, it was not peer-reviewed or subject to the review by the NBER Board of Directors that accompanies official NBER publications.
Hiszpanski A.M.,Princeton |
Energy and Environmental Science | Year: 2014
The morphology of thin films of molecular and polymeric semiconductors, which is structurally complex and heterogeneous across multiple length scales, is known to significantly affect the device performance. Yet, controlling the film structure is challenging, typically requiring chemical modification of the organic semiconductors, substrates, or the conditions under which the films are formed. Post-deposition processing offers an opportunity to decouple film formation from structural development, providing greater control over molecular ordering in organic semiconductor thin films. This review highlights recent advances in post-deposition processing, focusing specifically on methods that control three important aspects of the film structure-the in-plane and out-of-plane molecular orientations and molecular packing-and correlating these structural changes with device performance in organic thin-film transistors and solar cells. © The Royal Society of Chemistry.
Soft Matter | Year: 2011
Intracellular bodies consisting of dynamic aggregates of concentrated proteins and often RNA are a ubiquitous feature of the cytoplasm and nucleus of living cells. Dozens of different types of protein bodies are involved in diverse physiological processes including ribosome biogenesis, RNA splicing, and cell division. Unlike conventional organelles, they are not defined by an enclosing membrane. Instead, these bodies represent dynamic patterns of locally concentrated macromolecules which turn over on timescales of seconds. Here we discuss recent findings suggesting that intracellular protein bodies are active liquid-like drops that self-assemble within an intrinsically structured cytoplasm. © The Royal Society of Chemistry 2011.