New York City, NY, United States
New York City, NY, United States

Columbia University in the City of New York, or simply Columbia University, is an American private Ivy League research university located in the Morningside Heights neighborhood of Upper Manhattan in New York City. It is the oldest institution of higher learning in the State of New York, the fifth oldest in the United States, and one of the country's nine Colonial Colleges founded before the American Revolution. Today the university operates Columbia Global Centers overseas in Amman, Beijing, Istanbul, Paris, Mumbai, Rio de Janeiro, Santiago and Nairobi.The university was founded in 1754 as King's College by royal charter of George II of Great Britain. After the American Revolutionary War, King's College briefly became a state entity, and was renamed Columbia College in 1784. The University now operates under a 1787 charter that places the institution under a private board of trustees, and in 1896 it was further renamed Columbia University. That same year, the university's campus was moved from Madison Avenue to its current location in Morningside Heights, where it occupies more than six city blocks, or 32 acres .The university encompasses twenty schools and is affiliated with numerous institutions, including Teachers College , Barnard College, and the Union Theological Seminary, with joint undergraduate programs available through the Jewish Theological Seminary of America as well as the Juilliard School.Columbia annually administers the Pulitzer Prize. 101 Nobel Prize laureates have been affiliated with the university as students, faculty, or staff, the second most of any institution in the world. Columbia is one of the fourteen founding members of the Association of American Universities, and was the first school in the United States to grant the M.D. degree. Notable alumni and former students of the university and its predecessor, King's College, include five Founding Fathers of the United States; nine Justices of the United States Supreme Court; 43 Nobel Prize laureates; 20 living billionaires; 28 Academy Award winners; and 29 heads of state, including three United States Presidents. Wikipedia.


Time filter

Source Type

Patent
Columbia University | Date: 2015-04-24

Active wearable body brace embodiments provide selectable conformation control of the interface between the trunk of a human subject and the brace, which conformation may be varied over time and responsively to feedback and an updateable prescription. Further active wearable body brace embodiments provide selectable movement control of the interface between the trunk of a human subject and the brace with up to six degrees of freedom between elements to allow a controller to implement operations such as muscle challenges, active support, and passive-like support where actuators mimic springs. Further embodiments may also provide detection functions such as quantifying muscle weakness. The features of these embodiments may be combined in various ways to provide still further embodiments.


Patent
Columbia University | Date: 2015-04-22

A quantitative gait training and/or analysis system includes a pair of footwear modules that may include a shank module and an independent processing module. Each footwear module may have a sole portion, a heel portion, a speaker, vibrotactile transducer and a wireless communication module. Sensors may permit the extraction of gait kinematics in real time and provide feedback from it. Embodiments may store data for later reduction and analysis. Embodiments employing calibration-based estimation of kinematic gait parameters are described.


Patent
New York University and Columbia University | Date: 2015-03-05

The present invention provides a method of determining whether a subject is afflicted with a depressive disorder comprising:


Patent
Columbia University | Date: 2016-10-14

The disclosed subject matter includes optical tomographic systems for acquiring and displaying dynamic data representing changes in a target tissue sample to external provocation. For example, the disclosed devices, methods and systems may be used for quantifying dynamic vascular changes caused by imposed blood pressure changes for diagnosing peripheral artery disease.


The present invention provides, inter alia, methods for treating or ameliorating the effects of a disease associated with altered serotonin transporter molecule (SERT) activity in a subject in need thereof. The methods include administering to the subject an effective amount of a 5-HT_(4 )agonist or a pharmaceutically acceptable salt thereof. Pharmaceutical compositions and kits for the same are also provided. The present invention additionally provides methods for treating or ameliorating the effects of elevated serotonin transporter molecule activity in the gastrointestinal tract of a subject in need thereof and a gastrointestinal abnormality associated with autism spectrum disorder in a subject in need thereof.


Patent
Columbia University | Date: 2015-03-11

A customized allograft that is bendable and suitable for allo-grafting of an articular joint, including the thumb.


Systems for machine-based rehabilitation of movement disorders including gait therapy applications can apply controlled forces to the pelvis and/or other body parts including knee and ankle joints. Cable-driven systems for gait therapy applications can apply controlled forces to, in respective embodiments, the pelvis and the pelvis, knee and ankle joints. In further embodiments, systems for gait therapy can be treadmill-based or walker-based. In embodiments, a controlled downforce is applied to the hip with augmentation including supportive forces. In further embodiments, the technology is activated through cables that provide support and limb-flexing moments with low inertia and friction resistance. In further embodiments, assistance is configured for gait therapy in children. In still further embodiments, methods of rehabilitation and assist-as-needed (AAN) control of the gait therapy systems facilitate a patients ability to coordinate movement, control balance, achieve strength, and other beneficial outcomes.


Patent
Columbia University | Date: 2016-09-28

The present invention relates to a skin or surface disinfectant composition with broad spectrum antimicrobial activity comprising one or more essential oil (and/or one or more component thereof) and one or more fruit acid. The compositions of the invention may be used as non-toxic alternatives to conventional disinfectants or may be added to other antimicrobial agents to enhance their activity. The invention provides effective alternatives to harsher products which may be particularly useful in personal care and household products and where children and/or pet exposure may be a concern.


Patent
Columbia University and Sloan Kettering Institute For Cancer Research | Date: 2016-11-18

This invention provides a compound having the structure:


Patent
Columbia University | Date: 2016-08-26

Techniques to profile a disease or a disorder (e.g., a tumor) based on a protein activity signature are disclosed herein. An example method can include measuring quantitatively protein activity of a plurality of master regulator proteins in a sample from a disease or disorder; and profiling the tumor from the quantitative protein activity of the master regulator proteins. Also disclosed are methods of identifying a compound or compounds that treats diseases or disorders (e.g., inhibit tumor cell growth).


Methods are described for producing enteroendocrine cells that make and secrete insulin in a mammal by blocking the expression or biological activity of one or more Foxo proteins or biologically active fragments or variants thereof.


This invention provides a nucleotide analogue comprising (i) a base selected from the group consisting of adenine, guanine, cytosine, thymine and uracil, (ii) a deoxyribose, (iii) an allyl moiety bound to the 3-oxygen of the deoxyribose and (iv) a fluorophore bound to the base via an allyl linker, and methods of nucleic acid sequencing employing the nucleotide analogue.


Methods, pharmaceutical formulations and medicaments for treating prostate cancer or preventing the progression of a nonaggressive form of prostate cancer to an aggressive form, in a mammal, include a therapeutically effective amount of one or more active agents that reduce the expression or biological activity of both Forkhead box protein M1 (FOXM1) and Centromere protein F (CENPF) or biologically active fragments thereof or biologically active fragments thereof selected from the group consisting of an isolated shRNA, siRNA, antisense RNA, antisense DNA, Chimeric Antisense DNA/RNA, microRNA, and ribozymes that are sufficiently complementary to either a gene or an mRNA encoding either FOXM1 or CENPF proteins. A method is also presented for discovering synergistic master regulators of other phenotype transitions, wherein the master regulators are conserved among different species.


Patent
Columbia University | Date: 2016-09-19

A microdevice for isolating and amplifying aptamers includes a selection microchamber and an amplification microchamber. The selection microchamber can include a plurality of cultured cells immobilized therein. A first microchannel connecting the selection microchamber to the amplification microchamber can be configured to hydrodynamically transfer oligomers from the selection microchamber to the amplification chamber. A second microchannel connecting the selection microchamber to the amplification microchamber can be configured to hydrodynamically transfer oligomers from the amplification chamber to the selection chamber.


Patent
Columbia University | Date: 2016-12-15

This invention provides methods for attaching a nucleic acid to a solid surface and for sequencing nucleic acid by detecting the identity of each nucleotide analogue after the nucleotide analogue is incorporated into a growing strand of DNA in a polymerase reaction. The invention also provides nucleotide analogues which comprise unique labels attached to the nucleotide analogue through a cleavable linker, and a cleavable chemical group to cap the OH group at the 3-position of the deoxyribose.


Patent
Columbia University | Date: 2016-12-15

This invention provides methods for attaching a nucleic acid to a solid surface and for sequencing nucleic acid by detecting the identity of each nucleotide analogue after the nucleotide analogue is incorporated into a growing strand of DNA in a polymerase reaction. The invention also provides nucleotide analogues which comprise unique labels attached to the nucleotide analogue through a cleavable linker, and a cleavable chemical group to cap the OH group at the 3-position of the deoxyribose.


Patent
Columbia University | Date: 2016-05-09

In accordance with some embodiments of the present invention, systems and methods that protect an application from attacks are provided. In some embodiments of the present invention, input from an input source, such as traffic from a communication network, can be routed through a filtering proxy that includes one or more filters, classifiers, and/or detectors. In response to the input passing through the filtering proxy to the application, a supervision framework monitors the input for attacks (e.g., code injection attacks). The supervision framework can provide feedback to tune the components of the filtering proxy.


Methods, media, and systems for securing communications between a first node and a second node are provided. In some embodiments, methods for securing communication between a first node and a second node are provided. The methods comprising: receiving at least one model of behavior of the second node at the first node; and authorizing the first node to receive traffic from the second node based on the difference between the at least one model of behavior of the second node and at least one model of behavior of the first node.


A system and methods for detecting intrusions in the operation of a computer system comprises a sensor configured to gather information regarding the operation of the computer system, to format the information in a data record having a predetermined format, and to transmit the data in the predetermined data format. A data warehouse is configured to receive the data record from the sensor in the predetermined data format and to store the data in a SQL database. A detection model generator is configured to request data records from the data warehouse in the predetermined data format, to generate an intrusion detection model based on said data records, and to transmit the intrusion detection model to the data warehouse according to the predetermined data format. A detector is configured to receive a data record in the predetermined data format from the sensor and to classify the data record in real-time as one of normal operation and an attack based on said intrusion detection model. A data analysis engine is configured to request data records from the data warehouse according to the predetermined data format and to perform a data processing function on the data records.


Patent
Columbia University | Date: 2016-09-26

A three-dimensional cancer culture model is provided suitable for chemotherapeutic testing and metastasis mechanism study.


Patent
Columbia University | Date: 2015-02-12

This invention provides methods of using labeled primers or probes for nucleic acid target detection and to detect the identity or presence of a nucleotide at certain positions in nucleic acid sequences with single molecule sensitivity using nanopore detection, and sets of oligonucleotide primers for use in such methods, as well as methods of quantitative PCR coupled with nanopore detection.


Patent
Columbia University | Date: 2015-02-19

Techniques for diagnosis of aggressive prostate cancer include determining a level of expression of each of the genes encoding (FOXM1) Forkhead box protein M1 and Centromere protein F (CENPF) in a test sample. If the level of expression of each of the FOXM1 and CENPF genes in the test sample is at least 35% higher than the corresponding level in a control sample, then it is determined that the subject has an aggressive form of prostate cancer or has a high risk of prostate cancer progressing to an aggressive form. Alternatively, if at least 50% of prostate cancer cells in the sample express both FOXM1 protein and CENPF protein at a composite score of at least 100 for each, then the above diagnosis is made. Composite score is calculated by multiplying a percent staining value by a staining intensity value.


Methods are provided for inhibiting or enhancing down-regulation of an antigen-activated HLA-E^(+) T cell by an HLA-E-restricted CD8^(+) T cell comprising contacting the HLA-E^(+) T cell and CD8^(+) T cell with an agent which inhibits or enhances, respectively, binding between (i) T cell receptor (TCR) on the surface of the CD8^(+) T cell and (ii) a self peptide presented by HLA-E on the surface of the HLA-E^(+) T cell, thereby inhibiting or enhancing, respectively, down-regulation of the antigen-activated HLA-E^(+) T cell. Compositions comprising agents which inhibit or enhance/activate, respectively, binding between (i) T cell receptor (TCR) on the surface of a CD8 T cell and (ii) a self peptide presented by HLA-E on the surface of a HLA-E^(+) T cell, and assays for identifying such agents, are provided.


Patent
Columbia University | Date: 2015-04-10

The disclosure provides for compositions, systems, and methods of cell expansion, stimulation and/or differentiation. The disclosure further provides for a mesh substrate and associated methods capable of stimulating cell expansion, for example, T cell or stem cell expansion. In another aspect, the disclosure provides for an electrospun mesh substrate and methods of using thereof comprising a silicone rubber composition, for example, polydimethylsiloxane, PLC, or combinations thereof.


Patent
Columbia University | Date: 2015-05-08

Evaporation-driven engines are disclosed herein. An example engine can include a water source having a high humidity zone proximate the surface of the water source, a supporting structure, and a hygroscopic material disposed on the supporting structure and configured to generate mechanical force in response to a changing relative humidity. The hygroscopic material can be repeatedly exposed to the high humidity zone and removed from the high humidity zone thereby causing the hygroscopic material to generate mechanical force.


Patent
Columbia University | Date: 2016-12-15

This invention provides methods for attaching a nucleic acid to a solid surface and for sequencing nucleic acid by detecting the identity of each nucleotide analogue after the nucleotide analogue is incorporated into a growing strand of DNA in a polymerase reaction. The invention also provides nucleotide analogues which comprise unique labels attached to the nucleotide analogue through a cleavable linker, and a cleavable chemical group to cap the OH group at the 3-position of the deoxyribose.


Patent
Columbia University | Date: 2016-07-27

Method for determining one or more viewer affects evoked from visual content using visual sentiment analysis using a correlation model including a plurality of publisher affect concepts correlated with a plurality of viewer affect concepts includes detecting one or more of the plurality of publisher affect concepts present in selected visual content, and determining, using the correlation model, one or more of the plurality of viewer affect concepts corresponding to the one or more of the detected publisher affect concepts. A method for determining one or more visual content to evoke one or more viewer affects using visual sentiment analysis is also provided.


Patent
Columbia University | Date: 2016-08-26

Methods for determining regulon enrichment in gene expression signatures are disclosed herein. An example method can include obtaining a set of transcriptional targets of a regulon. The method can include obtaining a gene expression signature by comparing a gene expression profile of a test sample to gene expression profiles of a plurality of samples representing control phenotypes. The method can include calculating a regulon enrichment score for each regulon in the gene expression signature. The method can including determining whether a number of control samples in the control phenotypes is above a predetermined threshold to support evaluation of statistical significance using permutation analysis. The method can include, in response to determining that the number of control samples is above the predetermined threshold, calculating a significance value by comparing each regulon enrichment score to a null model.


Patent
Columbia University and New York University | Date: 2016-09-02

The present disclosure is directed to iterative regularized reconstruction methods. In certain embodiments, the methods incorporate locally-weighted total variation denoising to suppress artifacts induced by PSF modeling. In certain embodiments, the methods are useful for suppressing ringing artifacts while contrast recovery is maintained. In certain embodiments, the weighting scheme can be extended to noisy measures introducing a noise-independent weighting scheme. The present disclosure is also directed to a method for quantifying radioligand binding in a subject without collecting arterial blood. In certain embodiments, the methods incorporate using imaging data and electronic health records to predict one or more anchors, which are used to generate an aterial input function (AIF) for the radioligand.


Patent
Columbia University | Date: 2015-04-17

A technology which enables identifying, via a computer, a vessel in a third image. The third image is obtained from a subtraction of a second image from a first image. The second image and the first image are aligned on an imaging space. The first image is post-contrast. The second image is pre-contrast. The technology enables determining, via the computer, a voxel intensity mean value of a segment of the vessel in the third image. The technology enables obtaining, via the computer, a fourth image from a division of the third image by the voxel intensity mean value. The technology enables applying, via the computer, a filter onto the fourth image. The technology enables generating, via the computer, a filter mask based on the fourth image.


Patent
Columbia University | Date: 2016-08-01

As information to be processed at an object-based video or audio-visual (AV) terminal, an object-oriented bitstream includes objects, composition information, and scene demarcation information. Such bitstream structure allows on-line editing, e.g. cut and paste, insertion/deletion, grouping, and special effects. In the interest of ease of editing, AV objects and their composition information are transmitted or accessed on separate logical channels (LCs). Objects which have a lifetime in the decoder beyond their initial presentation time are cached for reuse until a selected expiration time.


Systems and methods are presented for content extraction from markup language text. The content extraction process may parse markup language text into a hierarchical data model and then apply one or more filters. Output filters may be used to make the process more versatile. The operation of the content extraction process and the one or more filters may be controlled by one or more settings set by a user, or automatically by a classifier. The classifier may automatically enter settings by classifying markup language text and entering settings based on this classification. Automatic classification may be performed by clustering unclassified markup language texts with previously classified markup language texts.


Systems, methods, and media for recording an image of a scene are provided. In accordance with some embodiments, systems for recording an image of a scene are provided, comprising: a diffuser that diffuses light representing the scene and that has a scattering function that is independent of aperture coordinates; a sensor that receives diffused light representing the scene and generates data representing an image; and a hardware processor that uses a point spread function to deblur the image.


Patent
Sony Corporation and Columbia University | Date: 2015-04-15

Systems, methods, and media for extracting information and a display image from two captured images are provided, in some embodiments, systems for extracting information and a display image from two captured images are provided, the systems comprising: a rolling shutter sensor; and a hardware processor coupled to the rolling shutter sensor that is configured to: cause the roiling shutter sensor to capture two captured images; receive the two captured images; and extract the information, and the display image from the two captured images, wherein the information is represented in the captured images as a flicker pattern.


Chia G.,Columbia University
Nature Cell Biology | Year: 2017

Somatic cells can be reprogrammed to a pluripotent state by nuclear transfer into oocytes, yet developmental arrest often occurs. While incomplete transcriptional reprogramming is known to cause developmental failure, reprogramming also involves concurrent changes in cell cycle progression and nuclear structure. Here we study cellular reprogramming events in human and mouse nuclear transfer embryos prior to embryonic genome activation. We show that genetic instability marked by frequent chromosome segregation errors and DNA damage arise prior to, and independent of, transcriptional activity. These errors occur following transition through DNA replication and are repaired by BRCA1. In the absence of mitotic nuclear remodelling, DNA replication is delayed and errors are exacerbated in subsequent mitosis. These results demonstrate that independent of gene expression, cell-type-specific features of cell cycle progression constitute a barrier sufficient to prevent the transition from one cell type to another during reprogramming. © 2017 Nature Publishing Group


Yuste R.,Columbia University | Bargmann C.,Rockefeller University
Cell | Year: 2017

Neuroscience is entering a collaborative era in which powerful new technologies, generated by large scientific projects in many countries, will have a dramatic impact on science, medicine, and society. Coordinating these international initiatives and ensuring broad distribution of novel technologies and open accessibility of the generated data will multiply their value, while tapping creativity and expertise from every source. Neuroscience is entering a "big science" era in which powerful new technologies, generated by large scientific projects in many countries, will have a dramatic impact on science, medicine, and society. Coordinating these international initiatives and ensuring broad distribution of novel technologies and open accessibility of the generated data will multiply their value, while tapping creativity and expertise from every source. © 2017 Elsevier Inc.


Li M.,Columbia University
Nature Structural and Molecular Biology | Year: 2017

The activities of organellar ion channels are often regulated by Ca2+ and H+, which are present in high concentrations in many organelles. Here we report a structural element critical for dual Ca2+/pH regulation of TRPML1, a Ca2+-release channel crucial for endolysosomal function. TRPML1 mutations cause mucolipidosis type IV (MLIV), a severe lysosomal storage disorder characterized by neurodegeneration, mental retardation and blindness. We obtained crystal structures of the 213-residue luminal domain of human TRPML1 containing three missense MLIV-causing mutations. This domain forms a tetramer with a highly electronegative central pore formed by a novel luminal pore loop. Cysteine cross-linking and cryo-EM analyses confirmed that this architecture occurs in the full-length channel. Structure–function studies demonstrated that Ca2+ and H+ interact with the luminal pore and exert physiologically important regulation. The MLIV-causing mutations disrupt the luminal-domain structure and cause TRPML1 mislocalization. Our study reveals the structural underpinnings of TRPML1's regulation, assembly and pathogenesis. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.


Rhinn H.,Columbia University | Abeliovich A.,Columbia University
Cell Systems | Year: 2017

Human age-associated traits, such as cognitive decline, can be highly variable across the population, with some individuals exhibiting traits that are not expected at a given chronological age. Here we present differential aging (δ-aging), an unbiased method that quantifies individual variability in age-associated phenotypes within a tissue of interest, and apply this approach to the analysis of existing transcriptome-wide cerebral cortex gene expression data from several cohorts totaling 1,904 autopsied human brain samples. We subsequently performed a genome-wide association study and identified the TMEM106B and GRN gene loci, previously associated with frontotemporal dementia, as determinants of δ-aging in the cerebral cortex with genome-wide significance. TMEM106B risk variants are associated with inflammation, neuronal loss, and cognitive deficits, even in the absence of known brain disease, and their impact is highly selective for the frontal cerebral cortex of older individuals (>65 years). The methodological framework we describe can be broadly applied to the analysis of quantitative traits associated with aging or with other parameters. Rhinn et al. describe an integrative genomics approach to quantify aging rate in a tissue of interest. A subsequent GWAS analysis identifies the TMEM106B-progranulin genetic pathway as a key determinant of age-associated manifestations in the human cerebral cortex. © 2017 Elsevier Inc.


Tabas I.,Columbia University
Arteriosclerosis, Thrombosis, and Vascular Biology | Year: 2017

Atherosclerosis is initiated by the subendothelial accumulation of apoB-lipoproteins, which initiates a sterile inflammatory response dominated by monocyte-macrophages but including all classes of innate and adaptive immune cells. These inflammatory cells, together with proliferating smooth muscle cells and extracellular matrix, promote the formation of subendothelial lesions or plaques. In the vast majority of cases, these lesions do not cause serious clinical symptoms, which is due in part to a resolution-repair response that limits tissue damage. However, a deadly minority of lesions progress to the point where they can trigger acute lumenal thrombosis, which may then cause unstable angina, myocardial infarction, sudden cardiac death, or stroke. Many of these clinically dangerous lesions have hallmarks of defective inflammation resolution, including defective clearance of dead cells (efferocytosis), necrosis, a defective scar response, and decreased levels of lipid mediators of the resolution response. Efferocytosis is both an effector arm of the resolution response and an inducer of resolution mediators, and thus its defect in advanced atherosclerosis amplifies plaque progression. Preclinical causation/treatment studies have demonstrated that replacement therapy with exogenously administered resolving mediators can improve lesional efferocytosis and prevent plaque progression. Work in this area has the potential to potentiate the cardiovascular benefits of apoB-lipoprotein-lowering therapy. © 2016 American Heart Association, Inc.


Frank J.,Columbia University
Nature Protocols | Year: 2017

In single-particle cryo-electron microscopy (cryo-EM), molecules suspended in a thin aqueous layer are rapidly frozen and imaged at cryogenic temperature in the transmission electron microscope. From the random projection views, a three-dimensional image is reconstructed, enabling the structure of the molecule to be obtained. In this article I discuss technological progress over the past decade, which has, in my own field of study, culminated in the determination of ribosome structure at 2.5-Å resolution. I also discuss likely future improvements in methodology.


News Article | April 25, 2017
Site: www.accesswire.com

NEW YORK, NY / ACCESSWIRE / April 25, 2017 / Avid philanthropist, established property investor, and business adviser, Jacob Frydman proudly talks about his continuous contributions to National Committee for the Furtherance of Jewish Education (NCFJE) and its numerous charity projects. The generous donor is deeply involved with the foundation, working closely with its Orphan, Poor and Sick Fund, Released Time Program, and Toys for Hospitalized Children initiatives among many others. Rabbi Yosef Yitzchok Schneerson founded NCFJE in the midst of WWII with the principle mission of providing Jewish public school students with a free Jewish education. Shortly after its conception, the institution noticed that many of the children lived in households experiencing a variety of social and economic hardships, and implemented a multitude of educational, community outreach, and humanitarian services that still provide imperative aid to New York's citizens today. Rabbi Hannoch Hecht of the Rhinebeck Jewish Center introduced Jacob Frydman to the committee, and the businessman was immediately enthralled by their generosity, "I saw from their past work that the NCFJE has made countless positive lasting effects on individual families and the entire community." Created in 1941, the Released Time Program educates Jewish youth about the history, customs and prayers of Judaism, and has inspired more than a quarter million boys and girls in the greater New York area to be proud of their faith. Each Wednesday students are dismissed an hour early from school and transported to a nearby synagogue, where dedicated instructors create a welcoming religious atmosphere and teach the children about their heritage. The classes are free of charge, and are now available in over 125 public schools. Another longstanding NCFJE charity, Toys for Hospitalized Children, distributes over 10,000 toys and gifts to hospitals, special needs facilities, and destitute children each year. In an effort to share joy with the city's elderly as well, the 50-year project has recently expanded to servicing senior residences on an as-need basis. The Orphan, Poor and Sick Fund aids underprivileged families in accessing necessary resources through grocery and clothing vouchers, rent and utility assistance, school and camp scholarships, and weekly food disbursements. Rabbi Hecht considers Frydman's constant assistance with these initiatives as a sign of greater understanding, "He knows that the foundation of the Jewish community is the Jewish family, and he believes that by helping needy families we can all look forward to a stronger Jewish community as a whole." Jacob Frydman is a native New Yorker, real estate investor, and private equities expert. Over his 30-year career, he has structured, financed, and executed highly complex real estate transactions. He often discusses business, law, and ethics at Columbia University and in the Master's Lecturer series at New York Law School. A passionate and vocal member of the Jewish faith, Frydman has been an active supporter of the NCFJE for many years, and assists other charitable committees including The Chabad of Dutchess County and Washington, DC-based The Brem Foundation.


News Article | April 25, 2017
Site: marketersmedia.com

Jacob Frydman proudly talks about his continuous contributions to National Committee for the Furtherance of Jewish Education (NCFJE) and its numerous charity projects. The generous donor is deeply involved with the foundation, working closely with its Orphan, Poor and Sick Fund, Released Time Program, and Toys for Hospitalized Children initiatives among many others. Rabbi Yosef Yitzchok Schneerson founded NCFJE in the midst of WWII with the principle mission of providing Jewish public school students with a free Jewish education. Shortly after its conception, the institution noticed that many of the children lived in households experiencing a variety of social and economic hardships, and implemented a multitude of educational, community outreach, and humanitarian services that still provide imperative aid to New York's citizens today. Rabbi Hannoch Hecht of the Rhinebeck Jewish Center introduced Jacob Frydman to the committee, and the businessman was immediately enthralled by their generosity, "I saw from their past work that the NCFJE has made countless positive lasting effects on individual families and the entire community." Created in 1941, the Released Time Program educates Jewish youth about the history, customs and prayers of Judaism, and has inspired more than a quarter million boys and girls in the greater New York area to be proud of their faith. Each Wednesday students are dismissed an hour early from school and transported to a nearby synagogue, where dedicated instructors create a welcoming religious atmosphere and teach the children about their heritage. The classes are free of charge, and are now available in over 125 public schools. Another longstanding NCFJE charity, Toys for Hospitalized Children, distributes over 10,000 toys and gifts to hospitals, special needs facilities, and destitute children each year. In an effort to share joy with the city's elderly as well, the 50-year project has recently expanded to servicing senior residences on an as-need basis. The Orphan, Poor and Sick Fund aids underprivileged families in accessing necessary resources through grocery and clothing vouchers, rent and utility assistance, school and camp scholarships, and weekly food disbursements. Rabbi Hecht considers Frydman's constant assistance with these initiatives as a sign of greater understanding, "He knows that the foundation of the Jewish community is the Jewish family, and he believes that by helping needy families we can all look forward to a stronger Jewish community as a whole." Jacob Frydman is a native New Yorker, real estate investor, and private equities expert. Over his 30-year career, he has structured, financed, and executed highly complex real estate transactions. He often discusses business, law, and ethics at Columbia University and in the Master's Lecturer series at New York Law School. A passionate and vocal member of the Jewish faith, Frydman has been an active supporter of the NCFJE for many years, and assists other charitable committees including The Chabad of Dutchess County and Washington, DC-based The Brem Foundation.


News Article | April 17, 2017
Site: www.eurekalert.org

Escaping cycles of poverty may depend on how much a person feels he or she can rely on their local communities, according to research led by Princeton University. Published in the Proceedings of the National Academy of Sciences, the study finds that low-income individuals who trust their communities make better long-term financial decisions. This is likely because citizens rely on friends and neighbors for financial support, rather than quick fixes, like payday loans, which further indebt them. The findings show the importance of building strong communities, especially for low-income individuals. The researchers suggest moving away from a focus on low-income individuals, instead focusing on low-income communities through targeted policies. "Instead of cutting funding to community development programs, policymakers should implement changes that give individuals in low-income communities more opportunities to develop community trust," said study co-author Elke Weber, the Gerhard R. Andlinger Professor in Energy and the Environment and professor of psychology and public affairs at Princeton University's Woodrow Wilson School. In addition to Weber, the study was conducted by lead author Jon Jachimowicz, Columbia University; Salah Chafik, Columbia University; Sabeth Munrat, BRAC (an international development organization in Bangladesh); and Jaideep Prabhu, University of Cambridge. To determine why low-income individuals tend to make more myopic (or short-term) financial decisions, the researchers conducted a series of studies, focusing on both the United States and Bangladesh. In the first study, the researchers invited 647 participants from the United States to make several choices between "smaller, sooner" and "larger, later" options, taking into account participants' incomes and how much they trusted their local communities. They found that richer participants were generally less likely to make harmful short-term decisions than those with lower incomes, but that this only applied to low-income individuals who did not trust their communities. In contrast, those low-income individuals who trust their communities more made financial decisions that were very similar to those made by richer participants. "Current financial dilemmas are stressful and leave people with no option but to choose immediate solutions. Our results indicate that lower-income people are less likely to invest in the long-term because of their immediate financial needs," said Weber. "This is in line with work by Princeton's Eldar Shafir and others: that scarcity leads to harmful long-term decision-making." In the second study, the researchers evaluated "payday loans" in the United States, which carry high interest rates and exacerbate cycles of poverty among the poor. After reviewing the Federal Reserve Board's Survey of Household Economics and Decisionmaking, the researchers found that fewer payday loans were taken out in communities where levels of trust were higher. This is because individuals can rely on their communities to help with financial needs (taking out a loan from a friend, for example), instead of resorting to high-interest emergency loans, the researchers said. In the final part of the study, the researchers turned their attention to Bangladesh, where they conducted a two-year field study. Together with BRAC and The Hunger Project, a global nonprofit organization, the researchers worked with 121 of Bangladesh's smallest local government units, known as council unions. They trained community volunteers to act as intermediaries between local government and community residents. Volunteers met with members of their community and helped provide them with access to public services. Volunteers also provided guidance to government units directly. When comparing the unions with community volunteers to those without, the researchers found the two groups differed widely in their levels of community trust. Residents with community volunteers had higher levels of community trust, which also influenced their decision-making. These individuals were more likely to forgo smaller payoffs in exchange for more-profitable, delayed options. Taken together, the findings highlight the importance of building trust in low-income communities. The findings also point to the benefits of programs currently targeted for budget cuts by the Trump administration, the researchers said. "The Trump administration's preliminary federal budget for 2018 recommends eliminating the $3 billion Community Development Block Grant, a program established in 1974 to help communities address a wide range of their development needs," said Jachimowicz. "The budget blueprint reasons that the program is 'not well-targeted to the poorest populations and has not demonstrated results.' The evidence presented in our paper contradicts this claim, and suggests eliminating this line item could lead to devastating consequences, particularly for those on low incomes." The paper, "Community trust reduces myopic decisions of low-income individuals," was published online in PNAS on April 11. This research was made possible in part by a Cambridge Judge Business School small grant, the research facilities provided by the Center for Decision Sciences at Columbia University and the support of the German National Academic Foundation.


News Article | April 23, 2017
Site: motherboard.vice.com

"The liquid soluble that made up the chemistry," GZA raps on Liquid Swords. "A gaseous element, that burned down your ministry." For Dr. Christopher Emdin, rap and science are two elements that together make an elegant compound. An associate professor at Columbia University's Teachers College and New York Times bestselling author, Emdin founded Science Genius, a program for 14-18 year olds in NYC's public schools to compete in rap battles with scientific themes. It started five years ago, when Neil deGrasse Tyson introduced Emdin to Wu-Tang Clan rapper GZA. They bonded over their shared experiences of growing up geeky and loving hip-hop culture, in a society that forces many young men to choose one or the other. Emdin grew up an inquisitive kid in Brooklyn and the Bronx, but once he got to high school, the curiosity about the world his mom would praise as "like a scientist" was suddenly deemed "disruptive" or "distracted" by teachers. He abandoned STEM studies until he reached undergrad, where that curiosity and freedom of thought returned. "I was being affirmed for asking really good questions instead of having answers," he said. Then, when he started teaching, everything clicked. He saw in his students the same anti-authoritarianism and skepticism about the world that he felt as a child, and started incorporating their own culture into lessons, allowing them to learn as themselves. "We can be hood and scientific." Leaving biases behind, he believes, is the biggest challenge facing education today. "The kid who isn't reading on grade level, that doesn't mean that kid can't be a scientist," he said. "That kid who has his pants saggin' doesn't mean that kid is not deeply interested quasars." Although the gap has begun to narrow in the last six years, black and hispanic grade-school students perform behind white students in science topics, according to the 2015 National Assessment of Educational Progress. How can this country let go of the idea that science is only for people who sound, act, and look a certain way? With hip-hop and Science Genius, kids are respected among their peers for being brainy and for being able to spit bars. If they can learn to express themselves creatively with metaphor and rhymes in scientific terms, maybe they'll be more likely go on to pursue careers in STEM as adults. "We can be rachet and academic," Emdin told me. "We can be hood and scientific. To be able to push back against where people have positioned you to be, and be yourself... It's a political act, it's a necessary endeavor." The next Science Genius rap battle will be held on May 26, in the birthplace of hip-hop, the Bronx, "right in the middle of the hood," he said. "There's something so beautiful and magical about bringing science to the hood, that speaks to me." At his core, Emdin said, he's still very much that kid from Brooklyn, feeling confident in hip-hop culture but also in his passion for learning. "I've been privileged and lucky enough to see the beauty in the wonders of science, and I just want to be able to introduce that to as many people as possible. That, in a nutshell, is me." Subscribe to Science Solved It, Motherboard's new show about the greatest mysteries that were solved by science.


News Article | April 17, 2017
Site: www.wired.com

Autonomous drones, lasers, and computer rendering play increasingly vital roles in filmmaking, but what makes Liam Young’s moody, futuristic films so unusual is these technologies are not tools, but stars. The Australian architect-turned-filmmaker considers his films In the Robot Skies, Where the City Can’t See, and Renderlands Trojan horses bringing these technologies into mainstream consciousness in a positive, even creative, way. If people give, say, LIDAR, any thought, it’s probably within the context of autonomous vehicles or that lawsuit against Uber. But Young sees a compelling story. “We tell stories about what these technologies might mean,” he says. “We’re prototyping their possibilities.” The films, appearing in the exhibit “New Romance” at Columbia University’s Arthur Ross Architecture Gallery, paint a promising, or at least benign, picture of where such technology may lead humanity. But they also offer a warning about how technology can constrain, even control, society. Such themes are not of course limited to drones, the focus of In the Robot Skies. In that film, autonomous drones follow characters through bleak 70’s-era public housing over a soundtrack of demonic, industrial music. The film, shown through the drone’s point of view, tells the story of a young woman and her boyfriend who send surreptitious and illegal messages with the drones. In addition to shooting and starring in the films, the drones directed it, too. Their flight algorithms, navigation system, and facial recognition software allowed them to decide where to go and what to film. Young edited the footage into the final story, following the whims and foibles of the drones. “The technology has its own tendencies and personality,” he says. “We’re trying to see the world through their eyes.” LIDAR, the laser-scanning technology that allows autonomous vehicles to “see,” is the star of Where the City Can’t See. Young uses LIDAR to simulate the gliding view of an autonomous car as it navigates Detroit. The ghostly scenes, set to haunting, atonal digital music, show abandoned fields and urban ruins as heavily pixellated, colorful point clouds. The actors, who play teenagers trying to escape constant surveillance, wear clothing that absorbs, or reflects, the lasers’ light, creating rippling, distorted shapes and patterns as they move through the city. In Renderlands, Young creates a dark collage of live-action shots of computer renderers—their faces glowing in the light of their monitors—working in India and the CGI visions of Hollywood they’re creating. It’s a misty, neon-lit place replete with long piers, hilltop vistas, and other Southern California clichés that the workers have never seen and only imagine. The process of rendering imagined worlds becomes an important part of the film’s look, structure, and plot. Young, who once worked for renowned architect Zaha Hadid, sees his work as one possible roadmap for architects, whom he believes have seen their roles as builders of the physical world usurped by developers and engineers. He sees a future for architects guiding society into the digital world, helping people understand their physical surroundings and the technology that will dictate them. “These things are going to rule a large part of our lives,” he says. “Helping people develop relationships with them is critical.” To that end, he helped found the Masters of Fiction and Entertainment program at the Southern California Institute of Architecture to advance tech-inspired filmmaking. It already has 15 students, each of them, like Young, pondering how technology might tell, and star in, stories.


A win-win approach to curbing climate change could be capturing carbon dioxide from the atmosphere and converting it to valuable products. A study presented at the American Chemical Society national meeting in San Francisco on Monday may help advance that effort by revealing mechanistic details of a catalytic process that converts CO to a commodity chemical—methanol. On industrial scales, catalysts composed of copper and zinc oxide supported on alumina hydrogenate carbon monoxide and CO to methanol. But these catalysts have shortcomings, according to Ping Liu, a chemist at Brookhaven National Laboratory. Speaking at a symposium sponsored by the Division of Energy and Fuels, Liu pointed out that the Cu-ZnO catalysts are not very efficient or selective in producing methanol. These reactions also require high temperatures and high pressures of the reactant gases. What’s more, she said, chemical details of the active catalytic site remain elusive. That information could be the key to designing catalysts with improved energy and chemical efficiency, Liu says. In an ongoing debate regarding the catalyst’s active site, various researchers have argued that highly active Zn-Cu alloy species are the key catalytic players. In contrast, Liu’s new work suggests that the action occurs at the atomic interface between ZnO and Cu (Science 2017, DOI: 10.1126/science.aal3573). To reach that conclusion, Liu, Brookhaven colleagues José A. Rodriguez and Shyam Kattel, and Columbia University’s Jingguang Chen, prepared several types of Cu and ZnO reference catalysts, including one made of zinc nanoparticles deposited on copper, and another with ZnO nanoparticles on copper. They analyzed and directly compared the CO -to-methanol chemistry of all the catalysts using synchrotron-based photoelectron spectroscopy and computational methods. The computations predicted that Cu-ZnO surface species should be the most reactive form of the catalyst. They also predicted that the Zn-Cu species shouldn’t remain stable under reaction conditions. Instead, it should react with oxygen and form copper zinc oxide. And that’s exactly what Liu and coworkers found in the lab. Now the group aims to use that information to optimize the interface between ZnO and Cu to improve the catalysts. “This is a highly important study with excellent quality data and supporting theoretical calculations,” said Charles T. Campbell, a catalysis specialist at the University of Washington, Seattle. CO hydrogenation to methanol is one of the most likely pathways for converting the greenhouse gas to a valuable product, Campbell asserted. He added that this study should help improve that catalytic process.


News Article | April 17, 2017
Site: www.scientificamerican.com

Pres. Donald Trump issued a major executive order last week that, if successful, could undercut the nation’s fight against global warming. In particular, the order kicks off an attempt to dismantle the Clean Power Plan, which regulates carbon emissions from the power sector. While Trump’s move represents a big blow to U.S. climate efforts, the renowned scientist James Hansen sees a different—and, he argues, better—way forward on global warming. “The problem is the Clean Power Plan is really not that effective,” says Hansen, former director of NASA Goddard Institute for Space Studies and adjunct professor at Columbia University’s Earth Institute, who brought climate change to the U.S. public’s attention in his famed 1988 congressional testimony. “It’s a tragedy that [the Obama administration] continued to pursue a regulatory approach.” The solution Hansen believes will work best is one recently advocated by a group of Republican statesmen: a “carbon fee and dividend.” Although it is not a tax, the approach would put a price on carbon—a step Hansen thinks is absolutely essential for cutting back greenhouse gas emissions. Hansen, who has been called the father of climate change awareness, recently spoke about the issue along with Earth Institute director Jeffrey Sachs, a leading expert on economic development, at the New York Society for Ethical Culture. Scientific American followed up with Hansen, also director of the Climate Science, Awareness and Solutions program at Columbia, to discuss this strategy and how he thinks it will help the U.S. turn the tide on global warming. [An edited transcript of the interview follows.] What’s the United States’ best hope for solving climate change at this point? The only effective way of addressing climate change is to make the price of fossil fuels include their cost to society. That could be done in a simple way by collecting a fee from the fossil fuel companies that would gradually rise over time—a carbon fee and dividend. Studies show this would benefit the economy and this is a conservative approach, where you let the market move you toward a better situation. I call it a carbon fee because you would give all of the money to the public, a dividend to each legal resident. [A group of Republicans] have adopted [this approach] almost precisely as I proposed it in 2008. The starting level of the fee varies from one proposition to another—I believe that they start at $40 per ton of carbon. [I] suggest $55 per ton—[that price] yields a dividend of $1,000 per legal resident and $3,000 for a family with two or more children, with one half-share for each child [and] a maximum of two half-shares per family. This way it actually stimulates the economy. If it’s a tax taken by the government, it makes the government bigger and it depresses the economy. That’s why I object to the Democrats as much as to the Republicans. The only way the public will allow a carbon fee is if you give the money to them—people don’t want to see the price of gasoline at the pump going up. That’s what’s frustrating about this problem—the fact that there’s a solution, which is not difficult and not economically harmful. It would be remarkable if the Trump administration would actually understand this and realize that it would be popular. It would work, unlike some of the things that Trump is advocating. What is the number-one action the U.S. could take to reduce its emissions, without the federal government? Unless you get a fee on carbon, you cannot solve the problem. As long as fossil fuels appear to be cheap energy, they’re going to keep being burned by somebody. So ultimately the solution has got to involve the government. You view nuclear energy as an integral part of addressing climate change—why? Nuclear energy—even in its current sad state—is doing a lot to reduce carbon emissions and deaths and illnesses from pollution. There’s no way countries like China and India are going to phase out their coal use without the help of advanced nuclear power. The safety record of nuclear power is actually very impressive. We should have developed the technology of advanced nuclear power but the bias against nuclear has been so strong that the industry has not developed. It’s still not too late because there are a lot of innovative start-up companies out there—but these need to be encouraged. You’ve been focusing your energy on helping people understand the urgency of global warming. Are you hopeful that the public will demand major action from the government soon? Climate change is not going to register on the public’s list of priorities, so we need the help of an intelligent government system. Even though the fossil fuel industry money has been able to distort the climate science in Congress, the judicial branch can come into play. That’s why I’m a plaintiff along with 21 young people in a lawsuit against the federal government [suing it for having taken—and continuing to take—actions that support fossil fuel production and create greenhouse gas emissions].* We now have a really bulletproof case, which I think will win even with a conservative Supreme Court. It’s going to be a combination of using the judiciary branch of the government and then using the democratic process to shape the policy that’s accepted. Between those two, I’m optimistic we could get on a path that would then influence the world. So then is communicating with the public even useful? This is somewhat analogous to civil rights—the courts did not force the government to carry out policies to end segregation until the public began to make an issue of it. Courts don’t often move in front of public opinion, so it is important to try to get public pressure. How should climate scientists—both federal government researchers and outside scientists—react to the Trump presidency? We have to use the scientific method and facts to make it clear that we’re being objective, and that there’s nothing political about the science. Scientists should stick to trying to explain the science as clearly as possible. Given the president's stance on global warming, are you concerned about climate scientists’ ability to communicate with the public? I’m very concerned about their inability to communicate with the public, but that’s nothing new with Trump. That problem has come about over the last decade or two, because of the political preference of those politicians who support the fossil fuel industry—they’ve found that an extremely effective technique is simply to deny the science or politicize it, or make it appear that scientists have an agenda. It’s made it difficult for science to provide effective advice to the government. Why is it important that climate scientists be able to openly communicate with the public about climate change? We have to make this situation clear to the public. The public still does not treat this as a high-priority issue, while in fact it should be near the top of the list. It’s a difficult story to communicate to the public because you just don’t see that much happening—the fact that the climate system has a delayed response is what makes this whole thing so dangerous. You might think the great inertia of the ocean and the ice sheets is our friend because we’ve seen a relatively slow response so far. But it’s very clear in the science that we’re building in bigger changes in the future, so there’s a danger of handing young people a system that’s out of their control. We’re setting up a situation that’s extremely dangerous. That’s just crystal clear in the science. *Editor's Note (4/10/17): This sentence has been updated with additional information since its original posting.


News Article | March 14, 2017
Site: www.techtimes.com

Hollywood and science fiction have repeatedly tackled the idea of worlds below the surface of our planet. If an expanding Eastern Siberian crater is taken as a reference point, then perhaps a portal to a subterranean world may have been found. A doorway to hell — as others have labeled it. But while there are those like the local folk who believe that the crater may be an entrance to the underworld, to climate scientists, it is an opportunity to view more than 200,000 years of climate change in Siberia. In a study published in the journal Quaternary Research in February, scientists reported the Batagaika Crater to be a kilometer (0.62 mile) long and 86 meters (328 feet) deep. Batagaika is located about 6 miles southeast of the town of Batagay, in the Verkhoyansk region of northern Yakutia. The Batagaika Crater, which first formed in the 1960s after a large piece of forest was cleared, is the result of melting permafrost. The soil stays frozen for two years in a row or longer. The jagged terrain is known as thermokarst. Seen from the air, the "megaslump" closely resembles a one-celled organism with a tail structure that could propel it through an aquatic environment. In a way, such projected movement may have basis. Satellite imagery indicates that the size of the depression is estimated to grow up to 15 meters annually. Scientists who seek to understand how climate change impacts the permafrost have been optimistic at how much data the partially manmade phenomenon could bring to the table, which could provide insights into the climate. An earlier expedition gathered samples of plants and soil, and tried to identify the age of the layers of soil that had been frozen in the permafrost. According to Professor Julian Murton of the University of Sussex, the project will allow scientists to compare the data of similar objects in different parts of the planet such as those in Greenland, China, and Antarctica. Data on ancient soils and vegetation taken from collected soil sediments will help in reconstructing the history of the planet. Rapid deforestation and greenhouse gases have been identified to contribute to global warming. In Batagaika, there is a phenomenon called "drunken trees," which refers to trees that stoop. The permafrost ends up melting much faster and erosion takes place much sooner because there is less shade from them. Scientists warn that thermokarst may also cause warming temperatures in the future. According to a 2016 study in the journal Nature Communications, the climate experienced a substantial increase in temperatures from greenhouse gases from the permafrost in the recent Ice Age, an event that could be repeated in the future. After the study's release, co-author Francesco Muschitiello told Columbia University's science blog that "the Arctic carbon reservoir locked in the Siberian permafrost has the potential to lead to massive emissions of the greenhouse gases carbon dioxide and methane to the atmosphere." The Batagaika area is one of the coldest inhabited places on Earth where the remains of mummified ancient bison, horses, elks, mammoths, and reindeer — with some estimated to be more than 4,400 years old — have been found. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | March 15, 2017
Site: www.techtimes.com

A research performed by the scientists at the Mailman School of Public Health at Columbia University reveals that B-vitamin supplements may come in handy to reduce the effects of air pollution. B-vitamin supplements are described as vitamins, which are water soluble and play an essential role in cell metabolism. This study shows the benefits of B vitamins and how it can be effective in decreasing the harmful effects of air pollution on the epigenome. It also showcases how the health is harmed due to the epigenetic effects of air pollution. Reports reveal about the ways through which an individual can safeguard against the adverse the effects of air pollution by taking certain preventive measures. "Our study launches a line of research for developing preventive interventions to minimize the adverse effects of air pollution on potential mechanistic markers. Because of the central role of epigenetic modifications in mediating environmental effects, our findings could very possibly be extended to other toxicants and environmental diseases," said Andrea Baccarelli, Chairman at the Environmental Health Sciences in Mailman School. During the study, the investigators administered one B-vitamin supplement or placebo daily to the participants of the research. The volunteers who took part in this study were between 18 and 60 years and all of them were healthy. All were non-smokers and were not taking any medication before the research was conducted. The results which were obtained from the measurements that were taken before the intake of the supplement, as well as after, revealed that the median plasma absorption increased among the people who took the B vitamins. It also increased the levels of folic acid, vitamin B12 and vitamin B6. The people who received the placebo for the duration of four weeks had almost the same median plasma absorption. All the measurement was taken at the same time during the day. The ambient components were taken from an area situated next to the populated street in Toronto where at least 1000 automobiles passed through every hour. These components were transported through an oxygen mask and the samples of blood were gathered and measured by making use of an Infinium Human Methylation 450K BeadChip. Baccarelli states that regulating emission control is the most essential part of prevention. Unfortunately, rules are not that strict in other major cities of the world. Further study is needed to record the effects of B vitamins on the adverse conditions rising from air pollution. These studies may lead to the usage of B-vitamin supplements in controlling the harmful aspects of air pollution. The study has been published in the journal PNAS © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 25, 2017
Site: www.eurekalert.org

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer's disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer's and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is "turned down" at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to "speak in unison" are disrupted, resulting in a failure of memory. "These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer's disease," says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper's senior author. "The key point here is that it's the combination of amyloid and low NPTX2 that leads to cognitive failure." Since the 1990s, Worley's group has been studying a set of genes known as "immediate early genes," so called because they're activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen "circuits" in the brain. "Those connections are essential for the brain to establish synchronized groups of 'circuits' in response to experiences," says Worley. "Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information." Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer's. Worley's group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn't enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain "rhythms" important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic -- one depending on the other for the effect -- it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers -- including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. "Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease. This creates many new opportunities," says Worley. "One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies." Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study's authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging's Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer's Disease Discovery Foundation and Lumind.


News Article | April 26, 2017
Site: www.eurekalert.org

Dr. Abbas Ardehali, a professor of surgery and medicine in the division of cardiothoracic surgery at the David Geffen School of Medicine at UCLA, has been selected a 2017 recipient of the Ellis Island Medal of Honor by the National Ethnic Coalition of Organizations. Ardehali will receive the award at ceremony on May 13 at historic Ellis Island in New York City. The medals are awarded annually to a group of distinguished U.S. citizens who exemplify a life dedicated to community service. These are individuals who preserve and celebrate the history, traditions and values of their ancestry while exemplifying the values of the American way of life, and who are dedicated to creating a better world. Since it was established in 1986, the Ellis Island Medal has been officially recognized by both Houses of Congress as one of the nation's most prestigious awards. Past recipients have included six U.S. presidents, former Secretary of State Hillary Clinton, as well as such notables as Frank Sinatra, Lee Iacocca, Quincy Jones, Muhammad Ali, Nobel laureate Elie Wiesel, Louis Zamperini and Rosa Parks. "I was surprised and honored to be informed that I was selected as a 2017 Ellis Island Medal of Honor awardee," said Ardehali. "I really think this recognition is a reflection of the accomplishments that our UCLA heart and lung transplant teams have achieved together." Ardehali serves as director of the UCLA Heart and Lung Transplant program, which was ranked as the largest combined heart and lung transplant program in the United States in 2016 by the United Network of Organ Sharing. Ardehali and his colleagues have been leaders in implementing new technologies to advance the field of heart and lung transplantation and have the depth of experience to take many of the more complex cases that other transplant centers are unable to accept. His professional accomplishments include a role in developing an innovative technology for transporting human heart and lungs in a beating or breathing state. The technology could help to improve clinical outcomes and expand the donor pool of organs to help patients. He also developed and patented a new technology that will improve the care of patients with end-stage lung disease. Ardehali has served as a volunteer on several committees for the United Network for Organ Sharing and several scientific organizations, with leadership positions in the International Society of Heart and Lung Transplantation, American Society of Transplant Surgeons, American Association of Thoracic Surgery and Society of Thoracic Surgeons. Among his many honors and awards, Ardehali received a Resolution of Commendation by the California State Assembly and the the Breath of Life Award from the Cystic Fibrosis Foundation. Ardehali has been a faculty member at UCLA since 1997. He also served as chief of cardiothoracic surgery at West Los Angeles Veterans Hospital from 1998 to 2012. He co-authored a textbook, "Khonsari's Cardiac Surgery: Safeguards and Pitfalls in Operative Technique," published in 2016. He has authored numerous book chapters and more than 100 peer-reviewed manuscripts and abstracts. Ardehali has been interviewed by ABC News, the Associated Press, CNN, Fox News, NBC News, CBS News, "The Doctor's Show," and Al Jazeera America. He completed his fellowship in cardiothoracic surgery at UCLA and his internal medicine residency at UC San Francisco. He earned a master's degree in public health at UC Berkeley; a medical degree at Emory University School of Medicine; and both a master's degree in chemical and biochemical engineering and an undergraduate degree in biology and biochemistry, both from Rutgers University. Born in Tehran, Iran, Ardehali moved to the United States when he was in high school. He and his wife, Mitra, who is a practicing dentist, have two daughters, Leila and Sara, currently attending Barnard College and Columbia University.


News Article | April 17, 2017
Site: www.newscientist.com

The firing of every neuron in an animal’s body has been recorded, live. The breakthrough in imaging the nervous system of a hydra – a tiny, transparent creature related to jellyfish – as it twitches and moves has provided insights into how such simple animals control their behaviour. Similar techniques might one day help us get a deeper understanding of how our own brains work. “This could be important not just for the human brain but for neuroscience in general,” says Rafael Yuste at Columbia University in New York City. Instead of a brain, hydra have the most basic nervous system in nature, a nerve net in which neurons spread throughout its body. Even so, researchers still know almost nothing about how the hydra’s few thousand neurons interact to create behaviour. To find out, Yuste and colleague Christophe Dupre genetically modified hydra so that their neurons glowed in the presence of calcium. Since calcium ions rise in concentration when neurons are active and fire a signal, Yuste and Dupre were able to relate behaviour to  activity in glowing circuits of neurons. For example, a circuit that seems to be involved in digestion in the hydra’s stomach-like cavity became active whenever the animal opened its mouth to feed. This circuit may be an ancestor of our gut nervous system, the pair suggest. A second circuit fires when the hydra contracts its body into a ball to hide from predators. A third seems to sense light and may help let the animal know when to eat – despite being blind, hydra need light to hunt and they do more of this in the morning. The team found that no neuron was a member of more than one circuit. This suggests the animal has evolved distinct networks for each reflex – a primitive arrangement, much less complex than our own interconnected nervous systems. Nevertheless, the hydra is the first step towards breaking the neural code – the way that neural activity determines behaviour, says Yuste. “Hydra have the simplest ‘brain’ in the history of the earth, so we might have a shot at understanding those first and then applying those lessons to more complicated brains,” he says. Yuste hopes that seeing how the circuits work in real time might lead to new insights into the human brain and tell us more about mental illnesses such as schizophrenia, for example. “We cannot cure patients until we know how the system works,” he says. Yuste was one of several neuroscientists, including George Church at Harvard University, who launched the Brain Activity Map Project in 2012. It was a rallying cry to neuroscientists, calling on them to record the activity of every neuron in the human brain. The project forms the central plank of the billion-dollar BRAIN Initiative launched by President Obama’s administration in 2013. The hydra is now the first animal to have one of these maps created for the whole body, although the activity of the whole brains of zebrafish have also been mapped in a similar way. The work is an “awesome milestone worth celebrating”, says Church. But scaling this up to rodents or primates will be very challenging, he says. Dale Purves, a neuroscientist at the Duke Institute for Brain Sciences, North Carolina, doubts if the animal will prove useful for understanding ourselves. “You have to ask: is this an animal that’s going to join the fruit fly, worm and mouse as a model organism to look at in the quest to better understand the nervous system?” he says. “My answer would unfortunately be no.” But Yuste is now collaborating with seven other teams to decipher the hydra’s neural code. They want to get such a complete understanding of the way its neurons fire that they can use a computational model to predict its behaviour just from its neural activity. “One of our dreams is to get to the point in neuroscience that genetics got to when they figured out the DNA double helix,” says Yuste. While some have suggested that the brain is too complicated for that, Yuste is optimistic. “I hope it will happen in our lifetime and it will be an aha moment when the jigsaw puzzle comes together,” he says. Read more: “A brief history of the brain” Our brains followed a twisting path of development through creatures that swam, crawled and walked the earth long before we did. Here are a few of these animals, and how they helped make us what we are.


News Article | April 19, 2017
Site: news.yahoo.com

Cutaway of Antarctica with data on the glaciers and ice shelf (AFP Photo/Sophie RAMIS, Thomas SAINT-CRICQ) Paris (AFP) - Antarctic meltwater lakes are far more common than once thought and could destabilise glaciers, potentially lifting sea levels by metres as global warming sets in, scientists said Wednesday. Most vulnerable are the massive, floating ice shelves that ring the Antarctic continent and help prevent inland glaciers from sliding toward the sea, they reported in the journal Nature. Antarctica holds enough frozen water to push up global oceans by tens of metres. Meltwater pooling on the surface of ice shelves can suddenly drain below the surface, fracturing the ice with heat and pressure, studies have shown. "This is widespread now, and has been going on for decades," said lead author Jonathan Kingslake, a glaciologist at Columbia University's Lamont-Doherty Earth Observatory. "Most polar scientists have considered water moving across the surface of Antarctica to be extremely rare -- but we found a lot of it over very large areas," he said in a statement. To piece together a "big picture", Kingslake and his team combed through thousands of photos taken from military aircraft starting in 1947, along with satellite images dating back to 1973. They catalogued nearly 700 distinct networks of interconnected ponds, channels and streams criss-crossing the continent. A few reached to within 600 kilometres (375 miles) of the South Pole at altitudes topping 1,300 metres (4,300 feet), where liquid water was assumed to be rare or nonexistent. Rising temperatures are eroding ice shelves -- which can be hundreds of metres thick and extend hundreds of kilometres over ocean water -- on two fronts, scientists say. From above, warmer air and shifting winds remove snow cover, exposing the bedrock ice underneath. Because ice has a darker, blueish tint, it absorbs more of the Sun's radiation rather than reflecting it back into space. But the main damage to ice shelves comes from ocean water eroding their underbellies. Normally, that erosion is compensated by the accumulation of fresh snow and ice from above. But oceans in recent decades have absorbed much of the excess heat generated by global warming, which has lifted average global air temperatures by one degree Celsius (1.8 degrees Fahrenheit) since the mid-19th century. Temperatures in Earth's polar regions have risen twice as fast during the same period. On the Antarctic Peninsula -- which juts north toward South America -- they have shot up by 3.5 C (6.3 F) in just the last 50 years. Indeed, in a dress rehearsal of what might happen elsewhere, large chunks of the peninsula's Larsen Ice Shelf fell dramatically into the ocean within days in 1995 and 2002 -- due in large part to the impact of pooling waters, scientists now believe. Another huge piece of the same ice shelf, half the size of Jamaica, is hanging by a thread and could break off at any moment, scientists monitoring the future iceberg have said. "This study tells us that there is already a lot more melting going on than we thought," said Robin Bell, a polar scientists at the same institute and lead author of a second study, also published in Nature, on Antarctic meltwater. Bell and colleagues looked at the movement of water on the surface of Nansen Ice Shelf, also part of the Antarctica peninsula, and found that its drainage system may in fact help relieve pressure. The elaborate, river-like system on the 50-kilometre (30-mile) long shelf was first observed more than a century ago, but recent aerial images and remote sensing show that it has remained remarkably stable, the study found. During the southern hemisphere summer, the meltwater is efficiently drained through sinkholes and a "roaring 400-foot-wide waterfall into the ocean," Bell said. Taken together, the two studies outline diverging scenarios of how the icy continent might respond to global warming and an increase in meltwater, the authors said.


News Article | April 25, 2017
Site: www.eurekalert.org

A new study indicates that the number of plant and animal species at risk of extinction may be considerably higher than previously thought. A team of researchers, however, believe they've come up with a formula that will help paint a more accurate picture. The study appears in the journal Biological Conservation. The maps describing species' geographic ranges, which are used by the International Union for Conservation of Nature (IUCN) to determine threat status, appear to systematically overestimate the size of the habitat in which species can thrive, said Don Melnick, senior investigator on the study and the Thomas Hunt Morgan Professor of Conservation Biology in the Department of Ecology, Evolution and Environmental Biology (E3B) at Columbia University. "Concerned about this issue, we aimed to determine how far off those maps were. In doing so, we found there is an enormous amount of freely available data on many species around the world that can be employed to get a better picture of exactly how many species are truly under extreme threat. This picture, grim as it may be, is necessary if we are going to accurately plan the steps needed to stem those threats, locally and globally." Currently, IUCN makes use of species sightings reported by experts to draw boundaries reflecting the geographic range of a given species. From these maps, the IUCN develops its Red List, which assigns a threat status to wild species: Vulnerable, Endangered, or Critically Endangered. Though the accuracy of threat risk assigned to a species relies heavily on these maps, Melnick and his colleagues believe they almost always overestimate the actual distribution of a species by incorporating areas of unsuitable habitat. This overestimation of range size, in turn, leads to a significant overestimation of population size and therefore an underestimation of extinction risk. In an effort to determine how exaggerated the IUCN range maps might be, the team analyzed the maps established for 18 endemic bird species with varying IUCN-assigned extinction threat levels inhabiting the Western Ghats mountain chain of southwest India. Melnick's student, Vijay Ramesh, and two other researchers from India studying in the United States, pored over data from the world's largest citizen science database (eBird), and also gathered freely available and geo-referenced data on the climate, vegetation, ecology, and geo-physical attributes of the Western Ghats. The team then used local experts to sift through those data and verify their accuracy. By bringing together carefully curated citizen science data on the sightings of each species with the other data types, they were able to build a profile of where each species is likely to be found - at what elevation, at what temperature range, in what types of vegetation, etc. This allowed them to estimate new geographic ranges for each species that they believe are much more accurate than the IUCN range maps. The new range estimates from the Columbia study revealed that the IUCN maps for 17 of the 18 bird species contained large areas of unsuitable habitat and vastly overestimated their ranges. By extension, the threat levels which are correlated to species range size are probably underestimated, Melnick said, and the study suggests that IUCN threat status for at least 10 of the 18 species should be elevated. "We were extremely surprised by how much the IUCN ranges overestimated what we deem the true ranges to be," he added. "In a number of cases the ranges were overestimated by an order of magnitude. The drastic reduction in range size and the increased habitat fragmentation that our study indicates leads us to infer that there is a much greater threat to these endemic birds than was ever imagined." The study points to a new way of estimating species ranges for conservation purposes, Melnick said, adding that the use of freely available, digitized, and geo-referenced citizen science data, along with biological and geophysical data, and sophisticated statistical modeling can and should be applied to plant and animal species around the globe so that IUCN can more accurately assess the threat to species worldwide. "IUCN's criteria for establishing threat levels for species are excellent; however, the data to which those criteria are being applied need to be updated using an approach like the one we have developed for the Western Ghats," Melnick said. "By using citizen science data in a careful way, we may find there is an urgent need to start protecting species we thought were flourishing but are actually in danger of spiraling toward extinction."


News Article | April 27, 2017
Site: www.rdmag.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study’s authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging’s Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer’s Disease Discovery Foundation and Lumind.


News Article | May 2, 2017
Site: news.yahoo.com

Donald Trump and Russian President Vladimir Putin have agreed to work together to make diplomatic progress on the threat posed by North Korea, following weeks of escalating tensions between the state and the US. The White House said in a short statement on the call the two leaders spoke about the best way to resolve the "very dangerous" situation in North Korea. The Kremlin said that Mr Putin and Mr Trump have agreed during a phone call to try to schedule a face-to-face meeting in Germany in July - around the G20 summit in Hamburg. Merkel raises concerns with Putin over Chechnya 'torture' of gay men Matthew Wallin, Senior Fellow at American Security Project said: "Russia’s influence will be helpful [on North Korea], but not nearly as much as China’s cooperation." He said North Korean leader Kim Jong-Un needs to see that having functioning nuclear weapons is a bigger threat to the country's existence "than the threat of an American or South Korean invasion without them." "Like China, Russia sees North Korea as a useful check on American power in the region", but Mr Wallin said it was "unclear" if either Mr Putin or Chinese President Xi Jinping are willing to put pressure on Mr Kim. Moscow described the call as "business-like and constructive" with the two presidents also discussing the crisis in Syria. The RIA news agency, citing the Kremlin, said that Mr Trump and Mr Putin would look to step up contact between US Secretary of State Rex Tillerson and Russian Foreign Minister Sergei Lavrov with the aim of intensifying diplomatic efforts over Syria. Both statements are fairly "generic" on the hot-button issue according to Dr. Steve Sestanovich, who is a professor at Columbia University and a Senior Fellow at the Council on Foreign Relations. Mr Sestanovich, also a former ambassador-at-large at the State Department, told The Independent that "the entire chemical weapons controversy is ignored [in the statements] - no UN Security Council resolution, no international investigation, no warnings against further use, no reference to any of the parties in the Syrian civil war." The White House said that the conversation included an agreement that "all parties must do all they can to end the violence" in Syria, which is in the sixth year of a civil war that has also helped fuel the rise of terror group Isis. However, Mr Wallin said it is "very unlikely we’ll see significant action on ending the suffering in that country" as a result of this call. The White House noted the Syria conversation was "very good," where the two leaders discussed the creation of safe zones in the country and agreed that the suffering in Syria “has gone on for far too long.” Mr Wallin explained that "the creation of safe zones without a plan to resolve the core issues of the conflict also risks perpetuating those safe zones." He said it would create "permanent refugee camps with no sustainable economy, and likely full of hunger, suffering, and recruiting opportunities for terrorists." This is the first known discussion between the leaders since the US missile strikes against a Syrian government air base, however Mr Sestanovich said there was nothing in either side's statements that "suggests progress."


News Article | April 18, 2017
Site: www.fastcompany.com

Around seven years ago, Brian Kateman was eating a hamburger on a plane as he flew to a conference where he was presenting research on tree ring data and climate change that he had conducted for a college class. “I was always the guy on campus who identified as an environmentalist, telling people to take shorter showers and carry around reusable water bottles,” Kateman tells Fast Company. But until his friend looked over, saw Kateman chowing down on ground beef while poring over notes on the declining state of our planet, and tossed him The Ethics of What We Eat —Peter Singer’s seminal book that explores the impact our food choices have on animals, ourselves, and the environment—Kateman never made the connection between meat consumption and climate change . That moment, Kateman says, began a real shift for him. Learning that large-scale meat production accounts for around 14.5% of global greenhouse gas emissions, Kateman became a vegetarian shortly thereafter. But the strictures of a completely meat-free life chafed at him. One piece of turkey at Thanksgiving, Kateman reasoned, would not dig a deep enough carbon footprint to negate the benefits of every other meat-free meal he consumed. With the idea that any variety of meat reduction—whether it be veganism, vegetarianism, or just deciding to cut out meat one day a week—still benefits the planet, Kateman founded the Reducetarian Foundation in 2014 while studying conservation biology at Columbia University. In The Reducetarian Solution, a new anthology edited by Kateman, thought leaders from Singer to economist Jeff Sachs to environmentalist Bill McKibben sound off on the reasons why less meat is a good thing for humans and the planet we inhabit—and why it’s more important to focus on gradual cutbacks and their benefits than forcing yourself into a category like vegan or “flexitarian,” where the focus might drift more toward obeying a set of rules than focusing on a specific global outcome. The reasons, according to The Reducetarian Solution, are legion. The anthology contains no less than 72 short essays, organized into three overarching categories of mind, body, and planet that traverse every possible argument for meat reduction, from the moral-ethical (large-scale livestock operations expose animals to inhumane conditions), to the health and productivity focused (red meat is linked to sluggishness, heart disease, and cancer), to the environmental (meat production pollutes the air and is an inefficient use of resources). “I love the idea of all these different thought leaders coming together and being united on a single front,” Kateman says. “We don’t have to agree on everything. We don’t have to agree on what the ideal reduction is; we don’t have to agree on the most important cause areas. But we all agree that reducing societal consumption of animal products absolutely has to happen. And we’ll reach that common goal much faster if we work together than continuing to work in silos.” Reducetarianism, Kateman says, differs crucially from categories like “flexitarian” and “semi-vegetarian” because while the latter describe people who primarily consume plant-based diets and occasionally “cheat” on their commitments (a concept for which Kateman has little patience—punishing yourself for taking the occasional bite of burger distracts from the fact that eating less meat overall is still a net positive), reducetarianism aims for inclusivity, and an acknowledgement, as Kateman writes in the anthology, “that people are at different stages of willingness and commitment to eating less meat.” The most strident vegans and vegetarians, Kateman says, advocate for a complete end to global meat consumption. Kateman recognizes that at least for the time being, that is an impossible request for various reasons, not the least of them being the cultural inertia of meat consumption: An especially fascinating essay by author Anastacia Marx de Salcedo navigates the influence of the military on linking carnivorousness with masculinity. And for some people, it’s a matter of habit. For an example of this, Kateman looks to his parents, whom he estimates eat around 200 pounds each of meat per year, shy of the 270 pounds the average American puts away each year, but still 100 pounds over the global average. “If you think about it, getting people like my parents to cut back 20%—for say, a reduction of 40 pounds per year—that’s actually a greater win for the planet than getting someone who eats maybe five pounds per year to go completely vegetarian,” Kateman says. “I think we have to think about this in terms of societal consumption of animal products rather than pinning it on one particular person.” While the majority of the essays in the book are written through the lens of the U.S., the philosophies have a global application. China now consumes over a quarter of the world’s meat, despite new governmental dietary restrictions aiming to cut the country’s meat consumption in half by 2030 (and reduce associated greenhouse gas emissions by 1 billion tons). One could much more easily see that goal being accomplished if everyone rolls back their individual consumption than if half of China spontaneously goes vegetarian. Even for those well-versed in the arguments in favor of the consumption of meat reduction, like Kateman, The Reducetarian Solution will likely illuminate a previously unconsidered angle. For Kateman, that happened when he read the essay by Dawn Moncrief, the founding director of A Well-Fed World, an organization aiming to tackle world hunger by cutting back on animal consumption. Moncrief’s essay slices into the livestock industry’s crop-use inefficiencies, describing how 25 calories in the form of feed are needed to produce one calorie of edible beef. “Just imagine if you were at a restaurant and saw someone present you with 25 plates of food, then they threw out 24 of those plates and you were left with a single plate,” Kateman says. “The restaurant would be in an uproar, but that’s what happens every day on factory farms in terms of having to feed these animals.” It’s a difficult concept to grasp, Kateman says, because we are so removed from that process, but reading Moncrief’s essay brought the inefficiencies into sharp relief in the context of global hunger. The Reducetarian Solution anthology, Kateman says, is neither a prescription nor a fix for livestock-industry produced climate change. But the variety of approaches and perspectives contained in the book testify to how open the avenues are for bringing about this kind of change. In late May, a month after the release of the anthology, the Reducetarian Foundation will host its first summit in New York City to tackle the question of, “How do we as individuals, organizations, communities, and societies work to systematically decrease meat consumption?”


News Article | April 4, 2017
Site: www.medicalnewstoday.com

Chronic fatigue syndrome (CFS), also sometimes referred to as myalgic encephalomyelitis (ME), affects more than 1 million people in the United States. The disease is usually most prevalent in women in their 40s and 50s, with CFS being four times more frequent in women than in men. Symptoms include joint pain, painful lymph nodes, having trouble sleeping, and headaches, as well as difficulty concentrating and remembering things. Medical professionals do not yet know what causes the disease. CFS is difficult to identify as there is no test for it, and because it shares some of its symptoms with other illnesses. However, new research investigates the biological basis for the illness and identifies two subgroups of CFS that go on to develop differently: the so-called classical CFS and an "atypical" variant. The study was carried out by researchers at the Center for Infection and Immunity (CII) at Columbia University's Mailman School of Public Health in New York, and it was led by Dr. Mady Hornig, director of translational research at CII and associate professor of epidemiology at the university. The results were published in the journal Translational Psychiatry. Hornig and team performed immunoassays to measure 51 immune biomarkers in the cerebrospinal fluid of 32 people with classical CFS, and another 27 with atypical CFS. The tests showed lower levels of immune molecules in those with atypical CFS than in those with the classical variant. The analyses revealed drastically lower levels of interleukin 7 (a protein that plays a key role in the adaptive immune response to infections), interleukin 17A, and chemokine ligand 9 (molecules with a key role in the adaptive immunity to neurological illnesses). Additionally, these biological features were accompanied by different trajectories of disease history and comorbidities. Those with atypical CFS tended to have a history of viral encephalitis and tended to fall ill after traveling abroad or receiving a blood transfusion. Furthermore, people with atypical CFS went on to develop simultaneous conditions such as seizure disorders, several types of cancers, or demyelinating disorders - that is, multiple sclerosis-like diseases that damage myelin, the protective sheath around the nerve cells in our brains and spinal cords. The senior author of the study and director of CII, Dr. Ian Lipkin, also explains the contribution of the findings: "Multiple biological pathways are likely involved in the pathogenesis of ME/CFS, with a range of clinical subtypes relating to variability in the types of environmental triggers, genetic and epigenetic vulnerability, as well as comorbidity patterns," he says. "Shedding light on these pathways may help us to identify the various agents that precipitate disease as well as to design more precise, targeted treatments." Overall, both atypical and classical CFS patients were revealed to have an abnormal immune system when compared with the general population. However, only people with classical CFS displayed the previously discovered 3-year mark of CFS - namely, after 3 years of having an "overzealous" immune system, CFS patients show signs of immune "exhaustion," with dramatic drops in their levels of immune molecules. In this new study, only those with classical CFS had this drop in immune molecules after 3 years, whereas those with atypical CFS displayed steady or increased levels of cytokines and chemokines - proteins that control the development and activation of immune cells. Study co-author Dr. Daniel L. Peterson, principal clinician at Sierra Internal Medicine in Incline Village, NV, comments on the significance of these findings: "Early identification of patients who meet the usual clinical criteria when first diagnosed but then go on to develop atypical features would help clinicians like myself identify and treat these complex cases and even prevent fatal outcomes." Hornig speculates on the mechanism that might be responsible for the differences between the two subgroups. She suggests that atypical patients may go through a "smoldering inflammatory process," in which their body's immune system is trying to recover, but she notes that further research is needed in order to test this hypothesis. She also suggests that genetic predispositions may cause the immune system to respond differently in atypical individuals. Researchers at CII continue to investigate other subgroups of CFS patients, such as patients with allergies, cognitive impairment, and gastrointestinal problems. Learn how altered gut bacteria could cause CFS.


News Article | April 28, 2017
Site: www.sciencenews.org

Researchers have pinpointed a gene that keeps important brain cells in mice from crossing their wires, providing a possible link between brain wiring and mood disorders like depression. Without the gene, called Pcdhαc2, mice acted more depressed, researchers report April 28 in Science. Nerve cells, or neurons, that produce the chemical messenger molecule serotonin extend long projections called axons to various parts of the brain. Serotonin released from the tips of the axons signal other neurons in these target areas to influence mood and other aspects of behavior. For efficient signaling, the axon tips must be properly spaced. In the new work, scientists from New York City, St. Louis and China found that such spacing is disrupted in mice lacking the Pcdhαc2 gene. As a result, serotonin-signaling circuits are not properly assembled and the mice exhibited behaviors indicating depression. Pcdhαc2 is found in a cluster of genes that contain the blueprints for proteins that protrude from the surface of cells. These proteins work like ID cards, says study coauthor Joseph Dougherty, a neurogeneticist at Washington University School of Medicine in St. Louis. As serotonin neuron axons branch out through the brain, they can recognize other axons carrying identical IDs and spread out to keep out of each other’s paths. This process, called tiling, evenly spaces the axons in their target areas within the brain. But for mice in which the whole gene cluster was deleted, serotonin axons don’t keep their distance from each other. They trip each other up, which prevents the axons from fully extending through the brain and delivering the usual doses of serotonin, says Tom Maniatis, a study coauthor and molecular neuroscientist at Columbia University. Uneven serotonin distribution affected the mice’s behavior. One test forced mice to swim for an extended period of time; mice with the cluster deleted were more likely to give up on swimming for survival. Other tests found no problem with the mutant mice’s muscles and movement, suggesting that mental state caused the surrendering behavior. Maniatis and colleagues found that deleting one group of genes in the cluster didn’t cause the axon tangling, leaving two possible suspects. Of those two, only the protein coded by the Pcdhαc2 gene was found in the serotonin neurons. The exact mechanism for how the protein keeps the axons in line still isn’t known, says Maniatis. But researchers think something within the protein repulses other proteins with the same ID card, keeping the axons far enough apart that they don’t tangle. What’s more clear, says Dougherty, is how disabling this gene can cause mice to behave in a way that is reminiscent of depression. Sean Millard, a neuroscientist at the University of Queensland School of Biomedical Sciences in Australia, researches fly genes that influence neuron spacing. This genetic finding in mice is consistent with what’s been found in flies, Millard says. “It’s really nice to see that similar mechanisms are working in higher organisms.” Further research would look at whether mutations of the same gene in humans could contribute to depression. The implications of neuron wiring in this study may also point researchers in a new direction for research into psychiatric disorders connected to serotonin. Many previous studies have focused on how serotonin is transmitted and synthesized, as well as the genes that control levels of the chemical. Instead, “maybe we should be looking at genes that regulate serotonin wiring,” Dougherty says.


News Article | April 24, 2017
Site: www.eurekalert.org

(Columbia University School of Engineering and Applied Science) Columbia Engineering Professor Yuan Yang has developed a new method that could lead to lithium batteries that are safer, have longer battery life, and are bendable, providing new possibilities such as flexible smartphones. His new technique uses ice-templating to control the structure of the solid electrolyte for lithium batteries that are used in portable electronics, electric vehicles, and grid-level energy storage. The study is published online April 24 in Nano Letters.


News Article | April 17, 2017
Site: www.prweb.com

LearnHowToBecome.org, a leading resource provider for higher education and career information, has analyzed more than a dozen metrics to rank Missouri’s best universities and colleges for 2017. Of the 40 four-year schools on the list, Washington University in St. Louis, Saint Louis University, Maryville University of Saint Louis, William Jewell College and Rockhurst University were the top five. 14 two-year schools also made the list, and State Fair Community College, Crowder College, Jefferson College, East Central College and State Technical College of Missouri were ranked as the best five. A full list of the winning schools is included below. “The schools on our list have created high-quality learning experiences for students in Missouri, with career outcomes in mind,” said Wes Ricketts, senior vice president of LearnHowToBecome.Org. “They’ve shown this through the certificates and degrees that they offer, paired with excellent employment services and a record of strong post-college earnings for grads.” To be included on the “Best Colleges in Missouri” list, schools must be regionally accredited, not-for-profit institutions. Each college is also appraised on additional data that includes annual alumni salaries 10 years after entering college, employment services, student/teacher ratio, graduation rate and the availability of financial aid. Complete details on each college, their individual scores and the data and methodology used to determine the LearnHowToBecome.org “Best Colleges in Missouri” list, visit: The Best Four-Year Colleges in Missouri for 2017 include: Avila University Baptist Bible College Calvary Bible College and Theological Seminary Central Methodist University-College of Liberal Arts and Sciences College of the Ozarks Columbia College Culver-Stockton College Drury University Evangel University Fontbonne University Hannibal-LaGrange University Harris-Stowe State University Kansas City Art Institute Lincoln University Lindenwood University Maryville University of Saint Louis Midwestern Baptist Theological Seminary Missouri Baptist University Missouri Southern State University Missouri State University-Springfield Missouri University of Science and Technology Missouri Valley College Missouri Western State University Northwest Missouri State University Park University Rockhurst University Saint Louis University Southeast Missouri State University Southwest Baptist University Stephens College Truman State University University of Central Missouri University of Missouri-Columbia University of Missouri-Kansas City University of Missouri-St Louis Washington University in St Louis Webster University Westminster College William Jewell College William Woods University Missouri’s Best Two-Year Colleges for 2017 include: Crowder College East Central College Jefferson College Lake Career and Technical Center Mineral Area College Missouri State University - West Plains Moberly Area Community College North Central Missouri College Ozarks Technical Community College St. Charles Community College State Fair Community College State Technical College of Missouri Texas County Technical College Three Rivers Community College About Us: LearnHowtoBecome.org was founded in 2013 to provide data and expert driven information about employment opportunities and the education needed to land the perfect career. Our materials cover a wide range of professions, industries and degree programs, and are designed for people who want to choose, change or advance their careers. We also provide helpful resources and guides that address social issues, financial aid and other special interest in higher education. Information from LearnHowtoBecome.org has proudly been featured by more than 700 educational institutions.


News Article | April 21, 2017
Site: www.prweb.com

The staff at Palm Beach Face is proud to announce that their practice founder, Michael Schwartz MD, FACS, will be taking part in the 2017 London Marathon. Set to take place on April 23rd, the London Marathon has a long tradition of raising funds for a variety of charitable organizations. This year, Dr. Schwartz will run as part of team EMPOWER, raising money for the international charity, Smile Train. Started in 1981 by former Olympic champion, Chris Brasher, and athlete, John Disley, the London Marathon annually hosts more than thirty thousand runners, traveling a 26 mile course around the River Thames. Sponsored by Virgin Money, the long-distance event is part of the World Marathon Majors. As the largest charity race in the world, many of those competing in the London Marathon run to raise funds for local and international causes. Dr. Schwartz ran his first marathon in 2008, after his father passed away from leukemia. Running for the Leukemia Lymphoma Society, Dr. Schwartz found raising money through his marathon racing a gratifying way to honor his loved ones while giving back to the community. Since that time, Dr. Schwartz has run in six full and more than 20 half marathons across the United States. The London Marathon will be his first international race. This year, Dr. Schwartz will run for Smile Train, an international organization, dedicated to helping those children born with a cleft lip or palate. Since 1999, Smile Train has be committed to providing a sustainable approach for cleft correction. While surgery takes less than an hour, and costs about $250 per child, the procedure is often out of reach for families in developing countries. Smile Train has provided doctors, in more than 85 countries, the training and funds needed to perform cleft repair surgery in their own community. These surgeons will then train other physicians to do the same. As part of the Smile Train 2017 Virgin Money London Marathon team, Dr. Schwartz has already raised $20,000. With donations from more than 90 individuals, Dr. Schwartz is the top fundraiser in the Smile Train team. To date, Dr. Schwartz has raised enough money to provide cleft lip and palate repair surgery for 80 children. A long-time Palm Beach facial plastic surgeon and ENT Specialist, Dr. Schwartz is double board certified by the American Board of Facial Plastic and Reconstructive Surgery and American Board of Otolaryngology. A graduate of Baylor College of Medicine, Dr. Schwartz completed his General Surgery Residency at New York’s Beth Israel Medical Center and his ENT Residency at Columbia University’s Presbyterian Medical Center. Dr. Schwartz is a Diplomate of the American Board of Otolaryngology. An esteemed author, speaker and educator, Dr. Schwartz has over 24 years experience as a surgeon. He has extensive training in the latest advancements in facial plastic surgery. Over his decades of practice, Dr. Schwartz has developed his own, innovative techniques for providing some of the most sought-after procedures in the United States, including the rhinoplasty, blepharoplasty, face lift, otoplasty and necklift. Dr. Schwartz's’ practice, Palm Beach Face, is also one of Florida’s top providers of cutting-edge, noninvasive techniques, such as Botox and soft tissue fillers. To get additional details on Dr. Schwartz’ surgical and noninvasive techniques for facial rejuvenation, contact his West Palm Beach office at 561.228.5888. Appointments can be made at the second location of Palm Beach Face, in Boynton Beach, as well. Speak with a patient coordinator about scheduling an in-person or Skype consultation with Dr. Schwartz. Read more about the April’s Virgin Money London Marathon, the international charity, Smile Train and Dr. Schwartz’ marathon fundraising campaign.


WASHINGTON, DC / ACCESSWIRE / April 22, 2017 / Distinguished leader in digital marketing services, CEO of DC-based RedPeg Marketing, Brad Nierenberg says that a well-defined and positive corporate culture is critical for employee retention as well as an organization's overall productivity and greater gains from its business operations. As the founder of the experiential ad agency, Nierenberg has realized that the individual experiences of employees have a much more significant impact on morale than the typical business-offered "morale boosters". The creative marketing expert recently shed light on the three most important aspects of creating a meaningful, productive, and thought provoking work environment. In 2012, Columbia University students conducted a study on the effects a company’s atmosphere has on its turnover rates. They came to the conclusion that employees were 45% more likely to leave a job with poor culture than one that provided positive experiences. The results also showed that employers who cultivate a rich work setting often enjoy retention rates as high as 77%. The first step, Brad Nierenberg states, is to welcome creativity. At RedPeg, Nierenberg takes the open door policy a step further. To his managers he asks, "Is your door really open? Are you truly encouraging fresh thinking?" If ideas are immediately rejected or disregarded by superiors, employees feel their professional growth is being stunted, and the business in turn suffers when they refrain from contributing again. In order to achieve company-wide growth, the creative process demands that new concepts and solutions be embraced and encouraged, not discouraged. Industry powerhouses like Google and 3M are a direct result of innovation from within. Google actually encourages staff to dedicate one day a week to creating new concepts. 3M's most successful product, the Sticky Note, was completely designed by employees. Managers must be able to accurately identify a person's strengths and weaknesses, and create tasks that allow them to maximize the impact of their talents. A recent poll by Gallup showed that putting these techniques into practice will result in employees being 7.8% more productive and six times more likely to be engaged in their job. When a team is compiled to successfully take advantage of its strengths, its work output is increased by 12.5%. In other words, more is being accomplished and those completing the jobs are also receiving immense satisfaction doing the work. Finally, and most importantly says Nierenberg, officemates must feel like they are surrounded by a second family. To accomplish this, he curates experiences that build deeper bonds between his employees. Each week he sends three employees from different departments out to lunch. Called the Three Amigos, this helps team members get to know each other in ways that may not happen in the office. Nierenberg has also rented a beach house during the summer and sent different sets of employees there. The resulting bonds are stronger than what could be achieved in the day to day work back at the office. Inevitably, conflicts will occur, but those moments of uncertainty should be used as a chance to work towards something greater, together. High levels of respect, trust, and comfort will make the countless hours of working closely together a rewarding experience. Treating employees as valued contributors helps employers to continually improve and adapt the workplace to best optimize culture and productivity. Brad Nierenberg is an entrepreneur and the President & CEO of RedPeg Marketing. With over 20 years of industry experience, Nierenberg has won several awards for his contributions to the field of Experiential Marketing. Besides being a nationally sought after speaker, his written works have been featured in The Wall Street Journal, The Washington Post, and Inc. Magazine, along with many other prestigious business publications. WASHINGTON, DC / ACCESSWIRE / April 22, 2017 / Distinguished leader in digital marketing services, CEO of DC-based RedPeg Marketing, Brad Nierenberg says that a well-defined and positive corporate culture is critical for employee retention as well as an organization's overall productivity and greater gains from its business operations. As the founder of the experiential ad agency, Nierenberg has realized that the individual experiences of employees have a much more significant impact on morale than the typical business-offered "morale boosters". The creative marketing expert recently shed light on the three most important aspects of creating a meaningful, productive, and thought provoking work environment. In 2012, Columbia University students conducted a study on the effects a company’s atmosphere has on its turnover rates. They came to the conclusion that employees were 45% more likely to leave a job with poor culture than one that provided positive experiences. The results also showed that employers who cultivate a rich work setting often enjoy retention rates as high as 77%. The first step, Brad Nierenberg states, is to welcome creativity. At RedPeg, Nierenberg takes the open door policy a step further. To his managers he asks, "Is your door really open? Are you truly encouraging fresh thinking?" If ideas are immediately rejected or disregarded by superiors, employees feel their professional growth is being stunted, and the business in turn suffers when they refrain from contributing again. In order to achieve company-wide growth, the creative process demands that new concepts and solutions be embraced and encouraged, not discouraged. Industry powerhouses like Google and 3M are a direct result of innovation from within. Google actually encourages staff to dedicate one day a week to creating new concepts. 3M's most successful product, the Sticky Note, was completely designed by employees. Managers must be able to accurately identify a person's strengths and weaknesses, and create tasks that allow them to maximize the impact of their talents. A recent poll by Gallup showed that putting these techniques into practice will result in employees being 7.8% more productive and six times more likely to be engaged in their job. When a team is compiled to successfully take advantage of its strengths, its work output is increased by 12.5%. In other words, more is being accomplished and those completing the jobs are also receiving immense satisfaction doing the work. Finally, and most importantly says Nierenberg, officemates must feel like they are surrounded by a second family. To accomplish this, he curates experiences that build deeper bonds between his employees. Each week he sends three employees from different departments out to lunch. Called the Three Amigos, this helps team members get to know each other in ways that may not happen in the office. Nierenberg has also rented a beach house during the summer and sent different sets of employees there. The resulting bonds are stronger than what could be achieved in the day to day work back at the office. Inevitably, conflicts will occur, but those moments of uncertainty should be used as a chance to work towards something greater, together. High levels of respect, trust, and comfort will make the countless hours of working closely together a rewarding experience. Treating employees as valued contributors helps employers to continually improve and adapt the workplace to best optimize culture and productivity. Brad Nierenberg is an entrepreneur and the President & CEO of RedPeg Marketing. With over 20 years of industry experience, Nierenberg has won several awards for his contributions to the field of Experiential Marketing. Besides being a nationally sought after speaker, his written works have been featured in The Wall Street Journal, The Washington Post, and Inc. Magazine, along with many other prestigious business publications.


News Article | April 20, 2017
Site: news.yahoo.com

Antarctica's giant ice shelves, which ring the entire continent, act as buttresses and help hold back the land-bound glaciers behind them — glaciers that, if they flow into the ocean, could cause a catastrophic rise in sea levels. Studies have already shown that in response to the inexorable rise in global temperatures — fueled primarily by the greenhouse gases humans are pumping in the atmosphere — many glaciers in Antarctica are melting much faster than previously believed. There is one factor whose impact on Antarctica's glaciers and ice shelves is still not widely understood — the seasonal streams of meltwater that crisscross the continent, freezing over in winters. Two new studies published Wednesday in the journal Nature have now revealed that these seasonal flows of meltwater are much more extensive than previously thought, triggering concerns over their impact on the continent's already unstable ice shelves. "This is not in the future—this is widespread now, and has been for decades,” Jonathan Kingslake, a glaciologist at Columbia University’s Lamont-Doherty Earth Observatory, and lead author of one of the studies, said in a statement. “I think most polar scientists have considered water moving across the surface of Antarctica to be extremely rare. But we found a lot of it, over very large areas." The studies, based on a continent-wide survey, reveal the presence of nearly 700 seasonal systems of interconnected ponds, channels and streams — some of which run as far as 75 miles. Many of these streams start as close as 375 miles from the South Pole, and at 4,300 feet above sea level — at temperatures scientists hitherto believed liquid water was unlikely to exist. Although none of the mapped drainages are actually new, their very presence suggests that Antarctica may be much more vulnerable to melting than it was previously believed. This is because running water, even if it's confined primarily to the surface, could eventually fracture the continent's floating ice shelves. "Antarctica is already losing ice, but the direct effects of meltwater, which generally refreezes in winter, are probably negligible for now," Columbia University's Earth Institute explained in the statement. "The concern among glaciologists is that this could change in the future. Most loss right now is taking place near the edges, where giant, floating shelves of ice attached to the land are being eroded from underneath by warming ocean currents." However, the other study suggests that the drainage system on West Antarctica’s 695-square-mile Nansen Ice Shelf may actually be helping keep the shelf together over the past 100 years by draining excess meltwater during the summer and dumping it into the ocean. But, as the authors of the studies clarify, it is still not entirely clear how the presence of these streams is relevant to sea-level rise predictions, and whether there are even more melt zones lurking beneath the surface. Over the past 100 years, global average sea levels have risen nearly 7 inches. Some estimates suggest that if Antarctica’s ice sheet melts completely, it would raise sea levels by over 200 feet — enough to flood the planet's land masses. Although this is not something that is likely to happen anytime soon, several recent studies have pointed toward a warming trend on the continent. "Looking forward, it will be really important to work out how these systems will change in response to warming, and how this will affect the ice sheets," Kingslake said in the statement. Key West Antarctica Glacier Is Melting From Inside Out


News Article | April 17, 2017
Site: www.prweb.com

Ultimate Medical Academy’s community of learning will come together at 11 a.m. on March 25, 2017 at the USF Sun Dome, 4202 E. Fowler Ave. in Tampa, and online to celebrate the graduation (#UMAgrad) of the nonprofit higher education institution’s spring class of 2017. More than 600 UMA graduates from 38 states, including those from as far away as California and the Virgin Islands, will walk on Saturday out of a total of more than 4,000 who will virtually join them in the Pomp and Circumstance. Thousands more, including graduates from 47 states and Guam as well as audience members, will celebrate their success, including employer partners, families and friends, and educational leaders attending UMA’s inaugural K-20 Education Summit. The ceremony will open with the presentation of colors by the University of Tampa ROTC and the singing of the National Anthem led by retired United States Air Force Sergeant Sonya Bryson, a regular performer for the Tampa Bay Lightning at Amalie Arena. In addition, UMA President Derek Apanovitch, J.D., M.B.A., and UMA Career Services Advisor Nelson Sostre will deliver remarks. Sostre will discuss how he helps UMA students secure jobs to start their healthcare careers. Representing UMA’s three locations (Tampa, Clearwater and online), three student graduates will share their stories on what they have overcome to reach educational and career success. Markeeta Jones from Tampa (originally from Milwaukee, Wisconsin), who is receiving a Patient Care Technician diploma from the Tampa campus; Tamika Holland from Clearwater, who is receiving a Dental Assistant with Expanded Functions diploma from the Clearwater campus; and Shay Blanks from Charlotte, North Carolina, who is receiving an Associate of Science in Healthcare Management degree from UMA’s online program. In addition, the ceremony will include faculty speakers and an inspiring keynote address from scholar, educator, activist and community organizer Dr. Jamila Lyiscott, who presented at the UMA K-20 Summit (#UMAK20) this week in Tampa. Lyiscott is a postdoctoral fellow at the Institute for Urban and Minority Education at Teachers College, Columbia University in New York City. The UMA Spring Commencement is being sponsored by Pearson, Bookmasters, McGraw-Hill, Elsevier and Hobsons. More information can be found at ultimatemedical.edu/students/commencement/ ABOUT ULTIMATE MEDICAL ACADEMY: Ultimate Medical Academy is a nonprofit healthcare educational institution with a national presence. Headquartered in Tampa, Florida and founded in 1994, UMA offers content-rich, interactive online courses as well as hands-on training at our campuses. UMA students have access to academic advising, one-on-one or group tutoring, résumé and interview coaching, job search assistance, technical support and more. The institution is accredited by the Accrediting Bureau of Health Education Schools (ABHES). Learn more by visiting UltimateMedical.edu.


News Article | April 25, 2017
Site: www.fastcompany.com

Right now, someone with depression has only two clinical options: antidepressants (that often  don’t work particularly well) and therapy. But there soon may be a third possibility: a vaccine that could prevent depression rather than attempting to treat it after the disease occurs. Neuroscientist Rebecca Brachman is working on the development of a drug that increases resilience to stress–and because exposure to stress can trigger depression, the drug could help prevent the disease. Before someone enters a high-stress situation, they could take a dose of the drug. “Imagine a scenario where we know someone is predictively at high risk for exposure to extreme stress,” Brachman, cofounder of the startup Paravax, said at TED 2017 (Brachman is a TED Fellow). “Say, a Red Cross volunteer going into an earthquake zone. In addition to the typhoid vaccine, we could give her an injection of a resilience enhancer before she leaves, so when she is held at gunpoint by looters or worse, she will be protected against developing depression or PTSD. It won’t prevent her from experiencing the stress, but it allows her to recover from it. That’s what’s revolutionary here. By increasing resiliency, we can dramatically reduce her susceptibility to depression and PTSD.” While in a doctoral program at Columbia University, working with neurobiologist Christine Denny, Brachman studied the effects of giving mice an injection of ketamine, the drug known as special-K. When the mice were later put through a series of stressful situations, they were less depressed, less afraid, and more social than a control group. That effect lasted at least a month, long after the drug had left a mouse’s system. Through her startup, Brachman is working on developing a related drug that could be used as a “resilience enhancer” to protect against depression. “It’s important because we don’t have any cures,” Brachman tells Fast Company. Anti-depressants aren’t fully effective (and for some people aren’t effective at all, or can stop working over time), and can cause unpleasant side effects. The preventative drug would also have the advantage of potentially needing only one dose. From testing, the researchers know that the preventative effects of ketamine last at least a month, and may last longer. “Preventative interventions, especially if they give a long lasting protection, have a much higher likelihood of making it to underserved communities,” she says. “That’s why when people go into Africa they bring vaccines. It’s easier to get governments to invest, and it’s easier to administer if it only needs to be done once.” It’s possible that the drug, or a variation of it, could also be potentially be used to prevent addiction, OCD, bipolar disorder, or a variety of other mental illnesses. “It’s a whole new field–preventative psychopharmacology,” Brachman says.


News Article | April 26, 2017
Site: www.chromatographytechniques.com

A new study indicates that the number of plant and animal species at risk of extinction may be considerably higher than previously thought. A team of researchers, however, believe they've come up with a formula that will help paint a more accurate picture. The study appears in the journal Biological Conservation. The maps describing species' geographic ranges, which are used by the International Union for Conservation of Nature (IUCN) to determine threat status, appear to systematically overestimate the size of the habitat in which species can thrive, said Don Melnick, senior investigator on the study and the Thomas Hunt Morgan Professor of Conservation Biology in the Department of Ecology, Evolution and Environmental Biology (E3B) at Columbia University. "Concerned about this issue, we aimed to determine how far off those maps were. In doing so, we found there is an enormous amount of freely available data on many species around the world that can be employed to get a better picture of exactly how many species are truly under extreme threat. This picture, grim as it may be, is necessary if we are going to accurately plan the steps needed to stem those threats, locally and globally." Currently, IUCN makes use of species sightings reported by experts to draw boundaries reflecting the geographic range of a given species. From these maps, the IUCN develops its Red List, which assigns a threat status to wild species: Vulnerable, Endangered, or Critically Endangered. Though the accuracy of threat risk assigned to a species relies heavily on these maps, Melnick and his colleagues believe they almost always overestimate the actual distribution of a species by incorporating areas of unsuitable habitat. This overestimation of range size, in turn, leads to a significant overestimation of population size and therefore an underestimation of extinction risk. In an effort to determine how exaggerated the IUCN range maps might be, the team analyzed the maps established for 18 endemic bird species with varying IUCN-assigned extinction threat levels inhabiting the Western Ghats mountain chain of southwest India. Melnick's student, Vijay Ramesh, and two other researchers from India studying in the United States, pored over data from the world's largest citizen science database (eBird), and also gathered freely available and geo-referenced data on the climate, vegetation, ecology, and geo-physical attributes of the Western Ghats. The team then used local experts to sift through those data and verify their accuracy. By bringing together carefully curated citizen science data on the sightings of each species with the other data types, they were able to build a profile of where each species is likely to be found - at what elevation, at what temperature range, in what types of vegetation, etc. This allowed them to estimate new geographic ranges for each species that they believe are much more accurate than the IUCN range maps. The new range estimates from the Columbia study revealed that the IUCN maps for 17 of the 18 bird species contained large areas of unsuitable habitat and vastly overestimated their ranges. By extension, the threat levels which are correlated to species range size are probably underestimated, Melnick said, and the study suggests that IUCN threat status for at least 10 of the 18 species should be elevated. "We were extremely surprised by how much the IUCN ranges overestimated what we deem the true ranges to be," he added. "In a number of cases the ranges were overestimated by an order of magnitude. The drastic reduction in range size and the increased habitat fragmentation that our study indicates leads us to infer that there is a much greater threat to these endemic birds than was ever imagined." The study points to a new way of estimating species ranges for conservation purposes, Melnick said, adding that the use of freely available, digitized, and geo-referenced citizen science data, along with biological and geophysical data, and sophisticated statistical modeling can and should be applied to plant and animal species around the globe so that IUCN can more accurately assess the threat to species worldwide. "IUCN's criteria for establishing threat levels for species are excellent; however, the data to which those criteria are being applied need to be updated using an approach like the one we have developed for the Western Ghats," Melnick said. "By using citizen science data in a careful way, we may find there is an urgent need to start protecting species we thought were flourishing but are actually in danger of spiraling toward extinction."


News Article | April 27, 2017
Site: www.theguardian.com

Harvard University is “pausing” investments in some fossil fuel interests following a five-year campaign by some students and environment groups to pressure the university to divest itself from coal, oil and gas. The elite university has come under fire for investing its $36bn endowment in a portfolio that contains fossil fuel companies and has until now resisted a concerted divestment campaign that has also targeted other US universities. However, Colin Butterfield, head of natural resources at the Harvard Management Company, said that climate change is a “huge problem” and that “for now, we are pausing minerals and oil and gas.” Butterfield said that Harvard indirectly invests in fossil fuels through outside funds, although the management company has previously signalled that it is moving away from coal due to a lack of profitability. “What I can tell you is, from my area, I could honestly say that I doubt – I can’t say never, because never say never – but I doubt that we would ever make a direct investment with fossil fuels,” he said. While Harvard hasn’t declared a full moratorium on fossil fuels, campaigners have hailed the pause as a breakthrough moment in the lengthy fight to get the university to divest. A group called Divest Harvard is demanding that the university freezes new investments in fossil fuels, divest from direct holdings in the top 200 publicly listed fossil fuel firms and rid themselves of all indirect ties within five years. Protests escalated in March when students blocked the entrances to Harvard’s University Hall. In a letter to Drew Faust, Harvard’s president, Divest Harvard demanded the university act “morally and with a conscience” but an official response stated that while climate change was “one of the world’s most urgent and serious issues” it disagreed with the divestment approach. Bill McKibben, co-founder of climate campaign group 350.org, said: “Harvard is divesting through the back door – testimony to the great pressure applied by students, faculty, and alumni, but also to its establishment unwillingness to simply say forthrightly: the fossil fuel age must end. “Still, the significance is enormous: the richest and most famous educational institution on our planet is now siding with the future, not the past.” The protests at Harvard were mirrored by a sit-in action at the University of Pennsylvania, where students also demanded the endowment divest from fossil fuels. Columbia University, under similar pressure, recently announced that it would divest from companies getting more than 35% of their income from thermal coal production. Last year, Yale announced it had removed around $10m in fossil fuel investments from its $25bn endowment and Dartmouth announced a review of its holdings. Meanwhile, the board of trustees of Cornell University approved a standard that would trigger divestment in a “morally reprehensible” company without specifying a specific approach to fossil fuels.


News Article | April 25, 2017
Site: phys.org

The study appears in the journal Biological Conservation. The maps describing species' geographic ranges, which are used by the International Union for Conservation of Nature (IUCN) to determine threat status, appear to systematically overestimate the size of the habitat in which species can thrive, said Don Melnick, senior investigator on the study and the Thomas Hunt Morgan Professor of Conservation Biology in the Department of Ecology, Evolution and Environmental Biology (E3B) at Columbia University. "Concerned about this issue, we aimed to determine how far off those maps were. In doing so, we found there is an enormous amount of freely available data on many species around the world that can be employed to get a better picture of exactly how many species are truly under extreme threat. This picture, grim as it may be, is necessary if we are going to accurately plan the steps needed to stem those threats, locally and globally." Currently, IUCN makes use of species sightings reported by experts to draw boundaries reflecting the geographic range of a given species. From these maps, the IUCN develops its Red List, which assigns a threat status to wild species: Vulnerable, Endangered, or Critically Endangered. Though the accuracy of threat risk assigned to a species relies heavily on these maps, Melnick and his colleagues believe they almost always overestimate the actual distribution of a species by incorporating areas of unsuitable habitat. This overestimation of range size, in turn, leads to a significant overestimation of population size and therefore an underestimation of extinction risk. In an effort to determine how exaggerated the IUCN range maps might be, the team analyzed the maps established for 18 endemic bird species with varying IUCN-assigned extinction threat levels inhabiting the Western Ghats mountain chain of southwest India. Melnick's student, Vijay Ramesh, and two other researchers from India studying in the United States, pored over data from the world's largest citizen science database (eBird), and also gathered freely available and geo-referenced data on the climate, vegetation, ecology, and geo-physical attributes of the Western Ghats. The team then used local experts to sift through those data and verify their accuracy. By bringing together carefully curated citizen science data on the sightings of each species with the other data types, they were able to build a profile of where each species is likely to be found - at what elevation, at what temperature range, in what types of vegetation, etc. This allowed them to estimate new geographic ranges for each species that they believe are much more accurate than the IUCN range maps. The new range estimates from the Columbia study revealed that the IUCN maps for 17 of the 18 bird species contained large areas of unsuitable habitat and vastly overestimated their ranges. By extension, the threat levels which are correlated to species range size are probably underestimated, Melnick said, and the study suggests that IUCN threat status for at least 10 of the 18 species should be elevated. "We were extremely surprised by how much the IUCN ranges overestimated what we deem the true ranges to be," he added. "In a number of cases the ranges were overestimated by an order of magnitude. The drastic reduction in range size and the increased habitat fragmentation that our study indicates leads us to infer that there is a much greater threat to these endemic birds than was ever imagined." The study points to a new way of estimating species ranges for conservation purposes, Melnick said, adding that the use of freely available, digitized, and geo-referenced citizen science data, along with biological and geophysical data, and sophisticated statistical modeling can and should be applied to plant and animal species around the globe so that IUCN can more accurately assess the threat to species worldwide. "IUCN's criteria for establishing threat levels for species are excellent; however, the data to which those criteria are being applied need to be updated using an approach like the one we have developed for the Western Ghats," Melnick said. "By using citizen science data in a careful way, we may find there is an urgent need to start protecting species we thought were flourishing but are actually in danger of spiraling toward extinction." Explore further: Remote sensing data reveals hundreds more species at risk of extinction


News Article | April 28, 2017
Site: news.yahoo.com

In recent years, illegal marijuana use has risen faster in states that have legalized medical marijuana than in states without such laws, a new study finds. In addition, the percentage of people with "marijuana use disorders" — people who use the drug in unhealthy ways, or abuse it — has also increased at a higher rate in these states, according to the study. Although medical marijuana laws may benefit some people, changes to state laws also may have negative consequences for public health, the researchers, led by Deborah Hasin, a professor of epidemiology at Columbia University in New York City, wrote in the study. Marijuana may, for example, help cancer patients with pain and nausea. The researchers looked at data from three time periods: 1991 to 1992, when no states allowed marijuana use for medical reasons; 2001 to 2002, when six states had medical marijuana laws; and 2012 to 2013, when 15 states had medical marijuana laws. [Marijuana Legalization in the US (Map)] As of November 2016, a total of 28 states have passed medical marijuana laws, according to the study, published today (April 26) in the journal JAMA Psychiatry. Data on people's marijuana use and rates of marijuana use disorders came from national surveys from the three time periods included, the study said. Nearly 120,000 people in 39 states were included in these surveys, according to the study. Over the course of the study period, the rates of illegal marijuana use increased in all 39 states included in the study. In the states that never passed medical marijuana laws, the rates of illegal use of the drug rose from 4.5 percent to 6.7 percent — an increase of 2.2 percentage points. In states that did pass medical marijuana laws, the rates of illegal use rose from 5.6 percent to 9.2 percent — an increase of 3.6 percentage points. In other words, the rates of illegal marijuana use increased more quickly in states with medical marijuana laws, the researchers said. Similarly, the rates of marijuana use disorders also increased more quickly over the study period in states that had passed medical marijuana laws than in states that had not. In states without medical marijuana laws, the rates of marijuana use disorders rose from 1.3 percent to 2.3 percent — an increase of 1 percentage point. In comparison, the rates of marijuana use disorders in states with medical marijuana laws rose from 1.5 percent to 3.1 percent — an increase of 1.6 percentage points. The researchers noted that two states, California and Colorado, stood out for their higher rates of illegal marijuana use. For example, between 2001-2002 to 2012-2013, the rates of illegal marijuana use increased by 5.3 percentage points in California and 7 percentage points in Colorado, the researchers found. Marijuana has since been legalized for recreational use in both Colorado and California. They also noted that the rates of illegal marijuana use increased more sharply in states that passed medical marijuana laws after 2002-2003 than in states that passed the laws before those years, with the exception of California. [7 Ways Marijuana May Affect the Brain] However, the study had several limitations, the researchers noted. For example, the survey participants reported their own marijuana use, which could have led to inaccuracies. In addition, as marijuana use became more common, people might have been more forthcoming about reporting their use of the drug, which, in turn, might have contributed to the sharper increase between 2002-2003 and 2012-2013. In other words, in the earlier survey, people might not have admitted to using marijuana. More studies are needed to help explain why medical marijuana laws may increase the illegal use of the drug, the researchers said. Some possible explanations include an increased perception that marijuana is safe when a medical marijuana law is passed, as well as increased availability of the drug, the researchers wrote.


News Article | April 20, 2017
Site: www.eurekalert.org

Scientists at the Center for Infection and Immunity (CII) at Columbia University's Mailman School of Public Health report elevated levels of a pathogen responsible for the tick-borne disease babesiosis in Suffolk County, New York, where rates are the highest in the state. Results are published in the journal mSphere. Researchers developed and employed a method to simultaneously test for five common pathogens carried by deer ticks: Babesia microti, the pathogen behind babesiosis; Borrelia burgdorferi, the cause of Lyme disease; as well as Anaplasma phagocytophilum, Borrelia miyamotoi, and Powassan virus -- pathogens responsible for other tick-borne infections. The team collected and tested 318 adult and nymph ticks at five sites in Suffolk County (Southampton, Mannorville, Southold, Islip, Huntington) and three sites in Connecticut (Mansfield, Stamford, Greenwich). Nymphal ticks are about the size of a poppy seed, emerge in warmer months, and are responsible for the majority of tick-borne disease. The new test uses a DNA amplification technique called polymerase chain reaction or PCR to test for tick-borne pathogens. Most existing tests use this method to test ticks for each agent individually. Even the tests that have the ability to test for more than one agent typically only test for up to three, not five agents, and never for Powassan virus, the rarest but most pathogenic of the five. The scientists say the technique has several advantages: it lowers costs, facilitates testing for agents (B. miyamotoi, and especially Powassan virus) that are rarely tested for, and provides risk assessments for co-infections which may adversely affect the course of disease. Tests found B. microti present in a higher proportion of ticks in Suffolk County than Connecticut, including 17 vs. 7 percent of nymphal ticks. In both locations, B. burgdorferi, the causative agent for Lyme disease, was the most frequently detected agent in ticks tested while A. phagocytophilum, B. miyamotoi and Powassan virus were more rare. One-quarter of B. burgdorferi-positive nymphs were also positive for B. microti suggesting a risk of co-infection with both agents from a single tick bite. "Gathering data on co-infections is particularly important in light of the fact that antibiotics used for Lyme disease may be ineffective for babesiosis," says first author Rafal Tokarz, a research scientist at CII. The number of counties in the Northeast with high rates of Lyme disease has more than tripled since the 1990s -- a sign that ticks that spread disease have expanded their range. Rates of tick-borne illness may be much higher than reported: one study in Minnesota found 79 percent of cases were not reported to health authorities. Symptoms include fever and headaches, and, more rarely, neurological complications like encephalitis. "This new test can strengthen surveillance for tick-borne illnesses which are underreported and growing rapidly," says W. Ian Lipkin director of CII and John Snow Professor of Epidemiology at the Mailman School. Co-authors including Teresa Tagliafierro and Stephen Sameroff at CII and D. Moses Cucura and Ilia Rochlin at the Suffolk County Department of Public Works. This study was funded through grants from the Steven and Alexandra Cohen Foundation and National Institute of Allergies and Infectious Diseases (U19 AI109761).


President Trump said recently that the tradition of rating a new president’s first 100 days is “ridiculous.” The White House then created a web page devoted to rating his first 100 days. It’s further proof, if anyone needed it, that the defining feature of this president’s first 100 days is noise. Every day brings some piercing new alarm, making it hard to separate the momentarily disturbing from the truly damaging. But this is essential – especially for the environment. While the president has flip-flopped on some signature issues, he’s been totally consistent about dismantling protections for public health, clean air and clean water. So let’s take a closer look at what he’s done so far, and what it will mean for our health and our world. Here are the four worst actions Trump took his first 100 days – and one that’s very good. Trump’s choice of leader for the U.S. Environmental Protection Agency built his career by attacking the agency and its clean air and water rules. Pruitt is beginning to staff the EPA with Beltway insiders who have made their living lobbying for weaker pollution rules on behalf of industry. For example, it has been widely reported that Andrew Wheeler may be named as Pruitt’s top deputy. Wheeler is now a lobbyist for Murray Energy, a coal mining conglomerate that is demanding an end to the rule that limits mercury pollution. In fact, a recent analysis by Columbia University Law School showed that more than one quarter of the administration’s appointees so far to environmental, energy and natural resource agencies have close ties to the fossil fuel industry. The likely result: Thousands of decisions over the next four years made by those more interested in protecting polluters than public health. That will leave a toxic legacy of more disease and premature death. Last year, a bipartisan Congress overwhelmingly passed the Lautenberg Act, a new chemical safety law that, after four decades of a broken system that flooded our stores and homes with dangerous or untested chemicals, finally constructed a strong chemical safety net. But now the EPA has to finish writing the rules to implement it. For that, Pruitt has chosen Nancy Beck, an insider straight from the main chemical industry trade association who even within the last few weeks lobbied the agency on these very rules. If those new rules give industry everything it wants, we’ll have blown a historic chance to restore public trust and market confidence in the products consumers buy for household use. Our health would continue to be at risk – and undoing the damage would take years. The administration’s budget proposal would cut the EPA by almost a third – more than any other agency even though its budget is tiny to begin with. Out of every $10 the federal government spends, only 2 cents go to the EPA. These cuts aren’t being done to save money. They’re part of an ideological crusade the public doesn’t support. If the EPA budget is cut this way, the loss of experts and institutional knowledge will reverberate for years. Detailed plans obtained by the Washington Post show that Trump and Pruitt want to cut a quarter of the workforce and abolish 56 programs with impacts from the Chesapeake Bay to Puget Sound. Together, this will lead to more asthma attacks, more health problems for the elderly and a more dangerous future. Pruitt is now trying to gut many of the same the rules and safeguards he sued to stop as Oklahoma’s attorney general. They limit the amount of arsenic and acid gases power plants can emit, reduce smog that causes respiratory problems and cut carbon pollution that causes climate change. He has signaled hostility to the Mercury and Air Toxics Rule, despite the fact that virtually all power plants are already in compliance. The EPA chief and Trump have also taken aim at the Clean Power Plan, America’s first limits on carbon pollution from power plants, without any strategy to replace it. The energy market is moving toward cleaner energy, but slowing that process means losing clean tech jobs to other countries and a bigger cleanup for our children’s generation. This is the positive legacy of the Trump administration: Americans who used to take clean air and water for granted are waking up to the danger. Membership in environmental groups is skyrocketing – the biggest question we get these days is, “what can I do?” – as women and men from all walks of life are reclaiming environmentalism as a mainstream American value. On Saturday, thousands will take to the streets in Washington and other cities for the People’s Climate March. Just as a blossoming environmental awareness in the early 1970’s led to some of the bedrock laws we rely on today, I believe the great awakening of 2017 will echo for years to come. If we work together and make our voices heard we can limit the worst of the damage Trump intends to conflict.


News Article | May 3, 2017
Site: www.scientificamerican.com

If two black holes merge and no one is around to hear them, do they still make a sound? Careful—this is a trick question. Despite their reputation as the most fearsome objects in the universe, black holes by themselves aren’t actually very noisy. Any sounds emitted inside the event horizon, the boundary beyond which light itself cannot escape a black hole’s gravitational pull, would never reach the outside universe. So two bare black holes meeting and merging in the cosmic dark would be expected to make essentially no sound at all. What little noise black holes might emit would come from their more mundane meals—the sonic protests of materials being ripped apart and frictionally heated as they funnel into a black hole’s insatiable maw. But these death rattles would be stifled by the near vacuum of space—a place, it turns out, where indeed no one can hear you scream, even if you are being devoured by a black hole. There are, however, soundless ways to listen to merging black holes, as Janna Levin will explain in a special presentation broadcast live on this page at 7 P.M. Eastern time from the Perimeter Institute for Theoretical Physics in Ontario. Levin is the Tow Professor of physics and astronomy at Barnard College of Columbia University, and an award-winning author of, most recently, Black Hole Blues and Other Songs from Outer Space. Two colliding black holes come together with such violence that they distort the fabric of reality itself, creating ripples in spacetime—gravitational waves—that emanate outward from the merger at the speed of light. Although predicted more than a century ago by Albert Einstein, gravitational waves were not directly observed until 2015, when the Advanced Laser Interferometer Gravitational-Wave Observatory (LIGO) detected some originating from a black hole merger more than a billion light-years away. Converted into audible sounds, the recorded waveforms of the signal register as a distinct chirp—the “sound” of two black holes colliding, heard for the very first time. Since then LIGO has tuned in to several more chirps from other merging black holes, providing a soundtrack to what has previously been a silent-movie view of the universe. Join Levin this evening as she presents the astounding science of gravitational waves and the remarkable century-spanning story of their discovery. Her talk is part of Perimeter’s public lecture series presented by BMO Financial. Online viewers can join the conversation by tweeting to @Perimeter using the #piLIVE hashtag.


News Article | April 25, 2017
Site: www.futurity.org

Low levels of a brain protein may combine with another long-suspected culprit to trigger the learning and memory losses in Alzheimer’s disease, a study shows. The discovery should open up important new research areas, scientists say—and may one day lead to better therapies for the disease and other forms of cognitive decline. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, a neuroscientist at Johns Hopkins University School of Medicine and the senior scientist in the study. Alzheimer’s is estimated to affect more than 5 million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of Alzheimer’s patients, have often taken the blame for the mental decline associated with the disease. But autopsies and imaging studies reveal that people can have high levels of amyloid in the brain without displaying Alzheimer’s symptoms, which calls into question a direct link between amyloid and dementia. The new study, published in eLife, shows that when the NPTX2 gene produces less of its protein at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to work together are disrupted. That results in a failure of memory. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure,” Worley says. Worley’s lab group studies “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. NPTX2 is one of these immediate early genes; it makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” Worley says. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by studies indicating altered patterns of activity in brains of people with Alzheimer’s and wondered whether altered activity was linked to changes in immediate early gene function. To get answers, researchers first turned to archived human brain tissue samples. They discovered that NPTX2 protein levels were reduced by as much as 90 percent in brain samples from Alzheimer’s patients. Samples with amyloid plaques from people who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. The scientists then examined mice bred without the rodent equivalent of the NPTX2 gene and discovered that a lack of NPTX2 alone wasn’t enough to affect cell function. But then they added a gene that increases amyloid generation to the mouse brains. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Examination of cerebrospinal fluid from 60 living Alzheimer’s patients and 72 people without the disease provided further evidence. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques,” Worley says. “This means that NPTX2 represents a new mechanism. One immediate application, he says, may be figuring out if NPTX2 levels can help identifying patients who can best be helped by new drugs. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. Worley’s group is helping companies try to develop a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in Alzheimer’s and how to prevent or slow that process. Others involved in the study work at Johns Hopkins; the National Institutes of Health; the University of California, San Diego; Shiley-Marcos Alzheimer’s Disease Research Center; Columbia University; the Institute for Basic Research; and the University of Exeter. The National Institutes of Health, the Alzheimer’s Disease Discovery Foundation, and LuMind Research Down Syndrome Foundation funded the work.


News Article | April 24, 2017
Site: news.yahoo.com

Huge swaths of Antarctica are awash in draining meltwater during the summer months, the first-ever continent-wide survey of meltwater shows. Although past studies revealed that portions of Antarctica's Western Peninsula were melting at an alarming rate, most scientists believed the rest of the continent did not face extensive melting during Antarctica's ephemeral summer months. "This is not in the future — this is widespread now, and has been for decades," lead author Jonathan Kingslake, a glaciologist at Columbia University's Lamont-Doherty Earth Observatory, said in a statement.


News Article | April 25, 2017
Site: www.sciencedaily.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer's disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer's and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is "turned down" at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to "speak in unison" are disrupted, resulting in a failure of memory. "These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer's disease," says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper's senior author. "The key point here is that it's the combination of amyloid and low NPTX2 that leads to cognitive failure." Since the 1990s, Worley's group has been studying a set of genes known as "immediate early genes," so called because they're activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen "circuits" in the brain. "Those connections are essential for the brain to establish synchronized groups of 'circuits' in response to experiences," says Worley. "Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information." Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer's. Worley's group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn't enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain "rhythms" important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic -- one depending on the other for the effect -- it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers -- including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. "Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease. This creates many new opportunities," says Worley. "One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies." Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed.


News Article | April 26, 2017
Site: motherboard.vice.com

As states slowly start to legalize medical marijuana, a question that has remained unanswered is whether more access means more recreational use, too. A new study out today in JAMA Psychiatry says it does. There are 29 states along with Washington, DC that now have medical marijuana laws on the books. And while a few studies have tried to determine how these laws affect illegal marijuana use, this new study is the first to use data collected prior to any medical marijuana laws to track changes in illegal use rates. The researchers, led by Deborah Hasin, a Professor of Epidemiology at Columbia University, looked at data collected in three national surveys on alcohol and drug use from 1991 to 2013. Between the first and third surveys, the researchers found that illegal marijuana use increased more in the states that passed the laws compared to those that didn't. In states with medical marijuana laws, illegal marijuana use increased by 3.6 percent compared to a 2.2 percent increase in states without medical marijuana. Cannabis use disorders, as defined by the DSM IV, the American Psychiatric Association's mental disorder manual, increased by 1.6 percent and 1.0 percent in states with and without medical marijuana laws, respectively. The authors believe these findings point to a potential health hazard. "Medical marijuana laws may benefit some with medical problems. However, changing state laws—medical or recreational—may also have adverse public health consequences, including cannabis use disorders," Hasin said in a statement., But the authors of the study also note several limitations. The surveys were self-reported and they admit that more people might have been willing to report their own drug use as it became more socially acceptable. It should also be noted that requirements for a cannabis use disorder—a condition far less dangerous and prevalent than other kinds of drug and alcohol abuse—have changed since the three surveys were conducted. In the DSM IV, cannabis abuse and dependence were separate disorders and neither withdrawal nor craving were criteria for a dependence diagnosis. In the latest version, abuse and dependence have been merged and withdrawal and craving added to the criteria list, changing who might fit the bill for having a disorder. The DSM IV also required one of four criteria to be met for a cannabis abuse disorder diagnosis, one of which was substance-related legal problems, a criterion that has been removed in the latest version. Those surveyed who were diagnosed with a cannabis use disorder may have received a different diagnosis under the new standards. "Future studies are needed to investigate mechanisms by which increased cannabis use is associated with medical marijuana laws, including increased perceived safety, availability, and generally permissive attitudes," said Hasin.


News Article | April 26, 2017
Site: www.biosciencetechnology.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley, who is also a member of the Institute for Basic Biomedical Sciences. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed.


News Article | April 20, 2017
Site: www.eurekalert.org

A team of neuroscientists at the University of Pennsylvania has shown for the first time that electrical stimulation delivered when memory is predicted to fail can improve memory function in the human brain. That same stimulation generally becomes disruptive when electrical pulses arrive during periods of effective memory function. The research team included Michael Kahana, professor of psychology and principal investigator of the Defense Advanced Research Projects Agency's Restoring Active Memory program; Youssef Ezzyat, a senior data scientist in Kahana's lab; and Daniel Rizzuto, director of cognitive neuromodulation at Penn. They published their findings in the journal Current Biology. This work is an important step toward the long-term goal of Restoring Active Memory, a four-year Department of Defense project aimed at developing next-generation technologies that improve memory function in people who suffer from memory loss. It illustrates an important link between appropriately timed deep-brain stimulation and its potential therapeutic benefits. To get to this point, the Penn team first had to understand and decode signaling patterns that correspond to highs and lows of memory function. "By applying machine-learning methods to electrical signals measured at widespread locations throughout the human brain," said Ezzyat, lead paper author, "we are able to identify neural activity that indicates when a given patient will have lapses of memory encoding." Using this model, Kahana's team examined how the effects of stimulation differ during poor versus effective memory function. The study involved neurosurgical patients receiving treatment for epilepsy at the Hospital of the University of Pennsylvania, the Thomas Jefferson University Hospital, the Dartmouth-Hitchcock Medical Center, the Emory University Hospital, the University of Texas Southwestern, the Mayo Clinic, Columbia University, the National Institutes of Health Clinical Center and the University of Washington. Participants were asked to study and recall lists of common words while receiving safe levels of brain stimulation. During this process, the Penn team recorded electrical activity from electrodes implanted in the patients' brains as part of routine clinical care. These recordings identified the biomarkers of successful memory function, activity patterns that occur when the brain effectively creates new memories. "We found that, when electrical stimulation arrives during periods of effective memory, memory worsens," Kahana said. "But when the electrical stimulation arrives at times of poor function, memory is significantly improved." Kahana likens it to traffic patterns in the brain: stimulating the brain during a backup restores the normal flow of traffic. Gaining insight into this process could improve the lives of many types of patients, particularly those with traumatic brain injury or neurological diseases, such Alzheimer's. "Technology based on this type of stimulation," Rizzuto said, "could produce meaningful gains in memory performance, but more work is needed to move from proof-of-concept to an actual therapeutic platform." This past November, the RAM team publicly released an extensive intracranial brain recording and stimulation dataset that included more than 1,000 hours of data from 150 patients performing memory tasks.


News Article | April 17, 2017
Site: www.prweb.com

The Community for Accredited Online Schools, a leading resource provider for higher education information, has compiled its list of Missouri’s best online colleges and universities for 2017. Of the 31 four-year schools that made the list, Webster University, University of Missouri Columbia, University of Central Missouri, Missouri State University Springfield and Lindenwood University scored the highest. Of the 8 two-year colleges that also made the list, Crowder College, Jefferson College, Mineral Area College and State Fair Community College were top scoring schools. “Online certificates and degrees are increasingly popular option, especially for students who are unable to complete their education in a traditional classroom environment because of scheduling or location,” said Doug Jones, CEO and founder of AccreditedSchoolsOnline.org. “These Missouri schools have distinguished themselves by offering the best online college programs in the state, maintaining high quality and accreditation standards and providing flexibility with a variety of degrees online.” To earn a spot on the “Best Online Schools in Missouri” list, colleges and universities must be accredited, public or private not-for-profit institutions. Each college is also judged based on additional data points such as the availability of financial aid opportunities, school counseling services, student/teacher ratios and graduation rates. For more details on where each school falls in the rankings and the data and methodology used to determine the lists, visit: Missouri’s Best Online Four-Year Schools for 2017 include the following: Avila University Baptist Bible College Calvary Bible College and Theological Seminary Columbia College Cox College Culver-Stockton College Drury University Evangel University Fontbonne University Hannibal-LaGrange University Lincoln University Lindenwood University Maryville University of Saint Louis Midwestern Baptist Theological Seminary Missouri Baptist University Missouri Southern State University Missouri State University-Springfield Missouri University of Science and Technology Missouri Valley College Missouri Western State University Northwest Missouri State University Park University Saint Louis University Southeast Missouri State University Southwest Baptist University Stephens College University of Central Missouri University of Missouri-Columbia University of Missouri-Kansas City Webster University William Woods University Missouri’s Best Online Two-Year Schools for 2017 include the following: Crowder College Jefferson College Mineral Area College Moberly Area Community College North Central Missouri College Ozarks Technical Community College State Fair Community College Three Rivers Community College ### About Us: AccreditedSchoolsOnline.org was founded in 2011 to provide students and parents with quality data and information about pursuing an affordable, quality education that has been certified by an accrediting agency. Our community resource materials and tools span topics such as college accreditation, financial aid, opportunities available to veterans, people with disabilities, as well as online learning resources. We feature higher education institutions that have developed online learning programs that include highly trained faculty, new technology and resources, and online support services to help students achieve educational success.


News Article | May 2, 2017
Site: news.yahoo.com

FILE - In this March 31, 2017 file photo, a portrait of former President Andrew Jackson hangs on the wall behind President Donald Trump, accompanied by Vice President Mike Pence, in the Oval Office at the White House in Washington. President Donald Trump made puzzling claims about Andrew Jackson and the Civil War in an interview, suggesting that he was uncertain about the origin of the conflict while claiming that Jackson was upset about the war that started more than a decade after his death. (AP Photo/Andrew Harnik, File) ALBANY, N.Y. (AP) — President Donald Trump suggested in an interview that he is unclear about the origins of the Civil War, that President Andrew Jackson (who died 16 years before the war) could have prevented the conflict and that it was possible to have settled it without bloodshed. "Could that one not have been worked out?" Trump asked in the interview with The Washington Examiner. AP talked to some of the most distinguished experts on what was really behind the war that tore the nation asunder. WHY DID THE CIVIL WAR START? The issues leading up to the Civil War were complex, and many people in the North and South in 1861 viewed the conflict as inevitable. In the South, slave labor was the foundation of an economy based on the cotton produced by plantations and farms. The free labor also was key to profiting from the production of such cash crops as tobacco, corn and other staples of the South. In the North, farms were generally smaller because of the soil and climate. With their more industrialized economy, the Northern states didn't require large numbers of slaves. By the 1850s, the North vs. South divide was widening as free states and slave states debated over allowing slavery in new territories as the nation expanded westward. Southerners viewed the North's opposition to slavery's expansion as a threat to the economies — and thus the political power and rights — of slave-holding states. Abraham Lincoln, opposed to slavery's expansion, was elected president in 1860 and the path to the South's seceding from the Union was set. "Slavery was the root cause of the Civil War," said Eric Foner, professor of history at Columbia University. "It was not the only cause, but it was the underlying cause. There was a fundamental difference between the North and the South as the South feared for the future of slavery." COULD IT HAVE BEEN AVOIDED? Probably not, according to James Roark, an author and retired history professor at Emory University in Atlanta. "As it got tangled with American politics and regional interests, nobody could figure out a way to save both the Union and preserve slavery in the South," he said. "It wasn't for a lack of talking. There was plenty of talking." WHAT WOULD ANDREW JACKSON DO? (OR HAVE DONE, IF HE LIVED THAT LONG?) Probably not much. "Even Andrew Jackson, were he alive, could not have threatened the use of force that perhaps Trump thinks would have solved the problem," Foner said. Jackson, who died in 1845, was a slave-holding plantation owner. "The Civil War was caused by slavery; it wasn't caused by the absence of Andrew Jackson to help the American government," said Harold Holzer, a New York-based scholar who is an expert on the Civil War and Abraham Lincoln. HOW WAS THE WAR RESOLVED? After four years and more than 600,000 soldiers dead, Confederate Gen. Robert E. Lee surrendered on April 9, 1865, at the village of Appomattox Court House in Virginia. Associated Press reporters Russ Bynum in Savannah, Georgia, and Jonathan Lemire in New York contributed to this report. Historians of the American Civil War point to complex issues when reflecting on President Donald Trump's remark that the conflict might have been settled without bloodshed. Trump asked in an interview with The Washington Examiner: "Could that one not have been worked out?" A professor of history at Columbia University, Eric Foner, notes that slavery was a root cause of the war and that the South feared for the future of slavery. A retired history professor at Emory University in Atlanta, James Roark, says war probably couldn't have been avoided in 1861. Roark says, "Nobody could figure out a way to save both the Union and preserve slavery in the South." More than 600,000 soldiers had died by the time the war ended in 1865.


News Article | April 22, 2017
Site: news.yahoo.com

Whether it's clean water gushing from a faucet, a weather forecast or a new smartphone game, kids see the accomplishments of science all around them, and tomorrow's "March for Science" provides an opportunity for parents and other adults to talk to kids about the importance of science, experts say. Parents can tell their kids that the march is being held to show that "science is for everyone – it's really that simple," said David Evans, executive director of the National Science Teachers Association, which is one of the more than 300 organizations working in partnership to organize the event. The march will start from the National Mall in Washington, D.C., on Saturday (April 22), and satellite marches will be held in more than 500 other cities worldwide. If a child asks why people are marching for science, an adult can explain that organizations that support science are concerned about the public's understanding of science, Evans told Live Science. "There are many important issues facing society right now where science comes to bear on people's decisions," he said. [Best Supporting Role: 8 Celebs Who Promote Science] A key idea to explain to kids is that science unfolds in many steps, he said. Adults can explain to kids that this means starting with an observation about the world, then asking questions about that observation and testing one's understanding of it to see if it holds up, he said. "Everyone needs to understand that science is a process so they can participate" in society's decisions, he said. Emily Graslie, the "chief curiosity correspondent" for The Field Museum in Chicago, said even young children can learn that science is a way of learning about the world. "The march is an opportunity for scientists and science enthusiasts to show up and our make presence known as citizen of our communities," said Graslie, who will be giving the keynote address at the march in Chicago on Saturday. Scientists have not always done a good job of communicating about their work, and the public may mistakenly think that scientists work in isolation, in labs, she said. The march will change that. The march will also show kids that scientists are a diverse group, Graslie said. "Kids might think of a scientist as a kooky guy with crazy hair, but science is a diverse field, a collaborative field, and there are people from all walks of life who contribute." Some in the public might feel that they are distant from the world of science, but science just means expressing curiosity or an interest in a question, she said. For anyone who's just looking to develop a better understanding of science, participating in a march is a good first step, she said. Mike Carapezza, a biomedical engineer who works as a research associate at Columbia University in New York City and as a partner with a children's science education program called Hypothekids, said  adults can explain to kids that the march is happening because "people who understand the value of science are trying to make the statement that we can't ignore scientific facts." [25 Scientific Tips for Raising Happy Kids] Taking kids to the march is a good idea because it would let kids see how many people think science is important, Carapezza said. Kids can benefit from "seeing how many people really trust and value science, and trust in the 'good faith' of science – that scientists are trying to find answers, not push an ideology," he said. Adults can explain to kids that science is "a systematic way of understanding the world," he said. "It's a method of asking questions and answering them as well as you can," but still acknowledging that those answers may not be completely correct, he told Live Science. "The uncertainty is built in," he said. And anyone who feels that they don't know a lot about science can rest assured that they will feel at home at the march. "Admitting that you don't know something -- that's actually exactly what scientists do," he said. Scientists look for questions that they don't have answers to, and try to learn. With kids, "it's never too early to foster an interest in science," Graslie said. "In times like these, it's important to keep dialogues open and encourage that curiosity."


News Article | April 19, 2017
Site: www.eurekalert.org

Researchers at Columbia University have made a significant step toward breaking the so-called "color barrier" of light microscopy for biological systems, allowing for much more comprehensive, system-wide labeling and imaging of a greater number of biomolecules in living cells and tissues than is currently attainable. The advancement has the potential for many future applications, including helping to guide the development of therapies to treat and cure disease. In a study published online April 19 in Nature, the team, led by Associate Professor of Chemistry Wei Min, reports the development of a new optical microscopy platform with drastically enhanced detection sensitivity. Additionally, the study details the creation of new molecules that, when paired with the new instrumentation, allow for the simultaneous labeling and imaging of up to 24 specific biomolecules, nearly five times the number of biomolecules that can be imaged at the same time with existing technologies. "In the era of systems biology, how to simultaneously image a large number of molecular species inside cells with high sensitivity and specificity remains a grand challenge of optical microscopy," Min said. "What makes our work new and unique is that there are two synergistic pieces - instrumentation and molecules - working together to combat this long-standing obstacle. Our platform has the capacity to transform understanding of complex biological systems: the vast human cell map, metabolic pathways, the functions of various structures within the brain, the internal environment of tumors, and macromolecule assembly, to name just a few." All existing methods of observing a variety of structures in living cells and tissues have their own strengths, but all are also hindered by fundamental limitations, not the least of which is the existence of a "color barrier." Fluorescence microscopy, for example, is extremely sensitive and, as such, is the most prevalent technique used in biology labs. The microscope allows scientists to monitor cellular processes in living systems by using proteins that are broadly referred to as "fluorescent proteins" with usually up to five colors. Each of the fluorescent proteins has a target structure that it applies a "tag," or color to. The five fluorescent proteins, or colors, typically used to tag these structures are BFP (Blue Fluorescent Protein), ECFP (Cyan Fluorescent Protein), GFP (Green Fluorescent Protein), mVenus (Yellow Fluorescent Protein), and DsRed (Red Fluorescent Protein). Despite its strengths, fluorescence microscopy is impeded by the "color barrier," which limits researchers to seeing a maximum of only five structures at a time because the fluorescent proteins used emit a range of indistinguishable shades that, as a result, fall into five broad color categories. If a researcher is trying to observe all of the hundreds of structures and different cell types in a live brain tumor tissue sample, for example, she would be restricted to seeing only up to five structures at a time on a single tissue sample. If she wanted to see more than those five, she would have to clean the tissue of the fluorescent labels she used to identify and tag the last five structures in order to use those same fluorescent labels to identify another set of up to five structures. She would have to repeat this process for every set of up to five structures she wants to see. Not only is observing a maximum of five structures at a time labor intensive, but in cleaning the tissue, vital components of that tissue could be lost or damaged. "We want to see them all at the same time to see how they're operating on their own and also how they're interacting with each other," said Lu Wei, lead author on the study and a postdoctoral researcher in the Min lab. "There are lots of components in a biological environment and we need to be able to see everything simultaneously to truly understand the processes." In addition to fluorescence microscopy, there are currently a variety of Raman microscopy techniques in use for observing living cell and tissue structures that work by making visible the vibrations stemming from characteristic chemical bonds in structures. Traditional Raman microscopy produces the highly-defined colors lacking in fluorescence microscopy, but is missing the sensitivity. As such, it requires a strong, concentrated vibrational signal that can only be achieved through the presence of millions of structures with the same chemical bond. If the signal from the chemical bonds is not strong enough, visualizing the associated structure is near impossible. To address this challenge, Min and his team, including Profs. Virginia Cornish in chemistry and Rafael Yuste in neuroscience, pursued a novel hybrid of existing microscopy techniques. They developed a new platform called electronic pre-resonance stimulated Raman scattering (epr-SRS) microscopy that combines the best of both worlds, bringing together a high level of sensitivity and selectivity. The innovative technique identifies, with extreme specificity, structures with significantly lower concentration - instead of millions of the same structure needed to identify the presence of that structure in traditional Raman microscopy, the new instrument requires only 30 for identification. The technique also utilizes a novel set of tagging molecules designed by the team to work synergistically with the ultramodern technology. The amplified "color palette" of molecules broadens tagging capabilities, allowing for the imaging of up to 24 structures at a time instead of being limited by only five fluorescent colors. The researchers believe there's potential for even further expansion in the future. The team has successfully tested the epr-SRS platform in brain tissue. "We were able to see the different cells working together," Wei said. "That's the power of a larger color palette. We can now light up all these different structures in brain tissue simultaneously. In the future we hope to watch them function in real time." Brain tissue is not the only thing the researchers envision this technique being used for, she added. "Different cell types have different functions, and scientists usually study only one cell type at a time. With more colors, we can now start to study multiple cells simultaneously to observe how they interact and function both on their own and together in healthy conditions versus in disease states." The new platform has many potential applications, Min said, adding that it is possible the technique could one day be used in the treatment of tumors that are hard to kill with available drugs. "If we can see how structures are interacting in cancer cells, we can identify ways to target specific structures more precisely," he said. "This platform could be game-changing in the pursuit of understanding anything that has a lot of components." Funding: NIH Director's New Innovator Award (1DP2EB016573), R01 (EB020892), the US Army Research Office (W911NF-12-1-0594), the Alfred P. Sloan Foundation and the Camille and Henry Dreyfus Foundation. R.Y. is supported by the NEI (EY024503, EY011787) and NIMH (MH101218, MH100561)


News Article | April 19, 2017
Site: www.fastcompany.com

When Sheena Wright stepped into the role of president of the United Way of New York City (UWNYC) nearly five years ago, she was the first woman to hold the position in the organization’s 80-year history. She currently presides over a staff of 98 employees and oversees programming and revenues in excess of $57 million . But Wright is no stranger to trailblazing. “I learned early on, I’m almost always underestimated,” she says. Raised in the South Bronx, Wright was one of a handful of African-American students at the boarding school she attended and entered Columbia University at the age of 16. Personal experience has informed Wright’s leadership as she’s sought to grow UWNYC by building on its existing strengths and assets. It’s also why she’s passionate about the work. Wright admits she was fortunate as a child to be exposed to programming that offered her different educational opportunities beyond public school. Some of her peers were not so lucky, she says. Wright says she’s had to be very clear about her strengths and capabilities, because there will always be people who see being a black woman from a poor neighborhood as a liability. “I have to be constantly aware [of my strengths], because there’s always going to be an assumption that they’re not there,” she explains, “and I’m going to have to demonstrate that.” Wright was immediately tasked with demonstrating her leadership skills and vision when she applied for the job at UWNYC by creating a new mission statement. “The organization was really at a turning point,” she says, being best known as a corporate social responsibility partner for companies and having a broad mission statement of advancing the common good. Wright says she thought hard about the fact that many companies didn’t need a middleman like United Way when they could easily reach nonprofits directly to donate resources. “My goal and vision was for UWNYC to reinvent itself,” she says, “and play more of an activist role.” After she landed the job, says Wright, she started putting this vision into motion. Of the activist role, she admits, “That was a little bit controversial. We had a board of corporate guys,” she says, 80% of whom were white men and their average age was 62. “I love them,” she enthuses, “they are wonderful people, but this was new language.” But it wasn’t just the board who were challenged by Wright’s bold language and vision. “The people who had the most trouble were internal,” she says. That mostly came from a place of doubt about putting a goal out there and not achieving it. Wright made them see that there was no alternative if they were going to grow. “We owe it to them,” says Wright of her predecessors at the organization. To get past the initial resistance, Wright says she couched the mind-set as an evolution and tried to engage staff and board members in conversations and decisions. Wright also says, “I’m always asking questions,” especially around the data collected from programming. “I think one of the challenges is making sure we have the right data and are asking the right questions,” she says. It’s also about changing established mind-sets.


News Article | April 25, 2017
Site: www.chromatographytechniques.com

Most people on Earth have already felt extreme and record heat, drought or downpours goosed by man-made global warming, new research finds. In a first-of-its-kind study, scientists analyzed weather stations worldwide and calculated that in 85 percent of the cases, the record for hottest day of the year had the fingerprints of climate change. Heat-trapping gases from the burning of coal, oil and natural gas made those records more likely or more intense. "The world is not quite at the point where every hot temperature record has a human fingerprint, but it's getting close to that," said lead author and Stanford University climate scientist Noah Diffenbaugh. Climate change's influence was spotted 57 percent of the time in records for lowest rainfall in a year and 41 percent of the time in records for most rain in a 5-day period, according to the study in Monday's Proceedings of the National Academy of Sciences. For the last several years, researchers have come up with a generally accepted scientific technique to determine whether an individual weather extreme event was made more likely or stronger because of climate change. It usually involves past weather data and extensive computer models that simulate how often an event would happen with no warming from greenhouse gases and compare that to how often it does happen. Outside scientists said what makes Diffenbaugh's study different and useful is that he doesn't look at an individual event such as California's five-year drought. Instead, he applies the technique to weather stations as a whole across the world, said Columbia University climate scientist Adam Sobel, who wasn't part of new work. "This is a step forward in that it allows general statements about what fraction of events of the given types selected have a statistically significant" human influence, Sobel said in an email.


News Article | April 27, 2017
Site: www.eurekalert.org

Bloomington, IN - April 27, 2017 - The Midwest Political Science Association announced fourteen award recipients at its annual MPSA Business Meeting earlier this month at the Palmer House Hilton in Chicago. Awards committees select the winners from among nominations made by chairs, discussants and section heads at the previous year's conference. Best Paper by an Emerging Scholar - Honoring the best paper, regardless of field or topic, by a scholar or scholars who has or have received the terminal degree(s) within six years of the year in which the paper was presented. Javier Osorio, City University of New York Livia I. Schubiger, London School of Economics Michael Weintraub, Binghamton University, SUNY Best Paper in International Relations - Honoring the best paper on the topic of international relations. Award Committee: Vesna Danilovic, University at Buffalo (Chair); David Cunningham, University of Maryland; Karl Kalenthaler, University of Akron Best Paper Presented by a Graduate Student - Honoring the best paper delivered by a graduate student. Can New Procedures Improve the Quality of Policing? The Case of 'Stop, Question and Frisk' in New York City Best Paper Presented in a Poster Format - Honoring the best paper presented in a poster format. Best Undergraduate Paper Presented in a Poster Format - Honoring the best paper presented in a poster format by an undergraduate. The Effect of Nationality on Grass-root Volunteer and Donors Support for Nongovernmental Organizations Kellogg/Notre Dame Award - Honoring the best paper in comparative politics. Anti-Identities in Latin America: Chavismo, Fujimorismo, and Uribismo in Comparative Perspective Award Committee: Sarah Brooks, The Ohio State University (Chair); Alex Tan, University of Canterbury; Zeynep Somer-Topcu, The University of Texas at Austin Kenneth J. Meier Award - Honoring the best paper in bureaucratic politics, public administration, or public policy. Slow-Rolling, Fast-Tracking, and the Pace of Bureaucratic Decisions in Rulemaking Lucius Barker Award - Honoring the best paper on a topic investigating race or ethnicity and politics and honoring the spirit and work of Professor Barker. Saved from a Second Slavery: Black Voter Registration in Louisiana from Reconstruction to the Voting Rights Act Review of Politics Award (co-winners) - Honoring the best paper in normative political theory. Reparative Justice and the Moral Limits of Discretionary Philanthropy Chiara Cordelli, University of Chicago Edmund Burke and the Deliberative Sublime Rob Goodman, Columbia University Robert H. Durr Award - Honoring the best paper applying quantitative methods to a substantive problem. Of Rents and Rumors: Government Competence and Media Freedom in Authoritarian Countries Sophonisba Breckinridge Award - Honoring the best paper on the topic of women and politics. Making Space for Women: Explaining Citizen Support for Legislative Gender Quotas in Latin America Pi Sigma Alpha Award - Honoring the best paper presented at the MPSA Annual National Conference. Sponsored by Pi Sigma Alpha, the national political science honor society. AJPS Best Article Award - Honoring the best article appearing in the volume of the American Journal of Political Science published in the year preceding the conference. The Midwest Political Science Association (MPSA) is an international organization with a membership of approximately 7,000 political science faculty, researchers and graduate students representing more than 100 countries. Founded in 1939, the MPSA is dedicated to the advancement of scholarship in all areas of political science. MPSA publishes the American Journal of Political Science the top research journal in the discipline.


News Article | April 20, 2017
Site: www.gizmag.com

The new microscopy technique provides a more comprehensive picture of what's happening both inside and between our cells (Credit: Nicoletta Barolini, Columbia University) By some estimates, the human eye can distinguish between a million colors, yet scientists looking through a microscope have been limited to seeing only five. Researchers at Columbia University have blasted through this "color barrier" with a new technique that ups the color vision of microscopy to a total of 24 different shades. Coloring structures inside our cells can help scientists track the effects of drugs, observe how cellular biomechanics play out, trace complex metabolic pathways and more. Till now, there have been two main ways for researchers to use color to observe cellular materials: fluorescence microscopy and Raman spectroscopy. In fluorescence microscopy, cellular structures are either stained or tagged with a fluorescent chemical known as a fluorophore (or fluorochrome) which, when struck by light, re-emits that light in a particular wavelength, illuminating the structure. However, the system only allows five different colors to be used at once, which means only five different structures can be watched at one time per tissue sample. If researchers want to see other processes in the same tissue, they need to clean it and start again, and the cleaning process can damage the tissue. With Raman spectroscopy, a tissue sample is beamed with a laser. A part of that light is scattered based on the vibrations it encounters from molecules in the tissue. A Raman microscope translates these different wavelengths into images and provides information about the chemical structure of the tissue. A drawback to this method is that for it to be effective, millions of any one structure must be present for it to create enough of a vibrational frequency to work. Any less and imaging becomes practically impossible. To solve the issues with these systems, the Columbia researchers combined them both. They call their new technique electronic pre-resonance stimulated Raman scattering (epr-SRS) microscopy. Instead of a million, the new microscope only needs 30 structures to perform Raman spectroscopy, making it leagues more sensitive than its precursors. The researchers also harnessed different tagging molecules and stains that allow 24 different cellular structures to be seen instead of just five. Lu Wei, lead author on the study, told New Atlas that 14 of those colors come from tagging molecules created by her team, while six were already are commercially available. The other four come from fluorescent stains. The epr-SRS system was tested out successfully using brain tissue. "We were able to see the different cells working together," said Wei. "That's the power of a larger color palette. We can now light up all these different structures in brain tissue simultaneously. In the future we hope to watch them function in real time." Not only does the new system allow researchers to get a more comprehensive picture of what's happening inside cells, the enhanced color system lets them understand what's happening between cells as well. "Different cell types have different functions, and scientists usually study only one cell type at a time," said Wei. "With more colors, we can now start to study multiple cells simultaneously to observe how they interact and function both on their own and together in healthy conditions versus in disease states." Wei Min, in whose lab the research was conducted, added that the new imaging technique could open the door to better cancer treatments. If researchers can see how structures interact in individual cancer cells, he said, then drugs could be developed to target those structures with increased precision. "In the era of systems biology, how to simultaneously image a large number of molecular species inside cells with high sensitivity and specificity remains a grand challenge of optical microscopy," Min said. "What makes our work new and unique is that there are two synergistic pieces – instrumentation and molecules – working together to combat this long-standing obstacle. Our platform has the capacity to transform understanding of complex biological systems: the vast human cell map, metabolic pathways, the functions of various structures within the brain, the internal environment of tumors, and macromolecule assembly, to name just a few." The work of the researchers has been published in the journal Nature.


Dr. Anthony L. D'Ambrosio, Board Certified Neurosurgeon & Director, Neurosurgeons of New Jersey, has joined The Expert Network©, an invitation-only service for distinguished professionals. Dr. D'Ambrosio has been chosen as a Distinguished Doctor™ based on peer reviews and ratings, numerous recognitions, and accomplishments achieved throughout his career. Dr. D'Ambrosio outshines others in his field due to his extensive educational background, numerous awards and recognitions, and career longevity. After graduating from Vanderbilt University School of Medicine in 1999, Dr. D’Ambrosio went on to pursue his internship and Neurological Surgery residency at the Neurological Institute of New York at Columbia University, serving as chief resident in his final year. He then completed a Skull Base and Cerebrovascular Surgery Fellowship in the Department of Neurological Surgery at the University of South Florida in Tampa. He is the recipient of many awards and honors, such as New Jersey Monthly’s Best Doctor Award and (201) Health’s Bergen’s Top Doctors, among many others. With 10 years dedicated to medicine, Dr. D'Ambrosio brings a wealth of knowledge to his industry, and in particular, to his area of expertise, neurosurgery. When asked why he decided to pursue a career in medicine, Dr. D'Ambrosio said: "When I was little, my eldest grandfather, whom I never met, passed away of heart disease. Then in 1999, my mom's dad died from the same disease. I knew then that I wanted to go into medicine to help people, even if it was for something other than cardiology." Dr. D’Ambrosio’s vast knowledge, training, and expertise have made him one of the go to neurosurgeons in the Tri-State area. Today, he devotes his practice to the treatment of brain tumors, skull base tumors, meningiomas, acoustic neuromas, pituitary tumors, Chiari malformation, and microvascular decompression (MVD) for trigeminal neuralgia and hemifacial spasm. Using state-of-the-art technology, Dr. D'Ambrosio is able to perform complex neurological procedures with much lower complication percentages than traditional approaches. As Co-Director of Gamma Knife Perfexion Program at Valley Hospital in Ridgewood, N.J., Dr. D’Ambrosio uses revolutionary, non-invasive radiation beams that offer extremely precise, concentrated radiation to shrink and stop the growth of tumors and other cranial abnormalities, including pain and movement disorders, that would otherwise require surgery. As a thought-leader in his specialty, Dr. D’Ambrosio is widely regarded for his groundbreaking research in neurosurgery. He has published numerous peer-reviewed journal articles on the subjects of neuro-oncology, surgical management of brain metastases, and surgical strategies for giant pituitary adenomas, among many others. He is a big proponent of making medical decisions based on proven scientific evidence to offer the best possible service for his patients. Dr. D’Ambrosio is proud to pass on his knowledge to those that want to follow in his footsteps as an Assistant Professor of Neurological Surgery at Columbia University. This prominent position in his specialty gives Dr. D’Ambrosio a unique vantage point from which to keep a close eye on prevailing trends in neurosurgery. In particular, he notes a promising advancement in getting patients access to healthcare through technology and social media: "I think what we're starting to see is a connection between social media and telemedicine. The barriers to health care are dropping, and the world is becoming flat with respect to people getting their medical care and information from all around the world. I'm very interested in seeing trends in social media connecting patients to the right doctors along with apps and telemedicine getting the right people to talk. So all those barriers, whether they be physical barriers or transportation barriers, are starting to drop which is really great for neurosurgery because some of the things we do are so refined and siloed. I think we're going to see a major uptaking in collaboration between doctors that will enable patients to find the right specialists." For more information, visit Dr. D’Ambrosio's profile on the Expert Network here: https://expertnetwork.co/members/anthony-l-d'ambrosio-md-faans/24e2c5f808fdb608 The Expert Network© has written this news release with approval and/or contributions from Dr. Anthony L. D'Ambrosio. The Expert Network© is an invitation-only reputation management service that is dedicated to helping professionals stand out, network, and gain a competitive edge. The Expert Network selects a limited number of professionals based on their individual recognitions and history of personal excellence.


News Article | April 29, 2017
Site: www.prweb.com

LearnHowToBecome.org, a leading resource provider for higher education and career information, has determined which online colleges and universities in the U.S. have the most military-friendly programs and services. Of the 50 four-year schools that earned honors, Drexel University, University of Southern California, Duquesne University, Regis University and Harvard University were the top five. 50 two-year schools were also recognized; Laramie County Community College, Western Wyoming Community College, Dakota College at Bottineau, Mesa Community College and Kansas City Kansas Community College ranked as the top five. A complete list of top schools is included below. “Veterans and active duty members of the military often face unique challenges when it comes to transitioning into college, from navigating the GI Bill to getting used to civilian life,” said Wes Ricketts, senior vice president of LearnHowToBecome.org. “These online schools not only offer military-friendly resources, they also offer an online format, allowing even the busiest members of our armed forces to earn a degree or certificate.” To be included on the “Most Military-Friendly Online Colleges” list, schools must be regionally accredited, not-for-profit institutions. Each college is also evaluated on additional data points such as the number and variety of degree programs offered, military tuition rates, employment services, post-college earnings of alumni and military-related academic resources. Complete details on each college, their individual scores and the data and methodology used to determine the LearnHowToBecome.org “Most Military-Friendly Online Colleges” list, visit: The Most Military-Friendly Online Four-Year Colleges in the U.S. for 2017 include: Arizona State University-Tempe Auburn University Azusa Pacific University Baker University Boston University Canisius College Carnegie Mellon University Columbia University in the City of New York Creighton University Dallas Baptist University Drexel University Duquesne University George Mason University Hampton University Harvard University Illinois Institute of Technology Iowa State University La Salle University Lawrence Technological University Lewis University Loyola University Chicago Miami University-Oxford Michigan Technological University Missouri University of Science and Technology North Carolina State University at Raleigh Norwich University Oklahoma State University-Main Campus Pennsylvania State University-Main Campus Purdue University-Main Campus Regis University Rochester Institute of Technology Saint Leo University Southern Methodist University Syracuse University Texas A & M University-College Station University of Arizona University of Denver University of Florida University of Idaho University of Illinois at Urbana-Champaign University of Michigan-Ann Arbor University of Minnesota-Twin Cities University of Mississippi University of Missouri-Columbia University of North Carolina at Chapel Hill University of Oklahoma-Norman Campus University of Southern California University of the Incarnate Word Washington State University Webster University The Most Military-Friendly Online Two-Year Colleges in the U.S. for 2017 include: Aims Community College Allen County Community College Amarillo College Barton County Community College Bunker Hill Community College Casper College Central Texas College Chandler-Gilbert Community College Cincinnati State Technical and Community College Cochise College Columbus State Community College Cowley County Community College Craven Community College Dakota College at Bottineau East Mississippi Community College Eastern New Mexico University - Roswell Campus Edmonds Community College Fox Valley Technical College GateWay Community College Grayson College Hutchinson Community College Kansas City Kansas Community College Lake Region State College Laramie County Community College Lone Star College Mesa Community College Metropolitan Community College Mitchell Technical Institute Mount Wachusett Community College Navarro College Northeast Community College Norwalk Community College Ozarka College Phoenix College Prince George's Community College Quinsigamond Community College Rio Salado College Rose State College Sheridan College Shoreline Community College Sinclair College Southeast Community College Southwestern Oregon Community College State Fair Community College Truckee Meadows Community College Western Nebraska Community College Western Oklahoma State College Western Texas College Western Wyoming Community College Yavapai College ### About Us: LearnHowtoBecome.org was founded in 2013 to provide data and expert driven information about employment opportunities and the education needed to land the perfect career. Our materials cover a wide range of professions, industries and degree programs, and are designed for people who want to choose, change or advance their careers. We also provide helpful resources and guides that address social issues, financial aid and other special interest in higher education. Information from LearnHowtoBecome.org has proudly been featured by more than 700 educational institutions.


News Article | April 26, 2017
Site: www.scientificamerican.com

Pres. Donald Trump’s administration has announced plans to dismantle an array of federal efforts to fight global warming, including a program to reduce carbon emissions from coal-fired power plants, a rule limiting methane gas leaks and a mandate that aggressively boosts auto emissions standards. But Trump officials face a major roadblock in their efforts, legal scholars say. It is the U.S. Environmental Protection Agency’s 2009 formal “endangerment finding,” which states carbon dioxide and five other greenhouse gases emitted from smokestacks and other man-made sources “threaten the public health and welfare of current and future generations.” This agency rule, supported by two Supreme Court decisions, legally compels the government to do exactly what its new leaders want to avoid: regulate greenhouse gases. Although EPA Administrator Scott Pruitt publicly doubts a connection between human-produced carbon emissions and global warming, any attempt to undo this rule “would be walking into a legal buzz saw,” says Michael Gerrard, faculty director of the Sabin Center for Climate Change Law at Columbia University. Endangerment is “the linchpin for everything—all of the carbon regulation under the Clean Air Act,” says Patrick Parenteau, a professor of environmental law at Vermont Law School. The rule’s fundamental power is exactly why Pruitt has to remove it, says Myron Ebell, who oversaw the Trump transition team at the EPA. “You can’t just take out the flowers—you have to take out the roots—starting with the endangerment finding,” says Ebell, a senior fellow at the conservative think tank the Competitive Enterprise Institute. “You can undo the Obama climate agenda on the surface by reopening the Clean Power Plant rule, the methane rule, rescinding the [auto emissions] standards and so on. But the underlying foundation remains.” The conservative Web site Breitbart, read widely among Trump’s supporters and still tied to its former publisher, White House adviser Steve Bannon, has attacked Pruitt as a political careerist for reportedly resisting pressure to revoke the finding. The rule rests on a 2007 Supreme Court decision in the case Massachusetts v. EPA, which determined the agency has the authority under the Clean Air Act to regulate greenhouse gases. When the finding itself was later challenged, the Court upheld it. The endangerment finding prevents Pruitt from ignoring climate change or eliminating greenhouse gas regulations outright. The EPA can attempt to water down these standards and regulations, perhaps substantially. But Pruitt “would have to come up with a scientific basis for saying that greenhouse gas emissions do not in fact pose a threat to public health and welfare,” Gerrard says. “That would be a very difficult finding, considering every court that has addressed the issue of the science of climate change has found there to be a solid factual, scientific basis for it.” To begin to remove the endangerment rule, the EPA would have to go through a formal rule-making process. That means inviting public comments, reviewing available evidence and scientifically justifying every point. Formulating and then defending such a document in court would be a big challenge, given it cuts against the legal and scientific consensus linking carbon to climate change. Even Ebell concedes this is a formidable obstacle. “That’s why a lot of people on our side say it’s not worth the trouble,” he says. “The people who disagree with me are not nuts—they are making substantial arguments for why we should not do it.” The endangerment finding has its roots in the waning days of the Clinton administration, when then–EPA General Counsel Jonathan Cannon drafted a legal memo stating the agency had the authority to regulate carbon emissions. At the time this was a novel and counterintuitive idea. CO is a ubiquitous, naturally occurring gas, essential to photosynthesis and other basic processes of life on Earth. It’s not poisonous like smog and other dangerous pollutants targeted by the Clean Air Act. “CO is a different sort of pollutant than many that the EPA regulates,” says Cannon, now a professor at the University of Virginia School of Law. “Its effects are felt over time through the climate system, not as immediate effects on one’s lungs or physical systems.” But the Clean Air Act “has a very broad definition of what a pollutant can be and what harm a pollutant causes,” says George Kimbrell, legal director of the International Center for Technology Assessment and the Center for Food Safety, two related groups among a coalition of environmental organizations that formally petitioned the EPA to regulate carbon in 1999. The law defines “air pollutant” as "any air pollution agent or combination of such agents, including any physical, chemical, biological, radioactive...substance or matter, which is emitted into or otherwise enters the ambient air.” According to Kimbrell, “The breadth of that language suggested greenhouse gas emissions would qualify under the statute.” The language prompted a lawsuit from states and small environmental groups, during the George W. Bush administration, to sue the EPA to force it to regulate carbon. The result was the Supreme Court’s 5–4 2007 Massachusetts decision. Following that ruling, the endangerment finding then spelled out the legal rationale and the scientific basis for regulation. What can the Trump administration do to get out of this regulatory box? It could push Congress to amend the Clean Air Act to explicitly exclude carbon dioxide and other greenhouse gases from the list of air pollutants. But even if it passed the Republican-dominated House, Parenteau notes, such a bill could be effectively opposed by Democrats in the Senate, who have enough votes to hold up or change legislation. . The EPA could also target climate rules not based on the endangerment finding, such as procedures for monitoring and reporting greenhouse gases, according to Gerrard. The most likely outcome, legal scholars say, is a series of incremental battles in which the administration and Congress try to weaken individual climate rules and enforcement—while those efforts are repeatedly challenged in court by states and environmental groups hoping to run out the clock on the Trump administration. “One reason the endangerment finding is important,” Cannon says, “is that, should administrations change, it provides the basis for further climate initiatives.”


News Article | April 27, 2017
Site: www.rdmag.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study’s authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging’s Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer’s Disease Discovery Foundation and Lumind.


News Article | May 2, 2017
Site: www.scientificamerican.com

A lot of human connectome images (neural connection brain maps) pass across my desk as reference material for Scientific American graphics. But I’ve never seen it like this: Twenty-four feet tall, rendered with a LED system and seven 72 inch monitors that crawl up and down the wall. The connectome rotates and transitions into a view of the external surface of the brain. The monitors slide up, focusing the viewer’s attention on a strip of the background image, revealing that portion in greater resolution. Then the monitors cascade back down to human-level, and beckon people to step forward and interact. I’m at the Jerome L. Greene Science Center on Columbia University’s Manhattanville campus, checking out the new Brain Index exhibit. The project aims to capture the imagination of visitors with larger-than-life imagery and interactive stories about research conducted in the labs upstairs. Commissioned by the Mortimer B. Zuckerman Mind Brain Behavior Institute and orchestrated by creative directors Mark Hansen (Professor of Journalism, Director of the Brown Institute for Media Innovation) and Laura Kurgan (Associate Professor of Architecture, Director of the Center for Spatial Research), it’s now a permanent feature of the ground floor, and open to the public. Eight scientists are featured when the touch-screen monitors dock at ground level, allowing visitors to swipe their way through an overview of their research, woven together with illustrations by Fernando Togni. A panel on blood and the brain catches my attention, highlighting Elizabeth Hillman’s work. (See her article in Scientific American MIND on the neuron-blood vessel connection, and how things can go wrong with it). Although the blurbs are interesting, after perusing a few stories, I find myself wanting the screens to crawl back up the wall. I spend too much time with my nose close to a screen as it is. There’s something refreshing and thought-provoking about stepping back, looking up, and watching the brain dance.


Antarctica isn’t a huge, static block of ice where very little goes on. For the first time, scientists are getting a sense of just how active the continent’s extensive network of lakes, rivers, and streams is. These bodies of water have existed for decades on Antarctica, and their meltwater affects the stability of the ice shelf underneath. That, in turn, has important implications for sea level rise. Antarctica’s landmass is surrounded by hundreds of floating ice shelves that play a key role in preventing sea levels from engulfing our coastal cities. In fact, these ice shelves keep the ground-based ice from flowing into the sea, which would raise sea levels by several feet. Scientists have long known that, in the summer, some surface ice and snow on these ice shelves melts, pooling in lakes and streams. But until now, the phenomenon was thought to be pretty rare, according to Alison Banwell at the University of Cambridge’s Scott Polar Research Institute, who wrote a comment on the new research. Today’s study, published in Nature, shows that the network of lakes and streams is actually widespread on top of many ice shelves, transporting water for up to 75 miles. Some ponds were found to be up to 50 miles long. “The fact that there are these huge rivers moving water for hundreds of kilometers, that’s even quite an exciting discovery,” lead study author Jonathan Kingslake, a glaciologist at Columbia University’s Lamont-Doherty Earth Observatory, tells The Verge. “They’re very common across the ice sheet, but we are a long way from being able to understand how they behave and how they will impact the ice sheet in the long term.” In fact, there’s a lot we don’t know about the way this meltwater interacts with the ice sheet. Lakes and ponds that form on top of the ice are thought to be dangerous. That’s because the weight of the liquid water can crack the ice; when water drains through the crevasses, it may freeze and expand, widening the cracks and fracturing the ice. This process is believed to have caused the break-up of the Larsen B Ice Shelf in 2002. But the meltwater doesn’t only collect in puddles. Today’s study shows that it also flows downhill in rivers — for miles across the continent. And another study published today, also in Nature, shows that the meltwater doesn’t necessarily weaken the ice shelf beneath. This second paper analyzed a particular region called the Nansen Ice Shelf located in West Antarctica. Here, large and complex river networks allow huge amounts of meltwater to flow off the shelf into the ocean, with a 400-foot-wide waterfall. The drainage system may be protecting the ice shelf by getting the water off the ice quickly, before its weight cracks the ice. That means the meltwater isn’t necessarily dangerous. “The meltwater acts as a jackhammer on an ice shelf is what we’ve always thought,” the lead author of the second Nature paper, Robin Bell, a polar scientist at Columbia University’s Lamont-Doherty Earth Observatory, tells The Verge. “This study suggests that we can’t just assume that if we turn up the temperature, every ice shelf will collapse.” Rather, the study suggests the process will be more complex. Scientists expect that as temperatures warm up, we’re going to see even more meltwater in Antarctica. So understanding how this water behaves and what effects it has is key for predicting what’s going to happen in this part of the world, and whether or not it’s going to affect sea level rise. In the first study, researchers analyzed satellite images from 1973 onwards, as well as aerial photos taken from 1947 onward. They found that a widespread and complicated drainage system made of lakes and rivers has existed across Antarctica for decades. Some of these streams and ponds are present as close as 375 miles from the South Pole, and at 4,300 feet above sea level. Those are areas that were thought to be clear of liquid water. Whether the amount of meltwater has actually increased in the past 70 years is impossible to tell, says Kingslake. That’s because in the past, photos of the continent were not taken as frequently as today. So you might have a photo taken in 1973, and then another one taken in 1980, with no images in between, Kingslake says. That seven-year gap doesn’t allow researchers to understand long-term trends, and calculate whether we’re seeing more meltwater. “At the moment, the initial indication is that things haven’t changed significantly,” Kingslake says. But as the planet warms up, scientists are expecting to see more ice and snow from the surface to melt and puddle up in lakes or flow in rivers. What effects this liquid water will have on the stability of the ice shelf, however, remains to be seen. “It is complicated and there are a whole bunch of processes that are really interesting and we don’t really understand,” Kingslake says. That’s why the Nature studies published today are important: they add a piece of the puzzle to figuring out how one of the largest reserve of ice on Earth works. As temperatures climb up and waters warm, all this information will be key to understand how sea levels will rise. As for how Kingslake got interested in studying Antarctica’s drainage systems, it’s all thanks to Google Earth. In 2010, he used to spend lots of time surfing the site, he says. At one point, he noticed lots of ponds on Antarctica’s ice surface. That inspired him to study more detailed satellite images and look into the widespread system of lakes and rivers dotting the continent. Today, Kingslake tells his students to never feel bad if they’re procrastinating by looking at Google Earth images. You never know what you’re going to find. “It’s not a waste of time!” he says.


News Article | April 18, 2017
Site: www.futurity.org

A new camera system doesn’t need a long lens to take a detailed micron-resolution image of a faraway object. The prototype reads a spot illuminated by a laser and captures the “speckle” pattern with a camera sensor. Raw data from dozens of camera positions feed into a computer program that interprets it and constructs a high-resolution image. The system known as SAVI—for “Synthetic Apertures for long-range, subdiffraction-limited Visible Imaging”—only works with coherent illumination sources such as lasers, but it’s a step toward a SAVI camera array for use in visible light, says Ashok Veeraraghavan, assistant professor of electrical and computer engineering at Rice University. “Today, the technology can be applied only to coherent (laser) light,” he says. “That means you cannot apply these techniques to take pictures outdoors and improve resolution for sunlit images—as yet. Our hope is that one day, maybe a decade from now, we will have that ability.” The technology is the subject of a paper in Science Advances. Labs led by Veeraraghavan at Rice and Oliver Cossairt at Northwestern University’s McCormick School of Engineering built and tested the device that compares interference patterns between multiple speckled images. Like the technique used to achieve The Matrix’s “bullet time” special effect, the images are taken from slightly different angles, but with one camera that moves between shots instead of many fired in sequence. Veeraraghavan explains the speckles serve as reference beams and essentially replace one of the two beams used to create holograms. When a laser illuminates a rough surface, the viewer sees grain-like speckles in the dot. That’s because some of the returning light scattered from points on the surface has farther to go and throws the collective wave out of phase. The texture of a piece of paper—or even a fingerprint—is enough to cause the effect. The researchers use these phase irregularities to their advantage. “The problem we’re solving is that no matter what wavelength of light you use, the resolution of the image—the smallest feature you can resolve in a scene—depends upon this fundamental quantity called the diffraction limit, which scales linearly with the size of your aperture,” Veeraraghavan says. “With a traditional camera, the larger the physical size of the aperture, the better the resolution,” he says. “If you want an aperture that’s half a foot, you may need 30 glass surfaces to remove aberrations and create a focused spot. This makes your lens very big and bulky.” SAVI’s “synthetic aperture” sidesteps the problem by replacing a long lens with a computer program the resolves the speckle data into an image. “You can capture interference patterns from a fair distance,” Veeraraghavan says. “How far depends on how strong the laser is and how far away you can illuminate.” “By moving aberration estimation and correction out to computation, we can create a compact device that gives us the same surface area as the lens we want without the size, weight, volume, and cost,” says Cossairt, an assistant professor of electrical engineering and computer science at Northwestern. Lead author Jason Holloway, a Rice alumnus who is now a postdoctoral researcher at Columbia University, suggests an array of inexpensive sensors and plastic lenses that cost a few dollars each may someday replace traditional telephoto lenses that cost more than $100,000. “We should be able to capture that exact same performance but at orders-of-magnitude lower cost,” he says. Such an array would eliminate the need for a moving camera and capture all the data at once, “or as close to that as possible,” Cossairt says. “We want to push this to where we can do things dynamically. That’s what is really unique: There’s an avenue toward real-time, high-resolution capture using this synthetic aperture approach.” Veeraraghavan says SAVI leans on work from Caltech and the University of California, Berkeley, which developed the Fourier ptychography technique that allows microscopes to resolve images beyond the physical limitations of their optics. The SAVI team’s breakthrough was the discovery that it could put the light source on the same side as the camera rather than behind the target, as in transmission microscopy, Cossairt says. “We started by making a larger version of their microscope, but SAVI has additional technical challenges. Solving those is what this paper is about,” Veeraraghavan says. The National Science Foundation, the Office of Naval Research, and a Northwestern University McCormick Catalyst grant supported the research.


News Article | April 19, 2017
Site: cen.acs.org

A new method expands the number of molecules that can be imaged simultaneously in biological samples. Wei Min and coworkers at Columbia University demonstrate that their method could resolve 24 dye-labeled molecules at a time with the method—20 with a new version of stimulated Raman scattering (SRS) microscopy called electronic preresonance SRS and four with fluorescence microscopy (Nature 2017, DOI: 10.1038/nature22051).


WASHINGTON, DC / ACCESSWIRE / April 22, 2017 / Distinguished leader in digital marketing services, CEO of DC-based RedPeg Marketing, Brad Nierenberg says that a well-defined and positive corporate culture is critical for employee retention as well as an organization's overall productivity and greater gains from its business operations. As the founder of the experiential ad agency, Nierenberg has realized that the individual experiences of employees have a much more significant impact on morale than the typical business-offered "morale boosters". The creative marketing expert recently shed light on the three most important aspects of creating a meaningful, productive, and thought provoking work environment. In 2012, Columbia University students conducted a study on the effects a company’s atmosphere has on its turnover rates. They came to the conclusion that employees were 45% more likely to leave a job with poor culture than one that provided positive experiences. The results also showed that employers who cultivate a rich work setting often enjoy retention rates as high as 77%. The first step, Brad Nierenberg states, is to welcome creativity. At RedPeg, Nierenberg takes the open door policy a step further. To his managers he asks, "Is your door really open? Are you truly encouraging fresh thinking?" If ideas are immediately rejected or disregarded by superiors, employees feel their professional growth is being stunted, and the business in turn suffers when they refrain from contributing again. In order to achieve company-wide growth, the creative process demands that new concepts and solutions be embraced and encouraged, not discouraged. Industry powerhouses like Google and 3M are a direct result of innovation from within. Google actually encourages staff to dedicate one day a week to creating new concepts. 3M's most successful product, the Sticky Note, was completely designed by employees. Managers must be able to accurately identify a person's strengths and weaknesses, and create tasks that allow them to maximize the impact of their talents. A recent poll by Gallup showed that putting these techniques into practice will result in employees being 7.8% more productive and six times more likely to be engaged in their job. When a team is compiled to successfully take advantage of its strengths, its work output is increased by 12.5%. In other words, more is being accomplished and those completing the jobs are also receiving immense satisfaction doing the work. Finally, and most importantly says Nierenberg, officemates must feel like they are surrounded by a second family. To accomplish this, he curates experiences that build deeper bonds between his employees. Each week he sends three employees from different departments out to lunch. Called the Three Amigos, this helps team members get to know each other in ways that may not happen in the office. Nierenberg has also rented a beach house during the summer and sent different sets of employees there. The resulting bonds are stronger than what could be achieved in the day to day work back at the office. Inevitably, conflicts will occur, but those moments of uncertainty should be used as a chance to work towards something greater, together. High levels of respect, trust, and comfort will make the countless hours of working closely together a rewarding experience. Treating employees as valued contributors helps employers to continually improve and adapt the workplace to best optimize culture and productivity. Brad Nierenberg is an entrepreneur and the President & CEO of RedPeg Marketing. With over 20 years of industry experience, Nierenberg has won several awards for his contributions to the field of Experiential Marketing. Besides being a nationally sought after speaker, his written works have been featured in The Wall Street Journal, The Washington Post, and Inc. Magazine, along with many other prestigious business publications.


News Article | May 5, 2017
Site: www.eurekalert.org

New York, NY, May 5, 2017 - An international group of experts has concluded that, for patients with schizophrenia and related psychotic disorders, antipsychotic medications do not have negative long-term effects on patients' outcomes or the brain. In addition, the benefits of these medications are much greater than their potential side effects. These findings, by Jeffrey Lieberman, MD, Lawrence C. Kolb Professor and Chairman of Psychiatry at Columbia University College of Physicians and Surgeon and Director of the New York State Psychiatric Institute, and colleagues from institutions in the United States, Germany, The Netherlands, Austria, Japan, and China, were published today in the American Journal of Psychiatry. Nearly seven million Americans take antipsychotic medications for the treatment of schizophrenia and related conditions. The medications are prescribed to alleviate the symptoms of psychosis and longer-term, to prevent relapse. In recent years, however, concerns have been raised that these medications could have toxic effects and negatively impact long-term outcomes. This view, if not justified by data, has the potential mislead some patients (and their families) to refuse or discontinue antipsychotic treatment. For this reason, the researchers undertook a comprehensive examination of clinical and basic research studies that examined the effects of antipsychotic drug treatment on the clinical outcomes of patients and changes in brain structure. "The evidence from randomized clinical trials and neuroimaging studies overwhelmingly suggests that the majority of patients with schizophrenia benefit from antipsychotic treatment, both in the initial presentation of the disease and for longer-term maintenance to prevent relapse," said Dr. Lieberman. Moreover, whatever side effects that these medications might cause are greatly outweighed by their therapeutic benefits. "Anyone who doubts this conclusion should talk with people whose symptoms have been relieved by treatment and literally given back their lives," Lieberman added. The studies also revealed that delaying or withholding treatment has been associated with poorer long-term outcomes. "While a minority of patients who recover from an initial psychotic episode may maintain their remission without antipsychotic treatment, there is currently no clinical biomarker to identify them, and it is a very small number of patients who may fall into this subgroup," said Dr. Lieberman. "Consequently, withholding treatment could be detrimental for most patients with schizophrenia." And while preclinical studies in rodents suggested that antipsychotic medications can sensitize dopamine receptors, there is no evidence that antipsychotic treatment increases the risk of relapse. While antipsychotic medications can increase the risk for metabolic syndrome, which is linked to heart disease, diabetes, and stroke, the study did not include a risk-benefit analysis. "While more research is needed to address these questions, the strong evidence supporting the benefits of antipsychotic medications should be made clear to patients and their families, while at the same time they should be used judiciously" said Dr. Lieberman. The paper is entitled, "The Long-Term Effects of Antipsychotic Medication on Clinical Course in Schizophrenia." The authors are Donald Goff, MD (New York University School of Medicine, New York, NY), Peter Falkai, MD, PhD (Ludwig-Maximilians-University Munich, Germany), Wolfgang Fleischhacker, MD, (Medical University of Innsbruck, Austria), Ragy Girgis, MD (Columbia University Medical Center), Rene M. Kahn, MD, PhD (University Medical Center, Utrecht, The Netherlands;), Hiroyuki Uchida, MD, PhD (Keiyo University, Tokyo, Japan), Jingping Zhao, MD, Ph.D. (Central South University, Chengsha, China), and Jeffrey Lieberman, MD (Columbia University Medical Center and New York State Psychiatric Institute). Dr. Goff has received research support from Avanir Pharmaceuticals, the National Institute of Mental Health, and the Stanley Medical Research Institute. Dr. Fleischhacker has received research support from Boehringer-Ingelheim, Janssen, Lundbeck, and Otsuka; he has received honoraria for serving as a consultant to and/or on advisory boards for Allergan, Dainippon-Sumitomo, GedeonRichter, Janssen, Lundbeck, Otsuka, Takeda, and Teva; and he has received speaker's fees and travel support from AOP Orphan, Dainippon Sumitomo, Gedeon Richter, Janssen, Lundbeck, Pfizer, Otsuka, and Teva. Dr. Girgis receives research support from Allergan, BioAdvantex, Genentech, and Otsuka. Dr. Kahn has received consulting fees from Alkermes, Forrest, Forum, Gedeon-Richter, Janssen-Cilag, Minerva Neurosciences, and Sunovion and speaker's fees from Janssen-Cilag and Lilly. Dr. Uchida has received grants from Astellas Pharmaceutical, Dainippon Sumitomo Pharma, Eisai, Eli Lilly, Meiji-Seika Pharmaceutical, Mochida Pharmaceutical, Novartis, Otsuka Pharmaceutical, and Shionogi; speaker's honoraria from Dainippon-Sumitomo Pharma, Eli Lilly, Janssen Pharmaceutical, Meiji-Seika Pharma, MSD, Otsuka Pharmaceutical, Pfizer, Shionogi, and Yoshitomi Yakuhin; and advisory panel payments from Dainippon-Sumitomo Pharma. All other authors report no financial relationships with commercial interests. New York State Psychiatric Institute and Columbia University Department of Psychiatry (NYSPI/Columbia Psychiatry). New York State Psychiatric Institute (founded in 1896) and the Columbia University Department of Psychiatry have been closely affiliated since 1925. Their co-location in a New York State facility on the New York-Presbyterian/Columbia University Medical Center campus provides the setting for a rich and productive collaborative relationship among scientists and physicians in a variety of disciplines. NYSPI/Columbia Psychiatry is ranked among the best departments and psychiatric research facilities in the nation and has contributed greatly to the understanding of and current treatment for psychiatric disorders. The Department and Institute are home to distinguished clinicians and researchers noted for their clinical and research advances in the diagnosis and treatment of depression, suicide, schizophrenia, bipolar and anxiety disorders and childhood psychiatric disorders. Their combined expertise provides state of the art clinical care for patients, and training for the next generation of psychiatrists and psychiatric researchers. Columbia University Medical Center provides international leadership in basic, preclinical, and clinical research; medical and health sciences education; and patient care. The medical center trains future leaders and includes the dedicated work of many physicians, scientists, public health professionals, dentists, and nurses at the College of Physicians and Surgeons, the Mailman School of Public Health, the College of Dental Medicine, the School of Nursing, the biomedical departments of the Graduate School of Arts and Sciences, and allied research centers and institutions. Columbia University Medical Center is home to the largest medical research enterprise in New York City and State and one of the largest faculty medical practices in the Northeast. The campus that Columbia University Medical Center shares with its hospital partner, NewYork-Presbyterian, is now called the Columbia University Irving Medical Center. For more information, visit cumc.columbia.edu or columbiadoctors.org.


News Article | May 4, 2017
Site: www.greentechmedia.com

MIT Technology Review: Here’s Why Trump’s Plan to Save the Coal Industry Is Doomed Donald Trump’s policy efforts to rejuvenate the coal industry are unlikely to succeed. We’ve argued that for some time, of course. But a new report from Columbia University, which shows that regulations have played only a small part in the decline of the coal industry to date, lends extra weight to the thesis. If you’ve not been paying attention, coal has been taking a tumble recently. In America, its use fell by 30 percent between 2011 and 2016. The new analysis of what accounts for that slump makes for interesting reading: The Columbia team attributes around half of coal’s decline to the affordability of natural gas, 26 percent to reduced electricity demand, and 18 percent to surging renewables. The Trump administration has strenuously argued that President Obama introduced rules that placed unnecessary burdens on the burning of coal. The study does indeed identify 10 regulations introduced under the Obama administration -- from the notorious Clean Power Plan to more obscure Effluent Guidelines -- that will have dampened the sector. But it also finds that they would account for just a 3.5 percent decline in coal. The Post & Courier: S.C. Congressional Delegation Loses Fight to Get Nuclear Tax Credit in Government Spending Bill The budget agreement worked out in Congress has disappointed every member of the South Carolina delegation after a highly desired nuclear power plant tax credit was left out. Excluded from the plan that's supposed to keep the government running through September is a provision extending the deadline for nuclear power plants to take advantage of the tax bonus, threatening to undermine a major economic driver in the state. At issue is a credit Congress created in 2005 to incentivize nuclear power production. But it gave plants a 2020 deadline to complete their work in order to qualify. The Star: Technology Guru Bill Joy Is Betting on a Bulletproof Battery Bill Joy, the Silicon Valley guru and Sun Microsystems Inc. co-founder, sees the future of energy in a battery that can take a bullet. The venture capitalist formerly with Kleiner Perkins Caufield Byers LLC is now dedicating most of his time to Ionic Materials Inc, a Woburn, Massachusetts-based startup developing lithium batteries that won’t burst into flames. They’re strong enough to withstand being pierced by nails and even getting shot, as the company demonstrates in a promotional video. The effort is part of a global race to devise better storage systems for handheld devices, cars, trucks and electrical grids. The problem is that conventional lithium-ion batteries contain liquid electrolytes that wear out quickly and have a nasty habit of spontaneously combusting, sometimes aboard jetliners. Ionic Materials says it’s solved those problems by crafting batteries from a solid plastic-like material. “If you can make the battery out of a solid, these problems essentially disappear,” Joy said in a phone interview, speaking from a boat in the South Pacific. “It’s really a breakthrough in cost, safety and performance.” Environmentalists sued the Trump administration on Wednesday over an executive order aimed at increasing access to offshore oil and gas production that could include areas where former President Barack Obama banned drilling. Ten environmental organizations filed the lawsuit with the U.S. District Court for the District of Alaska, arguing President Donald Trump does not have the legal authority to undo Obama’s indefinite ban on offshore drilling in much of the Arctic Ocean and portions of the Atlantic Ocean. Trump signed an executive order on April 28 calling on the Department of the Interior to review the Obama administration’s plan for offshore drilling leases from 2017 to 2022, which the groups anticipate will lead to leases in areas restricted by Obama’s ban. HuffPost: Only 2 Countries Aren’t Part of the Paris Agreement. Will the U.S. Be the Third? Just two countries refused to partake in the Paris Agreement, the historic climate deal to cut greenhouse gas emissions that was signed by nearly every nation. One of them, Syria, is in ruins after six years of ongoing civil war. The other, Nicaragua, boycotted the accord to protest its unambitious initial goals and its failure to legally bind countries to their emissions targets. The one other holdout, Uzbekistan, finally signed onto the agreement last month. But now President Donald Trump is poised to withdraw the United States -- which was largely responsible, under the Obama administration, for orchestrating the deal -- from the Paris Agreement. In 2016, Trump campaigned on a promise to “cancel” the deal, which he said put an unfair burden on the U.S. and gave poorer countries a pass. The U.S., second only to China in its amount of carbon pollution, committed to slashing emissions by 26 to 28 percent below 2005 levels by 2025.


News Article | May 8, 2017
Site: www.scientificamerican.com

Eons ago Earth experienced a wild transformation: it turned into a giant snowball. These massive glaciation events, where ice encased the planet from pole-to-pole, are fittingly named “snowball Earth.” There were at least two occurrences: one around 717 million and another some 645 million years ago. Although geologists have good evidence Earth experienced these snowball events, they still cannot figure out howthey happened. Scientists have debated for decades over what set off the most profound climatic changes in the planet’s geologic record. Now researchers at Harvard University have a new idea that may finally provide an answer: They say volcanic regions, located in the right place at the right time, may have triggered at least the one of these giant glaciation events. If you traveled back in time to Earth about 700 million years ago, you would have found ice hundreds of meters thick covering the oceans and continents, although the land masses may have also had some bare, dry areas dotted with ice-covered hypersaline lakes. The average global temperature fell around negative 37 degrees Fahrenheit. The snowball-like Earth was largely uninhabitable. Thankfully, these apocalyptic glacial periods happen rarely—but that fact also makes it hard for scientists to determine how such an extreme climate formed. “The further we go back in time, the more Earth resembles a world very different from the one we live on today,” explains Linda Sohl, a paleoclimatologist at Columbia University's Center for Climate Systems Research and NASA's Goddard Institute for Space Studies “So we can’t readily interpret the past based on our knowledge of the present.” Researchers have proposed a host of ideas about what sparked snowball Earths. The cause—whatever it was—had to cool the planet so that enough ice formed to reflect much of the sun’s incoming energy, creating a runaway cooling effect. One hypothesis suggests a large meteorite hit the planet and threw up so much dust and ash into the air it reduced the incoming solar radiation for a couple years and chilled the planet. Other ideas involve similar types of brief but catastrophic events, such as a gigantic volcanic eruption. Yet another hypothesis proposes some kind of organism evolved that could remove a large amount of carbon from the surface of the ocean and bury it in deep sediments after they died and settled on the ocean floor; that mechanism would theoretically have kept enough carbon out of the atmosphere to cause runaway cooling. None of these ideas have much—if any—physical evidence to back them up, however. One of the most popular ideas focuses on weathering, a natural process that captures and stores carbon via the chemical breakdown of rocks. When the supercontinent Rodinia broke up around 750 million years ago, the new, smaller continents scattered to locations around the equator where it was warm and wet—prime conditions for weathering. In addition, large volcanic regions would have emerged as the giant land mass fragmented, which would have been extremely vulnerable to weathering. The problem: weathering works incredibly slowly—the process is constantly happening but it affects the global climate on a million-year time scale. Earth’s climate system usually self-corrects in that amount of time. Plus, the greater volcanic activity would have released carbon dioxide, making it even harder to push Earth into a snowball state. This supercontinent breakup scenario could have caused a runaway cooling effect only if weathering outpaced other feedbacks in the climate system, explains Francis Macdonald, an associate professor of geology at Harvard. Because none of the ideas is completely satisfactory, Macdonald and colleague Robin Wordsworth, an assistant professor of environmental science and engineering, set out to find another explanation. In 2010 Macdonald published a paper that, for the first time, pinned down the precise date when the Sturtian glaciation—the first of the two snowball Earths—began. “We could suddenly say within a few hundred thousand years when this event actually occurred,” Macdonald explains. “Before, it had only been known within tens of millions of years.” He discovered Sturtian glaciation started around 717 million years ago. Around the same time, Macdonald dated a volcanic region, called the Franklin Large Igneous Province (LIP). He discovered the Franklin LIP became active close to when the first snowball Earth event began. “I started thinking: How could these be so coincident? How might they be related?” he says. Armed with this new information, Macdonald and Wordsworth used a combination of geologic evidence and modeling to test whether the Franklin LIP could be the culprit. In a new study, published in February in Geophysical Research Letters, they show the Franklin LIP’s volcanic activity could have caused extreme climate cooling. That is because of a unique combination of factors: First, the Franklin LIP formed in an area rich in sulfur; as it erupted, large plumes of hot gas and dust would have lofted sulfur particles kilometers into the air. Sulfur particles block the incoming sun and also keep heat from escaping Earth, which can create either a warming or cooling effect, depending on the location. That’s why the next piece of physical evidence is key—geologic records show the Franklin LIP sat at the equator where Earth receives more solar energy than the amount of heat it radiates back out to space. According to the researchers’ model, if enough sulfur particles reached high enough into the atmosphere at this equatorial location, it would block enough of the sun’s incoming energy to trigger runaway cooling. The sulfur aerosols would have spread over the planet as well via mixing that occurs in the stratosphere, but the equatorial region would have the greatest density of sulfur particles, severely blocking the sun. The eruptions would have needed to blast sulfur into the atmosphere for about five years to push Earth into a snowball state. Such a scenario would also require a relatively cool Earth ahead of time. Macdonald says that is because sulfur particles need to reach the altitude of the stratosphere to have maximum cooling effect. In a colder climate the stratosphere settles a little closer to Earth’s surface, making it possible for the sulfur-rich hot air plumes to reach. Although scientists have not determined exactly what the climate was like prior to snowball Earth, this new hypothesis is appealing, Macdonald says. “It provides a positive feedback mechanism. As you start cooling, then it gets easier and easier to put more sulfur aerosols up there, then Earth cools more, and so on,” he explains. This process would happen potentially so fast that it would overwhelm other climate feedbacks that might make the planet warmer.” Other experts find Macdonald’s and Wordsworth idea compelling. “I would say it’s probably the best idea we have, because it’s actually based on observations,” says Joseph Kirschvink, a geobiologist at California Institute of Technology, who coined the term “snowball Earth.” Paul Hoffman, an emeritus professor of geology at Harvard, says the timing between the sulfur-rich Franklin volcanism and snowball Earth makes it an attractive explanation. But “it could just be a coincidence with no relation,” he explains. Linda Sohl says the pair have come up with an intriguing hypothesis, although she also says, “Does it explain all snowball events in Earth’s history? Almost certainly not.” Hoffman also points out the researcher’s idea does not explain the second snowball event that came soon after the first, called the Marinoan glaciation. “I think that’s the weakest point in the idea,” he says. “So far as we know, there’s no large [volcanic regions] associated with the onset of the second.” Macdonald says there could have been one but that geologic evidence becomes patchy that far back in time. Macdonald himself is not convinced his and Wordsworth’s version of events is what actually occurred 717 million years ago. “We’re not saying this had to happen, just that it’s feasible and it’s a pretty impressive coincidence,” he explains. Along with this new idea, Macdonald expresses a note of caution to people who have proposed geoengineering projects using sulfur aerosols to combat global warming. “It’s a little frightening if we want to play with these particles, to know they may have caused major climate change in the past,” he says. “On the other hand, we’re already geoengineering with carbon dioxide. The cat’s already out of the bag.”


April 24, 2017 -- In newly updated clinical guidelines from the Society for Integrative Oncology (SIO), researchers at Columbia University's Mailman School of Public Health and the Herbert Irving Comprehensive Cancer Center with an interdisciplinary team of colleagues at MD Anderson Cancer Center, University of Michigan, Memorial Sloan Kettering Cancer Center, and other institutions in the U.S. and Canada, analyzed which integrative treatments are most effective and safe for patients with breast cancer. This systematic review adds to the growing literature on integrative therapies for patients with breast cancer and other cancer populations. The latest results are published online and in print in CA: A Cancer Journal for Clinicians, a publication of the American Cancer Society. The researchers evaluated more than 80 different therapies and developed grades of evidence. Based on those findings, the Society for Integrative Oncology makes the following recommendations: "Studies show that up to 80 percent of people with a history of cancer use one or more complementary and integrative therapies, but until recently, evidence supporting the use of many of these therapies had been limited," said Heather Greenlee, ND, PhD, assistant professor of Epidemiology at Columbia University's Mailman School of Public Health, and past president of SIO. "Our goal is to provide clinicians and patients with practical information and tools to make informed decisions on whether and how to use a specific integrative therapy for a specific clinical application during and after breast cancer treatment," Greenlee continues. In their systematic evaluation of peer-reviewed randomized clinical trials, the researchers assigned letter grades to therapies based on the strength of evidence. A letter grade of "A" indicates that a specific therapy is recommended for a particular clinical indication, and there is high certainty of substantial benefit for the patient. Meditation had the strongest evidence supporting its use, and is recommended for reducing anxiety, treating symptoms of depression, and improving quality of life, based on results from five trials. Music therapy, yoga, and massage received a B grade for the same symptoms, as well as for providing benefits to breast cancer patients. Yoga received a B grade for improving quality of life based on two recent trials. Yoga and hypnosis received a C for fatigue. "The routine use of yoga, meditation, relaxation techniques, and passive music therapy to address common mental health concerns among patients with breast cancer is supported by high levels of evidence," said Debu Tripathy, MD, chair of Breast Oncology at The University of Texas MD Anderson Cancer Center, and a past president of SIO. "Given the indication of benefit coupled with the relatively low level of risk, , these therapies can be offered as a routine part of patient care, especially when symptoms are not well controlled." Acupressure and acupuncture received a B grade as an addition to drugs used for reducing chemotherapy-induced nausea and vomiting. In general, there was a lack of strong evidence supporting the use of ingested dietary supplements and botanical natural products as part of supportive cancer care and to manage treatment-related side effects. "Clinicians and patients need to be cautious about using therapies that received a grade of C or D and fully understand the potential risks of not using a conventional therapy that may effectively treat cancer or help manage side effects associated with cancer treatment," warned Lynda Balneaves, RN, PhD, associate professor, College of Nursing, Rady Faculty of Health Sciences, Winnipeg, Canada, and president-elect of SIO. "Patients are using many forms of integrative therapies with little or no supporting evidence and that remain understudied," noted Dr. Greenlee. "This paper serves as a call for further research to support patients and healthcare providers in making more informed decisions that achieve meaningful clinical results and avoid harm." Additional co-authors: Melissa J. DuPont-Reyes, Department of Epidemiology, Mailman School of Public Health, Columbia University; Linda Carlson, Department of Oncology, University of Calgary, Calgary, AB, Canada; Misha Cohen, American College of Traditional Chinese Medicine at California Institute of Integral Studies, and Chicken Soup Chinese Medicine, San Francisco; Gary Deng, Integrative Oncology, Memorial Sloan Kettering Cancer Center, New York City; Jillian A. Johnson, Department of Biobehavioral Health, The Pennsylvania State University, University Park; Matthew Mumber, Department of Radiation Oncology,Harbin Clinic, Rome, GA; Dugald Seely, Ottawa Integrative Cancer Center, Ottawa, ON, and Canadian College of Naturopathic Medicine, Toronto, ON; Suzanna M. Zick, Department of Family Medicine, University of Michigan Health System, and Department of Nutritional Sciences, School of Public Health, University of Michigan, Ann Arbor; and Lindsay M. Boyce, Memorial Sloan Kettering Library, Memorial Sloan Kettering Cancer Center, New York City. Founded in 1922, Columbia University's Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Mailman School is the third largest recipient of NIH grants among schools of public health. Its over 450 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change & health, and public health preparedness. It is a leader in public health education with over 1,300 graduate students from more than 40 nations pursuing a variety of master's and doctoral degree programs. The Mailman School is also home to numerous world-renowned research centers including ICAP (formerly the International Center for AIDS Care and Treatment Programs) and the Center for Infection and Immunity. For more information, please visit http://www. .


News Article | May 4, 2017
Site: www.eurekalert.org

From prenatal genetic screening to the genetic testing of women with family histories of breast cancer, genomics is rapidly becoming a fixture in our lives. The National Human Genome Research Institute (NHGRI) has, since its founding, sponsored research into the ethical, legal and social implications (ELSI) of genomics to understand the profound societal and personal effects of technological advances in genomics. Genomics and Society: Expanding the ELSI Universe, a three-day conference on the myriad issues that spring from the ethical, legal and social implications of genomic research, will be hosted on June 5 - 7, 2017 by The Jackson Laboratory for Genomic Medicine and UConn Health in Farmington, Connecticut. The conference is funded by NHGRI through a grant to Columbia University Medical Center (CUMC). The latest research on ELSI topics will be presented by physicians, geneticists, genetic counselors, social scientists and lawyers, in academia, government and industry from around the world. From renowned researchers at the top of their fields to students and early career scientists bringing new insight and perspectives, the depth and range of expertise at the conference promise fascinating debates over new and emerging data. Keynote speaker Eric Dishman, director of the National Institutes of Health (NIH) Precision Medicine Initiative's All of Us Research Program will kick off the meeting with a discussion of the need for continuous innovation to address the quickly evolving genomic landscape. Other featured speakers include: The full program presents new research in more than 150 expert panel discussions, individual paper and poster presentations, and workshops on topics ranging from the implications of genetic testing in the criminal justice system to the uses and potential misuses of CRISPR - the very latest in genetic manipulation. "We are honored to play a part in bringing together world-renowned experts to discuss the most pressing issues at the crossroads of genomics and ethics," says Dr. Charles Lee, Scientific Director of The Jackson Laboratory for Genomic Medicine. "As our society rapidly embraces precision medicine, we have an obligation to provide both ethical and scientific guidance to researchers in the lab, clinicians on the frontlines, and patients making important decisions about their health." "As our society rapidly embraces precision medicine, we have an obligation to provide both ethical and scientific guidance to researchers in the lab, clinicians on the frontlines, and patients making important decisions about their health." "The increase in genomic testing and technology are fueling breakthrough discoveries here in Connecticut and around the globe for heart disease, cancer and a host of rare diseases," says Dr. Bruce T. Liang, Dean of UConn School of Medicine. "However, these promising personalized medicine therapies and our greater genetic knowledge may also come with a steep societal price if we don't address the associated concerns in a timely fashion." "NHGRI has been the major funder of ELSI research since the beginning of the Human Genome Project," said Lawrence Brody, Ph.D., NHGRI's Director of the Division of Genomics and Society. "Our aim is to support research that anticipates and addresses the societal impact of genomic science. "NHGRI has also played a major role in bringing the results of ELSI research into the public forum," noted Dr. Paul Appelbaum, professor at CUMC and director of the Columbia University Center for Research on the Ethical, Legal and Social Implications of Psychiatric, Neurologic and Behavioral Genetics. "As genomic technology expands by leaps and bounds, new ethical issues of consent, of commercialization, of justice and access to genomic medicine confound scientists and the public alike. ELSI research is seeking evidence-based answers to these novel questions. This conference aims to share the latest ELSI research as widely as possible." For more information, visit the ELSI Congress website at: http://www. . Follow #ELSICON for news updates before and during the conference. The conference is funded by NHGRI P50 HG007257 04S1 UConn Health is Connecticut's only public academic medical center. Based on a 206-acre campus in Farmington, UConn Health has a three-part mission: research, teaching and patient care. Home to the UConn School of Medicine, School of Dental Medicine and UConn John Dempsey Hospital with over 5,500 employees supporting nearly 1,000 students, over 600,000 annual patient visits, and innovative scientific research contributing to the advancement of medicine. For more information, visit health.uconn.edu. The Jackson Laboratory is an independent, nonprofit biomedical research institution based in Bar Harbor, Maine, with a National Cancer Institute-designated Cancer Center, a facility in Sacramento, Calif., and a genomic medicine institute in Farmington, Conn. It employs 1,800 staff, and its mission is to discover precise genomic solutions for disease and empower the global biomedical community in the shared quest to improve human health. For more information, please visit http://www. . Columbia University Medical Center provides international leadership in basic, preclinical, and clinical research; medical and health sciences education; and patient care. The medical center trains future leaders and includes the dedicated work of many physicians, scientists, public health professionals, dentists, and nurses at the College of Physicians and Surgeons, the Mailman School of Public Health, the College of Dental Medicine, the School of Nursing, the biomedical departments of the Graduate School of Arts and Sciences, and allied research centers and institutions. Columbia University Medical Center is home to the largest medical research enterprise in New York City and State and one of the largest faculty medical practices in the Northeast. The campus that Columbia University Medical Center shares with its hospital partner, NewYork-Presbyterian, is now called the Columbia University Irving Medical Center. For more information, visit cumc.columbia.edu or columbiadoctors.org. The National Human Genome Research Institute's (NHGRI) Ethical, Legal and Social Implications (ELSI) Research Program was established in 1990 as an integral part of the Human Genome Project (HGP) to foster basic and applied research on the ethical, legal and social implications of genetic and genomic research for individuals, families and communities. The ELSI Research Program funds and manages studies, and supports workshops, research consortia and policy conferences related to these topics.


April 27, 2017 (New York) -- Children in low-income families have an increased chance of thriving when their caregiver relationships include certain positive characteristics, according to new research from the National Center for Children in Poverty (NCCP) at Columbia University's Mailman School of Public Health. Using data from more than 2,200 low-income families surveyed as part of the Fragile Families and Child Wellbeing Study, NCCP researchers found that school-age children who reported high levels of parent involvement and supervision were more likely to report behaviors associated with positive emotional development and social growth. According to Strong at the Broken Places: The Resiliency of Low-Income Parents, an estimated 14 million families with at least one child earned below 200 percent of the poverty threshold in 2015 - a total of 65 percent of low-income families. Research has found that living in poverty can produce environmental stressors that lead to negative behaviors in children, such as inattention, impulsivity, aggression, withdrawal, depression, anxiety, or fearfulness. Furthermore, children living in poor families are significantly more likely to have trouble developing social-emotional competence -- the ability to manage emotions, express needs and feelings, deal with conflict, and get along with others. "Too often, when poor families are discussed, the focus is on deficits," said Renée Wilson-Simmons, DrPH, NCCP director and a co-author of the report. "And chief among those deficits is what's seen as parents' inability to successfully parent their children." Dr. Wilson-Simmons challenged the deficits focus, adding that despite the multitude of obstacles that low-income parents face, many of them succeed in helping their children flourish. "They raise children who possess the social-emotional competence needed to develop and keep friendships; establish good relationships with parents, teachers, and other adults; and experience a range of achievements that contribute to their self-confidence, self-esteem, and self-efficacy. These families have something to teach us all about thriving amidst adversity." Available online at http://www. , Strong at the Broken Places presents findings from the survey responses of 2,210 nine-year-olds who lived in low-income families for three to five years. The report also cites additional research involving low-income families from diverse backgrounds and geographic areas showing certain common attributes among parents who are able to function well when faced with challenges. Those effective protective factors range from exhibiting a positive outlook, establishing family routines, and spending sufficient family time together to having good financial management skills, an adequate support network, and the willingness to seek help. The major finding presented in the report is that low-income parents who provide their children with warmth and nurturance as well as rules and consequences are helping them develop both socially and emotionally in ways that will serve them well as they move from childhood to adolescence to young adulthood: Overall, most of the nine-year-olds surveyed rated their caregiver high on all of the factors NCCP researchers used to measure resiliency in low-income families: "The good news is that parents who struggle financially are still finding ways to have the kinds of interactions with their children that help them to develop socially and emotionally, despite the many external stressors competing for their attention," said co-author Yang Jiang, PhD, who led data analysis. "Since we know that children do better when their families do better, it's important that advocates and policymakers bolster families' efforts by supporting policies and programs that help parents develop strong connections with their children." To promote family resiliency, NCCP researchers also recommended two-generation approaches that enhance the well-being and life opportunities of both parents and their children. The following policy strategies, outlined in Strong at the Broken Places, help stabilize low-income households so that parents are better able to engage with their children: To speak with an NCCP expert about Strong at the Broken Places, contact Tiffany Thomas Smith, communications/media relations consultant for the National Center for Children in Poverty, at 443-986-5621 / TiffanyTSmith@nccp.org. Founded in 1922, Columbia University's Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Mailman School is the third largest recipient of NIH grants among schools of public health. Its over 450 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change & health, and public health preparedness. It is a leader in public health education with over 1,300 graduate students from more than 40 nations pursuing a variety of master's and doctoral degree programs. The Mailman School is also home to numerous world-renowned research centers including ICAP (formerly the International Center for AIDS Care and Treatment Programs) and the Center for Infection and Immunity. For more information, please visit http://www. . Part of Columbia University's Mailman School of Public Health, the National Center for Children in Poverty (NCCP) is the nation's leading public policy center dedicated to promoting the economic security, health, and well-being of America's low-income families and children. Visit NCCP online at http://www. . Like us on Facebook


News Article | May 1, 2017
Site: www.eurekalert.org

Development assistance for health targets largely ignores older age groups, with 90 percent of the assistance going to people below the age of 60, according to a new study led by a researcher at the Robert N. Butler Columbia Aging Center, Mailman School of Public Health. Children below the age of 5 receive the most development assistance for health. Findings from the study, Vast Majority of Development Assistance for Health Funds Target Those Below Age Sixty, will be published online and in the May issue of the journal, Health Affairs. Development assistance for health globally was $3.13 per person younger than age 60 in recipient countries, per DALY, defined as the sum of years lived with disability and years of life lost because of premature mortality. This is in contrast to $0.91 per person aged 60 and older. The gap was even higher at the extremes of the age distribution: People ages 70 and older received only $0.80 per DALY. Funds earmarked for low- and middle-income countries to improve health have more than quadrupled since 1990, reaching $36.4 billion U.S. dollars in 2015. The researchers used publicly available data from the Institute for Health Metrics and Evaluation's Financing Global Health 2015 report and the Global Burden of Disease Study 2015. They examined 27 assistance program areas that identified the cause of disease or the type of intervention targeted for the period 1990-2013. Country- and year-specific disability-adjusted life-years were calculated for each cause. "When we compared changes in development assistance for health and DALYs from 1990 to 2013 -- a period of epidemiological and demographic change during which the disease burden shifted toward older ages--we found that assistance was directed increasingly to children," said the study's lead author, Vegard Skirbekk, PhD, of the Columbia Aging Center and professor of Population and Family Health at the Mailman School of Public Health. For example, people younger than age five had $6.49 billion more assistance in 2013 than they had in 1990. The largest increases were for people ages 5-14 years. People in their twenties and thirties also received relatively large amounts of the spending for development assistance for health, some of it driven by HIV/AIDS funding. In 2013 the assistance benefited people younger than age five the most, with spending on this age group over three times more than any other age group. Many programs areas benefit this age group, especially assistance for child health, maternal and newborn health, and malaria. "Our results revealed that development assistance for health is likely to target diseases that occur early in life," noted Skirbekk. One driver for prioritizing younger over older populations may be that children are seen as representing the future. "Another idea is that younger people--especially children--should be given priority because they are more innocent, and that health risks and diseases that affect them are hardly due to behavior for which they could be held responsible," observed Skirbekk. Co-authors: Trygve Ottersen, Norwegian Institute of Public Health and Centre for Global Health, University of Oslo; Hannah Hamavid, Nafis Sadat, and Joseph L. Dieleman, Institute for Health Metrics and Evaluation, University of Washington. The study was supported by the Robert N. Butler Columbia Aging Center at Columbia University, the Norwegian Institute of Public Health, and Bill & Melinda Gates Foundation. Founded in 1922, Columbia University's Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Mailman School is the third largest recipient of NIH grants among schools of public health. Its over 450 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change & health, and public health preparedness. It is a leader in public health education with over 1,300 graduate students from more than 40 nations pursuing a variety of master's and doctoral degree programs. The Mailman School is also home to numerous world-renowned research centers including ICAP (formerly the International Center for AIDS Care and Treatment Programs) and the Center for Infection and Immunity. For more information, please visit http://www. .


News Article | April 26, 2017
Site: www.eurekalert.org

Illicit cannabis use and cannabis use disorders increased at a greater rate in states that passed medical marijuana laws than in other states, according to new research at Columbia University's Mailman School of Public Health and Columbia University Medical Center. The findings will be published online in JAMA Psychiatry. Laws and attitudes regarding cannabis have changed over the last 20 years. In 1991, no Americans lived in states with medical marijuana laws, while in 2012, more than one-third lived in states with medical marijuana laws, and fewer view cannabis use as entailing any risks. The new study is among the first to analyze the differences in cannabis use and cannabis use disorders before and after states passed medical marijuana laws, as well as differentiate between earlier and more recent periods and additionally examine selected states separately. The researchers used data from three national surveys collected from 118,497 adults: the 1991-1992 National Longitudinal Alcohol Epidemiologic Survey, the 2001-2002 National Epidemiologic Survey on Alcohol and Related Conditions and the 2012-2013 National Epidemiologic Survey on Alcohol and Related Conditions-III. Overall, between 1991-1992 and 2012-2013, illicit cannabis use increased significantly more in states that passed medical marijuana laws than in other states, as did cannabis use disorders. In particular, between 2001-2002 and 2012-2013, increases in use ranged from 3.5 percentage points in states with no medical marijuana laws to 7.0 percentage points in Colorado. Rates of increase in the prevalence of cannabis use disorder followed similar patterns. "Medical marijuana laws may benefit some with medical problems. However, changing state laws -- medical or recreational -- may also have adverse public health consequences, including cannabis use disorders," said author Deborah Hasin, PhD, associate professor of Epidemiology at the Mailman School of Public Health and in the Department of Psychiatry at Columbia University Medical Center. "A prudent interpretation of our results is that professionals and the public should be educated on risks of cannabis use and benefits of treatment, and prevention/intervention services for cannabis disorders should be provided." While illicit use of marijuana decreased and marijuana use disorder changed little between 1991-1992 and 2001-2002, both use and disorder rates increased between 2001-2002 and 2012-2013. In 1991-1992, predicted prevalences of use and disorder were higher in California than other states with early-medical marijuana laws (use: 7.6 percent vs. 4.5 percent; disorder: 2 percent vs. 1.15 percent). However, the predicted prevalence of past year use in California did not differ significantly from states that passed laws more recently. In contrast, the prevalences of use and disorder increased in the other 5 states with early medical marijuana laws. "Future studies are needed to investigate mechanisms by which increased cannabis use is associated with medical marijuana laws, including increased perceived safety, availability, and generally permissive attitudes," Dr. Hasin also noted. Co-authors: Aaron Sarvet and Malka Stohl, Columbia University Medical Center; Katherine Keyes and Melanie Wall, Mailman School of Public Health; Sandro Galea, Boston University School of Public Health; and Magdalena Cerda, University of California, Davis. The study was supported by the National Institute on Alcohol Abuse and Alcoholism (grant K01AA021511), National Institutes of Health (R01DA034244, National Institute on Drug Abuse (grants K01AA021511, R01Dao40924-01A), and the New York State Psychiatric Institute. Founded in 1922, Columbia University's Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Mailman School is the third largest recipient of NIH grants among schools of public health. Its over 450 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change & health, and public health preparedness. It is a leader in public health education with over 1,300 graduate students from more than 40 nations pursuing a variety of master's and doctoral degree programs. The Mailman School is also home to numerous world-renowned research centers including ICAP (formerly the International Center for AIDS Care and Treatment Programs) and the Center for Infection and Immunity. For more information, please visit http://www. . Columbia University Department of Psychiatry Columbia Psychiatry is ranked among the best departments and psychiatric research facilities in the nation and has contributed greatly to the understanding and treatment of psychiatric disorders. Located at the Columbia University Medical Center campus in Washington Heights, the department enjoys a rich and productive collaborative relationship with physicians in various disciplines at Columbia University's College of Physician's and Surgeons. Columbia Psychiatry is home to distinguished clinicians and researchers noted for their clinical and research advances in the diagnosis and treatment of depression, suicide, schizophrenia, bipolar and anxiety disorders, eating disorders, and childhood psychiatric disorders. Visit http://columbiapsychiatry. for more information.


News Article | April 17, 2017
Site: www.eurekalert.org

April 12, 2017 -- B vitamins can mitigate the impact of fine particle pollution on cardiovascular disease, according to new research conducted at Columbia University's Mailman School of Public Health. Healthy non-smokers who took vitamin B supplements nearly reversed any negative effects on their cardiovascular and immune systems, weakening the effects of air pollution on heart rate by 150 percent, total white blood count by 139 percent, and lymphocyte count by 106 percent. This is the first clinical trial to evaluate whether B vitamin supplements change the biologic and physiologic responses to ambient air pollution exposure. The study initiates a course of research for developing preventive pharmacological interventions using B vitamins to contain the health effects of air pollution. The findings are published online in the Nature Publishing Group journal, Scientific Reports. Ambient fine particulate pollution contributes to 3.7 million premature deaths annually worldwide, predominantly through acute effects on the cardiovascular system. Particulate matter pollution is the most frequent trigger for myocardial infarction at the population level. "Ambient PM2.5 pollution is one of the most common air pollutants and has a negative effect on cardiac function and the immune system," said Jia Zhong, PhD, principal investigator, and postdoctoral research officer in the Department of Environmental Health Sciences at Columbia's Mailman School. "For the first time, our trial provides evidence that B-vitamin supplementation might attenuate the acute effects of PM2.5 on cardiac dysfunction and inflammatory markers." The paper builds on research published in March that found B vitamins reduce the negative effects of air pollution as measured by epigenetic markers. In the new study, researchers recruited ten healthy, 18 to 60-year-old, non-smoking volunteers who were not on any form of B vitamin supplements or other medication. All volunteers received a placebo for four weeks preceding a two-hour exposure experiment to concentrated ambient PM2.5 (250 μ g/m3), after which they were administered B vitamin supplements for four weeks before the next two-hour exposure experiment to PM2.5. A particle-free two-hour exposure was included to provide baseline data. The controlled exposure experiments were conducted from July 2013 to February 2014 at the same time of day and adjusted for season, temperature, and humidity. "Our results showed that a two-hour exposure to concentrated ambient PM2.5 had substantial physiologic impacts on heart rate, heart rate variability, and white blood counts. Further, we demonstrated that these effects are nearly reversed with four-week B-vitamin supplementation," noted Andrea Baccarelli, MD, PhD, chair and Leon Hess Professor of Environmental Health Sciences at the Mailman School. Because the researchers studied healthy adults from lightly polluted urban environment, they caution that their findings might not be generalizable to populations that are at higher risk for pollution-induced cardiovascular effects, including children, older adults, individuals with pre-existing cardiovascular disease, and individuals residing in heavily polluted areas. "With ambient PM2.5 levels far exceeding air quality standards in many large urban areas worldwide, pollution regulation remains the backbone of public health protection against its cardiovascular health effects. Studies like ours cannot diminish--nor be used to underemphasize--the urgent need to lower air pollution levels to--at a minimum--meet the air quality standards set forth in the United States and other countries. However, residual risk remains for those who are sensitive, and high exposures are, unfortunately, the rule still in many megacities throughout the world," said Dr. Baccarelli. The study, conducted with colleagues at Harvard's T. H. Chan School of Public Health, in Sweden, China, Singapore, and Canada, was supported by NIH grants (R21ES021895, R01ES021733, R01ES020836, R01ES021357, T32ES007142, P30ES000002) and by U.S. Environmental Protection Agency grants (RD-834798, RD-832416). The authors declare no competing financial interests. Founded in 1922, Columbia University's Mailman School of Public Health pursues an agenda of research, education, and service to address the critical and complex public health issues affecting New Yorkers, the nation and the world. The Mailman School is the third largest recipient of NIH grants among schools of public health. Its over 450 multi-disciplinary faculty members work in more than 100 countries around the world, addressing such issues as preventing infectious and chronic diseases, environmental health, maternal and child health, health policy, climate change & health, and public health preparedness. It is a leader in public health education with over 1,300 graduate students from more than 40 nations pursuing a variety of master's and doctoral degree programs. The Mailman School is also home to numerous world-renowned research centers including ICAP (formerly the International Center for AIDS Care and Treatment Programs) and the Center for Infection and Immunity. For more information, please visit http://www. .


News Article | April 17, 2017
Site: www.prweb.com

Jim Jacobs, President,Macomb Community College, was awarded the 2017 Educational Testing Service O’Banion Prize at the League’s Innovations Conference in San Francisco, California, March 13, 2017. Named in honor of Dr. Terry O’Banion, President Emeritus and Senior Fellow of the League for Innovation in the Community College, the Educational Testing Service O’Banion Prize is given to an individual who has greatly influenced a transformation in teaching and learning, or to a college that best exemplifies the ideals and characteristics of a learning college as established by O’Banion and the League. Jacobs has served as Macomb Community College’s President since 2008. Prior to his appointment, he concurrently served as Director for the Center for Workforce Development and Policy at the college, and as Associate Director, Community College Research Center (CCRC), Teachers College, Columbia University, where he currently serves as a member of its board of directors. He is a Past President of the National Council for Workforce Education and a member of the Manufacturing Extension Partnership Advisory Board of the National Institute of Standards and Technology and the National Assessment of Career and Technical Education. He is also a member of the Community College Advisory Panel to the Educational Testing Service in Princeton, New Jersey. The Educational Testing Service O’Banion Prize is presented annually at the Innovations Conference.Innovations is an international conference dedicated to innovative approaches for teaching, learning, and enhancing the community college experience. About the League for Innovation in the Community College The League for Innovation in the Community College (League) is an international nonprofit organization with a mission to cultivate innovation in the community college environment. The League hosts conferences and institutes, develops print and digital resources, and leads projects and initiatives with almost 500 member colleges, 100 corporate partners, and a host of other government and nonprofit agencies in a continuing effort to advance the community college field and make a positive difference for students and communities. Information about the League and its activities is available at http://www.league.org.


News Article | May 5, 2017
Site: news.yahoo.com

FILE PHOTO: The Dalai Lama greets U.S. Speaker of the House Nancy Pelosi during the Tom Lantos Human Rights Prize award ceremony in the Capitol in Washington October 6, 2009. REUTERS/Kevin Lamarque/File Photo BEIJING/NEW DELHI (Reuters) - When Donald Trump was elected in November, the Dalai Lama said he was keen to meet the incoming U.S. president, but since then Trump has cozied up to China's leader Xi Jinping, making it less likely the man Beijing deems a separatist will get an invite to the White House anytime soon. The United States has long recognised Tibet as part of the People's Republic of China, and does not back Tibetan independence. But that has not deterred all the recent U.S. presidents before Trump from meeting the exiled Tibetan spiritual leader. The United States is widely seen as the last major Western power that has still held meetings with the Dalai Lama despite Beijing's objections that such encounters foment separatism. In past meetings, the U.S. had consistently voiced support for the protection of human rights of Tibetans in China, and called for formal talks between Beijing and the Dalai Lama and his representatives. At a regular press briefing in Beijing on Friday, China's foreign ministry spokesman Geng Shuang reiterated that China resolutely opposes any foreign country allowing the Dalai Lama to visit or any foreign official having any form of exchange with him. He did not say whether China had specifically requested Trump not meet the Dalai Lama. A U.S. State Department spokeswoman and a White House official referred Reuters to the Dalai Lama's office when asked whether the Tibetan spiritual leader and his representatives had asked for a meeting with Trump and whether any such meetings were planned. "His Holiness was supposed to go (to the U.S.) in April, but it was postponed," Lobsang Sangay, head of the Tibetan government-in-exile, told Reuters. That trip has been delayed until June due to a hectic schedule in the preceding months that had left the Dalai Lama physically exhausted, Sangay said, adding that Washington D.C. wouldn't be part of the June itinerary. The office of the Dalai Lama hasn't reached out to Trump to arrange a meeting yet, he said. The Dalai Lama is taking a more considered approach with regard to any meeting with Trump, said a source with knowledge of the thinking of the winner of the 1989 Nobel Peace Prize. The unpredictable U.S. president upset protocol in December when weeks before being sworn into office he took a telephone call from the leader of self-ruled Taiwan, which China regards as a renegade province, only to last week rebuff Taiwanese suggestions of another call. [nL4N1I03TN] In the interim, Trump has met and phoned Xi, and says he has built a strong relationship with the Chinese leader. He called Xi "a friend of mine" who was "doing an amazing job as a leader" in an interview with Reuters last week, and praised him for trying to rein in nuclear-armed North Korea. In return, the Chinese president has invited Trump to visit China this year. In mid-2008, then British Prime Minister Gordon Brown met the Dalai Lama, to the anger of Beijing. Months later, the British Foreign Secretary at that time David Miliband ditched Britain's near century-long position on Tibet, describing it as an "anachronism", and explicitly recognised Tibet as part of China. Based on treaties signed at the turn of the 20th century by British-administered India and Tibet, Britain had previously said it would recognise China's "special position" in Tibet on the condition that Tibet was given significant autonomy. Chinese troops took control of Tibet in 1950 in what Beijing calls a "peaceful liberation". Nine years later, the Dalai Lama fled to India after an abortive uprising and set up a government in exile, which China does not recognise. China sees the Dalai Lama as a dangerous separatist in a monk's robes, even though the Dalai Lama says he wants autonomy for his homeland, not outright independence. There have been no formal talks between Beijing and the Dalai Lama's representatives since 2010. International rights groups and exiles say China stamps on the religious and cultural rights of Tibetans, accusations denied by Beijing. China says its rule has ended serfdom and brought prosperity to a once-backward trans-Himalayan region. Trump has been silent on Tibetan issues. "The main change is that the U.S. approach on Tibet seems likely to become more transactional and therefore less consistent," said Robbie Barnett, director of the Modern Tibetan Studies Program at Columbia University.


News Article | May 4, 2017
Site: physicsworld.com

Scientists in Greece have devised a new form of biometric identification that relies on humans' ability to see flashes of light containing just a handful of photons. The technique involves using very weak laser pulses to measure how a person's sensitivity to light varies across their retina. According to its inventors, such a quantum-based retinal map could provide a more powerful and secure form of identification than is possible with conventional biometrics such as fingerprints or iris scans. It has been known since the 1940s that humans are able to detect light pulses containing very few photons. However, whether we can actually see single photons is still unclear: one group last year said it had carried out experiments showing this to be the case but others questioned the claim. In the 1940s, Selig Hecht and colleagues at Columbia University in the US showed that variations in our perception of very low light levels are in fact governed by quantum statistics. By exposing several individuals to very dim flashes of light of differing average intensity, they found that the intensity-induced variation in the probability of seeing a flash could be modelled by assuming that the actual number of photons a person sees follows a Poisson distribution. This result held true across the different people examined, although the specific responses depended on an individual's value of alpha – a parameter describing the fraction of photons arriving at a person's eye that are then detected by their retina. Losses caused by absorption or scattering within the cornea, pupil, lens and body of the eyeball, as well as a finite probability of absorption within the retina itself, means that alpha typically varies between 0 and 0.2. This variation led to a series of curves describing seeing probability versus average intensity, whose precise shape depended on alpha. In the latest work, Iannis Kominis of the University of Crete and colleagues use these variations as the basis of the new biometric scheme. They say that the value of alpha changes by up to a factor of 100 from one point to another on an individual's retina, while variations between retinas can be up to 50%. As such, they argue that people could be uniquely identified by precisely mapping the variation of alpha across their retinas. The "alpha map" of a particular individual, who the researchers call Alice, would be created by exposing that person to large numbers of very weak laser pulses. The pulses would have a range of average intensities, and the exercise would be repeated across multiple points on Alice's retina. For each pulse, Alice would be asked whether or not she saw a flash of light. With the map stored on a secure database, Alice could then be identified by examining a subset of points on her retina. Again, she would be exposed to a series of weak laser pulses and asked on each occasion whether or not she sees the pulse. Only if her answers closely match what would be expected from her map would she be allowed to proceed. As Kominis and colleagues explain in a preprint uploaded to the arXiv server, Alice must be subject to a sufficient number of yes/no interrogations to limit two types of error as far as possible. One type of error is the "false negative", which means that Alice is not recognized as herself. The other type is the "false positive", in which an impostor, known as Eve, successfully fools the system into thinking that she is Alice. For the scheme to be implemented on a practical timescale, the number of interrogations must be limited. Simply choosing a random subset of points on Alice's retina would involve 2500 interrogations to reach certain benchmarks – generating a false negative less than once every 10,000 identifications and a false positive less than one every 10 billion. However, by refining their technique in a number of ways – choosing only very low or very high alpha regions on the retina, using Bayesian statistics and employing pattern recognition – the researchers calculated that just 50 interrogations would do the job. In addition, they assessed how well their scheme would cope if Eve was able to measure the number of photons entering Alice's eye as well as monitoring her brain activity. Their conclusion: Eve would need to make extremely precise measurements of both the thermal energy dumped in Alice's eye and the magnetic energy emitted by her head – something that would be very difficult to achieve. Rebecca Holmes of the University of Illinois in the US praises Kominis and colleagues for having "put a lot of thought into how to optimise" their biometric technique. But she says she is "sceptical" about the scheme's practicality, pointing out that up to half an hour would be needed just to acclimatize Alice's eyes to the very dark conditions required. Holmes also disputes the technique's "quantum" label, arguing that although it involves small numbers of photons, it does not provide a physics-based guarantee of complete security, as quantum cryptography (in principle) can do.


News Article | May 5, 2017
Site: globenewswire.com

At the 2017 Waterfront Conference on May 10, the Waterfront Alliance Previews the Region’s First-Ever Harbor Scorecard How protected are our coastal communities and our infrastructure? Are waterways meeting Clean Water Act standards? Where are opportunities for public access to the water? Turn to the Harbor Scorecard for answers and action, neighborhood by neighborhood NEW YORK, May 05, 2017 (GLOBE NEWSWIRE) -- At this year’s Waterfront Conference, titled “Measuring Our Harbor: Strong, Healthy, and Open,” the Waterfront Alliance will preview the region’s first-ever Harbor Scorecard. Comprehensive and user-friendly, the Scorecard compiles research in public access, ecology, and resiliency for a neighborhood-by-neighborhood evaluation of the waterfronts of New York City and northern New Jersey. Distilling extensive research into an easy-to-use tool, the Harbor Scorecard will be previewed at the Waterfront Alliance’s annual Waterfront Conference on Wednesday, May 10, and rolled out three weeks later in time for the start of hurricane season. The Waterfront Conference takes place aboard the Hornblower Infinity, dockside at Hudson River Park, Pier 40, in the morning, and cruising New York Harbor throughout the afternoon. Featured speakers include Rep. Nydia M. Velázquez, Member of Congress; Lauren Brand, U.S. Department of Transportation, Maritime Administration; Hon. Ras J. Baraka, Mayor, City of Newark; and Alicia Glen, Deputy Mayor for Housing and Economic Development, City of New York, along with dozens of expert panelists and workshop leaders, and hundreds of policy-makers, waterfront advocates, and professionals. Hornblower Cruises & Events is the venue sponsor. For the third year, Arcadis is the premier sponsor of the Waterfront Conference, and for the second year, the sponsor of Arcadis Waterfront Scholars, a program that invites more than 70 undergraduate and graduate students to participate in the conference and engage with professional mentors. On May 10, Arcadis Global Lead for Water Management Piet Dircke will discuss the Harbor Scorecard in a global context using the Arcadis Sustainable Cities Water Index, which assesses the water resources of 50 cities around the world. “New York City is making strides in protecting its coastline, and the Waterfront Conference is an important step in bringing together the best minds to help benchmark our harbors and waterways,” said Mr. Dircke. “Waiting until the next big storm to create safeguards against future disasters is not a sound resiliency strategy.” “As the Waterfront Alliance begins our milestone tenth year of work, we have put together the Harbor Scorecard, an essential tool to take stock of progress along our waterways,” said John Boulé, vice-chair of the Waterfront Alliance and senior vice president at Dewberry, where he is business manager for New York operations. “While we’ve got a lot to be proud of, it’s clear that we need to do better, and the Harbor Scorecard will give citizens and policy-makers alike the information they need to act.” “The Waterfront Alliance cannot solve all of New York City’s waterfront and sea level rise issues, but its Harbor Scorecard will indicate whether we are moving in the right direction, becoming more resilient while at the same time providing more and better access to the waterfront,” said Klaus Jacob of Columbia University’s Lamont-Doherty Earth Observatory. “Right now, New York City is making decisions that will affect clean water investments for the next generation,” said Larry Levine, a senior attorney at Natural Resources Defense Council. “Thanks to the Clean Water Act, our harbor is cleaner than it used to be. But far too often the water is still too polluted to touch. Sewage overflows still foul our waterways after it rains, making them unsafe for eight million New Yorkers to use recreationally. The Harbor Scorecard will provide a call to action for local, state and federal officials, shining a light on where we need to invest in our infrastructure for a cleaner, healthier future.” Learn more about the Waterfront Conference and purchase tickets ($150 regular ticket; $75 government agencies and nonprofits; $50 students). Registration and breakfast is 8am to 8:45am; the boat is dockside at Pier 40, Hudson River Park until 1pm; the afternoon harbor cruise returns at 5pm. Continuing education credits will be offered (AIA CES; with APA AICP CM, and LA CES credits pending). The Waterfront Conference is generously sponsored by: Venue Sponsor: Hornblower Cruises & Events Premier Sponsor and Waterfront Scholars: Arcadis Commander: AECOM, GCA , New York City Economic Development Corporation Supporter: Dewberry, ExxonMobil, GBX Gowanus Bay Terminal, Hudson River Foundation, Newtown Creek Group, Red Hook Container Terminal, Seastreak, Stantec, Studio V, Two Trees, United Metro Energy Champion: Entertainment Cruises, HDR, Industry City, Langan, New York Water Taxi, Park Tower Group, Queens Chamber of Commerce, Sims Metal Management/Sims Municipal Recycling, Friend: ARUP, Ecology and Environment, HATCH, Kyle Conti Construction, M.G. McLaren Engineering Group, Moffatt & Nichol, Mott MacDonald, NY Waterway, Perkins & Will, Scape Studio, Starr Whitehouse Landscape Architects and Planners, Steer Davies & Gleave, Williams, WSP/Parsons Brinckerhoff Continuing Education Partners: AIA New York, APA NY Metro Chapter, ASLA NY The Waterfront Alliance works to protect, transform, and revitalize our harbor and waterfront.


News Article | May 5, 2017
Site: www.nytimes.com

Eric Kandel, a professor at Columbia University, at home in Harlem with his trainer, Chris Mellars.


News Article | May 5, 2017
Site: www.prweb.com

The June 16th release of "Dreaming Big" (GoldFox Records), which marks the recording debut of the Brett Gold New York Jazz Orchestra and features the compositions of Brett Gold, illuminates a most intriguing jazz odyssey. A star trombonist in high school in his native Baltimore, Gold was steered away from a music career by his parents as well as his trombone teacher, of all people. Gold became an attorney and went on to achieve formidable success in the field of international and corporate tax law. But 25 years into his legal career, Gold changed course and reestablished contact with his musical muse. "Dreaming Big" is remarkable not only for its very existence but also for the striking sounds it offers. A tour de force, the music ranges from 12-tone melodies to playful Monkisms to a stirring political statement. While the album introduces one of jazz’s most challenging new instrumental voices, at the same time its warmth, humor, and accessibility convey an easy sophistication one would associate with an artist of far greater experience. Gold enlisted first-call players from New York’s jazz, studio, and Broadway scenes to produce the recording, including saxophonists Charles Pillow and Tim Ries, trumpeter Scott Wendholt, trombonist John Allred, bassist Phil Palombi, and drummer Scott Neumann. Many jazz composers and arrangers, including Gold, cite Gil Evans and Bill Holman as influences. But Gold’s affinity for the odd time-signature music of the late Don Ellis is reflected in a number of pieces on the CD. Among the compositions on "Dreaming Big," the Middle Eastern-themed “Al-Andalus” (featuring a virtuosic turn by trumpeter Jon Owens) is partly in 11/4 and partly in 5/4. “That Latin Tinge” is a 7/4 mambo, not the usual time signature for a salsa piece. Even the fairly straightforward “Stella’s Waltz” can trip someone up with its occasional judiciously placed bar of 5/4. And then there’s “Nakba,” the powerful 11-minute finale, which was composed partly with Gold’s Moroccan sister-in-law in mind. The song is named after the Arabic word for “catastrophe,” used by the Palestinians to describe the Arab-Israeli War of 1948. Featuring Ries on soprano saxophone, it traces the tragic history of the Israeli-Palestinian conflict. “I found out that you can stop playing music, but it’s still there circulating in your head,” Gold says of the years when he was not involved in music full-time. After finishing high school a year early, he attended the University of Rochester as a double major in history and film studies (Magna Cum Laude and Phi Beta Kappa) and continued his music studies at the Eastman School of Music where he played with one of its nationally recognized jazz ensembles. But he soon placed his jazz activities on the back burner, earning a J.D. from Columbia University Law School (1980) and an LL.M in tax law from New York University Law School (1983). When Gold returned to jazz, he had no problem coming up with ideas for compositions—his brain was full of them—but his sabbatical from music left him unprepared to execute those ideas both on paper and on his horn, which he hadn’t touched in 10 years. He first sketched his pieces out and hired professional musicians to record demo-like CDs of them. Then, studying privately with distinguished teachers like Pete McGuinness, Neal Kirkwood, and David Berger, he learned how to write complex compositions for big band. Eventually, in 2007, Gold was accepted into the esteemed BMI Jazz Composers Workshop, under the direction of Mike Abene, Jim McNeely, and Mike Holober. During his tenure there, he developed a book of more than two dozen arrangements, of which 11 of the best appear on Dreaming Big. “As a member of BMI, I was pushed to write longer, more abstract orchestral pieces, something I resisted,” he says. “Instead, I looked to the way Duke Ellington wrote for his band—his best pieces were seldom more than three to five minutes long. I also admired his idea of writing for individual members of the band.” Over the years, Gold has absorbed and strongly personalized any number of influences, some more than just musical. A study in diminished chords featuring clarinets and flutes, “Theme from an Unfinished Film” reveals his debt to what he calls the “internalized lyricism” of movie composers such as Bernard Herrmann, David Raksin, and Ennio Morricone. “Exit, Pursued by a Bear (Slow Drag Blues),” was inspired by Shakespeare’s most famous stage direction. And “Al-Andalus” was originally inspired by the hopes raised by the Arab Spring. Gold does not play in the trombone section on "Dreaming Big." “I actually function a lot better in a dark room writing music,” he says. The roles he plays on the new album are those of composer, arranger, producer—and big dreamer.


News Article | May 4, 2017
Site: www.eurekalert.org

New York, NY (May 4, 2017)--Columbia University Medical Center (CUMC) researchers have discovered a molecular mechanism that reprograms tumor cells in patients with advanced prostate cancer, reducing their response to anti-androgen therapy. The findings, based on a study in mice, could help to determine which patients should avoid anti-androgen therapy and identify new treatments for people with advanced prostate cancer. The study was published online April14th in the journal Cancer Discovery. Since androgens (male hormones) are known to drive prostate cancer, patients with recurrent or advanced disease are typically treated with anti-androgen medications. However, most patients fail treatment and develop an aggressive form of prostate cancer known as castration-resistant prostate cancer, or CRPC. "It's been a mystery why some patients do not respond to anti-androgens, and why a subset of these patients actually get worse after treatment," said study co-leader Cory Abate-Shen, PhD, the Michael and Stella Chernow Professor of Urological Oncology and professor of urology, medicine, systems biology, and pathology and cell biology at CUMC. "Our findings show that in many of these patients, the tumor cells are reprogrammed so that they are no longer dependent on androgens." To learn about the molecular mechanisms that drive resistance to anti-androgens, Drs. Abate-Shen and Michael Shen co-led a team to develop a strain of mice that lack two tumor-suppressor genes, Trp53 and Pten. These genes are both mutated in about 25 percent of patients with advanced prostate cancer. Mice that were treated with the anti-androgen drug abiraterone failed to respond and had accelerated tumor growth--similar to some humans with advanced prostate cancer who do not respond to anti-androgen therapy. "We found a number of genes that were overexpressed in mice with CRPC and also conserved in patients with the disease. Among the most interesting of these was SOX11, which regulates the development of the nervous system," said study co-leader Michael M. Shen, PhD, professor of medical sciences at CUMC. Most localized, slow-growing prostate cancers are largely composed of epithelial cells, which are rich in androgen receptors that increase their susceptibility to anti-androgen therapy. In contrast, aggressive prostate cancers, particularly those that fail treatment, often contain many neuroendocrine-like cells, which lack androgen receptors and are therefore less responsive to anti-androgen therapy. "This raised the question, where are the neuroendocrine-like cells in prostate tumors coming from?" said Dr. Abate-Shen. "While previous research hinted that epithelial tumor cells may be reprogrammed to become neuroendocrine-like cells, our study provides the first direct evidence that this reprogramming is actually occurring and that it is mediated, at least in part, by SOX11." The researchers also demonstrated that SOX11 acts in a similar fashion in human prostate cancer cells. "By giving anti-androgens to patients with CRPC, we are eliminating the cancer cells that need androgen to survive and enriching the tumor with the remaining neuroendocrine-like cells. The net effect is to create an even more aggressive tumor," said Dr. Shen. The researchers also identified several "master regulators"--genes that control SOX11 and other genes involved in prostate cancer reprogramming--that might be targeted for new prostate cancer treatments. "Based on our findings, genetic testing to identify SOX11 and the master regulators may be considered before embarking on anti-androgen therapy for patients with advanced prostate cancer," said Dr. Shen. The study is titled, "Transdifferentiation as a mechanism of treatment resistance in a mouse model of castration-resistant prostate cancer." The other contributors are: Min Zou (CUMC), Roxanne Toivanen (CUMC), Antonina Mitrofanova (CUMC and State University of New Jersey, Newark, NJ), Nicolas Floch (CUMC), Sheida Hayati (State University of New Jersey, Newark, NJ), Yanping Sun (CUMC), Clémentine Le Magnen (CUMC), Daniel Chester (CUMC), Elahe A. Mostaghel (Fred Hutchinson Cancer Research Center, Seattle, WA), Andrea Califano (CUMC), and Mark A. Rubin (Weill Cornell Medicine/NewYork-Presbyterian, New York, NY). The study was supported by grants from the National Institutes of Health (P30 CA013696, UL1 TR00040, P01 CA173481, CA154293, DK076602, CA196662, U54 CA209997, R35 CA197745, UL1 TR001873), the Pacific Northwest Prostate Cancer SPORE (P50 CA097186), the DOD Prostate Cancer Research Program (PC150051, PC131821), the Prostate Cancer Foundation, the TJ Martell Foundation for Leukemia, Cancer and AIDS Research, the National Health and Medical Research Council of Australia, the Swiss National Science Foundation, and the FM Kirby Foundation. The authors declare no financial or other conflicts of interest. Columbia University Medical Center provides international leadership in basic, preclinical, and clinical research; medical and health sciences education; and patient care. The medical center trains future leaders and includes the dedicated work of many physicians, scientists, public health professionals, dentists, and nurses at the College of Physicians and Surgeons, the Mailman School of Public Health, the College of Dental Medicine, the School of Nursing, the biomedical departments of the Graduate School of Arts and Sciences, and allied research centers and institutions. Columbia University Medical Center is home to the largest medical research enterprise in New York City and State and one of the largest faculty medical practices in the Northeast. The campus that Columbia University Medical Center shares with its hospital partner, NewYork-Presbyterian, is now called the Columbia University Irving Medical Center. For more information, visit cumc.columbia.edu or columbiadoctors.org.


News Article | May 7, 2017
Site: news.yahoo.com

Emmanuel Macron — a 39-year-old former banker who formed his political party around a year ago — will be the next president of France, according to exit polls released around 8 pm local time on Sunday. According to those polls, Macron bested his opponent, Marine Le Pen of the far-right National Front with roughly 65 percent of the vote; she received around 35 percent. Turnout was 65.3 percent at 5 pm local time — down from 71.96 percent in 2012. Over a quarter of voters abstained — the highest on record for France in decades (perhaps because far-left candidate Jean-Luc Mélenchon said he would not endorse either of the candidates after he failed to make the second round). The result, then, went much the way experts and pollsters alike expected it to go — even with an eleventh hour dump of hacked (and faked) Macron emails and documents just hours before the campaign officially ended on Friday. French media, however, also respected the blackout law — Le Monde, for example, one of the biggest papers in France, announced it would not publish or report on the Macron leaks until after Sunday’s second round vote). Macron is expected to celebrate at a packed rally at the Louvre. Le Pen’s post-vote party was, much like her candidacy, fraught with scandal even before it got started — after media outlets like Politico and BuzzFeed France were refused admittance, Le Monde and Bloomberg refused to cover the event out of solidarity. At Vincennes Park in Paris, Le Pen thanked the 11 million who voted for her, and all those who wanted to choose patriotism over globalization. “I call on all patriots to take part in the decisive political battles … Long live the republic, long live France.” And with that, she had conceded, and walked off the stage. Macron thanked those who voted for him, but went on to address every citizen of France. “I’m speaking to each of you tonight, to all of you together who make up the people of France. We have a duty to our country.” He added, “It is our very civilization that is at stake,” and said he would fight against terrorism and global warming, and for the French people and Europe. “A new page of our history is starting today. I want this new page to be one of hope.” Macron’s win is met with “an extraordinary global sense of relief,” Irene Finel-Honigman, a French politics expert at Columbia University, told Foreign Policy. From the perspective of global markets and politics, as well as from a European perspective, she said, this is “still seen as a total positive.” And, indeed, Macron’s win over Le Pen will be widely seen as a clear victory for Europe and a blow to xenophobia and fear. But that doesn’t mean Sunday was a complete victory for Macron — or a total loss for Le Pen. Macron’s next challenge is the parliamentary election in June. His own En Marche (Forward) movement, roughly a year old, faces an uphill battle in winning a legislative majority. “A new election will start immediately,” Pierre Vimont of Carnegie Europe said. In the likely event that Macron’s movement does not win a majority, he will need to try to form a workable governing coalition, bringing together some from the left and the right. With jobs at the top of voters’ concerns, he’ll likely want to move quickly to enact labor market reforms, and that will require confidence of people and parliament alike. But forming and leading a governing coalition is not so simple. For one thing, as Martin Michelot of the Prague-based EUROPEUM told FP, the traditional right and left parties — namely, the Republicans and the Socialists — were left in shambles after they both failed to make the second round (the first time in the history of the Fifth Republic that neither was able to do so). “What will also be interesting to see is whether French politicians can develop a coalition and compromise culture,” Michelot said. “In the context of the left/right divide, whoever was in opposition tended to vote against the majority along pretty strict party lines, in what was a rather unconstructive system.” A Macron presidency with a coalition government could be a chance to change all that — or it could mean Macron has to fight with both sides every time he wants to push a policy through, warned Columbia University’s Sheri Berman. That would only see the already massively discontented French electorate grow still more despondent. And if that’s the case, there’s one recently defeated force that will be ready and waiting. What we saw in this election, said Yascha Mounk, an expert on liberal democracy and populism, was “a radically transformed political landscape. [Le Pen] has more than doubled her party’s vote over the course of 15 years.” In 2002, when her father, Jean Marie Le Pen, faced Jacques Chirac, he got just 18 percent of the vote. “Imagine how she’ll do five years from now.” For Macron, “It’s going to be hard,” said Alessia Lefebure of Columbia University. Under the French constitution, parliament is more empowered than the president, but that hasn’t been the case — or at least hasn’t been perceived as being the case — in recent history. Still, Lefebure — like millions of French voters today — isn’t entirely pessimistic. “I think this could be a very positive moment,” she said. “France will be again very active in Europe, bringing hope to people in Europe that see the anti-democratic movement.” “If he’s smart, he can benefit from this momentum, and then the French will follow him.” Update, May 7 2017, 3:12 pm ET: This post was updated to include Macron’s statement.


News Article | May 5, 2017
Site: www.eurekalert.org

(New York - May 5, 2017) - Much is known about flu viruses, but little is understood about how they reproduce inside human host cells, spreading infection. Now, a research team headed by investigators from the Icahn School of Medicine at Mount Sinai is the first to identify a mechanism by which influenza A, a family of pathogens that includes the most deadly strains of flu worldwide, hijacks cellular machinery to replicate. The study findings, published online today in Cell, also identifies a link between congenital defects in that machinery -- the RNA exosome -- and the neurodegeneration that results in people who have that rare mutation. It was by studying the cells of patients with an RNA exosome mutation, which were contributed by six collaborating medical centers, that the investigators were able to understand how influenza A hijacks the RNA exosome inside a cell's nucleus for its own purposes. "This study shows how we can discover genes linked to disease -- in this case, neurodegeneration -- by looking at the natural symbiosis between a host and a pathogen," says the study's senior investigator, Ivan Marazzi, PhD, an assistant professor in the Department of Microbiology at the Icahn School of Medicine at Mount Sinai. Influenza A is responsible in part not only for seasonal flus but also pandemics such as H1N1 and other flus that cross from mammals (such as swine) or birds into humans. "We are all a result of co-evolution with viruses, bacteria, and other microbes, but when this process is interrupted, which we call the broken symmetry hypothesis, disease can result," Dr. Marazzi says. The genes affected in these rare cases of neurodegeneration caused by a congenital RNA exosome mutation may offer future insight into more common brain disorders, such as Alzheimer's and Parkinson's diseases, he added. In the case of Influenza A, the loss of RNA exosome activity severely compromises viral infectivity, but also manifests in human neurodegeneration suggesting that viruses target essential proteins implicated in rare disease in order to ensure continual adaptation. Influenza A is an RNA virus, meaning that it reproduces itself inside the nucleus. Most viruses replicate in a cell's cytoplasm, outside the nucleus. The researchers found that once inside the nucleus, influenza A hijacks the RNA exosome, an essential protein complex that degrades RNA as a way to regulate gene expression. The flu pathogen needs extra RNA to start the replication process so it steals these molecules from the hijacked exosome, Dr. Marazzi says. "Viruses have a very intelligent way of not messing too much with our own biology," he says. "It makes use of our by-products, so rather than allowing the exosome to chew up and degrade excess RNA, it tags the exosome and steals the RNA it needs before it is destroyed. "Without an RNA exosome, a virus cannot grow, so the agreement between the virus and host is that it is ok for the virus to use some of the host RNA because the host has other ways to suppress the virus that is replicated," says the study's lead author, Alex Rialdi, MPH, a graduate assistant in Dr. Marazzi's laboratory. Co-authors include investigators from the University of California-San Francisco, Columbia University, Regeneron Pharmaceuticals and Regeneron Genetics Center, Burnham Institute for Medical Research, and the University of California-Los Angeles. The study was supported by NIH grants 2RO1AI099195 and DP2 2OD008651 (U.B.), and partially supported by HHSN272201400008C - Center for Research on Influenza Pathogenesis (CRIP) a NIAID-funded Center of Excellence for Influenza Research and Surveillance (A.G.S, H.v.B., R.A., and I.M.). Other support includes the Department of Defense W911NF-14-1-0353 (to I.M.) NIH grant 1R56AI114770-01A1 (to I. M.), NIH grant 1R01AN3663134 (I.M. and H.v.B), and NIH grant U19AI106754 FLUOMICS (I.M., R.A., S.C., N.K., A.G.S.). The Mount Sinai Health System is an integrated health system committed to providing distinguished care, conducting transformative research, and advancing biomedical education. Structured around seven hospital campuses and a single medical school, the Health System has an extensive ambulatory network and a range of inpatient and outpatient services -- from community-based facilities to tertiary and quaternary care. The System includes approximately 7,100 primary and specialty care physicians; 12 joint-venture ambulatory surgery centers; more than 140 ambulatory practices throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and 31 affiliated community health centers. Physicians are affiliated with the renowned Icahn School of Medicine at Mount Sinai, which is ranked among the highest in the nation in National Institutes of Health funding per investigator. The Mount Sinai Hospital is in the "Honor Roll" of best hospitals in America, ranked No. 15 nationally in the 2016-2017 "Best Hospitals" issue of U.S. News & World Report. The Mount Sinai Hospital is also ranked as one of the nation's top 20 hospitals in Geriatrics, Gastroenterology/GI Surgery, Cardiology/Heart Surgery, Diabetes/Endocrinology, Nephrology, Neurology/Neurosurgery, and Ear, Nose & Throat, and is in the top 50 in four other specialties. New York Eye and Ear Infirmary of Mount Sinai is ranked No. 10 nationally for Ophthalmology, while Mount Sinai Beth Israel, Mount Sinai St. Luke's, and Mount Sinai West are ranked regionally. Mount Sinai's Kravis Children's Hospital is ranked in seven out of ten pediatric specialties by U.S. News & World Report in "Best Children's Hospitals." For more information, visit http://www. , or find Mount Sinai on Facebook, Twitter and YouTube.


News Article | April 20, 2017
Site: news.yahoo.com

This undated microscope image made available by the National Center for Microscopy and Imaging Research shows HeLa cells. Until these cells came along, whenever human cells were put in a lab dish, they would die immediately or reproduce only a few times. Henrietta Lacks' cells, by contrast, grew indefinitely. They were "perpetual, everlasting, death-defying, or whatever other word you want to use to describe immortal," says Dr. Francis Collins, director of the U.S. National Institutes of Health. (National Center for Microscopy and Imaging Research via AP) NEW YORK (AP) — What happened in the 1951 case of Henrietta Lacks, and could it happen again today? The story of the woman who unwittingly spurred a scientific bonanza made for a best-selling book in 2010. On Saturday, it returns in an HBO film with Oprah Winfrey portraying Lacks' daughter Deborah. Cells taken from Henrietta Lacks have been widely used in biomedical research. They came from a tumor sample taken from Lacks — who never gave permission for their use. A look at the case: HOW DID DOCTORS GET THE CELLS? As the book relates, Lacks was under anesthesia on an operating table at Johns Hopkins Hospital in Baltimore one day in 1951, undergoing treatment for cervical cancer. A hospital researcher had been collecting cervical cancer cells to see if they would grow continuously in the laboratory. So the surgeon treating Lacks shaved a dime-sized piece of tissue from her tumor for that project. Nobody had asked Lacks if she wanted to provide cells for the research. She died later that year. WAS IT ILLEGAL TO TAKE THE CELLS WITHOUT HER PERMISSION? Not at that time. "What happened to Henrietta Lacks was commonly done," says bioethicist Dr. Robert Klitzman of Columbia University in New York. WHAT ARE THE RULES NOW IN THE U.S.? Specimens intended specifically for research can be collected only if the donor gives consent first. If cells or tissues are instead removed for diagnosis and treatment, that is considered part of the patient's general consent for treatment. But there's a twist. Once a specimen is no longer needed for treating the patient and would otherwise be discarded, scientists can use it for research. No further consent is needed, as along as information identifying the patient as the source is removed and the specimen can't be traced back to the patient, says Johns Hopkins University bioethicist Jeffrey Kahn. IF A SPECIMEN LEADS TO A PRODUCT, DOES THE DONOR HAVE A RIGHT TO SHARE IN THE PROFITS? Generally not, because the consent form for donation or treatment usually waives any such legal right. WHAT WAS SO SPECIAL ABOUT LACKS' CELLS? Until they came along, whenever human cells were put in a lab dish, they would die immediately or reproduce only a few times. Her cells, by contrast, could be grown indefinitely. They were "perpetual, everlasting, death-defying, or whatever other word you want to use to describe immortal," as Dr. Francis Collins, director of the U.S. National Institutes of Health, put it. So they provided an unprecedented stock of human cells that could be shipped worldwide for experiments. They quickly became the most popular human cells for research, and have been cited in more than 74,000 scientific publications. HOW HAVE RESEARCHERS USED THE CELLS? The so-called "HeLa" cells became crucial for key developments in such areas as basic biology, understanding viruses and other germs, cancer treatments, in vitro fertilization and development of vaccines, including the polio vaccine. WHAT MAKES THEM GROW SO WELL? Researchers proposed a possible answer in 2013. Virtually all cases of cervical cancer are caused by infection with human papillomavirus , which inserts its genetic material into a human cell's DNA. Scientists who examined the DNA of HeLa cells suggested that happened in a place that strongly activated a cancer-promoting gene. That might explain both why Lacks' cancer was so aggressive and why the cells grow so robustly in a lab dish. DID EVERYBODY ALWAYS KNOW THE ORIGIN OF THE CELLS? No. Lacks was named publicly only in 1971, by an article in a medical journal. Her story appeared in some magazines in the 1970s, and in a 1997 documentary on BBC. She became famous in 2010 with publication of Rebecca Skloot's best-selling book, "The Immortal Life of Henrietta Lacks." Follow Malcolm Ritter at http://twitter.com/malcolmritter His recent work can be found at http://tinyurl.com/RitterAP


News Article | April 28, 2017
Site: motherboard.vice.com

For a while now, the most common causes of death around the world have been relatively constant, albeit varied depending on place: heart disease and respiratory illness rank high in the list regardless of country, while in some parts of the developing world infectious diseases like malaria and HIV/AIDS take a larger toll. But what if, in some as-yet undiscovered medical breakthrough, these conditions and all other causes of physical infirmity could be cured? How long would we live, if the only way we died was when an accident or injury befell us? According to a calculation by statistics website Polstats.com the answer is a respectable 8,938 years on average: long enough that humans born in the Neolithic period would still be alive today. To give a sense of what that looks like, the website uses an interactive simulation of a 100-strong population, who die off over time as accidents randomly occur. (Some of the Reddit discussion of the site has focused on the lone outliers who may survive 50,000 years or more.) Of course, the website is mostly an example of how to have fun with statistics and doesn't pretend to be making any scientifically valid claims, but some of the methods used still do resemble professional epidemiology. Dr Katherine Keyes, Associate Professor of Epidemiology at Columbia University's Mailman School of Public Health, confirmed that the basic technique—calculating the incidence of types of death in a population based on the statistical likelihood of accidents—is sound, even if the list of causes selected by Polstats are incomplete. But rather than asking what would happen if all diseases were eliminated, an epidemiologist is more often concerned with how the average life expectancy would change if just one cause of death could be reduced. "There has been some interesting work that has shown that one of the central reasons the US lags behind a lot of similarly positioned nations [in life expectancy] is the burden of injury, which especially afflicts younger individuals: firearm related homicide and suicide for example, and drug poisoning," Keyes said in a phone call. In this case "injury" refers to causes of death outside illness, what Polstats calls 'unnatural' although those who study mortality avoid the term. In fact, Keyes says that rather than imagine a drastically raised life expectancy all round, much of her work involves questioning why certain groups have worse outcomes than others. "Trends in life expectancy are troubling right now in the US," she said. "The most recent analyses available indicate that for the first time we're seeing declines in life expectancy for some demographic groups, especially those at the lower end of the socioeconomic ladder. There have been gains in life expectancy, but those have been almost exclusively confined to high socioeconomic status groups, so there's a widening disparity between the haves and the have-nots." While it's fun to imagine what might happen if injury became the only cause of death, the sad truth of the evidence we have is that there's a lot more work to be done before we're anywhere near. Subscribe to Science Solved It , Motherboard's new show about the greatest mysteries that were solved by science.


"Our students could not have a better role model as they embark on their careers as leaders in transforming healthcare" says Allen M. Spiegel, M.D., the Marilyn and Stanley M. Katz Dean at Einstein and executive vice president and chief academic officer at Montefiore Medicine. "A longtime and dedicated champion for the most vulnerable among us, he embodies the compassion and dedication to our profession that we expect our students will uphold. It is a privilege to have him join us this year." Dr. Frieden, a physician trained in internal medicine, infectious diseases, public health, and epidemiology, spent eight years at the helm of the CDC after his appointment in 2009 by President Barack Obama. From the beginning of his tenure, he led the United States' response to the global H1N1 influenza virus pandemic. Lauded for his calm demeanor, clear communication style, and persistence in pushing for public health improvements, Dr. Frieden is credited with leading the CDC's work to combat the 2014 Ebola epidemic, accelerating progress addressing drug-resistant infections and opioid use, and preventing strokes, heart attacks, and cancer. Prior to his CDC leadership, Dr. Frieden served as NYC Health Commissioner. He led former Mayor Michael Bloomberg's controversial push to ban smoking in bars and restaurants, and institute the nation's first ban on trans fats in chain restaurants. He also focused on smoking cessation campaigns and HIV prevention programs, and designed and launched the Bloomberg Initiative to Reduce Tobacco Use, a global effort to promote tobacco control policies that has so far prevented more than 20 million deaths. Dr. Frieden, author of more than 250 publications, received medical and public health degrees from Columbia University, and infectious diseases training at Yale University. In his first stint with the CDC from 1990 to 1996, he was an Epidemic Intelligence Service Officer, then led the agency's New York City program on tuberculosis control. He later served the CDC as a medical officer in India, where he worked on that nation's tuberculosis control program, which is credited with saving millions of lives. About Albert Einstein College of Medicine Albert Einstein College of Medicine is one of the nation's premier centers for research, medical education and clinical investigation. During the 2016-2017 academic year, Einstein is home to 717 M.D. students, 166 Ph.D. students, 103 students in the combined M.D./Ph.D. program, and 278 postdoctoral research fellows. The College of Medicine has more than 1,900 full-time faculty members located on the main campus and at its clinical affiliates. In 2016, Einstein received more than $160 million in awards from the National Institutes of Health (NIH). This includes the funding of major research centers at Einstein in aging, intellectual development disorders, diabetes, cancer, clinical and translational research, liver disease, and AIDS. Other areas where the College of Medicine is concentrating its efforts include developmental brain research, neuroscience, cardiac disease, and initiatives to reduce and eliminate ethnic and racial health disparities. Its partnership with Montefiore, the University Hospital and academic medical center for Einstein, advances clinical and translational research to accelerate the pace at which new discoveries become the treatments and therapies that benefit patients. Einstein runs one of the largest residency and fellowship training programs in the medical and dental professions in the United States through Montefiore and an affiliation network involving hospitals and medical centers in the Bronx, Brooklyn and on Long Island. For more information, please visit www.einstein.yu.edu, read our blog, follow us on Twitter, like us on Facebook and view us on YouTube To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/former-cdc-director-dr-tom-frieden-to-deliver-2017-commencement-address-at-albert-einstein-college-of-medicine-300446732.html


The U.S. Defense Department is looking for ways to speed up cognitive skills training -- the types of skills useful for specialists such as linguists, intelligence analysts and cryptographers -- and is awarding University of Florida engineers and neuroscientists up to $8.4 million over the next four years to investigate how to do that by applying electrical stimulation to peripheral nerves as a means of strengthening neuronal connections in the brain. Two neuroengineering experts in UF's Herbert Wertheim College of Engineering are among eight team leaders across the country receiving awards announced Wednesday under the Targeted Neuroplasticity Training program of the Defense Advanced Research Projects Agency, or DARPA. The program's goal is to develop safe and effective enhanced training regimens that accelerate the acquisition of cognitive skills while reducing the cost and time of the DoD's extensive training program. A large percentage of the work involves fundamental research to decipher the neural mechanisms that underlie the influence of nerve stimulation on brain plasticity. Under an award of up to $4.2 million, Kevin J. Otto, Ph.D., will lead a team of neuroscientists from the Evelyn F. and William L. McKnight Brain Institute of the University of Florida and the Malcom Randall VA Medical Center to identify which neural pathways in the brain are activated by vagal nerve stimulation. The team will conduct behavioral studies in rodents to determine the impact of vagal nerve stimulation on perception, executive function, decision-making and spatial navigation. This could potentially lead to an expansion of the use of vagal nerve stimulation, a therapy currently applied to prevent seizures in patients with epilepsy and to treat depression and chronic pain. "There are clinical applications, but very little understanding of why it works," said Jennifer L. Bizon, a professor of neuroscience at UF and an investigator on Otto's team. "We are going to do the systematic science to understand how this stimulation actually drives brain circuits and, ultimately, how to maximize the use of this approach to enhance cognition." The research funded by the DARPA awards will test the mechanisms by which peripheral nerve modulations make learning faster and more efficient. For military analysts on the job, "One hypothetical example would be target detection," said co-investigator Barry Setlow, Ph.D., a professor of psychiatry at UF. "So for people who spend hours a day looking for things of interest on a screen, if by stimulating their vagus nerve at just the right time you can help them realize performance improvements more quickly, then they become better attuned to the fine details of images." The technology has the potential to help Defense Department personnel advance through training more quickly, yet effectively. "Currently, they could spend 50 years of their careers, 80 hours a week, just doing training and still wouldn't be qualified to do every single thing," said Otto, an associate professor in the J. Crayton Pruitt Family Department of Biomedical Engineering. "So they're always interested in increasing mechanisms of learning and memory." Otto said if investigators can gain a more complete understanding of how targeted neuroplasticity works, they may be able to figure out how to optimize learning while avoiding potential side effects, such as blood pressure manipulation, heart rate changes and perceived visceral pain. In a second UF effort, and with an additional $4.2 million award, Karim Oweiss, Ph.D., a professor of electrical and computer engineering, biomedical engineering and neuroscience, will study the mechanisms by which cranial nerve stimulation can affect brain activity. His lab will use advanced optical imaging that will produce extremely high-resolution images of brain dynamics to map the functional circuitry in areas of the brain responsible for executive function. Additionally, optogenetic interrogation, a technique to drive specific brain cells to fire or go silent in response to targeted illumination, will be used to study the causal involvement of these areas in learning cue salience and working memory formation in rodents trained on auditory discrimination and decision making tasks. Oweiss will collaborate with Qi Wang, an assistant professor at Columbia University. Wang's lab will focus on the noradrenergic pathway -- a neuromodulator widely responsible for brain attention and arousal -- and the extent to which it is engaged when rodents learn a tactile discrimination task. Oweiss' project seeks to demonstrate the effects of vagal nerve stimulation on cognitive-skill learning and the brain activity supporting those skills, as well as to optimize the stimulation parameters and training protocols for long-term retention of those skills. "We want to see if it's possible to promote targeted changes in specific brain circuits to accelerate this process by stimulating the vagus nerve, which sends close to 80 percent of its output back to the brain," Oweiss said. "So if one knows that 'brain area A' talks to 'brain area B' when learning a new language, can we develop training protocols that promote the exchange between these two areas while leaving other areas unaltered? Then the person will learn at a faster rate and retain the skills for much longer." The implications of both projects reach beyond accelerated learning speeds. "If we identify specific ways that neural pathways change as a person learns, then if a person loses brain function, we could potentially rewire disconnected brain areas and personalize neural rehabilitation," said Oweiss. "This technology could be used to restore quality of life much quicker if brain function has been compromised."


News Article | April 25, 2017
Site: www.chromatographytechniques.com

The key to unlocking better rechargeable lithium-ion batteries for portable electronics, electric vehicles and grid-level energy storage lies in the material used to construct the battery cell’s cathode, anode and electrolyte. Currently, a liquid electrolyte powers commercial lithium-ion batteries; but this is also what makes the power source flammable and vulnerable to explosions. So, a team from Columbia University is exploring the possibility of using a solid electrolyte via ice-templating, also called freeze-casting. Yuan Yang’s team was interested in using ice-templating to fabricate vertically aligned structures of ceramic solid electrolytes. In experiments, Yuan Yang’s team cooled the aqueous solution with ceramic particles from the bottom and then let ice grow and push away and concentrate the ceramic particles. They then applied a vacuum to transition the solid ice to a gas, leaving a vertically aligned structure. Finally, they combined this ceramic structure with polymer to provide mechanical support and flexibility to the electrolyte. “We thought that if we combined the vertically aligned structure of the ceramic electrolyte with the polymer electrolyte, we would be able to provide a fast highway for lithium ions and thus enhance the conductivity,” says Haowei Zhai, Yang’s Ph.D. student and the paper’s lead author. “We believe this is the first time anyone has used the ice-templating method to make a flexible solid electrolyte, which is nonflammable and nontoxic, in lithium batteries. This opens a new approach to optimize ion conduction for next-generation rechargeable batteries.” Researchers in earlier studies used either randomly dispersed ceramic particles in polymer electrolyte or fiber-like ceramic electrolytes that were not vertically aligned. Verticality is the key to Yang and Zhai’s newest method, which could lead to lithium batteries that are safer, feature a longer battery life and are bendable, providing new avenues for flexible smartphones and tablets. Next, Yang and Zhai plan to work on optimizing the qualities of the combined electrolyte and assembling the flexible solid electrolyte together with battery electrodes to construct a prototype of a full lithium-ion battery. Elsewhere, researchers are turning to data in hopes of improving rechargeable batteries, specifically for integration into electric vehicles. Toyota Research Institute has pledged $35 million over four years to MIT, Stanford and Purdue to develop a novel, data-driven design of lithium-ion batteries leveraging a unique nanoscale visualization technique devised by Stanford researchers in August 2016. Using brilliant X-rays and high-tech microscopes, the Stanford study visualized the fundamental building blocks of batteries—the charge/discharge reaction in real-time. The researchers discovered that the charging process (delithiation) is significantly less uniform than discharge (lithiation). They also found that faster charging improves uniformity. Based on that data, the multi-university team will use theory, multi-scale modeling and simulations and machine learning to develop a scalable predictive modeling framework for rechargeable lithium-ion batteries. The four-year effort seeks to better understand the fundamental science governing how a battery’s internal architecture impacts energy storage, recharging speed and reliability, including how to improve the design of a battery cell’s electrodes.


News Article | April 25, 2017
Site: grist.org

Growing up in New York, Cynthia Malone had clear goals. “I had this vision of becoming the next Jane Goodall — the black Jane Goodall,” she says. She earned a master’s degree in conservation biology from Columbia University studying oil palm expansion and conflicts between humans and wildlife. Her fieldwork took her to the Solomon Islands, Indonesia, and Cameroon. While working toward her degree, Black Lives Matter protests and viral videos of police brutality gripped the country. So she got involved. Malone began working with Columbia’s graduate students of color and Black Youth Project 100, a youth-led organization that coordinates direct action and political education. She also cofounded the Diversity Committee at the Society for Conservation Biology. Her goal there is to make human diversity in the sciences as important as biological diversity. Malone is headed into a Ph.D. at University of Toronto studying neocolonialism and who gets a seat at the table in conservation decisions. She’s also building a network of environmental scholars of color and activists to “think through what a decolonized conservation science would look like.” In other words: create a new, more equitable vision for the future of science. Meet all the fixers on this year’s Grist 50.This story was originally published by Grist with the headline Meet the fixer: This scientist brings social justice to her field. on Apr 25, 2017.


News Article | May 3, 2017
Site: globenewswire.com

Hooksett, NH, May 03, 2017 (GLOBE NEWSWIRE) -- Manchester, NH:  Assignment Magazine, the literary journal of the the Mountainview MFA low-residency program, launches issue three on May 16. Included in this issue: fiction by Amie Barrodale, Lyudmila Petrushevskaya, Christine Smallwood, and MFA Student Contest Winner, David Moloney. Plus, David Moloney interviews his mentor, Andre Dubus III. Amie Barrodale is the author os You Are Having a Good Time, a collection of highly compressed and charged tales in which the veneer of normality is stripped from the characters’ lives to reveal the seething and contradictory desires that fuel them. The Moscow-born Lyudmila Petrushevskaya is regarded as one of Russia's most prominent contemporary writers, whose writing combines postmodernist trends with the psychological insights and parodic touches of writers such as Anton Chekhov. Over the last few decades, she has been one of the most acclaimed contemporary writers at work in Eastern Europe; Publishers Weekly has called her "one of the finest living Russian writers”. Christine Smallwood holds a Ph.D. in English and Comparative Literature from Columbia University. Her reviews and essays have been published in The New Yorker, Bookforum, T: The New York Times Style Magazine, and Harper’s Magazine, where she writes the monthly “New Books” column. Her fiction has been published in The Paris Review, n+1, and Vice. Andre Dubus III is an American novelist and short story writer. He is a member of the faculty at the University of Massachusetts Lowell. Dubus' work has been included in The Best American Essays 1994, The Best Spiritual Writing 1999, and The Best of Hope Magazine. He has been awarded a Guggenheim Fellowship, the National Magazine Award for fiction, and the Pushcart Prize. He was a finalist for the Rome Prize awarded by the American Academy of Arts and Letters. Dubus's novel House of Sand and Fog was a fiction finalist for the National Book Award, the Los Angeles Times Book Prize, and Booksense Book of the Year. It was an Oprah Book Club selection and headed the New York Times bestseller list. It has been published in twenty languages and the 2003 film adaptation directed by Vadim Perelman was nominated for an Academy Award. A photo accompanying this announcement is available at http://www.globenewswire.com/NewsRoom/AttachmentNg/c20f9e0a-395e-4667-af92-304eb0734e33


News Article | May 3, 2017
Site: globenewswire.com

Hooksett, NH, May 03, 2017 (GLOBE NEWSWIRE) -- Manchester, NH:  Assignment Magazine, the literary journal of the the Mountainview MFA low-residency program, launches issue three on May 16. Included in this issue: fiction by Amie Barrodale, Lyudmila Petrushevskaya, Christine Smallwood, and MFA Student Contest Winner, David Moloney. Plus, David Moloney interviews his mentor, Andre Dubus III. Amie Barrodale is the author os You Are Having a Good Time, a collection of highly compressed and charged tales in which the veneer of normality is stripped from the characters’ lives to reveal the seething and contradictory desires that fuel them. The Moscow-born Lyudmila Petrushevskaya is regarded as one of Russia's most prominent contemporary writers, whose writing combines postmodernist trends with the psychological insights and parodic touches of writers such as Anton Chekhov. Over the last few decades, she has been one of the most acclaimed contemporary writers at work in Eastern Europe; Publishers Weekly has called her "one of the finest living Russian writers”. Christine Smallwood holds a Ph.D. in English and Comparative Literature from Columbia University. Her reviews and essays have been published in The New Yorker, Bookforum, T: The New York Times Style Magazine, and Harper’s Magazine, where she writes the monthly “New Books” column. Her fiction has been published in The Paris Review, n+1, and Vice. Andre Dubus III is an American novelist and short story writer. He is a member of the faculty at the University of Massachusetts Lowell. Dubus' work has been included in The Best American Essays 1994, The Best Spiritual Writing 1999, and The Best of Hope Magazine. He has been awarded a Guggenheim Fellowship, the National Magazine Award for fiction, and the Pushcart Prize. He was a finalist for the Rome Prize awarded by the American Academy of Arts and Letters. Dubus's novel House of Sand and Fog was a fiction finalist for the National Book Award, the Los Angeles Times Book Prize, and Booksense Book of the Year. It was an Oprah Book Club selection and headed the New York Times bestseller list. It has been published in twenty languages and the 2003 film adaptation directed by Vadim Perelman was nominated for an Academy Award. A photo accompanying this announcement is available at http://www.globenewswire.com/NewsRoom/AttachmentNg/c20f9e0a-395e-4667-af92-304eb0734e33

Loading Columbia University collaborators
Loading Columbia University collaborators