Baltimore, MD, United States
Baltimore, MD, United States

The Johns Hopkins University is a private research university in Baltimore, Maryland. Founded in 1876, the university was named after its first benefactor, the American entrepreneur, abolitionist, and philanthropist Johns Hopkins. His $7 million bequest—of which half financed the establishment of The Johns Hopkins Hospital—was the largest philanthropic gift in the history of the United States at the time. Daniel Coit Gilman, who was inaugurated as the institution's first president on February 22, 1876, led the university to revolutionize higher education in the U.S. by integrating teaching and research.The first research university in the Western Hemisphere and one of the founding members of the American Association of Universities, Johns Hopkins has ranked among the world’s top universities throughout its history. The National Science Foundation has ranked the university #1 among U.S. academic institutions in total science, medical, and engineering research and development spending for 31 consecutive years. Johns Hopkins is also ranked #12 in the U.S. News and World Report undergraduate program rankings for 2014 and was also ranked 11th in the U.S. News and World Report Best Global University Rankings of 2014, outranking Princeton University, Yale University, University of Pennsylvania, and Cornell University.Over the course of almost 140 years, 36 Nobel Prize winners have been affiliated with Johns Hopkins . Founded in 1883, the Blue Jays men’s lacrosse team has captured 44 national titles and joined the Big Ten Conference as an affiliate member in 2014.Johns Hopkins is organized into ten divisions on campuses in Maryland and Washington, D.C. with international centers in Italy, China, and Singapore. The two undergraduate divisions, the Krieger School of Arts and science and the Whiting School of Engineering, are located on the Homewood campus in Baltimore's Charles Village neighborhood. The medical school, the nursing school, and the Bloomberg School of Public Health are located on the Medical Institutions campus in East Baltimore. The university also consists of the Peabody Institute, the Applied Physics Laboratory, the Paul H. Nitze School of Advanced International Studies, the education school, the Carey Business School, and various other facilities. Wikipedia.


Time filter

Source Type

Patent
Johns Hopkins University | Date: 2016-08-15

A surgical system provides hands-free control of at least one surgical tool includes a robot having a tool connector, a smart tool attached to the tool connector of the robot, and a feedback control system configured to communicate with the smart tool to provide feedback control of the robot. The smart tool includes a tool that has a tool shaft having a distal end and a proximal end, a strain sensor arranged at a first position along the tool shaft, at least one of a second strain sensor or a torque-force sensor arranged at a second position along the tool shaft, the second position being more towards the proximal end of the tool shaft than the first position, and a signal processor configured to communicate with the strain sensor and the at least one of the second strain sensor or the torque-force sensor to receive detection signals therefrom. The signal processor is configured to process the detection signals to determine a magnitude and position of a lateral component of a force applied to the tool shaft when the position of the applied force is between the first and second positions. The feedback system controls the robot to move in response to at least the magnitude and position of the lateral component of the force applied to the tool shaft when the position of the applied force is between the first and second positions so as to cancel the force applied to the tool shaft to thereby provide hands-free control of the at least one surgical tool.


Patent
Johns Hopkins University | Date: 2016-08-19

The present invention provides a low-risk, unobtrusive and noninvasive method and device for treatment of obesity and eating disorders. In embodiments, the device is a gastric device suitable for placement in a stomach of a subject. The device may be composed of a sponge material which absorbs fluid upon implantation and expands in volume, thereby functioning as a space occupying device in the stomach to cause early satiety.


The present invention is directed to an innovative pedicle probe that uses a force-sensing electromechanical system coupled with haptic and visual feedback. The probe of the present invention reduces the rate of pedicle screw breaches during spinal fusion surgery. The probe provides an effective guidance system to aid surgeons in detecting and preventing cortical bone breaches, thereby minimizing risk of intraoperative injury to the patient. Moreover, the probe invention decreases surgeon reliance on intraoperative radiation, reducing harmful exposure to both patients and surgeons.


The present invention provides a method of administering a therapeutic agent directly to the brain parenchym through a compromised region of the blood-brain barrier in a subject having a brain disorder, that involves first disrupting the blood-brain barrier (BBB) at an isolated region by locally administering an effective amount of a hyperosmolar agent at said region using a catheter, followed by administering a therapeutically effective amount of a therapeutic agent. The step of disrupting the BBB is carried out with non-invasive MR (magnetic resonance) imaging with a contrast agent to visualize local parenchymal transcatheter perfusion at said isolated BBB region thereby indicating that the BBB region is compromised. The method of the invention allows for highly precise drug delivery to the brain through blood brain barrier disruption at specifically controlled regions.


Patent
Johns Hopkins University | Date: 2015-02-12

The present invention is directed to a computer application for monitoring and tracking leg and foot movements and positions and a device for facilitating the computer tracking of the leg and foot movements. The application uses an accelerometer, gyroscope or other movement detectors in available devices such as a phone, movement tracker, personal music device, tablet computing device, other similar device or a device specifically designed to detect movements and positions and to track the movement and changes in position of the patients leg and foot. The device can be held onto the patients leg using a band type device that is easy to use and comfortable during sleep or by incorporation into a comfortable wearable band. The application includes a user interface and a backend for use by physicians or other healthcare staff to review and diagnose the patients leg and foot movement patterns.


A method, computer-readable medium and system of planning, guiding and/or monitoring a therapeutic procedure, can include: receiving a non-labeled therapeutic agent by a subject, said non-labeled therapeutic agent comprises at least one type of water-exchangeable proton that is exchangeable with protons in surrounding water molecules so as to enhance detection by a chemical exchange saturation transfer (CEST) process; acquiring a plurality of CEST magnetic resonance images of said non-labeled therapeutic agent within a region of interest of said subject for a corresponding plurality of times; and assessing at least one of a therapeutic plan or therapeutic effect of said non-labeled therapeutic agent in tissue of said subject based on said plurality of magnetic resonance images


Patent
Johns Hopkins University | Date: 2016-07-27

The presently disclosed subject matter provides methods for continuously generating uniform polyelectrolyte complex (PEC) nanoparticles comprising: flowing a first stream comprising one or more water-soluble polycationic polymers at a first variable flow rate into a confined chamber; flowing a second stream comprising one or more water-soluble polyanionic polymers at a second variable flow rate into the confined chamber; and impinging the first stream and the second stream in the confined chamber until the Reynolds number is from about 1,000 to about 20,000, thereby causing the one or more water-soluble polycationic polymers and the one or more water-soluble polyanionic polymers to undergo a polyelectrolyte complexation process that continuously generates PEC nanoparticles. Compositions produced from the presently disclosed methods and a device for producing the compositions are also disclosed.


Patent
University of Illinois at Urbana - Champaign, Vanquish Oncology and Johns Hopkins University | Date: 2016-08-22

The invention provides compositions and methods for the induction of cell death, for example, cancer cell death. Combinations of compounds and related methods of use are disclosed, including the use of compounds in therapy for the treatment of cancer and selective induction of apoptosis in cells. The disclosed drug combinations can have lower neurotoxicity effects than other compounds and combinations of compounds.


Patent
Johns Hopkins University and Showa Pharmaceutical University | Date: 2016-12-12

In DN-DISC1 mice, a mouse model for major mental illnesses, the model that expresses pathological phenotypes relevant to schizophrenia, mood disorders, and addiction simultaneously, the inventors of the present invention found pronounced levels of oxidative stress in the prefrontal cortex, but not in the striatum. These mice also displayed greater amounts of GAPDH-Siah1 binding, a protein-protein interaction that is activated under exposure to oxidative stress. The present inventors investigated the role of oxidative stress in other organ systems. As detailed herein, the inventors found that GAPDH-Siah1 binding was increased in mouse models of cardiac failure. It was also found, that certain novel analogs of deprenyl, significantly inhibited GAPDH-Siah1 binding in cardiac tissue. Thus, with experimental data provided herein, it is clear that this GAPDH-Siah1 binding cascade is a crucial mechanism involved in major mental illness, such as schizophrenia, mood disorders, and addiction, as well as in stress-associated diseases involving other organs where GAPDH is expressed. The present invention provides compounds and composition comprising analogs of deprenyl and their use in the inhibition of nuclear GAPDH-Siah1 binding and the activation of p300 and MEF2. Also provided herein are methods of prevention and treatment of stress induced disorders of the body, including, for example, major mental illness, such as schizophrenia, mood disorders, and addiction, as well as in stress-associated diseases involving other organs, such as cardiac hypertrophy, in vivo, comprising administering to a mammal a therapeutically effective amount of analogs of deprenyl.


Patent
Johns Hopkins University | Date: 2016-09-16

The Vertebral Osteotomy Saw Guide allows precise osteotomies to be performed through the vertebral column in conjunction with a thread-wire saw. The guide is designed so that it can mount to rods commonly used during spinal surgery for spinal stabilization. The mount of the guide is a polyaxial mount, allowing the angle of the guide mount to be adjusted and locked to create a desired cutting plane to produce a precise osteotomy. The guide itself consists of two interdigitated pulley wheels that allow the thread-wire saw to pass smoothly through the guide. The simple, but unique design of the guide allows a surgeon to perform an osteotomy through the vertebral column cutting from one side of the vertebral column to the other. This unique orientation allows the osteotomy to be performed away from critical structures in the region (the spinal cord, aorta, and inferior vena cava).


The present invention is directed to a method for combining assessment of different factors of dyssynchrony into a comprehensive, non-invasive toolbox for treating patients with a CRT therapy device. The toolbox provides high spatial resolution, enabling assessment of regional function, as well as enabling derivation of global metrics to improve patient response and selection for CRT therapy. The method allows for quantitative assessment and estimation of mechanical contraction patterns, tissue viability, and venous anatomy from CT scans combined with electrical activation patterns from Body Surface Potential Mapping (BSPM). This multi-modal method is therefore capable of integrating electrical, mechanical, and structural information about cardiac structure and function in order to guide lead placement of CRT therapy devices. The method generates regional electro-mechanical properties overlaid with cardiac venous distribution and scar tissue. The fusion algorithm for combining all of the data suggests cardiac segments and routes for implantation of epicardial pacing leads.


The present invention is directed to a method for real-time characterization of spatially-resolved tissue optical properties using OCT/LCI. Imaging data are acquired, processed, displayed and stored in real-time. The resultant tissue optical properties are then used to determine the diagnostic threshold and to determine the OCT/LCI detection sensitivity and specificity. Color-coded optical property maps are constructed to provide direct visual cues for surgeons to differentiate tumor versus non-tumor tissue. These optical property maps can be overlaid with the structural imaging data and/or Doppler results for efficient data display. Finally, the imaging system can also be integrated with existing systems such as tracking and surgical microscopes. An aiming beam is generally provided for interventional guidance. For intraoperative use, a cap/spacer may also be provided to maintain the working distance of the probe, and also to provide biopsy capabilities. The method is usable for research and clinical diagnosis and/or interventional guidance.


Patent
Johns Hopkins University | Date: 2016-10-27

Synthetic representative HCV subtypes, including a 1a and 1b genome, dubbed Bole1a and Bole1b, are provided using an inventive method of Bayesian phylogenetic tree analysis, ancestral sequence reconstruction and covariance analysis. Bole1a branches centrally among 390 full-genome sequences used in its design, a carefully curated 143 sequence full-genome dataset, and separate genomic regions including an independent set of 214 E1E2 sequences from a Baltimore cohort. Bole1a is phylogenetically representative of widely circulating strains. Full genome non-synonymous diversity comparison and 9-mer peptide coverage analysis showed that Bole1a is able to provide more coverage (94% and 78% respectively) than any other sequence in the dataset including H77, a traditional reference sequence. Bole1a also provides unsurpassed epitope coverage when compared to all known T cell epitopes.


The present inventors employed cyclodextrins for use as a proteoglycan substitute to engineer a biomimetic collagen-based matrix composition. The resulting incorporation of cyclodextrin in the inventive collagen compositions increased collagen thermal stability and reduced collagen fibrogenesis. As a result, a thick, transparent and mechanically strong collagen-based composition was formed. This cyclodextrin-collagen composition holds a great potential to be used as a therapeutic eye patch for corneal repair. Methods for making these inventive compositions and their use are also provided.


Patent
Johns Hopkins University | Date: 2016-09-23

We found that FIZZ1/RELM is inducible by hypoxia in lung. The hypoxia-upregulated expression of FIZZ1/RELM was located in the pulmonary vasculature, bronchial epithelial cells, and type II pneumocytes. Recombinant FIZZ1/RELM protein stimulates rat pulmonary microvascular smooth muscle cell (RPSM) proliferation dose-dependently. Therefore, we renamed this gene as hypoxia-induced mitogenic factor (HIMF). HIMF strongly activated Akt phosphorylation. The phosphatidylinositol 3-kinase (PI3K) inhibitor LY294002 inhibits HIMF-activated Akt phosphorylation. It also inhibits HIMF-stimulated RPSM proliferation. Thus, the PI3K/Akt pathway, at least in part, mediates the proliferative effect of HIMF. HIMF also has angiogenic and vasoconstrictive activity. Notably, HIMF increases pulmonary arterial pressure and vascular resistance more potently than either endothelin-1 or angiotensin II.


Patent
Johns Hopkins University | Date: 2016-11-17

The invention provides chimeric proteins and nucleic acids encoding these which can be used to generate vaccines against selected antigens. In one aspect, a chimeric protein comprises an antigen sequence and a domain for trafficking the protein to an endosomal compartment, irrespective of whether the antigen is derived from a membrane or non-membrane protein. In one preferred aspect, the trafficking domain comprises a lumenal domain of a LAMP polypeptide. Alternatively, or additionally, the chimeric protein comprises a trafficking domain of an endocytic receptor (e.g., such as DEC-205 or gp200-MR6). The vaccines (DNA, RNA or protein) can be used to modulate or enhance an immune response against any kind of antigen. In one preferred aspect, the invention provides a method for treating a patient with cancer by providing a chimeric protein comprising a cancer-specific antigen or a nucleic acid encoding the protein to the patient.


The present invention provides novel methods for treating Th2-mediated immune disorders and enhancing Th1-mediated immune responses in a subject comprising administering to the subject, a pharmaceutical composition comprising a serum-glucocorticoid regulated kinase 1 (SGK1) inhibitor and a pharmaceutically acceptable carrier. Methods for treating a wide range of autoimmune diseases are also taught. The present invention also provides methods for augmenting the treatment of subjects having viral or parasitic infections, or which have cancerous tumors.


The present invention provides pharmaceutical compositions for treating neuromyelitis optica (NMO) comprising a therapeutically effective amount of loop C sequence-containing peptide of aquaporin-4 (AQP4) water channel, or a therapeutically effective fragment or variant thereof. The invention also provides methods for treating NMO by administering therapeutically effective amounts of loop C sequence-containing peptide(s) of AQP4, optionally in an immunosuppressive setting, and also provides diagnostics for detection of NMO in a subject, screening methods for identification of NMO-treating therapeutics and NMO model systems.


Patent
Johns Hopkins University | Date: 2016-10-25

Mesothelin can be used as an immunotherapeutic target. It induces a cytolytic T cell response. Portions of mesothelin which induce such responses are identified. Vaccines can be either polynucleotide- or polypeptide-based. Carriers for raising a cytolytic T cell response include bacteria and viruses. A mouse model for testing vaccines and other anti-tumor therapeutics and prophylactics comprises a strongly mesothelin-expressing, transformed peritoneal cell line.


A major challenge in non-viral gene delivery remains finding a safe and effective delivery system. Colloidally stable non-viral gene vector delivery systems capable of overcoming various biological barriers, are disclosed. The gene vectors are biodegradable, non-toxic and highly tailorable for use in specific applications. The vectors include a mixture of biodegradable copolymers, such as PBAE, and biodegradable polymers conjugated with hydrophilic, neutrally charged polymer, such as PEG. The gene vectors demonstrate broad vector distribution and high transgene delivery in vivo, providing an efficient non-viral gene delivery system for localized therapeutic gene transfer. Methods of using the vectors to overcome biological barriers including mucus gel and extracellular matrix are provided. Methods of formulating the vectors are also provided.


Patent
Johns Hopkins University and University of Illinois at Chicago | Date: 2015-04-22

The present invention provides novel indoleamide compounds for treating tuberculosis, including drug-resistant M-tuberculosis, compositions comprising the indoleamides and methods of using the indoleamides in conjunction with other biologically active agents for the treatment of tuberculosis in a subject in need thereof.


The present invention provides compounds or a pharmaceutically acceptable salts, solvates, stereoisomers, or prodrugs thereof which can block the Atg8-Atg3 protein-protein interaction, which is associated with autophagy in apicomplexan organisms. Pharmaceutical compositions comprising these compounds and their use for the suppression and treatment of various parasitical diseases are also provided.


Patent
Johns Hopkins University and Intonation Research Laboratories | Date: 2015-03-09

A series of phenelzine analogs comprising a phenelzine scaffold linked to an aromatic moiety and their use as inhibitors of lysine-specific demethylase 1 (LSD1) and/or one or more histone deacetylases (HDACs) is provided. The presently disclosed phenelzine analogs exhibit potency and selectivity for LSD1 versus MAO and LSD2 enzymes and exhibit bulk, as well as, gene specific histone methylation changes, anti-proliferative activity in several cancer cell lines, and neuroprotection in response to oxidative stress. Accordingly, the presently disclosed phenelzine analogs can be used to treat diseases, conditions, or disorders related to LSD1 and/or HDACs, including, but not limited to, cancers and neurodegenerative diseases.


The present invention provides compounds of formula I which are capable of inhibition of the activation of hNav1.1 or hNav1.6 sodium channels in neurons. Pharmaceutical compositions comprising these compounds are also provided. Methods for prevention and treatment of neurological disorders, including, for example, seizures and seizure disorders, including Lennox-Gastaut Syndrome, Dravet syndrome, epileptic encephalopathies, autism, Familial hemiplegic migraine (FHM), anxiety disorders, including Post-traumatic stress disorder (PTSD), panic disorder and obsessive-compulsive disorder, neuropathic pain, and Rett syndrome by administration of these compounds are also provided.


Patent
Johns Hopkins University | Date: 2015-02-17

In one aspect, the present invention is directed to a multi-lumen catheter with self-sealing hub and an attachable extension assembly. In a preferred aspect, the present invention can allows removal of the external fluid connections of an elongated percutaneous medical article, such as a catheter or cannula. The percutaneous medical article suitably contains a distal septum that prevents fluid movement within the intraluminal space of the percutaneous medical article, when the fluid connections are removed. A single-use, disposable extension set of one or more single lumen lines with associated clamp and cap is attached for intraluminal access and removed following clinical use. Retaining the catheter transition allows the catheter to be secured using methods common to the art.


Patent
Johns Hopkins University | Date: 2015-02-17

The present invention describes a securement device that maintains proper placement of a percutaneous catheter and incorporates a universal fitting for the attachment of various, interchangeable, active and passive technologies. The securement device includes a unique catheter hub that enables attachment of active technology to provide diagnostic, therapeutic, and monitoring applications of physiologic, anatomic, and other clinically relevant properties or conditions. The securement device also includes a primary semi-flexible polymeric retention member (the base) positioned atop, or integrated with, a thin flexible adhesive pad. The adhesive pad has a first surface with an adhesive substrate and a second surface configured to receive the base. The hub is received within the base and a cap is used to secure the hub to the base.


Patent
Johns Hopkins University | Date: 2016-08-18

The identification of mutations that are present in a small fraction of DNA templates is essential for progress in several areas of biomedical research. Though massively parallel sequencing instruments are in principle well-suited to this task, the error rates in such instruments are generally too high to allow confident identification of rare variants. We here describe an approach that can substantially increase the sensitivity of massively parallel sequencing instruments for this purpose. One example of this approach, called Safe-SeqS for (Safe-Sequencing System) includes (i) assignment of a unique identifier (UID) to each template molecule; (ii) amplification of each uniquely tagged template molecule to create UID-families; and (iii) redundant sequencing of the amplification products. PCR fragments with the same UID are truly mutant (super-mutants) if 95% of them contain the identical mutation. We illustrate the utility of this approach for determining the fidelity of a polymerase, the accuracy of oligonucleotides synthesized in vitro, and the prevalence of mutations in the nuclear and mitochondrial genomes of normal cells.


Patent
Johns Hopkins University and Duke University | Date: 2016-11-16

We found mutations of the R132 residue of isocitrate dehydrogenase 1 (IDH1) in the majority of grade II and III astrocytomas and oligodendrogliomas as well as in glioblastomas that develop from these lower grade lesions. Those tumors without mutations in IDH1 often had mutations at the analogous R172 residue of the closely related IDH2 gene. These findings have important implications for the pathogenesis and diagnosis of malignant gliomas.


Patent
Johns Hopkins University | Date: 2015-02-19

A system for controlling gene expression in yeast comprises a repressible gene expression plasmid that has a regulator binding sequence for camR and a target gene sequence. The system also includes a transcription enhancer expression plasmid; wherein said transcriptional activator protein binds to the regulator binding sequence in the absence of a transcriptional inhibitor. The system is used in a method for controlling expression of the target gene through the use of camphor. The target gene is expressed in the absence of camphor but unexpressed if camphor is added to a solution of cells containing the plasmids.


Patent
Johns Hopkins University | Date: 2015-05-06

A device and methods of use thereof for isolating one or more microorganisms from a biological sample, the device comprising a polymeric surface having one or more cationic polymers covalently grafted thereto, wherein the one or more cationic polymers have a selective affinity for the one or more microorganisms.


Patent
Johns Hopkins University and Lieber Institute For Brain Development | Date: 2015-03-20

RNA polymerase I (Pol I) is a dedicated polymerase for the transcription of the 47S ribosomal RNA precursor subsequently processed into the mature 5.8S, 18S and 28S ribosomal RNAs and assembled into ribosomes in the nucleolus. Pol I activity is commonly deregulated in human cancers. Based on the discovery of lead molecule BMH-21, a series of pyridoquinazolinecarboxamides were synthesized as inhibitors of Pol I and activators of the destruction of RPA194, the Pol I large catalytic subunit protein. The present invention identifies a set of bioactive compounds, including purified stereoisomers, that potently cause RPA194 degradation that function in a tightly constrained chemical space. Pharmaceutical compositions comprising these compounds and their uses in cancer and other Pol I related diseases is also provided.


Rational design of immunotherapeutics relies on clear knowledge of the immunodominant epitopes of antigens. Current methods for identifying kinetically stable peptide-MHC complexes are in many cases inadequate for a number of reasons. Disclosed herein is a reductionistic system incorporating known participants of MHC class II antigen processing in solution to generate peptide pools from antigens, including those for which no immunodominant epitope has yet been identified, that are highly enriched for proteolytic fragments containing their immunodominant epitopes. HLA-DM-mediated editing contributes significantly to immunodominance and is exploited in discovering immunodominant epitopes from novel or previously uncharacterized antigens, particularly antigens associated with pathogens, tumors or autoimmune diseases.


Patent
Johns Hopkins University | Date: 2015-04-30

The present invention provides compositions comprising PAMAM dendrimers conjugated with one or more biologically active agents, and their use systemically to target activated microglia/macrophages in retina/choroid and generally, inflammatory and/or angiogenic diseases of the eye.


Patent
Johns Hopkins University | Date: 2016-10-11

Implantable pressure-actuated systems to deliver a drug and/or other substance in response to a pressure difference between a system cavity and an exterior environment, and methods of fabrication and use. A pressure-rupturable membrane diaphragm may be tuned to rupture at a desired rupture threshold, rupture site, with a desired rupture pattern, and/or within a desired rupture time. Tuning may include material selection, thickness control, surface patterning, substrate support patterning. The cavity may be pressurized above or evacuated below the rupture threshold, and a diaphragm-protective layer may be provided to prevent premature rupture in an ambient environment and to dissipate within an implant environment. A drug delivery system may be implemented within a stent to release a substance upon a decrease in blood pressure. The cavity may include a thrombolytic drug to or other substance to treat a blood clot.


Patent
Johns Hopkins University | Date: 2015-05-12

A synthetic gene delivery platform with a dense surface coating of hydrophilic and neutrally charged PEG, capable of rapid diffusion and widespread distribution in brain tissue, and highly effective gene delivery to target cells therein has been developed. Nanoparticles including nucleic acids, are formed of a blend of biocompatible hydrophilic cationic polymers and they hydrophilic cationic polymer conjugated to hydrophilic neutrally charged polymers such as polyethylene glycol. The nanoparticles are coated with polyethylene glycol at a density that imparts a near neutral charge and optimizes rapid diffusion through the brain parenchyma. Methods of treating a disease or disorder of the brain including administering a therapeutically effective amount of nano particles densely coated with polyethylene glycol are also provided.


Patent
Johns Hopkins University | Date: 2016-12-12

Ophthalmic suture materials made from biocompatible and biodegradable polymers with high tensile strength for use in drug delivery, methods of making them, and method of using them for ocular surgery and repair have been developed. The suture materials are made from a combination of a biodegradable, biocompatible polymer and a hydrophilic biocompatible polymer. In a preferred embodiment the suture materials are made from a poly(hydroxyl acid) such as poly(1-lactic acid) and a polyalkylene oxide such as poly(ethylene glycol) or a polyalkylene oxide block copolymer. The sutures entrap (e.g., encapsulate) one or more therapeutic, prophylactic or diagnostic agents and provide prolonged release over a period of at least a week, preferably a month.


The present invention provides bivalent and multivalent ligands with a view to improving the affinity and pharmacokinetic properties of a urea class of PSMA inhibitors. The compounds and their synthesis can be generalized to multivalent compounds of other target antigens. Because they present multiple copies of the pharmacophore, multivalent ligands can bind to receptors with high avidity and affinity, thereby serving as powerful inhibitors. The modular multivalent scaffolds of the present invention, in one or more embodiments, contains a lysine-based (, -) dialkyne residue for incorporating two or more antigen binding moieties, such as PSMA binding Lys-Glu urea moieties, exploiting click chemistry and one or more additional lysine residues for subsequent modification with an imaging and/or therapeutic nuclides or a cytotoxic ligands for tumor cell killing.


Patent
Johns Hopkins University | Date: 2015-02-19

The present invention relates to the field of wound healing. Specifically, the present invention provides compositions and methods for promoting skin regeneration, more specifically, the generation of de novo hair follicles. In one embodiment, a method for stimulating hair follicle neogenesis in a subject comprises the step of administering to the subject an effective amount of a TLR3 agonist. In certain embodiments, the TLR3 agonist is a double stranded RNA (dsRNA). The present invention is also directed to treating common male pattern hair loss.


The disclosed subject matter provides certain N-substituted hydroxylamine derivative compounds, pharmaceutical compositions and kits comprising such compounds, and methods of using such compounds or pharmaceutical compositions. In particular, the disclosed subject matter provides methods of using such compounds or pharmaceutical compositions for treating, preventing, or delaying the onset and/or development of a disease or condition. In some embodiments, the disease or condition is selected from cardiovascular diseases, ischemia, reperfusion injury, cancerous disease, pulmonary hypertension and conditions responsive to nitroxyl therapy.


Patent
Johns Hopkins University and Northwestern University | Date: 2015-05-06

Low-molecular weight gadolinium (Gd)-based MR contrast agents for PSMA-specific Ti-weighted MR imaging are disclosed. The (Gd)-based MR contrast agents exhibit high binding affinity for PSMA and exhibit specific Ti contrast enhancement at PSMA+ cells. The PSMA-targeted Gd-based MR contrast agents can be used for PSMA-targeted imaging in vivo. 86Y-labeled PSMA-binding ureas also are provided, wherein the PSMA-binding ureas also are suitable for use with other radiotherapeutics.


Patent
Johns Hopkins University | Date: 2016-04-07

The present invention provides a system and method for increasing construction site safety. The present invention reduces the risk of a construction or similar large vehicle or piece of mobile machinery hitting a construction worker. The system uses low-power wireless beacons embedded in a construction workers hardhat or otherwise on the construction workers person. The low-power wireless beacon interacts with sensor modules around the construction site, on construction vehicles, on construction equipment, or any other suitable placement known to or conceivable by one of skill in the art. The sensor modules send alert signals to a display accessible to a driver of the vehicle, and/or a foreman on the construction site. While the present invention is discussed herein in the context of construction safety, it should be noted that such a system can be applied to any situation where tracking and alert generation would be beneficial.


Patent
Johns Hopkins University | Date: 2016-08-30

An attachment device includes a robot-engaging portion having a recess formed in an outer surface thereof for receiving a finger of a robot. The attachment device also includes a tool-engaging portion coupled to the robot-engaging portion. The tool-engaging portion is configured to be coupled to a tool that is to be used by the robot to perform a task. A damping member is positioned at least partially between the robot-engaging portion and the tool-engaging portion. The damping member is configured to be adjusted to vary a magnitude of oscillations that are transferred from the tool-engaging portion to the robot-engaging portion.


Patent
Stichting Katholieke University and Johns Hopkins University | Date: 2016-11-30

The present invention relates, in general, to a prostate-specific antigen, PCA3. In particular, the present invention relates to nucleic acid molecules coding for the PCA3 protein; purified PCA3 proteins and polypeptides; recombinant nucleic acid molecules; cells containing the recombinant nucleic acid molecules; antibodies having binding affinity specifically to PCA3 proteins and polypeptides; hybridomas containing the antibodies; nucleic acid probes for the detection of nucleic acids encoding PCA3 proteins; a method of detecting nucleic acids encoding PCA3 proteins or polypeptides in a sample; kits containing nucleic acid probes or antibodies; bioassays using the nucleic acid sequence, protein or antibodies of this invention to diagnose, assess, or prognose a mammal afflicted with prostate cancer; therapeutic uses; and methods of preventing prostate cancer in an animal.


Patent
Johns Hopkins University | Date: 2016-09-29

The present invention relates to the field of biomarkers. More specifically, the present invention relates to assay reagents useful in detecting neurogranin. In a specific embodiment, the present invention provides an isolated antibody or fragment thereof that specifically binds to neurogranin. In another embodiment, the present invention provides a polynucleotide aptamer that specifically binds neurogranin.


Patent
Johns Hopkins University | Date: 2015-03-11

The present invention provides an in vitro directed evolution selection system to create modified methyltransferases which improve methyltransferase specificity and use it to optimize and provide fusion proteins comprising a zinc finger methyltransferase derived from M.SssI. The resulting fusion proteins show increased target methylation specificity and greatly decreased non-target methylation compared to wild-type enzyme activity. Methods of use of such fusion proteins in both prokaryotic and eukaryotic cells are also provided.


A method for determining past exposure to chemical agents or heavy metals may include coating a capture material with a capture reagent. The capture reagent may be selected based on an ability of the capture reagent to bind with a target antibody, and the target antibody may be an indicator associated with a particular chemical agent or heavy metal. The method may further include interrogating a clinical sample associated with an individual by forming a mixture of the capture material and the clinical sample, and determining an exposure status of the individual to the particular chemical agent or heavy metal based on whether the capture material demonstrates capture of the indicator.


A method of assessing tissue vascular permeability for nanotherapeutics using non-labeled dextran can include: receiving a non-labeled, physiologically-tolerable dextran solution by a subject; acquiring a plurality of magnetic resonance images of a distribution of the dextran solution within at least one region of interest of the subject for a corresponding plurality of times; and assessing a tissue vascular permeability of the at least one region of interest to dextran particles in the dextran solution based on differences between the plurality of magnetic resonance images, wherein the dextran solution is a substantially mono-disperse solution of dextran particles of one size.


The present invention is in the area of pluripotent stem cells and more particularly deals with a method to differentiate a vascular network from stem cells.


Patent
Johns Hopkins University | Date: 2016-09-07

A detection system may include processing circuitry configured to receive synthetic aperture radar image data that has been or will be divided into a plurality of image tiles and perform initial screening to reject image tiles not having a threshold level of energy. The processing circuitry may be further configured to perform advanced screening to eliminate image tiles based on background noise to generate screened image tiles and generate a feature vector for an energy return of the screened image tiles. The processing circuitry may also be configured to determine a classification of a target associated with the feature vector.


Patent
Johns Hopkins University and Oklahoma Medical Research Foundation | Date: 2016-09-07

The present invention relates to the field of inflammatory bowel disease. More specifically, the present invention relates to the use of cytokines to detect, diagnose, and assess inflammatory bowel disease. In one embodiment, a method for diagnosing Crohns Disease (CD) in a patient comprises the steps of (a) collecting a sample from the patient; (b) measuring the levels of at least one cytokine in the sample collected from the patient; and (c) comparing the levels of the at least one cytokine with predefined cytokine levels, wherein a correlation between the cytokine levels in the patient sample and predefined cytokine levels indicates that the patient has CD. In a specific embodiment, the at least one cytokine comprises Interferon (IFN)-gamma, Interleukin (IL)-1beta, IL-6, IL-8, IL-12, IL-17 and CXCL10.


Patent
Johns Hopkins University | Date: 2015-11-19

An antenna is provided including an electromagnetic metasurface. The electromagnetic characteristics of the antenna are dynamically tunable.


A system for detecting and tracking a curvilinear object in a three-dimensional space includes an image acquisition system including a video camera arranged to acquire a video image of the curvilinear object and output a corresponding video signal, the video image comprising a plurality n of image frames each at a respective time t_(i), where i=1, 2, . . . , n; and a data processing system adapted to communicate with the image acquisition system to receive the video signal. The data processing system is configured to determine a position, orientation and shape of the curvilinear object in the three-dimensional space at each time t_(i )by forming a computational model of the curvilinear object at each time t_(i )such that a projection of the computation model of the curvilinear object at each time ti onto a corresponding frame of the plurality of image frames of the video image matches a curvilinear image in the frame to a predetermined accuracy to thereby detect and track the curvilinear object from time t_(1 )to time t_(n).


A serial digital data acquisition receiver (SDDAR) or system of receivers may include an opto-isolator assembly, sampling logic and a USB interface. Both a CLK signal and a DATA signal may each pass through the opto-isolator assembly upon receipt of the CLK and DATA signals at the SDDAR or system. The sampling logic may be operably coupled to the opto-isolator assembly and be configured to determine a point at which to sample the DATA signal based on state changes in the CLK signal. The USB interface may be operably coupled to the sampling logic and an output terminal. The USB interface may be configured to provide telemetry data for processing, display or recording at the output terminal, and may be configured to enable the SDDAR or system to be powered from the output terminal.


Patent
Johns Hopkins University | Date: 2015-05-15

A method, system and computer readable medium of: providing feature data of at least one organ at risk or target volume of said patient from a database of non-transitory data stored on a data storage device of prior patients data; generating, using a data processor, a distribution of dose points of the at least one organ at risk or target volume of said patient based on said feature data; calculating, using the data processor, at least one of (i) a probability of toxicity for the at least one organ at risk or (ii) a probability of treatment failure for the at least one target volume, based on said distribution of dose points; assessing, using the data processor, a dosimetric-outcome relationship based on the calculated probability; and automatically formulating, using the data processor, a treatment plan using the dosimetric-outcome relationship to minimize the at least one treatment-related risk.


Patent
Johns Hopkins University | Date: 2016-09-02

A method for providing malware protection in connection with processing circuitry including hardware resources and software resources managed by a primary operating system may include providing a trusted operating system to control access to a portion of a local storage area of the hardware resources. In this context, only the trusted operating system is configured to enable writing to the portion of the local storage area. The method may further include storing backup files for the primary operating system in the portion of the local storage area responsive to the trusted operating system granting access to write to the portion of the local storage area.


An embodiment in accordance with the present invention includes a method for estimating the permeability of fractured rock formations from the analysis of a slow fluid pressure wave, which is generated by pressurization of a borehole. Wave propagation in the rock is recorded with TFI. Poroelastic theory is used to estimate the permeability from the measured wave speed. The present invention offers the opportunity of measuring the reservoir-scale permeability of fractured rock, because the method relies on imaging a wave, which propagates through a large rock volume, on the order of kilometers in size. Traditional methods yield permeability for much smaller rock volumes: well logging tools only measure permeability in the vicinity of a borehole. Pressure transient testing accesses larger rock volumes; however, these volumes are much smaller than for the proposed method, particularly in low-permeability rock formations.


Patent
Johns Hopkins University and Cardioxyl Pharmaceuticals | Date: 2017-02-01

The invention relates to N-hydroxysulfonamide derivatives that donate nitroxyl (HNO) under physiological conditions and are useful in treating and/or preventing the onset and/or development of diseases or conditions that are responsive to nitroxyl therapy, including heart failure and ischemia/reperfusion injury. Novel N-hydroxysulfonamide derivatives release NHO at a controlled rate under physiological conditions, and the rate of HNO release is modulated by varying the nature and location of functional groups on the N-hydroxysulfonamide derivatives.


Patent
Johns Hopkins University | Date: 2017-01-11

AmexinA2 (ANXA2), a member of the Annexin family of calcium-dependent, phospholipid binding proteins, is one of a panel of identified antigens recognized by the post-vaccination sera of patients who demonstrated prolonged disease-free survival following multiple vaccinations. AnnexinA2 is abundantly expressed in pancreatic adenocarcinomas and cell surface/membrane AnnexinA2 increases with the progression from premalignant lesions to invasive pancreatic adenocarcinomas. The cytoplasmic to cell surface translocation of AnnexinA2 expression is critical for pancreatic cancer cell invasion. In addition, phosphorylation of AnnexinA2 at Tyrosine 23 is important for its localization to the cell surface and for the invasion of pancreatic cancer cells. Finally, loss of AnnexinA2 leads to the loss of the Epithelial-Mesenchymal Transition.


Patent
Seoul National University and Johns Hopkins University | Date: 2017-02-15

The present disclosure relates to a pharmaceutical composition for preventing or treating neurodegenerative diseases, the pharmaceutical composition including a graphene nanostructure as an active ingredient.


The invention provides methods and compositions for administration of allogeneic lymphocytes as an an exogenous source of CD4+ T cell help for endogenous, tumor-reactive CD8+ T cells. Depletion of CD8+ T cells from the donor lymphocyte infusion reduces the risk of sustained engraftment and graft-versus-host disease. Removal of regulatory T cells from the infused population may augment the ability of non-regulatory T cells to provide help for endogenous effectors of anti-tumor immunity. Allogeneic T cell therapy is typically given in the context of allogeneic stem cell transplantation, in which the patient receives highly immunosuppressive conditioning followed by an infusion of a stem cell graft containing unselected populations of mature T cells. In the treatment described here, the graft is engineered to minimize the possibility of sustained donor cell engraftment, and the anti-tumor effector T cells derive from the host.


BACKGROUND—: Evolocumab, a fully human monoclonal antibody to PCSK9, markedly reduces LDL-C across diverse patient populations. The objective of this study was to assess the safety and tolerability of evolocumab in a pooled safety analysis from phase 2 or 3 randomized and placebo or comparator-controlled trials (integrated parent trials) and the first year of open-label extension (OLE) trials that included a standard-of-care control group. METHODS—: This analysis included adverse event (AE) data from 6026 patients in 12 phase 2 and 3 parent trials, with a median exposure of 2.8 months, and of those patients, from 4465 patients who continued with a median follow-up of 11.1 months in two OLE trials. Adverse events were analyzed separately for the parent and OLE trials. Overall AE rates, serious AEs (SAEs), laboratory assessments, and AEs of interest were evaluated. RESULTS—: Overall AE rates were similar between evolocumab and control in the parent trials (51.1% vs 49.6%) and in Year 1 of OLE trials (70.0% vs 66.0%), as were those for SAEs. Elevations of serum transaminases, bilirubin and creatine kinase were infrequent and similar between groups. Muscle-related AEs were similar between evolocumab and control. Neurocognitive adverse events were infrequent and balanced during the double-blind parent studies (5 events [0.1%], evolocumab groups vs 6 events [0.3%], control groups). In the OLE trials, 27 patients (0.9%) in the evolocumab groups and 5 patients (0.3%) in the control groups reported neurocognitive AEs. No neutralizing anti-evolocumab antibodies were detected. CONCLUSIONS—: Overall, this integrated safety analysis of 6026 patients pooled across phase 2/3 trials and 4465 patients who continued in open-label extension trials for 1 year supports a favorable benefit-risk profile for evolocumab. © 2017 by the American College of Cardiology Foundation and the American Heart Association, Inc.


OBJECTIVES:: We sought to explore potential mechanisms underlying hospital sepsis case volume-mortality associations by investigating implementation of evidence-based processes of care. DESIGN:: Retrospective cohort study. We determined associations of sepsis case volume with three evidence-based processes of care (lactate measurement during first hospital day, norepinephrine as first vasopressor, and avoidance of starch-based colloids) and assessed their role in mediation of case volume-mortality associations. SETTING:: Enhanced administrative data (Premier, Charlotte, NC) from 534 U.S. hospitals. SUBJECTS:: A total of 287,914 adult patients with sepsis present at admission between July 2010 and December 2012 of whom 58,045 received a vasopressor for septic shock during the first 2 days of hospitalization. INTERVENTIONS:: None. MEASUREMENTS AND MAIN RESULTS:: Among patients with sepsis, 1.9% received starch, and among patients with septic shock, 68.3% had lactate measured and 64% received norepinephrine as initial vasopressor. Patients at hospitals with the highest case volume were more likely to have lactate measured (adjusted odds ratio quartile 4 vs quartile 1, 2.8; 95% CI, 2.1–3.7) and receive norepinephrine as initial vasopressor (adjusted odds ratio quartile 4 vs quartile 1, 2.1; 95% CI, 1.6–2.7). Case volume was not associated with avoidance of starch products (adjusted odds ratio quartile 4 vs quartile 1, 0.73; 95% CI, 0.45–1.2). Adherence to evidence-based care was associated with lower hospital mortality (adjusted odds ratio, 0.81; 95% CI, 0.70–0.94) but did not strongly mediate case volume-mortality associations (point estimate change ≤ 2%). CONCLUSIONS:: In a large cohort of U.S. patients with sepsis, select evidence-based processes of care were more likely implemented at high-volume hospitals but did not strongly mediate case volume-mortality associations. Considering processes and case volume when regionalizing sepsis care may maximize patient outcomes. Copyright © by 2017 by the Society of Critical Care Medicine and Wolters Kluwer Health, Inc. All Rights Reserved.


Holland A.,Johns Hopkins University
Journal of Cell Science | Year: 2017

Andrew received his first degree in natural sciences from the University of Cambridge and a Masters degree from the University of Manchester, followed by a PhD with Stephen Taylor in Manchester. He then moved to California in 2007 with an EMBO long-term fellowship for his postdoctoral research with Don Cleveland at the Ludwig Institute for Cancer Research. In 2013, Andrew started his own lab as an Assistant Professor in the Department of Molecular Biology and Genetics at the Johns Hopkins University School of Medicine, having been named a Kimmel Scholar and a Pew-Stewart Scholar in 2014. Andrew's lab investigates the mechanisms controlling centrosome copy numbers during cell division and the links between centrosome amplification, genome instability and tumorigenesis. © 2017. Published by The Company of Biologists Ltd.


Goutsias J.,Johns Hopkins University
Nature Genetics | Year: 2017

Epigenetics is the study of biochemical modifications carrying information independent of DNA sequence, which are heritable through cell division. In 1940, Waddington coined the term “epigenetic landscape” as a metaphor for pluripotency and differentiation, but methylation landscapes have not yet been rigorously computed. Using principles from statistical physics and information theory, we derive epigenetic energy landscapes from whole-genome bisulfite sequencing (WGBS) data that enable us to quantify methylation stochasticity genome-wide using Shannon's entropy, associating it with chromatin structure. Moreover, we consider the Jensen–Shannon distance between sample-specific energy landscapes as a measure of epigenetic dissimilarity and demonstrate its effectiveness for discerning epigenetic differences. By viewing methylation maintenance as a communications system, we introduce methylation channels and show that higher-order chromatin organization can be predicted from their informational properties. Our results provide a fundamental understanding of the information-theoretic nature of the epigenome that leads to a powerful approach for studying its role in disease and aging. © 2017 Nature Publishing Group, a division of Macmillan Publishers Limited. All Rights Reserved.


Miao Y.,Johns Hopkins University
Nature Cell Biology | Year: 2017

The diverse migratory modes displayed by different cell types are generally believed to be idiosyncratic. Here we show that the migratory behaviour of Dictyostelium was switched from amoeboid to keratocyte-like and oscillatory modes by synthetically decreasing phosphatidylinositol-4,5-bisphosphate levels or increasing Ras/Rap-related activities. The perturbations at these key nodes of an excitable signal transduction network initiated a causal chain of events: the threshold for network activation was lowered, the speed and range of propagating waves of signal transduction activity increased, actin-driven cellular protrusions expanded and, consequently, the cell migratory mode transitions ensued. Conversely, innately keratocyte-like and oscillatory cells were promptly converted to amoeboid by inhibition of Ras effectors with restoration of directed migration. We use computational analysis to explain how thresholds control cell migration and discuss the architecture of the signal transduction network that gives rise to excitability. © 2017 Nature Publishing Group


PTEN is a PIP3 phosphatase that antagonizes oncogenic PI3-kinase signalling. Due to its critical role in suppressing the potent signalling pathway, it is one of the most mutated tumour suppressors, especially in brain tumours. It is generally thought that PTEN deficiencies predominantly result from either loss of expression or enzymatic activity. By analysing PTEN in malignant glioblastoma primary cells derived from 16 of our patients, we report mutations that block localization of PTEN at the plasma membrane and nucleus without affecting lipid phosphatase activity. Cellular and biochemical analyses as well as structural modelling revealed that two mutations disrupt intramolecular interaction of PTEN and open its conformation, enhancing polyubiquitination of PTEN and decreasing protein stability. Moreover, promoting mono-ubiquitination increases protein stability and nuclear localization of mutant PTEN. Thus, our findings provide a molecular mechanism for cancer-associated PTEN defects and may lead to a brain cancer treatment that targets PTEN mono-ubiquitination.Oncogene advance online publication, 6 March 2017; doi:10.1038/onc.2016.493. © 2017 The Author(s)


Pouget P.,University Pierre and Marie Curie | Murthy A.,Indian Institute of Science | Stuphorn V.,Johns Hopkins University
Philosophical Transactions of the Royal Society B: Biological Sciences | Year: 2017

Voluntary behaviour requires control mechanisms that ensure our ability to act independently of habitual and innate response tendencies. Electrophysiological experiments, using the stop-signal task in humans, monkeys and rats, have uncovered a core network of brain structures that is essential for response inhibition. This network is shared across mammals and seems to be conserved throughout their evolution. Recently, new research building on these earlier findings has started to investigate the interaction between response inhibition and other control mechanisms in the brain. Here we describe recent progress in three different areas: selectivity of movement inhibition across different motor systems, re-orientation of motor actions and action evaluation. © 2017 The Author(s) Published by the Royal Society. All rights reserved.


ABSTRACT:: Sharing of pre-exposure prophylaxis (PrEP) medications is a concern for PrEP implementation. For HIV-1 serodiscordant couples, sharing may undermine HIV-1 prevention benefits and also cause antiretroviral resistance if taken by HIV-1 infected partners. Within a PrEP efficacy trial among HIV-1 serodiscordant couples, we assessed the occurrence of PrEP sharing by self-report and plasma tenofovir concentrations in HIV-1 infected partners. PrEP sharing was self-reported at <0.01% of visits and 0-1.6% of randomly selected and 0% of purposively selected specimens from HIV-1 infected participants had detectable tenofovir concentrations (median: 66.5 ng/mL, range: 1.3-292 ng/mL). PrEP sharing within HIV-1 serodiscordant couples was extremely rare.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial License 4.0 (CCBY-NC), where it is permissible to download, share, remix, transform, and buildup the work provided it is properly cited. The work cannot be used commercially without permission from the journal. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


Kasaie P.,Johns Hopkins University
Journal of Acquired Immune Deficiency Syndromes | Year: 2017

OBJECTIVES:: Pre-exposure prophylaxis (PrEP) is recommended for preventing HIV infection among individuals at high risk, including men who have sex with men (MSM). Although its individual-level efficacy is proven, questions remain regarding population-level impact of PrEP implementation. DESIGN:: We developed an agent-based simulation of HIV transmission among MSM, accounting for demographics, sexual contact network, HIV disease stage and use of antiretroviral therapy. We use this framework to compare PrEP delivery strategies in terms of impact on HIV incidence and prevalence. RESULTS:: The projected reduction in HIV incidence achievable with PrEP reflects both population-level coverage and individual-level adherence (as a proportion of days protected against HIV transmission). For example, provision of PrEP to 40% of HIV-negative MSM reporting more than one sexual partner in the last 12 months, taken with sufficient adherence to provide protection on 40% of days, can reduce HIV incidence by 9.5% (95% uncertainty range: 8-11%) within five years. However, if this could be increased to 80% coverage on 80% of days (e.g., through mass campaigns with a long-acting injectable formulation), a 43% (42−44%) reduction in HIV incidence could be achieved. Delivering PrEP to MSM at high risk for HIV acquisition can augment population-level impact up to 1.8-fold. CONCLUSIONS:: If highly ambitious targets for coverage and adherence can be achieved, PrEP can substantially reduce HIV incidence in the short-term. While the reduction in HIV incidence largely reflects the proportion of person-years protected, the efficiency of PrEP delivery can be enhanced by targeting high-risk populations. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


OBJECTIVE:: To describe accurately the pattern, timing, and predictors of disease recurrence after a potentially curative resection for pancreatic ductal adenocarcinoma (PDAC). SUMMARY BACKGROUND DATA:: After surgery for PDAC, most patients will develop disease recurrence. Understanding the patterns and timing of disease failure can help guide improvements in therapy. METHODS:: Patients who underwent pancreatectomy for PDAC at the Johns Hopkins Hospital between 2000 and 2010 were included. Exclusion criteria were incomplete follow-up records, follow-up <24 months, and neoadjuvant therapy. The first recurrence site was recorded and recurrence-free survival (RFS) was estimated using Kaplan–Meier curves. Predictive factors for specific recurrence patterns were assessed by univariate and multivariate analyses using Cox-proportional hazard regression models. RESULTS:: From the identified cohort of 1103 patients, 692 patients had comprehensive and detailed follow-up data available. At a median follow-up of 25.3 months, 531 (76.7%) of the 692 had recurred after a median RFS of 11.7 months. Most patients recurred at isolated distant sites (n = 307, 57.8%), while isolated local recurrence was seen in 126 patients (23.7%). Liver-only recurrence (n = 134, 25.2%) tended to occur early (median 6.9 mo), while lung-only recurrence (n = 78, 14.7%) occurred later (median 18.6 mo). A positive lymph node ratio >0.2 was a strong predictor for all distant disease recurrence. Patients receiving adjuvant chemotherapy or chemoradiotherapy had fewer recurrences and a longer RFS of 18.0 and 17.2 months, respectively. CONCLUSIONS:: Specific recurrence locations have different predictive factors and possess distinct RFS curves, supporting the hypothesis that unique biological differences exist among tumors leading to distinct patterns of recurrence. Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.


El-Diwany R.,Johns Hopkins University
AIDS | Year: 2017

OBJECTIVE:: To assess if the reduction in HIV-1 RNA in CD4+ T cells is correlated with the persistence of immune activation following early antiretroviral therapy (ART). DESIGN:: Clinical trial (NCT01285050). METHODS:: Next-generation sequencing was used to study total RNA from activated CD4+ T cells (CD38 and HLA-DR expressing) collected from 19 treatment-naïve HIV-1/HCV infected patients before and early after ART initiation (≥12 weeks after plasma HIV RNA < 50?c/ml). To validate comparisons, pre- and post-ART measures were adjusted for input RNA and overall read number. RESULTS:: As expected, ART use was associated with a median (IQR) 4.3 (2.2–8.3) reduction in the proportion of activated CD4+ T cells (P?=?0.0008). Whereas in those activated CD4+ T cells no consistent differences in overall gene expression were detected, interferon stimulated gene expression declined (P?


Tahsili-Fahadan P.,Johns Hopkins University | Geocadin R.G.,Johns Hopkins University
Circulation Research | Year: 2017

A complex interaction exists between the nervous and cardiovascular systems. A large network of cortical and subcortical brain regions control cardiovascular function via the sympathetic and parasympathetic outflow. A dysfunction in one system may lead to changes in the function of the other. The effects of cardiovascular disease on the nervous system have been widely studied; however, our understanding of the effects of neurological disorders on the cardiovascular system has only expanded in the past 2 decades. Various pathologies of the nervous system can lead to a wide range of alterations in function and structure of the cardiovascular system ranging from transient and benign electrographic changes to myocardial injury, cardiomyopathy, and even cardiac death. In this article, we first review the anatomy and physiology of the central and autonomic nervous systems in regard to control of the cardiovascular function. The effects of neurological injury on cardiac function and structure will be summarized, and finally, we review neurological disorders commonly associated with cardiovascular manifestations. © 2017 American Heart Association, Inc.


Beer M.A.,Johns Hopkins University
Human Mutation | Year: 2017

We participated in the Critical Assessment of Genome Interpretation eQTL challenge to further test computational models of regulatory variant impact and their association with human disease. Our prediction model is based on a discriminative gapped-kmer SVM (gkm-SVM) trained on genome-wide chromatin accessibility data in the cell type of interest. The comparisons with massively parallel reporter assays (MPRA) in lymphoblasts show that gkm-SVM is among the most accurate prediction models even though all other models used the MPRA data for model training, and gkm-SVM did not. In addition, we compare gkm-SVM with other MPRA datasets and show that gkm-SVM is a reliable predictor of expression and that deltaSVM is a reliable predictor of variant impact in K562 cells and mouse retina. We further show that DHS (DNase-I hypersensitive sites) and ATAC-seq (assay for transposase-accessible chromatin using sequencing) data are equally predictive substrates for training gkm-SVM, and that DHS regions flanked by H3K27Ac and H3K4me1 marks are more predictive than DHS regions alone. © 2017 Wiley Periodicals, Inc.


News Article | April 17, 2017
Site: www.newscientist.com

Enceladus’ south pole is wounded, bleeding heat and water. Its injury may have come from a huge rock smashing into this frigid moon of Saturn less than 100 million years ago, leaving the area riddled with leaky cracks. The region near Enceladus’ south pole marks one of the solar system’s most intriguing mysteries. It spews plumes of liquid from an interior ocean, plus an enormous amount of heat. The south pole’s heat emission is about 10 gigawatts higher than expected – equivalent to the power of 4000 wind turbines running at full capacity. The rest of the moon, though, is cold and relatively homogeneous. “We don’t have a really good explanation for why all this activity is so concentrated,” says John Spencer at the Southwest Research Institute in Colorado. On most icy moons and ocean worlds, the main heat source is the tide: the worlds are stretched by the gravity of their parent bodies and neighbours, which causes internal heat. “If it was heated by tides, the north and south should look the same,” says Angela Stickle at Johns Hopkins University in Maryland. “So the fact that Enceladus’s south has these crazy regions with plumes and heat flows is enigmatic.” Stickle and her colleague James Roberts used computer simulations to see if the enigma could be explained by a giant impact in the past. They found that Enceladus’s strange appearance could be explained by a blow large enough to cause huge cracks in the ice. This kind of collision would leave the south pole warm and weakened, they told the Lunar and Planetary Science Conference in Texas on 21 March. “An impact could set up the conditions to form a terrain like what we see now,” says Stickle. To have the desired effect, the blow would have been powerful enough to punch right through the 20-kilometre-thick ice covering the oceans, but we wouldn’t see a crater today because the ice would begin to refreeze immediately. By an hour after the impact, the exposed liquid could freeze to a depth of 10 centimetres, starting to rebuild the ice shell.“It heals quickly, but it’ll leave a scar,” says Roberts. Such an impact would deposit energy into Enceladus’ surface, heating and softening the surrounding ice. It would also cause a shock wave and seismic activity that could rip open the shell. The impact didn’t even have to happen exactly at the south pole. Because the dent would cause a dip in the local gravity, the moon would rotate so that the crater or hole would gradually migrate toward the pole. “The impact could have happened anywhere and then Enceladus would have rolled over until the impact point ended up at whichever pole happened to be nearest,” says Francis Nimmo at the University of California Santa Cruz. The patch of weaker ice, riddled with fractures, ends up at the south pole, leaking water and excess heat while it slowly re-freezes, recovering from its ordeal.


News Article | March 19, 2017
Site: www.techtimes.com

The Solar System currently has eight recognized planets but before 2006 when Pluto was demoted to a dwarf planet, school children were taught that there were nine planets in our star system. Now, as a group of scientists proposes a new way to classify planets, our celestial neighborhood may possibly have more than 100 objects that could be called planets. The International Astronomical Union (IAU) changed the definition of a planet in 2006. Pluto's demotion is attributed to discoveries that show it is actually a Kuiper Belt Object or KBO. Much of the Kuiper Belt was yet unknown when Pluto was discovered in 1930 and included as the ninth planet of the Solar System. Skeptics, however, eventually emerged after the discovery of several objects in the Kuiper Belt whose sizes are comparable to that of Pluto's. Skeptics pointed out the existence of bigger objects in Pluto's surrounding. The discovery of Eris, a dwarf planet 27 percent larger than Pluto led IAU to come up with a formal definition of a planet in 2006 during its 26th General Assembly. One of IAU's new criteria for defining a full planet requires having a clear neighborhood around the body's orbit. "A celestial body that (a) is in orbit around the Sun, (b) has sufficient mass for its self-gravity to overcome rigid body forces so that it assumes a hydrostatic equilibrium (nearly round) shape, and (c) has cleared the neighbourhood around its orbit," reads the IAU criteria. Because part of this formula requires that a planet and its natural satellite move alone through their orbit, Pluto is not classified as a planet and is now considered as a mere dwarf planet. Kirby Runyon of Johns Hopkins University and colleagues proposed that the factors that need to define whether or not a celestial object is a planet should rely on the body itself and not just other things such as location. Runyon said that no planet has actually totally cleared its orbit. Jupiter, the largest planet in the Solar system, for instance, even has clouds of asteroid. Runyon and colleagues defined a planet as "a sub-stellar mass body that has never undergone nuclear fusion." The object also needs to have enough gravitational heft to retain a roughly round shape. The group's definition is different from that of the IAU definition in that it did not make reference to the surroundings of the celestial body. Based on the team's definition, Jupiter's moon Europa and even the Earth's moon would be classified as a planet. Both moons are larger than Pluto, which Runyon and colleagues claim to be no less of a planet than Mars, Jupiter, Neptune, and Earth. Although it would likely take a long shot to have the team's version of planetary definition to be adopted, its official acceptance could significantly increase the number of recognized planets in the solar system from eight to about 110. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 7, 2017
Site: www.techtimes.com

NASA’s New Horizons probe has reached the halfway point between Pluto and its second target for flyby, a remote Kuiper Belt object known as 2014 MU69. New Horizons reached this milestone at midnight UTC last April 3, at a distance of 486.19 million miles from Pluto and the same distance to the distant asteroid. New Horizons targets swooping past MU69, which is almost 1 billion miles beyond Pluto, on Jan. 1, 2019. This will mark another record for space exploration. “That flyby will set the record for the most distant world ever explored in the history of civilization,” noted principal investigator Alan Stern of the Southwest Research Institute in a statement, dubbing it “fantastic” to have already accomplished half the journey to the next flyby. The probe’s Long Range Reconnaissance Imager (LORRI) will start to observe MU69 in September. And while it continues to zoom along, the spacecraft is slightly slowing down as it gets more distant from the sun. At any rate, it is still speeding through the Kuiper Belt at around 32,000 miles each hour. Also for the first time since December 2014, New Horizons will enter a five-month hibernation later in the week as it faces 466 million miles more into its mission. Given the groundbreaking Pluto flyby and the 16-month transmission of data obtained from Pluto, the spacecraft had to stay “awake” for over 2.5 years. While awake, New Horizons’ instruments also observed 12 Kuiper Belt objects, studied dust as well as charged particles in the solar system’s twilight zone, and evaluated hydrogen gas in the heliosphere, a massive region surrounding the sun through which solar wind reaches and the sun maintains a magnetic effect. The New Horizons probe arrived at Pluto back in July 2015 after launching from Cape Canaveral in Florida in January 2006. It serves as Pluto’s first guest from Earth. At present, it is 3.5 billion miles from our planet, and it takes radio signals five hours and 20 minutes to get from the Johns Hopkins University control center in Maryland to the spacecraft. The discoveries made by New Horizons in the form of images and space environment data have enhanced scientists’ knowledge of the Pluto system and offer enough signs and indicators of what can be expected from Kuiper Belt. They hold solid value as Pluto and its largest moon, Charon, are considered ice dwarfs and distinguished by solid surfaces. Researchers have long speculated whether the great ball of ice could also be capable of hosting life. Regarding New Horizons’ wealth of findings, Stern has a couple of favorites, including atmospheric hazes and lower atmospheric escape demolishing previous flyby models; indications of an internal water-ice ocean; and the gradual demystifying of the dark, red polar cap of Charon, to name a few. The probe successfully imaged a weird and snakeskin-looking terrain on Pluto, or icy ridges that are around 1,650 feet tall and similar to Earth’s “penitentes” or bowl-shaped depressions manifesting in cold mountain regions. Apart from having these Earth-like icy ridges, the dwarf planet has also been found to host exotic icy mountains, a blue sky, and an actively evolving surface. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 24, 2017
Site: news.mit.edu

The unpredictable annual flow of the Nile River is legendary, as evidenced by the story of Joseph and the Pharaoh, whose dream foretold seven years of abundance followed by seven years of famine in a land whose agriculture was, and still is, utterly dependent on that flow. Now, researchers at MIT have found that climate change may drastically increase the variability in Nile’s annual output. Being able to predict the amount of flow variability, and even to forecast likely years of reduced flow, will become ever more important as the population of the Nile River basin, primarily in Egypt, Sudan, and Ethiopia, is expected to double by 2050, reaching nearly 1 billion. The new study, based on a variety of global climate models and records of rainfall and flow rates over the last half-century, projects an increase of 50 percent in the amount of flow variation from year to year. The study, published in the journal Nature Climate Change, was carried out by professor of civil and environmental engineering Elfatih Eltahir and postdoc Mohamed Siam. They found that as a result of a warming climate, there will be an increase in the intensity and duration of the Pacific Ocean phenomenon known as the El Niño/La Niña cycle, which they had previously shown is strongly connected to annual rainfall variations in the Ethiopian highlands and adjacent eastern Nile basins. These regions are the primary sources of the Nile’s waters, accounting for some 80 percent of the river’s total flow. The cycle of the Nile’s floods has been “of interest to human civilization for millennia,” says Eltahir, the Breene M. Kerr Professor of Hydrology and Climate. Originally, the correlation he showed between the El Niño/La Niña cycle and Ethiopian rainfall had been aimed at helping with seasonal and short-term predictions of the river’s flow, for planning storage and releases from the river’s many dams and reservoirs. The new analysis is expected to provide useful information for much longer-term strategies for placement and operation of new and existing dams, including Africa’s largest, the Grand Ethiopian Renaissance Dam, now under construction near the Ethiopia-Sudan border. While there has been controversy about that dam, and especially about how the filling of its reservoir will be coordinated with downstream nations, Eltahir says this study points to the importance of focusing on the potential impacts of climate change and rapid population growth as the most significant drivers of environmental change in the Nile basin. “We think that climate change is pointing to the need for more storage capacity in the future,” he says. “The real issues facing the Nile are bigger than that one controversy surrounding that dam.” Using a variety of global circulation models under “business as usual” scenarios, assuming that major reductions in greenhouse gas emissions do not take place, the study finds that the changing rainfall patterns would likely lead to an average increase of the Nile’s annual flow of 10 to 15 percent. That is, it would grow from its present 80 cubic kilometers per year to about 92 or more cubic kilometers per year averaged over the 21st century, compared to the 20th century average. The findings also suggest that there will be substantially fewer “normal” years, with flows between 70 and 100 cubic kilometers per year. There will also be many more extreme years with flows greater than 100, and more years of drought. (Statistically, the variability is measured as the standard deviation of the annual flow rates, which is the number that is expected to see a 50 percent rise). The pattern has in fact played out over the last two years — 2015, an intense El Niño year, saw drought conditions in the Nile basin, while the La Niña year of 2016 saw high flooding. “It’s not abstract,” Eltahir says. “This is happening now.” As with Joseph’s advice to Pharaoh, the knowledge of such likely changes can help planners to be prepared, in this case by storing water in huge reservoirs to be released when it is really needed. "Too often we focus on how climate change might influence average conditions, to the exclusion of thinking about variability," says Ben Zaitchik, an associate professor of earth and planetary sciences at Johns Hopkins University, who was not involved in this work. "That can be a real problem for a place like the Eastern Nile basin, where average rainfall and streamflow might increase with climate change, suggesting that water will be plentiful. But if variability increases as well, then there could be as frequent or more frequent stress events, and significant planning — in infrastructure or management strategies — might be required to ensure water security." Already, Eltahir’s earlier work on the El Niño/La Niña correlation with Nile flow is making an impact. “It’s used operationally in the region now in issuing seasonal flood forecasts, with a significant lead time that gives water resources engineers enough time to react. Before, you had no idea,” he says adding that he hopes the new information will enable even better long-term planning. “By this work, we at least reduce some of the uncertainty.”


News Article | March 28, 2017
Site: www.techtimes.com

There is a new focus at NASA on small satellite missions as forerunners for larger missions of the Solar System. The NASA program, Planetary Science Deep Space SmallSat Studies gives a window for projects with small satellites to study Solar System's celestial bodies. In the latest step, NASA has awarded $3.6 million to ten projects for concept planning awaiting their roll out after a few months. Generally, small satellites weigh less than 400 pounds. Among the 10 projects selected, two are Venus centric with a focus on noble gasses and isotopes. One CubeSat project will be looking at ultraviolet absorption and atmosphere's nightglow emissions. NASA's Goddard Space Flight Center will be sending a 12-unit CubeSat to investigate the hydrogen cycle of the moon. The small satellite from Johns Hopkins University will target an asteroid with a seismometer to examine its surface and interiors. Another CubeSat from Purdue University will image Phobos and Deimos — the Martian moons. NASA Ames will deploy a CubeSat to Mars focusing on climate studies. The probe of Hampton University will be on Uranus and its atmosphere. The magnetosphere of Jupiter will be the core area of investigation for the project of Southwest Research Institute. Basically, SmallSats handle the delivery of preliminary data for upcoming bigger projects. The cost of launching SmallSats is also nominal. "These small but mighty satellites have the potential to enable transformational science. They guide NASA's development of small spacecraft technologies for deep space science investigation," noted Jim Green, director of the Planetary Science Division at NASA Headquarters. Green added that the agency is investing in SmallSats after being convinced of their utility for cutting edge scientific investigations. A range of merits justify SmallSats such as deployment from bigger spacecraft to target-specific investigations to back main missions. The Mars mission of NASA will use this approach by despatching two small satellites for advanced data. NASA is also buoyed by the 2016 report by the US National Academies that said SmallSats technology has come of age to provide high-value science. There are many cost benefits from the use of SmallSats. They also offer the flexibility to operate in constellations. "What we're seeing is a capability that we haven't really seen before in terms of small satellites that can do pretty good science at a much-reduced cost compared to the big missions," said Steve Mackwell from the Universities Space Research Association (USRA) in Maryland. Mackwell said miniaturization helps in deploying SmallSats where larger missions had been thought about. It is an unprecedented opportunity in using them to explore inner Solar System bodies like the Venus and Moon. Green noted that miniature satellites had posed challenges in the past with problems like difficulties around power and communication. Mackwell, however, points that there a change and critical advances have been made in their functioning. An example is compact propulsion systems to reach places where they can ride and maneuver to the ultimate destination. Also, innovations have come up to incorporate solar panels into SmallSats to boost capabilities. More progress is being made on the technology front. An example is engineers at Nasa's Glenn Research Center demonstrating printed electronics suitable for operating in the harsh conditions at Venus. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 25, 2017
Site: www.eurekalert.org

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer's disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer's and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is "turned down" at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to "speak in unison" are disrupted, resulting in a failure of memory. "These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer's disease," says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper's senior author. "The key point here is that it's the combination of amyloid and low NPTX2 that leads to cognitive failure." Since the 1990s, Worley's group has been studying a set of genes known as "immediate early genes," so called because they're activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen "circuits" in the brain. "Those connections are essential for the brain to establish synchronized groups of 'circuits' in response to experiences," says Worley. "Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information." Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer's. Worley's group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn't enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain "rhythms" important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic -- one depending on the other for the effect -- it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers -- including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. "Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease. This creates many new opportunities," says Worley. "One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies." Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study's authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging's Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer's Disease Discovery Foundation and Lumind.


News Article | April 28, 2017
Site: www.eurekalert.org

In an analysis of Medicare billing data submitted by more than 2,300 United States physicians, researchers have calculated the average number of surgical slices, or cuts, made during Mohs micrographic surgery (MMS), a procedure that progressively removes thin layers of cancerous skin tissue in a way that minimizes damage to healthy skin and the risks of leaving cancerous tissue behind. The study, the researchers say, serves as a first step towards identifying best practices for MMS, as well as identifying and informing physicians who may need re-training because their practice patterns deviate far from their peers. A report of the study, published in the journal JAMA Dermatology April 28, suggests that identifying and informing high outlier physicians of their extreme practice patterns can enable targeted re-training, potentially sparing patients from substandard care. The analysis is part of a medical quality improvement project called "Improving Wisely," funded by the Robert Wood Johnson Foundation and based at The Johns Hopkins University. The initiative focuses on developing and using individual physician-level measures to collect data and improve performance. The U.S. Centers for Medicare and Medicaid Services provided broad access to their records for the study. "The project aims to work by consensus, encouraging outliers to seek educational and re-training tools offered by their professional society," says Martin Makary, M.D., M.P.H., professor of surgery at the Johns Hopkins University School of Medicine and the paper's co-senior author. "That's the spirit of medicine's heritage of learning from the experience of other physicians." He estimates that the initiative could result in Medicare savings of millions of dollars. Ideally, says Makary, those who perform MMS make as few cuts or slices as possible to preserve as much normal tissue as possible while ensuring complete removal of cancers. As each layer of skin is removed, it is examined under a microscope for the presence of cancer cells. However, there can be wide variation in the average number of cuts made by a physician. Measuring a surgeon's average number of cuts was recently endorsed by the American College of Mohs Surgery (ACMS) as a clinical quality metric used to assess its members. "Outlier practice patterns in health care, and specifically Mohs surgery, can represent a burden on patients and the medical system," says John Albertini, M.D., immediate past president of the American College of Mohs Surgery and the paper's other senior author. "By studying the issue of variation in practice patterns, the Mohs College hopes to improve the quality and value of care we provide our patients." Taking their cue from that support, Makary and his research team analyzed Medicare Part B claims data from January 2012 to December 2014 for all physicians who received Medicare payments for MMS procedures on the head, neck, genitalia, hands and feet. These regions of the body account for more than 85 percent of all MMS procedures reimbursed by Medicare during those years. A total of 2,305 physicians who performed MMS were included in the analysis. The researchers also gathered the following data for each physician: sex, years in practice, whether the physician worked in a solo or group practice, whether the physician was a member of ACMS, whether the physician practiced at an Accreditation Council for Graduate Medical Education site for MMS, volume of MMS operations, and whether the physician practiced in an urban or rural setting. Physicians had to perform at least 10 MMS procedures each year to be included in the analysis. The researchers found that the average number of cuts among all physicians was 1.74. The median was 1.69 and the range was 1.09 to 4.11 average cuts per case. Of the 2,305 physicians who performed MMS during each of the three years studied, 137 were considered extremely high outliers during at least one of those years. An extremely high outlier was defined as having a personal average of greater than two standard deviations, or 2.41 cuts per case, above all physicians in the study. Forty-nine physicians were persistently high outliers during all three years. Physicians in solo practice were 2.35 times more likely to be a persistent high outlier than those in a group practice; 4.5 percent of solo practitioners were persistent high outliers compared to 2.1 percent of high outlier physicians who performed MMS in a group practice. Volume of cases per year, practice experience and geographic location were not associated with being a high outlier. Low extreme outliers, defined as having an average per case in the bottom 2.5 percent of the group distribution, also were identified. Of all physicians in the study, 92 were low outliers in at least one year and 20 were persistently low during all three years. Potential explanations for high outliers include financial incentive, because the current payment model for MMS pays physicians who do more cuts more money, Makary says. These charges are ultimately passed on to Medicare Part B patients, who are expected to pay 20 percent of their health care bill. Low outliers may be explained by incorrect coding, overly aggressive initial cuts, or choice of tumors for which MMS is not necessary, he says. Although the study was limited by lack of information about each patient's medical history, or the diameter or depth of each cut, Makary says it's a meaningful step toward identifying and mitigating physician outliers. "Developing standards based on physicians' actual experience and practices is the home-grown approach needed now to improve health care and lower costs of care," says Makary. Other authors on this paper include Aravind Krishnan, Tim Xu, Susan Hutfless and Angela Park of the Johns Hopkins University School of Medicine; Thomas Stasko of the University of Oklahoma; Allison T. Vidimos of the Cleveland Clinic; Barry Leshin of The Skin Surgery Center; Brett M. Coldiron of the University of Cincinnati Hospital; Richard G. Bennett of Bennett Surgery Center in Santa Monica, California; and Victor J. Marks and Rebecca Brandt of the American College of Mohs Surgery. Funding for this study was provided by a grant from the Robert Wood Johnson Foundation (grant No. 73417) and the American College of Mohs Surgery.


News Article | May 2, 2017
Site: news.yahoo.com

Venezuela is not the first developed country to put itself on track to fall into a catastrophic economic crisis. But it is in the relatively unusual situation of having done so while in possession of enormous oil assets. There aren’t many precedents to help understand how this could have happened and what is likely to happen next. There is, however, at least one — the Soviet Union’s similar devastation in the late 1980s. Its fate may be instructive for Venezuela — which is not to suggest Venezuelans, least of all the regime of Nicolás Maduro, will like what it portends. Venezuela has been ailing ever since the decline in oil prices that started in June 2014, and there is no reason to think this trend will shift anytime soon. Energy prices move in long quarter-century circles of one decade of high prices and one decade of low prices, so another decade of low prices is likely. Similarly, the biggest economic blow to the Soviet Union was the fall in oil prices that started in 1981 and got worse from there. But the deeper problem for the Soviet Union wasn’t the oil price collapse; it’s what came before. In his book Collapse of an Empire, Russia’s great post-Soviet reformer Yegor Gaidar pointed out that during the long preceding oil boom, Soviet policymakers thought that they could walk on water and that the usual laws of economic gravity did not apply to them. Soviet policymakers didn’t bother developing a theory to make sense of their spending. They didn’t even bother paying attention to their results. The math seemed to work out, so they just assumed there was a good reason. This is as true of the current Venezuelan leaders as it was of the Soviet leaders. The Venezuelan government, though it doesn’t claim to be full-fledged in its devotion to Marxism-Leninism, has been pursuing as absurd an economic policy mix as its Soviet predecessor. It has insisted for years on maintaining drastic price controls on a wide range of basic goods, including food staples such as meat and bread, for which it pays enormous subsidies. Nonetheless the Venezuelan government, like the Soviet Union’s, has always felt it could afford these subsidies because of its oil revenues. But as the oil price has fallen by slightly more than half since mid-2014, oil incomes have fallen accordingly. And rather than increase oil production, the Venezuelan government has been forced to watch it decline because of its mismanagement of the dominant state-owned oil company, PDVSA. And now Venezuela seems intent on repeating the Soviet folly of the late 1980s by refusing to change course. This is allowing the budget deficit to swell and putting the country on track toward ultimate devastation. The Soviet Union in its latter years had a skyrocketing budget deficit, too. In 1986 it exceeded 6 percent of GDP, and by 1991 it reached an extraordinary one-third of GDP. Venezuela is now following suit. The Soviet Union used its currency reserves to pay for imports, but when those reserves shrank, the government financed the budget deficit by printing money. The inevitable result was skyrocketing inflation. It seems as if President Nicolás Maduro has adopted this tried-and-failed combination of fiscal and monetary policy. Venezuela already is dealing with massive shortages as a result of its controlled prices, because the government can no longer afford its own subsidies. But it will get worse from here. Maduro seems intent on printing money like crazy, so the next step will be hyperinflation. Inflation is already believed to have reached 700 percent a year, and it is heading toward official hyperinflation, that is, an inflation rate of at least 50 percent a month. Hyperinflation is as frightful as it is rare. According to Johns Hopkins University professor Steve Hanke, the world has experienced only 56 hyperinflations, and half of them occurred when communism collapsed. (All of the Soviet Union’s 15 union republics suffered it during the country’s disintegration.) Hyperinflation is profoundly demoralizing. Suddenly, it makes no sense to work any longer. Instead of standing in queues to buy food with the money they’ve earned, people stop working entirely, because they cannot spend the money they would have earned. Smart profiteers indulge in speculation, buying safe assets such as commodities or real estate. As a result, output plummets and enters a downward spiral until financial stability is restored. In 1991, Soviet production probably fell by 10 percent, and oil production plunged by half from 1988 to 1995. Something similar seems to be going on in Venezuela. The Soviet Union had insisted on an unrealistically high official exchange rate of the ruble, usually five times higher than the black market exchange rate. The government did so to make people feel richer than they really were, but this meant that the government subsidized purchases of foreign currency just as it subsidized purchases of food. As the Soviet government spent more money, allowing the budget deficit to balloon, the black market exchange rate plummeted, humiliating its citizens. Gradually, people accepted the black market exchange rate as the real rate. When the Soviet Union fell apart in December 1991, the average Russian monthly salary was a miserable $6. This is where Venezuela is heading. In parallel with these miseries, Soviet foreign debt surged. The Soviet government, just like Maduro’s government, borrowed as much as it could for as long as it could. Foreign governments provided much of the financing, just as Venezuela has received half of its foreign credits from China. The Soviet government refused to acknowledge its poverty while continuing to service its debt for far too long. Venezuela seems to have been caught in the same hamster wheel. Naturally, there are differences. The Soviet Union was a multinational state with union republics unlike Venezuela. However absurd Venezuela’s economic policy may be, the country does not have a Marxist-Leninist system, and it remains far more open than the Soviet Union was, with a lively political opposition and a highly educated elite. But the economic demise of the Soviet Union offers a likely scenario for Venezuela’s future evolution. The financial crisis is likely to worsen because any significant change in policy would imply an admission by Maduro that his policies had been wrong, which would probably lead to his ouster. Thus, the budget deficit, shortages, inflation, exchange rate fall, and public debt are likely to grow much worse. One alternative could be a preemptive political overthrow of the Maduro regime fueled by public discontent or that the rulers just flee the country. Another possible endgame would be that the country runs out of international currency reserves and defaults on its foreign debt. That would deprive Venezuela of all foreign credit, and the natural consequence would be a complete collapse of imports and the exchange rate of the bolivar, the country’s currency. Either way, the Maduro regime is not likely to survive for long because it won’t be capable of making the necessary adjustments that avoid abject economic misery for most of the population, and the pressure on it will eventually become intolerable. A successor government will have to make the adjustments instead. But regardless of the nature of the new government, the choices it has available to it won’t be large: In extreme economic crises, the actual policy choices are few. The budget needs to be brought close to balance. That can only be done by cutting expenditures, because more taxes cannot be collected in the short term. The key cut will have to be to the elimination of price subsidies. Venezuela’s foreign aid projects must be cut as well. That might suffice to balance the budget. At the same time, the exchange rate needs to be unified around the market equilibrium, regardless of whether the exchange will be floating or pegged. Venezuela’s depleted international reserves will have to be restored. The only international agency that can do so fast and effectively is the International Monetary Fund. The IMF can quickly restore a country’s reserves and creditworthiness, but Venezuela has to make peace with the macroeconomic reforms the fund will call for. In parallel, other international organizations and friendly countries will have to engage to salvage the country from the ravages of the Maduro regime. The country’s foreign debt burden will need to be restructured. The situation of the former Soviet Union was much more difficult, because 15 new countries had to be formed and the common currency, divided. The negative lesson from Russia is that the country took far too long — seven years — to get the budget deficit under control. The warning for the West is that it failed to help Russia at its moment of greatest need, eventually pushing its politics in an anti-Western direction. When reform finally arrives in Venezuela, it needs to be radical and fast, and the West should offer wholehearted financial support. The collapse of the Maduro regime will not be pretty, but it is difficult to see how it can be avoided. While the politics might be difficult to predict, the main features of a severe economic crisis are quite predictable. The key question is how fast a new government will manage to do the right things.


Arturo Casadevall has zero training in forensic science—the techniques used in law enforcement and the courtroom to link individuals to crimes. For most of his career, the microbiologist at Johns Hopkins University in Baltimore, Maryland, paid the discipline little attention, but he did notice the field-shaking 2009 report from the National Academy of Sciences (NAS), which found that many forensic techniques, from fingerprint comparisons to bloodstain pattern analysis, lacked a firm scientific footing. “I remember reading [NAS] about this and I said, ‘Oh my God, I thought fingerprints had been validated,’” he remembers. He would soon play a direct role in the field’s reform, as one of a handful of basic researchers invited to serve alongside lawyers, judges, and forensic practitioners on a panel, created by the U.S. Department of Justice (DOJ), in partnership with the National Institute of Standards and Technology, to advise DOJ on how to respond to the NAS report’s concerns. Since its founding in 2013, the panel has published 43 documents and made 20 official recommendations to the attorney general, including a call for the universal accreditation of forensic practitioners and for the phasing out of the meaningless phrase “reasonable scientific certainty” that is common in courtrooms. Last week, Casadevall and five other scientific members of the commission wrote a letter to Attorney General Jeff Sessions and acting National Institute of Standards and Technology Director Kent Rochford asking them to renew the group’s charter, set to run out 23 April. Instead, Sessions and DOJ announced on Monday that the charter would be allowed to expire, and he requested proposals for a new advisory committee or an office within DOJ that would advance forensic science—a move many fear will exclude mainstream scientific views from future policy decisions. Casadevall spoke to Insider about the commission’s value, its limitations, and what he expects to be its legacy. Q: What was the logic behind this panel? A: If you think about it, the techniques that formed forensic science were created to try to do criminal investigations. … Their ancestry is out of the mainstream of science, in the sense that they were developed for the purpose of trying to help solve crimes, and consequently, many of these techniques never got validated. … The big innovation of the commission was that you had mainstream scientists sitting at the table with judges, prosecutors, and forensic [practitioners]. … We [scientists] were there because science has traditions and approaches that transcend disciplines. Even though our domains are very different, our approaches to inquiry, our approaches to error, are pretty similar. The first five meetings of the commission, very little got done because we were trying even to learn to talk to one another. We come from completely different worlds. Law works as a dichotomy—you’re guilty or not guilty. In science, all knowledge is provisional. I assume that what we’re doing in my lab today is going to be overturned in the future. Q: What do you see as the key products of the commission? A: I think the document that was voted on in September [2016] stating that new techniques of forensic science need to be independently validated is a really important line in the sand. … Today, if you develop a new technique to characterize white powders, and you start a company, you can actually sell this to police departments without necessarily going through any [validation] mechanism … and then use it in court, in a situation in which the lawyers don’t know any of the science. So who is going to challenge any of this? The recommendation was that it should go to an independent entity that is not part of the justice department. … And to me, that is probably the single most important output of the committee. That at least is on record. Q: What about all the existing techniques that are still used in court and haven’t been validated, such as bite mark evidence? A: Of course, we wanted [the document] to be bigger. We wanted it to be retrospective. But we had to compromise to get the votes. … We were a deliberative body and we need to get a product out, and if you dig your heels and say all of it needs to be independently validated, you couldn’t even get that statement. You can’t put the genie back in the bottle. The questions are out there. The Innocence Project, for example, is training lawyers on what questions to ask [about certain] techniques. We have an adversarial system. Inevitably, the pressure is going to come in, even from the prosecutors who say, “Hey guys, we’ve got to fix these techniques, because every time I present this, they hit me with these questions.” … There is a greater awareness now that if you’re going to introduce any of these things into court, you may be asked, “Where is your error rate? How do you really know what a match is? What does that mean?” I tend to look at this as progress. On Monday [at the commission’s final meeting], there were two documents that were voted down. … One of the documents was just to recommend that when somebody testified, if you asked them about an error rate, they should be able to tell you that they had it, or that they didn’t know. I think that if I stopped somebody on the street and I asked them, most people would agree that this is a reasonable thing. But some of the people … felt that this recommendation would be used by defense lawyers to question a lot of [forensic] practitioners. If we had continued to meet, those documents would have gone back to committee … and they would have figured out a way to work around … and they may have passed in the summer meeting or the fall meeting. Q: Did you expect the commission’s charter to be renewed? A: The truth of the matter is we did not know. We knew it was a congressional charter and Congress had basically appropriated money to run the commission a certain number of cycles. We thought that with the new administration, there was a high probability that perhaps it didn’t get renewed. A: Sessions’s statement doesn’t say anything bad about science. They basically say that we’re going to take this and we’re going to put it back in the Department of Justice, create a new office of forensic science. On the surface, that sounds good. You could imagine that they could create an office with a lot of scientific input. … The problem of forensic science is that it grew out of an arm of the judiciary, and by putting it back, you basically don’t deal with the questions that are being asked from mainstream science. … The commission was the only mechanism that existed where you had working scientists talking to people who actually do forensic science to generate data that is consumed by the legal system. And we were a minority. I served on the anthrax investigation—the National Academy of Sciences study that was done on [the 2001 anthrax mailings]—so I can tell you that the culture of criminal justice feels very uncomfortable when you question their techniques. And that is natural. These people are trying to reduce crime by putting people away. And here you have a situation in which techniques that they have used for many, many years come into question, and I think there is natural instinct to say, “Oh, God.” Some of that may have been there. But there are also a lot of people who recognize that these problems aren’t going away. And the way to deal with this is to do the science.


News Article | April 27, 2017
Site: www.rdmag.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study’s authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging’s Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer’s Disease Discovery Foundation and Lumind.


News Article | April 13, 2017
Site: www.chromatographytechniques.com

Maryland would be the first state to empower the state’s attorney general to take action to stop pharmaceutical price gouging under a measure given final passage by lawmakers on Monday. The bill sent to Republican Gov. Larry Hogan would enable the attorney general to bring civil actions against manufacturers of off-patent or generic drugs that make an “unconscionable” price increase, described as an excessive increase unjustified by the cost of producing or distributing the drug. “When a drug company doubles or triples -- or multiplies by 50 -- the price of medication, it imperils the health and finances of patients and their families, and it threatens public health,” said Maryland Attorney General Brian Frosh, a Democrat. Manufacturers could face a fine of up to $10,000 per violation. The attorney general would be able to request additional information from the corporations that instituted the price increases to help determine if price gouging had occurred. “The new law will give Maryland a necessary tool to combat unjustified and extreme prices for medicines that have long been on the market and that are essential to our health and well-being,” Frosh said. Concerns about sky-high drug prices have been growing for years nationally. They boiled over last year after it was revealed that Turing Pharmaceuticals and Canadian drugmaker Valeant Pharmaceuticals were hiking prices on previously low-priced medicines for patients with heart problems and other life-threatening conditions. Still, Congress has failed to take action. Doug Mayer, a spokesman for Hogan, said the governor will review the bill before deciding whether to sign it. The Democratic-controlled legislature passed the measure with bipartisan support in both houses, more than enough to override a veto by the governor. The vote was 38-7 in the Senate and 137-2 in the House. Vincent DeMarco, president of Maryland Citizens’ Health Initiative, said the problem of price gouging has been going on for too long, and it’s time for states to do what the federal government has been unable to tackle. The group describes itself as a coalition of hundreds of faith community, labor, business and health care groups that works for quality and affordable health care in the state. “Our health care advocate colleagues all across the country have been calling us to ask how they can replicate what we just did,” DeMarco said after the final vote Monday. Critics say the measure will chill competition among generic drug manufacturers and end up making drugs more expensive. Chester Davis, president and CEO of the Association for Accessible Medicines, urged the governor to veto the measure. “This bill will harm both Maryland patients and taxpayers alike and thus should be vetoed by Governor Hogan,” Davis said in a statement. Dr. Jeremy Greene, who practices at the East Baltimore Medical Center, praised the legislation. He said he sees patients who have trouble affording off-patent drugs every month. “I find myself seeing evidence of these problems repeatedly in my clinical experience and increasingly in the past five years,” said Greene, a professor of the history of medicine at Johns Hopkins University School of Medicine, on Monday.


Although human population studies have linked air pollution to chronic inflammation of nasal and sinus tissues, direct biological and molecular evidence for cause and effect has been scant. Now, Johns Hopkins researchers report that experiments in mice continually exposed to dirty air have revealed that direct biological effect. Researchers have long known that smog, ash and other particulates from industrial smokestacks and other sources that pollute air quality exacerbate and raise rates of asthma symptoms, but had little evidence of similar damage from those pollutants to the upper respiratory system. The new findings, published in the American Journal of Respiratory Cell and Molecular Biology, have broad implications for the health and well-being of people who live in large cities and industrial areas with polluted air, particularly in the developing world. "In the U.S., regulations have kept a lot of air pollution in check, but in places like New Delhi, Cairo or Beijing, where people heat their houses with wood-burning stoves, and factories release pollutants into the air, our study suggests people are at higher risk of developing chronic sinus problems," says Murray Ramanathan, M.D., associate professor of otolaryngology - head and neck surgery at the Johns Hopkins University School of Medicine. According to the Centers for Disease Control and Prevention, more than 29 million people in the U.S. or more than 12 percent of adults have a chronic sinusitis diagnosis. Chronic sinusitis can cause congestion, pain and pressure in the face, and a stuffy, drippy nose. Numerous studies have reported significant social implications of chronic sinonasal disease, including depression, lost productivity and chronic fatigue. To see how pollution may directly affect the biology of the upper airways, the researchers exposed 38 eight-week-old male mice to either filtered air or concentrated Baltimore air with particles measuring 2.5 micrometers or less, which excludes most allergens, like dust and pollen. The aerosolized particles, although concentrated, were 30 to 60 percent lower than the average concentrations of particles of a similar size in cities like New Delhi, Cairo and Beijing. Nineteen mice breathed in filtered air, and 19 breathed polluted air for 6 hours per day, 5 days a week for 16 weeks. The researchers used water to flush out the noses and sinuses of the mice, and then looked at the inflammatory and other cells in the flushed-out fluid under a microscope. They saw many more white blood cells that signal inflammation, including macrophages, neutrophils and eosinophils, in the mice that breathed in the polluted air compared with those that breathed in filtered air. For example, the mice that breathed in the polluted air had almost four times as many macrophages than mice that breathed filtered air. To see if the cells flushed out of the nasal and sinus passages had turned on a generalized inflammatory response, the researchers compared specific genes used by immune system cells from the mice that breathed polluted air with the cells of those that breathed filtered air. They found higher levels of messenger RNA -- the blueprints of DNA needed to make proteins -- in the genes for interleukin 1b, interleukin 13, oncostatin M and eotaxin-1 in the nasal fluid of mice that breathed the polluted air. All those proteins are considered direct biomarkers for inflammation. The investigators measured the protein levels of interleukin 1b, interleukin 13 and eotaxin-1, which are chemical messengers called cytokines that cause an immune response. They found five to 10 times higher concentrations of the cytokines involved in inflammation in the mice that breathed the polluted air than in those that breathed filtered air. Interleukin 1b is a chemical messenger that promotes inflammation, and both interleukin 13 and eotaxin-1 are chemical messengers that attract eosinophils. "Inflammation that attracts eosinophils is what happens in the lungs of people with asthma, so essentially the chronic exposure to air pollution in mice is leading to a kind of asthma of the nose," says Ramanathan. Next, the researchers examined layers of cells along the nasal passages and sinuses under a microscope and found that the surface layer - or epithelium - was, notably, 30 to 40 percent thicker in mice that breathed in polluted air than in those that breathed filtered air. Ramanathan says that a thicker epithelium is another sign of inflammation in humans and other animals. The researchers next used glowing antibodies that bind to the proteins claudin-1 and E-cadherin found in between the cells of the epithelium to help hold them together. They report observing far less of both proteins but up to 80 percent less E-cadherin from mice that breathed in the polluted air compared with the mice that breathed filtered air. The investigators also said they found much higher levels of the protein serum albumin in the mice that breathed in the polluted air. High levels of serum albumin indicate that barriers to the nasal passages and sinuses were breached. "We've identified a lot of evidence that breathing in dirty air directly causes a breakdown in the integrity of the sinus and nasal air passages in mice," says Ramanathan. "Keeping this barrier intact is essential for protecting the cells in the tissues from irritation or infection from other sources, including pollen or germs." Ramanathan says his team will continue to study the specific molecular changes that occur when the sinus and nasal barriers are breached because of air pollution, as well as investigate possible ways to repair them. Additional authors on the study include Nyall London, Anuj Tharakan, Nitya Surya, Thomas Sussan, Xiaoquan Rao, Sandra Lin, Sanjay Rajagopalan and Shyam Biswal of Johns Hopkins University and Elina Toskala of Temple University. The study was funded by grants from the National Institute of Environmental Health Sciences (ES020859 and U01 ES026721) and the Flight Attendant Medical Research Institute. London has a patient for treating vascular barrier dysfunction licensed to Navigen Pharmaceuticals, and he holds some stock with the company.


News Article | April 25, 2017
Site: www.sciencedaily.com

A team of computer engineers and neurosurgeons, with an assist from Hollywood special effects experts, reports successful early tests of a novel, lifelike 3D simulator designed to teach surgeons to perform a delicate, minimally invasive brain operation. A report on the simulator that guides trainees through an endoscopic third ventriculostomy (ETV) was published in the Journal of Neurosurgery: Pediatrics on April 25. The procedure uses endoscopes, which are small, computer-guided tubes and instruments, to treat certain forms of hydrocephalus, a condition marked by an excessive accumulation of cerebrospinal fluid and pressure on the brain. ETV is a minimally invasive procedure that short-circuits the fluid back into normal channels in the brain, eliminating the need for implantation of a shunt, a lifelong device with the associated complications of a foreign body. "For surgeons, the ability to practice a procedure is essential for accurate and safe performance of the procedure. Surgical simulation is akin to a golfer taking a practice swing," says Alan R. Cohen, M.D., professor of neurosurgery at the Johns Hopkins University School of Medicine and a senior author of the report. "With surgical simulation, we can practice the operation before performing it live." While cadavers are the traditional choice for such surgical training, Cohen says they are scarce, expensive, nonreusable, and most importantly, unable to precisely simulate the experience of operating on the problem at hand, which Cohen says requires a special type of hand-eye coordination he dubs "Nintendo Neurosurgery." In an effort to create a more reliable, realistic and cost-effective way for surgeons to practice ETV, the research team worked with 3D printing and special effects professionals to create a lifelike, anatomically correct, full-size head and brain with the touch and feel of human skull and brain tissue. The fusion of 3D printing and special effects resulted in a full-scale reproduction of a 14-year-old child's head, modeled after a real patient with hydrocephalus, one of the most common problems seen in the field of pediatric neurosurgery. Special features include an electronic pump to reproduce flowing cerebrospinal fluid and brain pulsations. One version of the simulator is so realistic that it has facial features, hair, eyelashes and eyebrows. To test the model, Cohen and his team randomly paired four neurosurgery fellows and 13 medical residents to perform ETV on either the ultra-realistic simulator or a lower-resolution simulator, which had no hair, lashes or brows. After completing the simulation, fellows and residents each rated the simulator using a five-point scale. On average, both the surgical fellows and the residents rated the simulator more highly (4.88 out of 5) on its effectiveness for ETV training than on its aesthetic features (4.69). The procedures performed by the trainees were also recorded and later watched and graded by two fully trained neurosurgeons in a way that they could not identify who the trainees were or at what stage they were in their training. The neurosurgeons assessed the trainees' performance using criteria such as "flow of operation," "instrument handling" and "time and motion." Neurosurgeons consistently rated the fellows higher than residents on all criteria measured, which accurately reflected their advanced training and knowledge, and demonstrated the simulator's ability to distinguish between novice and expert surgeons. Cohen says that further tests are needed to determine whether the simulator will actually improve performance in the operating room. "With this unique assortment of investigators, we were able to develop a high-fidelity simulator for minimally invasive neurosurgery that is realistic, reliable, reusable and cost-effective. The models can be designed to be patient-specific, enabling the surgeon to practice the operation before going into the operating room," says Cohen.


News Article | April 17, 2017
Site: www.scientificamerican.com

People with Parkinson’s disease may show hints of motor difficulty years before an official diagnosis, but current methods for detecting early symptoms require clinic visits and highly trained personnel. Three recent studies, however, suggest that diagnosis could be as simple as walking, talking and typing. Tests of activities such as these might eventually enable early intervention, which will be crucial for halting progression of the neurodegenerative condition if a cure becomes available. The findings are exciting, says neurologist Zoltan Mari of Johns Hopkins University. But he cautions that larger studies will be necessary to ensure that these techniques are ready for wider use. Walking: Data from wearable sensors attached to 93 Parkinson’s patients and 73 healthy controls revealed distinctive walking patterns: factors such as step distance and heel force helped to differentiate between the two groups with 87 percent accuracy, according to an analysis by Shyam Perumal and Ravi Sankar of the University of South Florida. Talking: In a study by Jan Rusz of Czech Technical University and Charles University, both in Prague, and his colleagues, participants read a list of words aloud, and each made a 90-second recording during which they described their current interests. Fifty of the participants were at high risk for developing Parkinson’s, but only 23 had begun to show symptoms. Simple acoustic features of the short speech samples—including slower speed of talking and longer duration of pauses than healthy controls—pinpointed the symptomatic participants with 70 percent accuracy. Typing: People with and without Parkinson’s were asked to listen to a folktale and transcribe it by typing. The two groups were matched for age and overall typing speed and excluded people with dementia. Luca Giancardo of the Massachusetts Institute of Technology and his colleagues successfully discriminated between the groups solely by analyzing key hold times (the time required to press and release a key). Their analysis performed comparably or better than motor tests currently used in clinical settings.


News Article | May 3, 2017
Site: news.yahoo.com

Satellite images indicate activity has resumed at North Korea's nuclear test site, US-based analysts said Tuesday, as tensions remain high over fears of an sixth atomic test by the reclusive state. Images of the Punggye-ri site captured on April 25 appear to show workers pumping out water at a tunnel believed to have been prepared for an upcoming nuclear test, monitoring group 38 North said. It also noted that a large number of personnel were seen throughout the facility, with some groups possibly playing volleyball, in what is very likely a propaganda scene. "It is unclear if this activity indicates that a nuclear test has been cancelled, the facility is in stand-by mode or that a test is imminent," said the researchers from the US-Korea Institute at Johns Hopkins University. Workers were also observed playing volleyball at the guard barracks and two other areas at the site in satellite pictures taken on April 19 and 21. 38 North said the latest images were "unusual and almost assuredly a component of an overall North Korean deception and propaganda effort" and the result of media reporting on the earlier volleyball sightings. North Korea is on a mission to develop a long-range missile capable of hitting the US mainland with a nuclear warhead, and has so far staged five nuclear tests, two of them last year. Punggye-ri is a complex of tunnels and testing infrastructure in the mountains in the northeast of the country. 38 North said last month that Punggye-ri was "primed and ready" to conduct a test, amid mounting speculation that Pyongyang would act to coincide with major anniversaries including the birthday of regime founder Kim Il-Sung. A nuclear test has yet to happen, but North Korea's failed ballistic missile launch last week marked the hermit state's latest show of defiance. On Monday it said it would carry out a nuclear test "at any time and at any location" set by its leadership. US President Donald Trump said this week he would be "honored" to meet North Korean leader Kim Jong-Un under the right conditions, dialling down earlier threats of military action. Washington is now exploring options at the UN Security Council to ramp up pressure on the North, with diplomats saying it was in discussion with China on possible sanctions. Over the past 11 years, the Security Council has imposed six sets of sanctions on Pyongyang, including imposing a cap on coal exports among other measures in November.


News Article | April 25, 2017
Site: www.rdmag.com

A team of computer engineers and neurosurgeons, with an assist from Hollywood special effects experts, reports successful early tests of a novel, lifelike 3D simulator designed to teach surgeons to perform a delicate, minimally invasive brain operation. A report on the simulator that guides trainees through an endoscopic third ventriculostomy (ETV) was published in the Journal of Neurosurgery: Pediatrics on April 25. The procedure uses endoscopes, which are small, computer-guided tubes and instruments, to treat certain forms of hydrocephalus, a condition marked by an excessive accumulation of cerebrospinal fluid and pressure on the brain. ETV is a minimally invasive procedure that short-circuits the fluid back into normal channels in the brain, eliminating the need for implantation of a shunt, a lifelong device with the associated complications of a foreign body. "For surgeons, the ability to practice a procedure is essential for accurate and safe performance of the procedure. Surgical simulation is akin to a golfer taking a practice swing," says Alan R. Cohen, M.D., professor of neurosurgery at the Johns Hopkins University School of Medicine and a senior author of the report. "With surgical simulation, we can practice the operation before performing it live." While cadavers are the traditional choice for such surgical training, Cohen says they are scarce, expensive, nonreusable, and most importantly, unable to precisely simulate the experience of operating on the problem at hand, which Cohen says requires a special type of hand-eye coordination he dubs "Nintendo Neurosurgery." In an effort to create a more reliable, realistic and cost-effective way for surgeons to practice ETV, the research team worked with 3D printing and special effects professionals to create a lifelike, anatomically correct, full-size head and brain with the touch and feel of human skull and brain tissue. The fusion of 3D printing and special effects resulted in a full-scale reproduction of a 14-year-old child's head, modeled after a real patient with hydrocephalus, one of the most common problems seen in the field of pediatric neurosurgery. Special features include an electronic pump to reproduce flowing cerebrospinal fluid and brain pulsations. One version of the simulator is so realistic that it has facial features, hair, eyelashes and eyebrows. To test the model, Cohen and his team randomly paired four neurosurgery fellows and 13 medical residents to perform ETV on either the ultra-realistic simulator or a lower-resolution simulator, which had no hair, lashes or brows. After completing the simulation, fellows and residents each rated the simulator using a five-point scale. On average, both the surgical fellows and the residents rated the simulator more highly (4.88 out of 5) on its effectiveness for ETV training than on its aesthetic features (4.69). The procedures performed by the trainees were also recorded and later watched and graded by two fully trained neurosurgeons in a way that they could not identify who the trainees were or at what stage they were in their training. The neurosurgeons assessed the trainees' performance using criteria such as "flow of operation," "instrument handling" and "time and motion." Neurosurgeons consistently rated the fellows higher than residents on all criteria measured, which accurately reflected their advanced training and knowledge, and demonstrated the simulator's ability to distinguish between novice and expert surgeons. Cohen says that further tests are needed to determine whether the simulator will actually improve performance in the operating room. "With this unique assortment of investigators, we were able to develop a high-fidelity simulator for minimally invasive neurosurgery that is realistic, reliable, reusable and cost-effective. The models can be designed to be patient-specific, enabling the surgeon to practice the operation before going into the operating room," says Cohen.


News Article | April 20, 2017
Site: www.marketwired.com

MCLEAN, VA--(Marketwired - April 20, 2017) - The MITRE Corporation has named Dr. William LaPlante as Senior Vice President and General Manager of MITRE's Center for National Security (CNS). CNS includes two of MITRE's federally funded research and development centers (FFRDCs) -- the National Security Engineering Center (NSEC) and the National Cybersecurity FFRDC. In his new role, Dr. LaPlante will be accountable for increasing MITRE's strategic value across the company's U.S. Department of Defense (DoD), intelligence, and cybersecurity portfolios. "Bill's demonstrated leadership ability, integrity, and drive for excellence provide huge value to MITRE and our government sponsors," said Dr. Jason Providakes, MITRE president and chief executive officer. "With his extensive experience in the private and public sectors combined with his expertise in national defense engineering systems, I am confident he will make an immediate and significant impact in his new role." Dr. LaPlante previously led the company's intelligence portfolio within NSEC, the FFRDC that MITRE operates on behalf of the DoD. Prior to rejoining MITRE in 2015, he served as the Assistant Secretary of the Air Force for acquisition, a position confirmed by the U.S. Senate, where he was responsible for all Air Force research, development and acquisition activities and oversaw a program portfolio of more than $32 billion annually. He was also responsible for development and execution of policies and procedures in support of the operation and improvement of the Air Force's acquisition system and was recognized in 2015 by the Air Force Association with the W. Stuart Symington Award for the most significant contribution by a civilian in the field of national defense. Before entering public service in 2013, LaPlante was the portfolio director for MITRE's Missile Defense Agency work program, where he led a technical team providing analytic and systems engineering expertise for ballistic missile defense systems. He currently serves on the Defense Science Board (DSB) and is a Commissioner on the Section 809 Panel established by Congress to streamline acquisition. Over his career, he has 30 years of experience in defense technology and served as a member of the U.S. Strategic Command Senior Advisory Group and the Naval Research Advisory Committee. LaPlante holds a bachelor's degree in engineering physics from the University of Illinois, a master's degree in applied physics from Johns Hopkins University, and a doctorate in mechanical engineering from The Catholic University of America. The MITRE Corporation is a not-for-profit organization that operates research and development centers sponsored by the federal government.


News Article | May 1, 2017
Site: www.futurity.org

More than 40 percent of survivors of acute respiratory distress syndrome (ARDS) who had jobs were unemployed a year after leaving the hospital, losing an average of $27,000 in pay, a new study estimates. ARDS affects approximately 200,000 Americans every year. Survivors often have long-lasting cognitive dysfunction, mental health issues, or physical impairments, any of which can affect their ability to hold down a job. “Health care providers need to start asking themselves, ‘What can we do to help patients regain meaningful employment,’ and not just concern ourselves with their survival,” says Dale Needham, professor of physical medicine and rehabilitation at Johns Hopkins University. ARDS is a lung condition often caused by severe infection or trauma and marked by fluid buildup in the lungs’ air sacs. The resulting damage leads to a substantial decrease in oxygen reaching the bloodstream and rapidly developing difficulty with breathing. Patients are usually hospitalized and placed on a life-supporting ventilator. One important goal of the study, researchers say, is to better identify specific risk factors for joblessness and to inform future interventions aimed at reducing joblessness after ARDS. “Multiple studies have suggested that joblessness is common in people who survive ARDS,” says Biren Kamdar, assistant professor of medicine at the University of California, Los Angeles and the study’s first author. “But to our knowledge, none have carefully tracked those who returned to work or subsequently lost their jobs, performed an in-depth analysis of risk factors for joblessness, and evaluated the impact of joblessness on lost earnings and health care coverage.” The research was part of a larger long-term study of ARDS survivors who have been patients at 43 hospitals across the United States. Investigators recruited 922 survivors and did telephone interviews at six and 12 months after the onset of ARDS. Each survivor was asked about employment, hours per week, how soon after hospital discharge they returned to work, perceived effectiveness at work, and any major change in occupation. Researchers estimated lost earnings using age- and sex-matched wage data from the US Bureau of Labor Statistics. Of the 922 survivors, 386 (42 percent) were employed prior to ARDS. The average age of these previously employed survivors was 45; 4 percent were 65 or older. Overall, previously employed survivors were younger, predominantly male, and had fewer pre-existing health conditions compared with survivors not employed before ARDS. Of the 379 previously employed patients who survived to the 12-month follow-up, 44 percent were jobless by that time. Some 68 percent of survivors had returned to work at some point during the 12-month follow-up period, but many with fewer hours or reduced on-the-job effectiveness. About 24 percent then lost their jobs. Throughout the 12-month follow-up period, non-retired jobless survivors had an average estimated earnings loss of about $27,000 each, or 60 percent of their pre-ARDS annual earnings. The findings also show a substantial decline in private health insurance coverage (from 44 to 30 percent) and a rise in Medicare and Medicaid enrollment (33 to 49 percent). “We believe that ARDS survivors are often jobless due to a combination of physical, psychological, and cognitive impairments that may result, in part, from a culture of deep sedation and bed rest that plagues many ICUs,” says Needham, senior author of the study published in the American Journal of Respiratory and Critical Care Medicine. “Perhaps if we can start rehabilitation very early, while patients are still on life support in the intensive care unit, getting them awake, thinking, and moving sooner, this may result in greater cognitive and physical stimulation and improved well-being.” Other researchers involved in the study were from Johns Hopkins, UCLA and Intermountain Medical Center in Utah. Funding came from the National Heart, Lung and Blood Institute, the ARDS Network and the NIH-funded UCLA Clinical and Translational Science Institute.


News Article | April 17, 2017
Site: www.nature.com

After 13 years exploring Saturn and its moons, NASA’s Cassini spacecraft has just 5 months left to live. But it will go out with a scientific bang. On 22 April, Cassini will slingshot past Titan, Saturn’s largest moon, for the last time. Four days later, the probe will hurtle into the unexplored region between the giant planet and its rings. Cassini will thread that 2,400-kilometre-wide gap 22 times before its kamikaze dive into Saturn’s atmosphere on 15 September. This unprecedented journey promises to yield fresh discoveries for the venerable spacecraft. “It will be like a whole new mission,” says Linda Spilker, Cassini’s project scientist at NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California. “There are fundamental new scientific measurements to make.” Those include the first direct tastes of particles in Saturn’s rings, and of its upper atmosphere; the best measurements yet of the planet’s magnetic and gravitational fields, which could answer long-standing questions such as how fast the planet rotates and how old its rings are; and the sharpest look yet at the inner rings. It all begins with the spacecraft’s final fly-by of Titan, the 127th such close encounter. Cassini will scan the moon’s methane lakes one last time, looking for waves, bubbles or other phenomena roiling the surface. Earlier fly-bys have revealed changes in the lakes over time, and the final pass is the last chance to look for seasonal shifts, says Sarah Hörst, a planetary scientist at Johns Hopkins University in Baltimore, Maryland. Titan’s gravitational pull will fling Cassini into its ‘grand finale’ orbits, plunging between Saturn’s innermost ring and the planet’s cloud tops (see ‘Cassini: the final frontier’). The spacecraft will turn its main antenna forward, to act as a protective shield against any errant ring particles as it whizzes along at 110,000 kilometres per hour. Since November, the probe has been climbing higher relative to Saturn’s equatorial plane, providing a new vantage point on the planet’s outer rings. The upcoming inner dives will also reveal spectacular new details, says Carolyn Porco, a planetary scientist at the University of California, Berkeley, who leads the mission’s imaging team. High-resolution photographs have captured mysterious propeller-shaped gaps that ripple through some of the farther-out rings, probably formed by unseen moonlets. “The rings really are changing before our eyes,” says Jeffrey Cuzzi, a planetary scientist at NASA’s Ames Research Center in Moffett Field, California. Cassini’s remote-sensing instruments will get their closest look yet at the rings, on sides both lit and unlit by the Sun. Measurements will show how the chemical make-up of the ring particles varies from place to place — information that is crucial for researchers who are trying to tease out which compounds pollute the rings’ otherwise pure ice. And scientists might finally unravel the rings’ biggest mystery — how old they are and how they formed. Between May and July, Cassini will make its most precise measurements of Saturn’s gravitational field; by tracking the spacecraft’s motion as it flies between the planet and the rings, mission scientists expect to improve their calculations of the mass of the rings by an order of magnitude. A relatively high mass would suggest that the rings were ancient, perhaps formed by a big moon ripped apart billions of years ago. Lighter-weight rings would suggest a more recent formation, perhaps from a visiting comet that disintegrated. Other fundamental measurements will tackle the giant planet itself. On the grand-finale orbits, Cassini’s magnetometer will measure Saturn’s magnetic field close to the planet. There, it is roughly ten times stronger — and more complex and scientifically interesting — than in areas already probed, says Marcia Burton, a planetary scientist at JPL. Those data should shed light on long-standing mysteries such as the depth of Saturn’s metallic hydrogen core — which powers its magnetic field — and how quickly the planet rotates. Observations by the Voyager spacecraft in the 1980s suggested that one rotation takes just under 11 hours. But the numbers are different when measured in the northern and southern hemispheres, which hints that something more complicated is going on. “It is hard to imagine how the grand-finale orbits could not lead to a huge improvement in our understanding of Saturn’s magnetic field,” Burton says. On 15 September, with its tanks almost out of fuel, mission controllers will steer Cassini directly into Saturn. But the craft will still radio back observations of the gases that make up Saturn’s atmosphere. “Even in its final moments, Cassini will be doing groundbreaking science,” says Hörst.


News Article | April 27, 2017
Site: www.eurekalert.org

The American Geriatrics Society (AGS) has named Fatima Sheikh, MD, CMD, MPH, Medical Director at FutureCare in Maryland and Assistant Professor at Johns Hopkins University School of Medicine, the 2017 AGS Clinician of the Year. In her work across post-acute and long-term care, Dr. Sheikh is recognized not only as a skilled physician serving the needs of particularly frail older adults in the Baltimore area but also as a dedicated mentor for a diverse and growing interprofessional team. Dr. Sheikh will be honored at the AGS 2017 Annual Scientific Meeting (May 18-20 in San Antonio, Texas). "Geriatrics expertise is complex and multifaceted, and that's especially true when working with frail older adults like those cared for by Dr. Sheikh," notes AGS President Ellen Flaherty, PhD, APRN, AGSF. "Dr. Sheikh and the healthcare professionals fortunate enough to learn from her are setting a new standard for what it means to provide high-quality, person-centered care." With a professional interest in improving long-term care for residents of skilled nursing facilities, Dr. Sheikh has championed several programs at FutureCare for managing multiple chronic conditions and transitions to and from the hospital--both critical components of long-term care for older adults. In 2012, for example, Dr. Sheikh played a key role to establish the Johns Hopkins Community Partnership Skilled Nursing Facility Collaborative, one of five Johns Hopkins University community health projects funded by the Centers for Medicare and Medicaid Services Innovation Center. And as these programs enhance the broader setting for geriatrics expertise, Dr. Sheikh has remained equally committed to high-quality care as a clinician and a mentor. Dr. Sheikh is recognized for her skill in conducting Advance Care Planning--an innovative service for documenting care preferences and one only recently recognized for Medicare beneficiaries. She continues to work with colleagues and fellows from the Johns Hopkins University School of Medicine on the nuances of geriatrics care, particularly when it comes to navigating cultural competency for diverse older adults and their families. "The most exciting experience for me in taking care of older adults is listening to their life stories," notes Dr. Sheikh. "And the most rewarding experience is when I can help make their end-of-life as comfortable and peaceful as possible." Dr. Sheikh has been a member of the AGS since 2011, when she first began her fellowship in geriatrics. She is board-certified in internal and geriatric medicine, and continues to balance her clinical career in geriatrics with research on health system innovation and improvements to care quality. The AGS Clinician of the Year Award recognizes exceptional health professionals who deliver outstanding care to older adults and who model the importance of geriatrics for our country's growing older adult population. It is one of several honors conferred by the AGS at its Annual Scientific Meeting--held this year in San Antonio, Texas., May 18-20. The 2017 award recipients include more than 15 healthcare leaders representing the depth and breadth of disciplines championing care for older adults. For more information, visit AmericanGeriatrics.org. Founded in 1942, the American Geriatrics Society (AGS) is a nationwide, not-for-profit society of geriatrics healthcare professionals that has--for 75 years--worked to improve the health, independence, and quality of life of older people. Its nearly 6,000 members include geriatricians, geriatric nurses, social workers, family practitioners, physician assistants, pharmacists, and internists. The Society provides leadership to healthcare professionals, policymakers, and the public by implementing and advocating for programs in patient care, research, professional and public education, and public policy. For more information, visit AmericanGeriatrics.org. The AGS Clinician of the Year Award was established to recognize the contributions of practitioners to quality health care for older people and the importance of the geriatrics clinician in our healthcare delivery system. Through awardees' efforts, scientific advances are integrated into the practice of geriatrics, resulting in improved well-being and quality of life for older adults. The AGS Annual Scientific Meeting is the premier educational event in geriatrics, providing the latest information on clinical care, research on aging, and innovative models of care delivery. More than 2,500 nurses, pharmacists, physicians, physician assistants, social workers, long-term care and managed care providers, healthcare administrators, and others will convene May 18-20, 2017 (pre-conference program on May 17), at the Henry B. Gonzalez Convention Center in San Antonio, Texas, to advance geriatrics knowledge and skills through state-of-the-art educational sessions and research presentations. For more information, visit AmericanGeriatrics.org.


US President Donald Trump spoke with Japanese Prime Minister Shinzo Abe, discussing the joint drills under way between the US carrier Carl Vinson and Japan's Maritime Self-Defence Force (AFP Photo/ MC2 Scott Fenaroli) Washington (AFP) - An aircraft carrier the US Navy said was steaming toward the Korean Peninsula amid rising tensions has not yet departed, a US defense official acknowledged Tuesday. The Navy on April 8 said it was directing a naval strike group headed by the USS Carl Vinson supercarrier to "sail north," as a "prudent measure" to deter North Korea. Pentagon chief Jim Mattis on April 11 said the Vinson was "on her way up" to the peninsula. For more news videos visit Yahoo View, available now on iOS and Android. President Donald Trump the next day said: "We are sending an armada. Very powerful." But a defense official told AFP Tuesday that the ships were still off the northwest coast of Australia. A Navy photograph showed the Vinson off Java over the weekend. "They are going to start heading north towards the Sea of Japan within the next 24 hours," the official said on condition of anonymity. The official added that the strike group wouldn't be in the region before next week at the earliest -- it is thousands of nautical miles from the Java Sea to the Sea of Japan. At the time of the strike group's deployment, many media outlets said the ships were steaming toward North Korea, when in fact they had temporarily headed in the opposite direction. The United States ratcheted up its rhetoric ahead of North Korea's military parade and failed missile launch over the weekend, and Vice President Mike Pence on Monday declared that the era of US "strategic patience" in dealing with Pyongyang was over. North Korean leader Kim Jong-un responded with his own fiery warnings and threatened to conduct weekly missile tests. It was not clear if the issue was the result of poor communication by the Navy, but some observers were critical. Joel Wit, a co-founder of the 38 North program of the US-Korea Institute at Johns Hopkins University, said the matter was "very perplexing" and fed into North Korea's narrative that America is all bluster and doesn't follow through on threats. "If you are going to threaten the North Koreans, you better make sure your threat is credible," Wit said. "If you threaten them and your threat is not credible, it's only going to undermine whatever your policy toward them is." The strike group has been conducting drills with the Australian navy in recent days, the official said, though it scrapped a planned port visit in Australia as a result of the new orders.


News Article | May 1, 2017
Site: news.yahoo.com

Washington (AFP) - US President Donald Trump's administration on Monday put the brakes on a scheme championed by former first lady Michelle Obama meant to ensure healthier meals in schools through lower salt, fats and sugar. The Agriculture Department said in a statement the change would give American schools "greater flexibility" and stop kids from throwing out the less appetizing food mandated under the scheme. Forcing schools to adopt better nutritional standards under the 2012 Healthy, Hunger-Free Kids Act if they wanted federal subsidies for meals was one of Michelle Obama's key achievements. Lauded by supporters in the US as crucial in the fight against childhood obesity, the initiative put restrictions on sodium and sweetened milks, and required school lunches to increase the amount of whole-grain foods they contained. The Trump administration's backtrack on the measure came the same day as a study suggesting that if American children exercised more, tens of billions of dollars in medical costs could be saved over their lifetimes. The finding was made by researchers at Johns Hopkins University's Bloomberg School of Public Health. According to the US Centers for Disease Control and Prevention, about one in six American children are overweight or obese. The Agriculture Department said the nutritional requirements imposed on schools in the past five years had added $1.2 billion in costs to school districts and to states. It said easing those rules would decrease costs, give back greater control to local authorities, and see children eating more enthusiastically. "If kids aren't eating the food, and it's ending up in the trash, they aren't getting any nutrition –- thus undermining the intent of the program," Agriculture Secretary Sonny Perdue said. At the same time, the Trump administration discontinued another signature program by Michelle Obama and her husband Barack Obama. The 2015 scheme designed to promote educational opportunities for adolescent girls in developing countries, called "Let Girls Learn," was being immediately scrapped, CNN reported on the basis of an internal email it had obtained. The email, sent to members of the US Peace Corps, a US government overseas volunteer program, said some parts of "Let Girls Learn" would continue, but the name was being dropped along with its standalone status.


News Article | May 3, 2017
Site: www.eurekalert.org

VIDEO:  Helistroke is the doctor flying to the stroke patient. A Johns Hopkins Lifeline helicopter arrives to pick up and transport Dr. Ferdinand Hui to Washington, D.C. to treat a stroke... view more Flying a stroke specialist by helicopter to a nearby stroke patient for emergency care is feasible, saves money and, most importantly, gets critical care to patients faster than transporting the patient to a hospital first, according to a single-patient, proof-of-concept study by a Johns Hopkins Medicine research team. Although the study was not designed to show whether "helistroke service" would improve outcomes for patients, previous research has amply demonstrated that stroke victims do best when they are treated as quickly as possible -- ideally in 100 minutes or less. A report of the findings, published in the Journal of Neurointerventional Surgery on May 3, details what is believed to be the first test of transporting a physician by helicopter to perform a standard intervention for a stroke. "With the development of effective treatments, the most limiting factor to treating acute stroke is infrastructure -- we have to keep evolving our systems to get therapy to as many appropriate patients as possible," says Ferdinand K. Hui, M.D., associate professor of radiology and radiological science at the Johns Hopkins University School of Medicine. Hui, the report's first author, is the physician who was transported via helicopter for the study. In the traditional model of care, people experiencing an acute ischemic stroke (a cutoff of blood supply in a blood vessel to the brain) are taken to a hospital with a specialized center capable of performing a minimally invasive therapy in which a physician inserts a catheter into the groin and threads it up through blood vessels to the blood clot in the brain causing the stroke. Once the catheter is in place, the physician delivers drugs that break up the clot. Patient transport time, however, can be significant and, in many cases, stroke victims are first taken to a nearby community hospital, then transported to the specialized center, often further delaying time to treatment and lowering the odds of recovery or reduced disability. In a recent study analyzing the results of a global, multicenter trial, data show a 91 percent probability of favorable stroke outcome if patients' blood flow was restored within 150 minutes of stroke. The next 60 minutes of delay, researchers found, resulted in a 10 percent reduction of good outcome. An additional 60 minutes resulted in an additional 20 percent reduction of good outcome. For the best chance of a favorable outcome, preintervention time was calculated to be less than 100 minutes. To test the feasibility of a physician-to-patient model that could potentially improve outcomes for a time-sensitive procedure, investigators designed a study to fly Hui by Johns Hopkins Lifeline from Baltimore to a National Institutes of Health Stroke Center at Suburban Hospital in Washington, D.C. --39.4 miles away -- to treat a stroke victim. Suburban, part of the Johns Hopkins Health System, has radiologists and the necessary equipment to image blood vessels but no neurointerventional experts on hand to provide immediate, catheter-based treatment. A patient was eligible for treatment in the pilot study if he or she had a large vessel blockage and a National Institutes of Health Stroke Scale rating greater than eight, which is considered a severe stroke. The stroke scale is a 15-item neurologic examination used to evaluate the potential damage of stroke as soon as possible after it occurs. In January 2017, such a patient was identified at Suburban at 11:12 a.m. Scans to view the patient's blood vessels and brain tissue were initiated at 11:46 a.m. and completed at 11:58 a.m. Hui, who was at The Johns Hopkins Hospital in Baltimore, was alerted at 12:07 p.m. Johns Hopkins Lifeline, which provides critical care transportation, was called at 12:13 p.m. Weather clearance for helicopter takeoff was obtained at 12:24 p.m., and the helicopter flight from The Johns Hopkins Hospital to Suburban Hospital took 19 minutes. Hui inserted the catheter into the patient at 1:07 p.m. and completed treatment at 1:41 p.m. Total time between decision-to-treat and groin puncture was 43 minutes, and between decision-to-treat and groin closure was 77 minutes. These times are comparable with time to treatment in one institution without transfer. The patient received tissue Plasminogen Activator, a clot-dissolving drug, and improved clinically. Hui says the helistroke service model not only has the potential to reduce transport time and improve patient outcomes, but also could expand ideal standards of care to rural and other populations, where specialized care is limited. "Up until now, the model has been that the 'right place' was a central location, like a tertiary facility such as The Johns Hopkins Hospital," says Jim Scheulen, M.B.A., chief administrative officer of emergency medicine at The Johns Hopkins Hospital. "But what we have demonstrated here is that bringing the right resources in the right time to the patient may actually be a better approach than always moving the patient." Hui cautions that the helistroke service is not always the right or best choice: weather restrictions, specialist availability and transportation costs limit the use of the model. But flying a specialist to a patient may also eliminate some costs of nursing care, monitoring equipment, and the costs of ambulance services to one or more hospitals, as well as potentially fewer days of hospitalization and rehabilitation for stroke patients, he says. Although costs vary among regions and hospital networks, the cost of transferring a physician in this case was roughly 20 percent ($2,000-$3,000) of the average patient helicopter transfer cost ($6,500-$8,000) for the hospital network. Other authors on this paper include Amgad El Mekabaty, Kelvin Hong, Karen Horton, Victor Urrutia and Shawn Brast of Johns Hopkins Medicine; Jacky Schultz of Suburban Hospital; and Imama Naqvi, John K. Lynch and Zurab Nadareishvili of the National Institutes of Health.


News Article | March 24, 2017
Site: www.techtimes.com

In an earlier study, Bert Vogelstein, from Johns Hopkins University, and colleagues argued that random DNA errors that happen when self-renewing cells divide can be blamed for the development of some cases of cancer. Now in a new study, the researchers showed how these unlucky mutations play a more significant role in the development of cancer than environmental, hereditary or lifestyle factors by providing for the first time the percentage of cancer mutations that is due to the latter three as well as random chance. The researchers found that in general, 66 percent of the genetic mutations that went on to become cancer are due to simple random errors that occur when the cells replace themselves. The findings showed that these bad luck mutations are the biggest cause of cancer when compared with other factors. Environmental factors contribute 29 percent and the remaining 5 percent are inherited. "Every time a perfectly normal cell divides, as you all know, it makes several mistakes -- mutations," explained Vogelstein. "Now most of the time, these mutations don't do any harm. They occur in junk DNA, genes unrelated to cancer, unimportant places with respect to cancer. That's the usual situation and that's good luck." Many people who engage in healthy practices such as eating a healthy diet and not smoking still develop cancer. The findings of the study offer one reason why despite having healthy lifestyles, some people have DNA mutations that lead to cancer. Despite the findings, the researchers still stressed the importance of healthy living. Some cancers are driven by avoidable factors. Vogelstein, for instance, said that the study does not contradict the generally accepted wisdom that most cases of lung cancer cases are preventable. Men who smoke have 23 times increased risk to develop lung tumors than their counterparts who do not smoke, and women smokers have 13 times increased chances than women who do not smoke. Cigarette smoking can lead to more genetic mutations than what can normally happen and just like random mutations, smoking-related mutations can either affect the cancer-driving genes or DNAs that are not relevant to cancer. While cancers affecting tissues that frequently divide such as colon cancer may have high input from chance mutation, diet, smoking and engaging in physical activities have roles in the development of the cancer as well. The researchers likewise said that the findings of the study fit well with the general advice that 40 percent of all cancers are preventable if people eat plenty of fruits and vegetables, exercise on a regular basis, avoid red meat, do not smoke ,and stay away from harmful UV rays such as those that come from tanning beds and sunlight. "Primary prevention is the best way to reduce cancer deaths. Recognition of a third contributor to cancer — R mutations — does not diminish the importance of primary prevention but emphasizes that not all cancers can be prevented by avoiding environmental risk factors," the researchers wrote in their study published in the journal Science. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 25, 2017
Site: www.futurity.org

Low levels of a brain protein may combine with another long-suspected culprit to trigger the learning and memory losses in Alzheimer’s disease, a study shows. The discovery should open up important new research areas, scientists say—and may one day lead to better therapies for the disease and other forms of cognitive decline. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, a neuroscientist at Johns Hopkins University School of Medicine and the senior scientist in the study. Alzheimer’s is estimated to affect more than 5 million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of Alzheimer’s patients, have often taken the blame for the mental decline associated with the disease. But autopsies and imaging studies reveal that people can have high levels of amyloid in the brain without displaying Alzheimer’s symptoms, which calls into question a direct link between amyloid and dementia. The new study, published in eLife, shows that when the NPTX2 gene produces less of its protein at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to work together are disrupted. That results in a failure of memory. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure,” Worley says. Worley’s lab group studies “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. NPTX2 is one of these immediate early genes; it makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” Worley says. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by studies indicating altered patterns of activity in brains of people with Alzheimer’s and wondered whether altered activity was linked to changes in immediate early gene function. To get answers, researchers first turned to archived human brain tissue samples. They discovered that NPTX2 protein levels were reduced by as much as 90 percent in brain samples from Alzheimer’s patients. Samples with amyloid plaques from people who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. The scientists then examined mice bred without the rodent equivalent of the NPTX2 gene and discovered that a lack of NPTX2 alone wasn’t enough to affect cell function. But then they added a gene that increases amyloid generation to the mouse brains. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Examination of cerebrospinal fluid from 60 living Alzheimer’s patients and 72 people without the disease provided further evidence. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques,” Worley says. “This means that NPTX2 represents a new mechanism. One immediate application, he says, may be figuring out if NPTX2 levels can help identifying patients who can best be helped by new drugs. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. Worley’s group is helping companies try to develop a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in Alzheimer’s and how to prevent or slow that process. Others involved in the study work at Johns Hopkins; the National Institutes of Health; the University of California, San Diego; Shiley-Marcos Alzheimer’s Disease Research Center; Columbia University; the Institute for Basic Research; and the University of Exeter. The National Institutes of Health, the Alzheimer’s Disease Discovery Foundation, and LuMind Research Down Syndrome Foundation funded the work.


News Article | April 27, 2017
Site: www.scientificamerican.com

As if Facebook wasn’t already pervasive enough in everyday life, the company’s newly formed Building 8 “moon shot” factory is working on a device they say would let people type out words via a brain–computer interface (BCI). If all goes according to plan—and that’s a big if—Building 8’s neural prosthetic would strap onto a person’s head, use an optical technique to decode intended speech and then type those thoughts on a computer or smartphone at up to 100 words per minute. This would be an order-of-magnitude faster than today’s state-of-the-art speech decoders. The use of light waves to quickly and accurately read brain waves is a tall order, especially when today’s most sophisticated BCIs, which are surgically implanted in the brain, can translate neural impulses into binary actions—yes/no, click/don’t click—at only a fraction of that speed. Still, Facebook has positioned its Building 8 as an advanced research and development laboratory launched in the model of Google’s X, the lab behind the Waymo self-driving car and Glass augmented-reality headset. So it is no surprise Building 8’s first project out of the gate proposes a pretty far-fetched technology to tackle a problem that neuroscientists have been chipping away at for decades. Here’s how the proposed device would work: the BCI will use optical fibers to direct photons from a laser source through a person’s skull into the cerebral cortex, specifically those areas involved in speech production. The BCI would “sample groups of neurons [in the brain’s speech center] and analyze the instantaneous changes in optical properties as they fire,” says Regina Dugan, head of Building 8 and a former executive at both Google and the Defense Advanced Research Projects Agency (DARPA). Light scattering through the neurons would reveal changes in their shape and configuration as the brain cells and their components—mitochondria, ribosomes and cell nuclei, for example—move. Building 8’s BCI would measure the number and type of photons bounced off of the neurons in the cortex and send that information—wirelessly or via a cable—to a computer that uses machine-learning software to interpret the results. That interpretation would then be typed as text on the screen of a computer, smartphone or some other gadget. The speech production network in your brain executes a series of planning steps before you speak, says Mark Chevillet, Building 8’s technical lead on the BCI project. “In this system we’re looking to decode neural signals from the stage just before you actually articulate what you want to say.” Because the researchers are focusing on a very specific application—speech—they know the prosthetic’s sensors must have millimeter-level resolution and be able to sample brain waves at about 300 times per second in order to measure the brain’s speech signals with high fidelity, Dugan says. “This isn’t about decoding random thoughts. This is about decoding the words you’ve already decided to share by sending them to the speech [production] center of your brain,” she says. The brain’s speech centers usually refer to Wernicke’s area (speech processing) and Broca’s area (speech production). The latter then sends output to the motor cortex to produce the muscle movements that result in speech. Chevillet and Dugan position the project as a potential communication option for the large numbers of people who suffer from amyotrophic lateral sclerosis (ALS) and other conditions that prevent them from being able to type or even speak. Furthermore, Dugan points out that interface would also offer a “more fluid human–computer interface” that supports Facebook’s efforts to promote augmented reality (AR). “Even a very simple capability to do something like a yes/no brain click would be foundational for advances in AR,” Dugan says. “In that respect it becomes a bit like the mouse was in the early computer interface days. Think of it like a ‘brain mouse.’” For all of that to happen, Building 8 must develop a BCI that fits over the head while also being able to produce the high-quality signals needed to decode neural activity into speech, says Chevillet, a former program manager of applied neuroscience at Johns Hopkins University. He and his team want to build a modified version of the functional near-infrared spectroscopy (fNIRS) systems used today for neuroimaging. Whereas conventional fNIRS systems work by bouncing light off a tissue sample and analyze all of the returning photons no matter how diffuse, Building 8’s prosthetic would detect only those photons that have scattered a small number of times—so-called quasi-ballistic photons—in order to provide the necessary spatial resolution. Additional challenges remain if and when Chevillet’s team can deliver their proposed prosthetic. One is whether the changes in the returning light will create patterns unique enough to represent each of the letters, words and phrases needed to translate brain waves into words on a screen, says Stephen Boppart, director of the University of Illinois at Urbana–Champaign’s Center for Optical Molecular Imaging. If that is possible, you might be able to train a person to generate different thought patterns over time that would correspond to a particular word or phrase, “but that hasn’t really been demonstrated,” he says. Dugan and Chevillet acknowledge the obstacles but say they intend to build on key research related to their work. One recent study, for example, demonstrated several paralyzed individuals could communicate using signals recorded directly from parts of the motor cortex that control arm movements, achieving some of the fastest brain-typing speeds to date (ranging from three to eight words per minute). Another study showed machine learning can successfully decode information from neural signals. Both projects, however, relied on electrodes placed in or on the surface of the brain. Chevillet’s team hopes to have a good idea of the technology needed to create their new optical prosthetic within two years, although it is unclear when they might build a working prototype. To meet these ambitious goals Building 8 has, over the past six months, recruited at least 60 scientists and engineers from the University of California, San Francisco; U.C. Berkeley; Johns Hopkins University’s Applied Physics Laboratory; Johns Hopkins Medicine; and Washington University School of Medicine in Saint Louis who specialize in machine-learning methods for decoding speech and language, optical neuroimaging systems and advanced neural prosthetics, Dugan says. Regardless of whether Building 8 succeeds in delivering its BCI prosthetic, Facebook’s investment in the project is a big win for science, says Adam Gazzaley, founder and executive director of U.C. San Francisco’s Neuroscape translational neuroscience center. “We have increasing struggles to squeeze money out of the National Institutes of Health, especially to do high-risk, high-reward projects like what Facebook is describing,” says Gazzaley, who is not involved in the Building 8 research. “It’s a great sign and should be encouraged and applauded if large companies in the consumer space are taking such serious efforts to be innovative in neuroscience.”


News Article | April 25, 2017
Site: www.sciencedaily.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer's disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer's and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is "turned down" at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to "speak in unison" are disrupted, resulting in a failure of memory. "These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer's disease," says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper's senior author. "The key point here is that it's the combination of amyloid and low NPTX2 that leads to cognitive failure." Since the 1990s, Worley's group has been studying a set of genes known as "immediate early genes," so called because they're activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen "circuits" in the brain. "Those connections are essential for the brain to establish synchronized groups of 'circuits' in response to experiences," says Worley. "Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information." Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer's. Worley's group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn't enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain "rhythms" important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic -- one depending on the other for the effect -- it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers -- including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. "Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease. This creates many new opportunities," says Worley. "One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies." Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed.


News Article | April 26, 2017
Site: www.biosciencetechnology.com

Clinical-stage biotech company Capricor Therapeutics Inc., has announced positive six-month results from a randomized 12-month Phase 1/2 trial in patients with Duchenne muscular dystrophy (DMD), which was designed to analyze safety and exploratory efficacy. DMD is a rare, genetic disorder that often occurs in boys.  It involves progressive muscular weakness and patients often experience frequent falls, have trouble getting up, or performing daily tasks such as eating, as well as learning disabilities, and treatment options are limited. Cardiac disease is the most common cause of death among those with DMD, who often don’t survive past their twenties, and early results from the trial showed statistically significant improvements in measures of cardiac and upper limb function in patients treated with the cell-based therapy called CAP-1002. CAP-1002 is comprised of allogeneic cardiosphere-derived cells, or CDCs, which are a type of progenitor cell. The HOPE trial, involved 25 patients, aged 12 years an older, with DMD who had cardiomyopathy, or heart disease secondary to DMD.  Thirteen patients were randomized to receive a single dose of CAP-1002, while 12 received usual care. The cell therapy was infused into the three main coronary arties, with a total dose of 75 million cells. MRI assessments showed that those receiving the cell-therapy showed significant improvements in systolic thickening of the inferior wall of the heart. “The observed signal in global cardiac scar reduction and the increase in the thickening of the left ventricle during contraction are very encouraging,” Joao A. C. Lima, M.D., professor of medicine and Director of the magnetic Resonance Imaging Core Lab at Johns Hopkins University School of Medicine, said in a prepared statement. “The population treated in HOPE had very advanced cardiac involvement, and to see such positive results following just a single-dose of CAP-1002 is remarkable.” There were also statistically significant improvements in upper limb function, which was assessed using a test designed specifically for use in a DMD setting, called the Performance of the Upper Limb test (PUL). The test simulates common daily activities that patients with DMD often have trouble performing, such as tearing paper or removing a container lid. The treatment was generally safe and well-tolerated no patients experienced a major adverse cardiac event. “In HOPE, we saw potential effects in both the heart and skeletal muscle that appear quite compelling in an exploratory trial,” principal investigator of the trial, John L. Jefferies, M.D., professor of pediatric cardiology and adult cardiovascular diseases at the University of Cincinnati said in a prepared statement. “These results clearly support the conduct of a confirmatory clinical trial in DMD to further evaluate the potential of CAP-1002. We look forward to an effective medication becoming available for people with this progressive and fatal disease, one that is poorly met by current options.” Capricor plans to request Breakthrough Therapy or Regenerative Medicine Advanced Technology designations for the therapy from the Food and Drug Administration. The company said it anticipates releasing top-line 12-month data results from the trial during the fourth quarter of 2017.


News Article | April 26, 2017
Site: www.biosciencetechnology.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley, who is also a member of the Institute for Basic Biomedical Sciences. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed.


News Article | April 17, 2017
Site: www.greentechmedia.com

Amid all the cacophony of President Donald Trump's first two months in office, it can be hard to parse what changes at the Department of Energy fall within the range of normalcy for a new administration -- and which do not. Well, sure, that recently appointed massage therapist from New Hampshire who tweeted about exterminating Muslims proved deviant enough to be relieved of duty. And the political aides assigned to monitor each of the cabinet secretaries for the White House have raised some eyebrows, according to The Washington Post. More generally speaking, though, it's customary for an incoming president to switch things up and appoint supporters to take the cabinet departments in new directions. The investigative team at ProPublica shed some light on this transition by digging up a list of "beachhead" appointments whom the White House has installed at the departments to get things off the ground while the permanent leadership teams materialize. GTM analyzed the new recruits installed at the DOE to see what perspectives they are bringing to the table. This is not a complete list of everyone new at the department -- the DOE declined to comment on specific personnel matters -- but from the information available, a few trends emerged. Of the 28 hires listed by ProPublica, 10 have discernable energy experience. Some worked as energy staffers on the Hill, some worked in the George W. Bush DOE and at least one spent some time at the Federal Energy Regulatory Commission. Most of the appointees with energy experience worked as lawyers, advocates or spokespeople for coal, oil or gas companies. Of the new energy staffers who have not worked in energy-related fields, many served the Trump campaign as state-level coordinators. Several had served Gov. Rick Perry in some capacity before he assumed leadership of the department. One used to work for Sen. Jeff Sessions, the early Trump backer and current attorney general. Then there are the wild cards, like Kyle Yunaska, a tax analyst at Georgetown University whose primary connection to the administration appears to be his status as the brother-in-law of Eric Trump, the president's son. Here's a breakdown of the new recruits and the perspectives and experience they're bringing to DOE. Every new administration brings in its own political appointees. It's a natural reward for hard work on an election that paid off. It's not surprising, then, to see eight of the new staffers come from roles in the Trump campaign. The beachhead team includes operatives from Ohio, Illinois, Texas and North Carolina. A few others advised Gov. Rick Perry and Ben Carson in their respective presidential bids. For instance, Justin Bis served as deputy state director of the Ohio GOP during the election cycle, with previous experience at the Michigan Republican Party. Brett Fetterly is coming from a foreign policy position at Johns Hopkins University's School of Advanced International Studies; he also advised Gov. Rick Perry's 2016 presidential campaign on foreign and security policy. ProPublica lists their titles, and those of most of the beachhead hires, as "assistant to the secretary." Higher up in the campaign hierarchy, Wells Griffith, battleground states director for the Trump campaign, will take on the role of "senior White House advisor." Then there's White House Liaison Joseph Uddo, who made headlines in April for his brusque approach to packing some Trump supporters into Delaware's GOP delegation. These and the other campaign vets will now apply their political experience to the day-to-day operations of the department. Of the 10 beachhead hires with a readily apparent background in energy, at least six have professionally represented or advocated for fossil fuel companies. Incoming Executive Advisor G. Michael Brown describes his experience in his LinkedIn summary: as manager for market development and government relations in Texas for Chesapeake Energy, "he handled state-specific issues, advocated for increased demand for compressed natural gas (CNG) vehicles and managed a nationwide public speaking and advocacy portfolio within the oil and gas sector." Attorney Joshua Campbell brings "11 years of oil & gas industry experience." Suzanne Jaworowski served as director of communications at Sunrise Coal, a company active in the Illinois basin that bills itself as the second-largest coal producer in Indiana. Doug Matheney ran the Ohio operation of the Count on Coal campaign, which attacked the Clean Power Plan and other regulations that would make it harder to burn coal for electricity. Resumes like that are sure to make clean energy advocates anxious, but they also denote an understanding of different aspects of the American energy system. "The people who represent these industries are the people who know these industries the best, whether they’re renewables or whether they’re coal," said Frank Maisano, an energy specialist at Bracewell’s policy resolution group, which represents energy companies across the fossil fuel, renewables and utility sectors. "This administration is going to have a different focus than the previous administration," he added. "They don't like renewables as much as the previous administration." The Obama administration wanted to support the growth of renewables, and brought people to DOE with experience in that industry. Trump has long made clear a preference for bolstering domestic coal and gas production, so it's not surprising he would turn to people who have advocated for those industries. The question now is how the fossil fuel professionals will use their positions within DOE to influence the nation's energy mix. Fossil fuels still produce two-thirds of the electricity made in the U.S., so those industries remain relevant to energy policymaking. Renewables advocates, though, are concerned that new arrivals might disadvantage the up-and-coming clean energy technologies in favor of incumbent coal, gas and oil. Scott Sklar has worked for clean energy in Washington for decades, including 15 years at the helm of the Solar Energy Industries Association. He's seen Republican administrations bring in their people to the DOE's renewables offices, and said progress on clean technologies had come out of that process in the past. "In the two different Bush administrations, many of the people in it from the traditional industries held a very open mind on allowing, in principle, new technologies in the marketplace -- that at least you should not put in artificial barriers," Sklar said. "In this administration, because they're picking a set of players who have been ideologically opposed to opening markets for new technologies and have been actively involved in trying to close off the new technologies, we don't expect that more dispassionate scenario to play out." Sklar was referring to a number of Trump appointees with ties to Koch Industries, the massive oil and gas conglomerate whose owners have sponsored well-documented efforts to forestall the growth of renewable energy and electric cars at the state and national levels. Trump tapped a former Koch lobbyist, Thomas Pyle, to run the energy transition team. Among the beachhead hires, there's an indirect monetary connection. Travis Fisher and Daniel Simmons worked in policy roles at the Institute for Energy Research, a D.C. nonprofit that produces research critical of clean energy subsidies. Official filings show that Charles Koch was on the board of directors of IER's predecessor, and Politico reported that the Kochs help fund IER. Trump in energy speeches has cited IER research that predicts big economic benefits from increased fossil fuel drilling on public lands. That indicates the organization had some sway in his energy thinking even before some of its members joined his DOE, or at least that its findings fit a narrative he wished to broadcast. Now the administration is hiring people for the DOE whose experience makes them well suited to carry out that vision of expanded extraction of domestic fuels. Unlike the emerging fields of clean energy, where DOE labs and basic research grants have helped new technologies get to market, oil, gas and coal companies have been operating at scale for decades, and tend to cite government regulations rather than the vagaries of cutting-edge technology transfer as the key obstacle to their growth. What service, exactly, can Trump's DOE provide to expand extraction of conventional fuels? There is an Office of Fossil Energy which guides research into "clean coal," carbon capture and storage and unconventional oil and gas recovery. Beachhead hire Mark Maddox was an acting assistant secretary there for the George W. Bush administration. But we haven't heard much talk lately about utilizing that unit, and the president's skinny budget includes fossil energy among the DOE programs slated for budget cuts (coal companies have asked that it be spared). A less active DOE might not hurt conventional fuels, but it certainly would put a damper on the innovation that has helped reduce the costs of renewable energy and expand its deployment in recent years. That could eventually decrease the rate at which renewables eat into conventional fuels' market share. Brash and unqualified hires, like the quickly-fired massage therapist, aren't the most pressing threat to renewables, said Minh Le, who ran the SunShot program at DOE and worked in President Barack Obama's Office of Management and Budget. "The greater concern comes from professional energy-sector appointees with immutable ideological beliefs about climate change and who know how to get things done, rather than from former campaign staffers with little or no relevant energy sector background and whose only qualification was supporting the president in the campaign." As the beachhead hires settle into their new roles, it will become clearer whether they have a positive strategy to enact for boosting American fuel production, or whether stripping away help for advanced clean energy will be more of a priority.


News Article | April 17, 2017
Site: www.newscientist.com

The laws of attraction rule Titan’s sands. Static electricity clumping up sand could explain the strange dunes on Saturn’s largest moon. Titan is a hazy moon with a thick, orange nitrogen atmosphere. Its poles are home to placid methane lakes, and its equatorial regions are covered with dunes up to 100 metres high. The dunes seem to be facing in the wrong direction, though. The prevailing winds on Titan blow toward the west, but the dunes point east. “You’ve got this apparent paradox,” says Josef Dufek at the Georgia Institute of Technology in Atlanta. “The winds are moving one way and the sediments are moving the other way.” To understand the shifting of Titan’s sands, Dufek and his colleagues placed grains of organic materials like those on Titan’s surface in a chamber with conditions simulating Titan’s and spun them in a cylindrical tumbler. When they opened the chamber, static electricity from the grains jostling in the dry air had clumped them together. “It was like when you open a box on a winter morning and the packing peanuts stick everywhere,” says Dufek. “These hydrocarbons on Titan are low density and they stick to everything, just like packing peanuts do.” Grains on Titan can maintain that charge and stick together for much longer than particles on Earth could, because of their low density and the dryness of Titan’s atmosphere. That could explain why the dunes don’t align with the wind. The breeze close to Titan’s surface is relatively mild, generally staying below 5 kilometres per hour. The sand’s “stickiness” would make it difficult for such low winds to move them. More powerful winds from storms or seasonal changes could blow otherwise-stable sands eastward, forming the dunes that we see today. “The relative importance of electrostatic forces on blowing sand are likely to be more significant on Titan,” says Ralph Lorenz at the Johns Hopkins University Applied Physics Lab in Laurel, Maryland. “The wind speed at which particles start to move could be higher than we might otherwise expect.” The unique clumping of Titan’s sands may even explain how the grains got there in the first place. Their make-up is similar to particles suspended in the soupy atmosphere, but the sand grains are much bigger. “The atmospheric particles are very small, so they can’t be the things blowing around in those dunes, but this is one way that we could make them grow,” says Jani Radebaugh at Brigham Young University in Utah. Once enough particles had clumped together, they would fall out of the sky, coating the moon’s surface like electric snow.


News Article | April 17, 2017
Site: www.chromatographytechniques.com

After nearly 40 years of searching, Johns Hopkins researchers report they have identified a part of the human genome that appears to block an RNA responsible for keeping only a single X chromosome active when new female embryos are formed, effectively allowing for the generally lethal activation of more than one X chromosome during development. Because so-called X-inactivation is essential for normal female embryo development in humans and other mammals, and two activated X chromosomes create an inherently fatal condition, the research may help explain the worldwide human sex ratio that has slightly favored males over females for as long as science has been able to measure it. The results appear online in the April 12 issue of the journal PLOS ONE. In each cell, most humans have 23 pairs of chromosomes, for a total of 46. Twenty-two are so-called autosomes and are the same in both males and females. The 23rd pair is composed of the sex chromosomes, either two X's, in the cases of females, or an X and a Y, in the case of males. Sex chromosome researchers have long known that the vast majority of human and other mammalian females have two X chromosomes, while the vast majority of males have a single X and a Y, and only one X chromosome is active in females. Studies done elsewhere identified the mechanism behind the silencing of X chromosomes: a gene called Xist, short for X-inactive specific transcript. Located on the X chromosome itself, Xist produces a protein that spreads up and down the chromosome during female embryonic development, turning off its genes. However, says Barbara R. Migeon, professor of pediatrics at the Johns Hopkins University School of Medicine and a pioneer in X-inactivation research, she and her colleagues reported nearly four decades ago that in some human embryos with triploidy—a condition in which there are three sets of chromosomes instead of the usual two—two copies of the X chromosome remained active. The most likely explanation for this phenomenon, Migeon reasons, was that a protein that represses the X chromosome silencing activity of Xist was working overtime, allowing more than one X chromosome to remain activated. However, she says, the gene responsible for this repressor, or even its approximate location in the human genome, has been unclear. To identify the likely location of the repressor protein and the gene that codes for it, the researchers started by looking at cells from human embryos with different forms of chromosomal trisomy, a condition in which cells carry three copies of a particular chromosome instead of two. For example, Down syndrome in humans is marked by a trisomy of chromosome 21. Because having two active X chromosomes is lethal very early in development—before a new embryo even implants into the uterine wall—Migeon and her colleagues focused on autosomal trisomies. The research team reported finding examples of trisomies in every chromosome in embryos that survived at least until later stages, except chromosomes 1 or 19. "Trisomies of these chromosomes were missing, suggesting that the repressor might be located on one of them," says Migeon. Delving deeper, the researchers turned to two different genetic databases: the Online Mendelian Inheritance in Man, developed and maintained at Johns Hopkins; and the University of California, Santa Cruz, Genome Browser, to look for genes or genomic regions of chromosomes 1 and 19 thought to produce proteins that interact with Xist. The researchers hunted for genes responsible for adding or subtracting so-called epigenetic marks, which attach to DNA and affect whether a cell can use a given gene. They narrowed their search to a few candidate regions, then turned to a third database, Decipher, which makes it possible to compare human genome variants on tens of thousands of patients with genetic disorders worldwide. On Decipher, the research team looked for genes in the "candidate" regions that showed skewed sex ratios linked to the number of DNA duplications and deletions they could count. The team reasoned that if the repressor was in a region that was duplicated, it would work overtime and turn off Xist on both X chromosomes, leaving both the X chromosomes active and selectively changing the survival of male vs. female embryos. Only one section of the human genome fit the bill with these criteria—a stretch of DNA on the short arm of chromosome 19. "We now believe the repressor gene must be located there," Migeon says, "because we've eliminated all the other possibilities." She explains that a gene or gene cluster in this region of the genome, which extends for eight megabases, or 8 million of the 6 billion nucleotides that make up all DNA on the human genome, could hold the key to understanding why the worldwide ratio of males to females is skewed at 1.05-1.06-to-1. "Any genetic glitch that causes a trisomy or partial trisomy of that specific region on chromosome 19 would effectively eliminate a resulting female embryo," Migeon suspects, although it's impossible to know how often such genetic mistakes occur. Virtually all government-funded experiments on human embryos are prohibited by law in the United States, but some are allowed in some European countries, Migeon notes. By eliminating or adding in extra copies of genes in the candidate region that she and her team identified, she says, other researchers might be able to eventually identify the specific gene or genes that encode Xist's repressor. Migeon's career, spanning nearly six decades at Johns Hopkins, has centered on the X chromosome and how the doses of proteins generated by the genes on this chromosome equilibrate between the sexes. Along with her husband, retired Johns Hopkins pediatric endocrinologist Claude Migeon, the pair has made numerous discoveries related to sex chromosomes and sex differentiation.


News Article | April 17, 2017
Site: cen.acs.org

Right-handed amino acids—found in small doses in nature—are largely a mystery. In humans, most linger at low concentrations yet play unknown roles in the body. A new survey of right- and left-handed amino acids in mouse brains thickens the plot. The study’s findings imply that the brain tightly regulates the levels of right-handed amino acids and that undiscovered enzymes might flip lefties to righties (ACS Chem. Neurosci. 2017, DOI: 10.1021/acschemneuro.6b00398). Ubiquitous left-handed -amino acids serve as the building blocks of all proteins. The roles of their right-handed counterparts, -amino acids, are still not fully understood. In 2000, scientists figured out that one of them, -serine, is a neurotransmitter. But, to find the functions of others, scientists must conduct baseline surveys that determine normal levels of - and -amino acids in various parts of the body, says Daniel W. Armstrong of the University of Texas, Arlington. Armstrong’s group collaborated with Adam L. Hartman’s laboratory at Johns Hopkins University to measure amounts of - and -amino acids in the cortices and hippocampi of mice. The collaborators separated blood from brain tissue and then purified the amino acids in the samples. They measured them by separating the and varieties with chiral high-performance liquid chromatography. “Many curious things turned up,” Armstrong says. Levels of most of the 12 -amino acids measured were 10 to 2,000 times as high in the brain as in the blood. The concentration of the neurotransmitter -serine was among the highest, but -aspartate and -glutamine were even higher. Such quantities suggest many of the -amino acids play an active role in the brain, Armstrong says. Particularly striking was the conspicuous absence of -glutamate anywhere in the brain or blood. -glutamate is the most abundant amino acid in the brain—it is also a neurotransmitter—so Armstrong expected to see at least some -glutamate. Yet his team couldn’t detect it at all with a detection threshold of 0.05 ng per mg of tissue. The unexpected absence, Armstrong says, implies that the body keeps -glutamate low for a physiological reason. It also implies that the brain has a mechanism for efficiently removing -glutamate or keeping its levels very low. Armstrong surmises that undiscovered enzymes may be at work. A known enzyme converts - and -glutamate to - and -glutamine, so perhaps a stereoselective enzyme takes -glutamine back the other way but doesn’t take -glutamine, making -glutamine a sink for -glutamate, he says. High brain levels of -glutamine support that hypothesis, Armstrong says. Herman Wolosker of Technion—Israel Institute of Technology notes that for some of the less abundant amino acids, relatively high fractions exist in the D form. Isoleucine, for instance, is 24% right-handed in the hippocampus. “It points to the possibility of additional enzymes in the brain that transform - into -amino acids,” he says.


News Article | April 17, 2017
Site: www.prweb.com

Dr. Jai Joshi is an experienced, board certified medical oncologist. Dr. Joshi received his medical degree and completed his internal medicine residence at the Christian Medical College in Ludhiana, India. He has trained at some of the best institutions in the United States: fellowship in hematology/oncology at the National Cancer Institute (NIH) and the University of Colorado Medical Center, then assistant professor of medicine and oncology at the Johns Hopkins School of Medicine and at MD Anderson Cancer Center. Dr. Joshi was awarded the prestigious RO1 Grant Award from the NCI, NIH to study infections in patients with acute leukemia. He has authored book chapters and Editorials and published extensively in peer-reviewed journals making an impact in the field of cancer. He has awarded Excellence in Teaching Awards with Hopkins, including one from the Department of Medicine, the Johns Hopkins University/Sinai Hospital of Baltimore for enthusiastic support and dedication to resident education including the compilation of the “Joshi Handbooks in Medicine”. He is the subject of biographical record in Who’s Who in Frontier Science and Technology, International Who’s Who of Contemporary Achievement and in Men of Achievement. His accomplishments appear in the International Biographical Center Register of Profiles of Personalities of America. Since then, Dr. Joshi has helped initiate new cancer programs and academic oncology practices in various small towns and hospitals (Jasper, IN, Laredo and Eagle Pass, TX), with citywide hospital-based tumor boards and actively engaged in community seminars, and radio and television talk shows. Dr. Joshi's patient philosophy is that interactions between a good doctor and his patients will always bring forth the physician's humanity in amazing light.


News Article | April 17, 2017
Site: www.prweb.com

A noted economist will visit Hood College on April 5 to discuss the economic implications of the new administration’s policies. Anirban Basu, J.D., will give his talk, “Markets, He Wrote: Looking for Clues into the Economy’s Direction,” at 5:30 p.m. in Hodson Auditorium in Rosenstock Hall. His presentation will provide detailed discussions of global, national and regional economies using the most up-to-date data available. He will give special attention to critical elements of economic life, including the performance of financial, labor, and real estate markets. Basu is chairman and CEO of Sage Policy Group, Inc., an economic and policy consulting firm headquartered in Baltimore, Maryland, with offices in Pennsylvania and Indonesia. The firm provides strategic analytical services to energy suppliers, law firms, medical systems, government agencies and real estate developers among others. Basu is chair of the Maryland Economic Development Commission and the Baltimore County Economic Advisory Committee. He is also the chief economist to Associated Builders and Contractors and chief economic adviser to the Construction Financial Management Association. He lectures at Johns Hopkins University in global strategy. In both 2007 and 2016, the Daily Record newspaper selected him as one of Maryland’s 50 most influential people. The Baltimore Business Journal named him one of the region’s 20 most powerful business leaders in 2010. He earned his bachelor’s degree in foreign service at Georgetown University, a master’s in public policy from Harvard University’s John F. Kennedy School of Government, a master’s in economics from the University of Maryland, College Park, and a juris doctorate at the University Of Maryland School Of Law. Basu’s lecture is the latest installment of the La Fleur Management Lecture Series, sponsored by Bruce La Fleur, a Hood College MBA alumnus. This event is sponsored by the Hood Department of Economics and Business Administration. For more information, please contact Anita Jose at ajose(at)hood.edu or 301-696-3691.


News Article | March 28, 2017
Site: www.techtimes.com

Australian scientists are recruiting budding or amateur stargazers in the search for the elusive ninth planet believed to orbit the solar system. Astronomers from the Australian National University recently released thousands of images for the public to help pinpoint the location of Planet Nine, which is speculated to be located beyond Neptune and Pluto. The thousands of images were captured by the SkyMapper telescope at the university’s observatory in New South Wales. The robotic telescope has been producing a digital map of the southern sky, prompting researchers to share the output to anyone interested in discovering the theorized planet. “[B]ecause it's produced hundreds of thousands of images we're inviting the public, everyone, to access our images and try and find this planet," said ANU astronomer Dr. Brad Tucker in an ABC News report. “Planet Nine” is merely a working title, and stargazers have been promised a chance to naming it if they spot it on the website showcasing the digital images. Rules set by the International Astronomical Union, however, will guide the naming. A similar public search dubbed Backyard Worlds, a search of the northern sky, was launched by NASA last month. "If this planet exists, it's already in one of our thousands and thousands of images," Tucker told the BBC, explaining that using the website is much like “spot the difference.” After clicking a certain object on the images, the site will provide calculations and determine if it lies on an orbit fitting the planet’s proposed position and characteristics. The site will then transmit the information to the scientists, who will track the answers with their telescopes from around the world. The team is expecting the project will last a few months. “But the bulk of it we hope to plough through really quick,” Tucker added. Calculations from January 2016 suggest Planet Nine may be orbiting the sun, despite the fact that it is yet to be eyeballed by scientists. It has been projected to be about 10 times the size of Earth and 800 times more distant from the sun. According to Tucker, experts concluded that the planet existed after a study of Pluto’s orbit, which could have been affected by another planet’s gravity. Neptune was predicted the actual same way, he revealed. Recent findings from New Mexico State University researchers showed that Planet Nine could actually be a “rogue planet,” a free-moving object not bound to a specific star in the past, and eventually got snatched into our solar system by the gravitational pull of the sun. The solar system currently has eight recognized planets, after Pluto was demoted in 2006. But science is still all agog with the prospect of finding so many more, with a group proposing a new way to classify planets and potentially bringing the count to over 100. Johns Hopkins University’s Kirby Runyon and colleagues, defining a planet as "a sub-stellar mass body that has never undergone nuclear fusion,” proposed that factors defining a celestial object’s planetary qualifications should depend on the body itself, not just things such as location. And based on this proposed definition, Jupiter moon Europa and our own moon would be classifiable as planets. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 25, 2017
Site: www.rdmag.com

A team of computer engineers and neurosurgeons, with an assist from Hollywood special effects experts, reports successful early tests of a novel, lifelike 3D simulator designed to teach surgeons to perform a delicate, minimally invasive brain operation. A report on the simulator that guides trainees through an endoscopic third ventriculostomy (ETV) was published in the Journal of Neurosurgery: Pediatrics on April 25. The procedure uses endoscopes, which are small, computer-guided tubes and instruments, to treat certain forms of hydrocephalus, a condition marked by an excessive accumulation of cerebrospinal fluid and pressure on the brain. ETV is a minimally invasive procedure that short-circuits the fluid back into normal channels in the brain, eliminating the need for implantation of a shunt, a lifelong device with the associated complications of a foreign body. "For surgeons, the ability to practice a procedure is essential for accurate and safe performance of the procedure. Surgical simulation is akin to a golfer taking a practice swing," says Alan R. Cohen, M.D., professor of neurosurgery at the Johns Hopkins University School of Medicine and a senior author of the report. "With surgical simulation, we can practice the operation before performing it live." While cadavers are the traditional choice for such surgical training, Cohen says they are scarce, expensive, nonreusable, and most importantly, unable to precisely simulate the experience of operating on the problem at hand, which Cohen says requires a special type of hand-eye coordination he dubs "Nintendo Neurosurgery." In an effort to create a more reliable, realistic and cost-effective way for surgeons to practice ETV, the research team worked with 3D printing and special effects professionals to create a lifelike, anatomically correct, full-size head and brain with the touch and feel of human skull and brain tissue. The fusion of 3D printing and special effects resulted in a full-scale reproduction of a 14-year-old child's head, modeled after a real patient with hydrocephalus, one of the most common problems seen in the field of pediatric neurosurgery. Special features include an electronic pump to reproduce flowing cerebrospinal fluid and brain pulsations. One version of the simulator is so realistic that it has facial features, hair, eyelashes and eyebrows. To test the model, Cohen and his team randomly paired four neurosurgery fellows and 13 medical residents to perform ETV on either the ultra-realistic simulator or a lower-resolution simulator, which had no hair, lashes or brows. After completing the simulation, fellows and residents each rated the simulator using a five-point scale. On average, both the surgical fellows and the residents rated the simulator more highly (4.88 out of 5) on its effectiveness for ETV training than on its aesthetic features (4.69). The procedures performed by the trainees were also recorded and later watched and graded by two fully trained neurosurgeons in a way that they could not identify who the trainees were or at what stage they were in their training. The neurosurgeons assessed the trainees' performance using criteria such as "flow of operation," "instrument handling" and "time and motion." Neurosurgeons consistently rated the fellows higher than residents on all criteria measured, which accurately reflected their advanced training and knowledge, and demonstrated the simulator's ability to distinguish between novice and expert surgeons. Cohen says that further tests are needed to determine whether the simulator will actually improve performance in the operating room. "With this unique assortment of investigators, we were able to develop a high-fidelity simulator for minimally invasive neurosurgery that is realistic, reliable, reusable and cost-effective. The models can be designed to be patient-specific, enabling the surgeon to practice the operation before going into the operating room," says Cohen.


News Article | April 25, 2017
Site: www.eurekalert.org

A team of computer engineers and neurosurgeons, with an assist from Hollywood special effects experts, reports successful early tests of a novel, lifelike 3D simulator designed to teach surgeons to perform a delicate, minimally invasive brain operation. A report on the simulator that guides trainees through an endoscopic third ventriculostomy (ETV) was published in the Journal of Neurosurgery: Pediatrics on April 25. The procedure uses endoscopes, which are small, computer-guided tubes and instruments, to treat certain forms of hydrocephalus, a condition marked by an excessive accumulation of cerebrospinal fluid and pressure on the brain. ETV is a minimally invasive procedure that short-circuits the fluid back into normal channels in the brain, eliminating the need for implantation of a shunt, a lifelong device with the associated complications of a foreign body. "For surgeons, the ability to practice a procedure is essential for accurate and safe performance of the procedure. Surgical simulation is akin to a golfer taking a practice swing," says Alan R. Cohen, M.D., professor of neurosurgery at the Johns Hopkins University School of Medicine and a senior author of the report. "With surgical simulation, we can practice the operation before performing it live." While cadavers are the traditional choice for such surgical training, Cohen says they are scarce, expensive, nonreusable, and most importantly, unable to precisely simulate the experience of operating on the problem at hand, which Cohen says requires a special type of hand-eye coordination he dubs "Nintendo Neurosurgery." In an effort to create a more reliable, realistic and cost-effective way for surgeons to practice ETV, the research team worked with 3D printing and special effects professionals to create a lifelike, anatomically correct, full-size head and brain with the touch and feel of human skull and brain tissue. The fusion of 3D printing and special effects resulted in a full-scale reproduction of a 14-year-old child's head, modeled after a real patient with hydrocephalus, one of the most common problems seen in the field of pediatric neurosurgery. Special features include an electronic pump to reproduce flowing cerebrospinal fluid and brain pulsations. One version of the simulator is so realistic that it has facial features, hair, eyelashes and eyebrows. To test the model, Cohen and his team randomly paired four neurosurgery fellows and 13 medical residents to perform ETV on either the ultra-realistic simulator or a lower-resolution simulator, which had no hair, lashes or brows. After completing the simulation, fellows and residents each rated the simulator using a five-point scale. On average, both the surgical fellows and the residents rated the simulator more highly (4.88 out of 5) on its effectiveness for ETV training than on its aesthetic features (4.69). The procedures performed by the trainees were also recorded and later watched and graded by two fully trained neurosurgeons in a way that they could not identify who the trainees were or at what stage they were in their training. The neurosurgeons assessed the trainees' performance using criteria such as "flow of operation," "instrument handling" and "time and motion." Neurosurgeons consistently rated the fellows higher than residents on all criteria measured, which accurately reflected their advanced training and knowledge, and demonstrated the simulator's ability to distinguish between novice and expert surgeons. Cohen says that further tests are needed to determine whether the simulator will actually improve performance in the operating room. "With this unique assortment of investigators, we were able to develop a high-fidelity simulator for minimally invasive neurosurgery that is realistic, reliable, reusable and cost-effective. The models can be designed to be patient-specific, enabling the surgeon to practice the operation before going into the operating room," says Cohen. Other authors on this paper include Roberta Rehder from the Johns Hopkins School of Medicine, and Peter Weinstock, Sanjay P. Parbhu, Peter W. Forbes and Christopher Roussin from Boston Children's Hospital. Funding for the study was provided by a grant from the Boston Investment Conference. The research team acknowledges the contribution of FracturedFX, an Emmy Award-winning special effects group from Hollywood, California, in the development of the surgical models. The investigators report no financial stake or interests in the success of the simulator.


News Article | March 27, 2017
Site: www.techtimes.com

NASA has chosen an airborne observatory led by the University of Arizona (UA) over eight other proposed missions vying for NASA's Explorer category. With a target launch date of Dec. 15, 2021, the Galactic/Extragalactic ULDB Spectroscopic Terahertz Observatory (GUSTO) mission with its airborne observatory will fly across Antarctica at an elevation around 110,000 and 120,000 feet, or 17 miles above a typical commercial flight's cruising altitude. Basically, the Ultralong-Duration Balloon (ULDB) has a telescope with carbon, oxygen, and nitrogen emission line detectors mounted to a gondola. With a science payload of almost 2 tons, GUSTO will run on about 1 kilowatt of electrical power produced by solar panels. "NASA has a great history of launching observatories in the Astrophysics Explorers Program with new and unique observational capabilities. GUSTO continues that tradition," Paul Hertz, astrophysics division director in the Science Mission Directorate in Washington, stated. After launching from McMurdo, Antarctica, GUSTO is expected to stay up in the air up 170 days, depending on weather conditions. The total project cost is approximately $40 million dollars, including expenses for the balloon launch, post-launch operations, and data analysis. GUSTO will measure emissions from interstellar mediums, helping scientists get a clearer picture of the life cycle of interstellar gas in the Milky Way galaxy and the birth and death of star-forming clouds. According to experts, the interstellar medium is the material "from which most of the observable universe is made: stars, planets, rocks, oceans, and all living creatures." According to principal investigator Christopher Walker, a professor of astronomy at the UA's Steward Observatory, understanding the interstellar medium is key to understanding where we came from, "because 4.6 billion years ago, we were interstellar medium." Aside from the Milky Way, GUSTO will also map the Large Magellanic Cloud, which according to Walker, is a hallmark of a galaxy more commonly found in the early universe. Walker and his team will use cutting-edge superconducting detectors and other instruments that will enable them to listen in at very high frequencies. Walker said that with the measurements from the GUSTO mission, experts can have enough data to develop a model for earlier galaxies and our home galaxy, the Milky Way, which are the two "bookends" of evolution through cosmic time. As a prelude to the GUSTO mission, Walker's team triumphantly launched a balloon with a smaller telescope — the Stratospheric Terahertz Observatory, or STO — above South Pole back in December 2016. Johns Hopkins University is reportedly in charge for the GUSTO balloon's gondola. Other participating organizations in the GUSTO mission include NASA's Jet Propulsion Laboratory, Massachusetts Institute of Technology, Arizona State University, and the SRON Netherlands Institute for Space Research. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 24, 2017
Site: www.futurity.org

Patients are far more willing to disclose their sexual orientation to health care providers than those providers believe, new research suggests. More than three-quarters of hospital emergency room doctors and nurses thought patients would refuse to discuss their sexual orientation, while only 10.3 percent of patients said they would balk at the question, the study shows. Closing that disclosure gap, the investigators say, has the potential to improve the care of lesbian, gay, and bisexual patients, a population with historically poorer overall health and less access to health care and insurance than the straight population. About 8 million American adults identify as lesbian, gay, or bisexual, according to government estimates. The research, published in JAMA Internal Medicine, is part of the EQUALITY Study, a collaboration among researchers at the Johns Hopkins University School of Medicine, Brigham and Women’s Hospital, and Harvard Medical School. “Unlike racial/ethnic and age data, information about sexual orientation and gender identity has not been collected routinely in health care settings, which limits the ability of researchers and clinicians to determine the unique needs of the lesbian, gay and bisexual communities,” says Brandyn Lau, assistant professor of surgery at Johns Hopkins and the study’s senior author. “Health care providers haven’t collected these data, at least in part due to fear of offending patients, but this study shows that most patients actually would not be offended,” he says. Researchers aren’t taking to LGB teens of color The study focused on emergency departments, which get more than 130 million patient visits annually in the United States. EDs are the source of nearly half of US inpatient hospital admissions and the primary point of entry for uninsured and underinsured patients. The research team first interviewed 53 adult patients and 26 health care professionals from three community hospitals and two academic medical centers. Although clinicians recognized the importance of knowing a patient’s sexual orientation when medically relevant, most patients believed that sexual orientation was always relevant. Many felt health care professionals needed to know sexual orientation not just to enable them to provide relevant care, but also to help them recognize the lesbian, gay, and bisexual population. “Our patients are telling us,” says lead author Adil Haider, director of the Center for Surgery and Public Health at Brigham and Women’s Hospital, “that routinely asking all patients who come to the ED about this information creates a sense of normalcy toward people of all sexual orientations and signals that each patient is equally welcome here, including the 3 to 10 percent of Americans who identify as lesbian, gay, or bisexual.” Being rude to your doctor makes them mess up The second phase of the research was an online survey of 1,516 potential adult patients (244 lesbian, 289 gay, 179 bisexual, and 804 straight) and 429 ED health care professionals (209 physicians and 220 nurses). Just under 78 percent of the clinicians said patients would refuse to provide their sexual orientation if asked, but only 10.3 percent of patients said they would decline. Specifically, only about 10 percent of straight patients, 4.8 percent of lesbians, 12 percent of gay men, and 16.4 percent of bisexual patients said they would refuse to answer a question in the ED about sexual orientation. “We need to make collecting sexual orientation information a regular part of our practice, similar to how other demographic information such as age and race is collected,” Lau says. “Because I don’t think providers will start consistently collecting these data on their own, clinics and hospitals need to mandate it.” The Patient-Centered Outcomes Research Institute funded the work. Source: Johns Hopkins University The post Doctors don’t think we’ll answer this identity question appeared first on Futurity.


News Article | April 28, 2017
Site: www.prweb.com

In response to the significant number of Healthcare Associated Infections (HAIs) in hospitals, healthcare and long-term care facilities, two industry marketing veterans launched Design Manage Deliver (DMD), a technology company that delivers easy-to-use, cost-effective communications tools to promote infection prevention in healthcare environments. “HAIs are a major challenge in the industry, because only 40 percent of hospital staff, and a much smaller percentage of visitors, comply with recommended hand hygiene guidelines,” said Natalie Rose-Miller, a 20-year veteran of Sodexo Healthcare and co-founder of DMD. “The numbers are astounding, with over 700,000 infections and 75,000 deaths each year because of HAIs, along with annual costs of up to $147 billion including over $3 billion due to hospital readmissions, which are mostly avoidable.” A technology-based platform, DMD is a portal that provides quick, customized access to messaging and material to increase awareness of the need for hand hygiene among clinical and nonclinical staff, visitors, patients and residents in healthcare communities. DMD co-founder Tom Cancelmo said, “Healthcare and long-term care executives need to think differently and start communicating the importance of hand hygiene in a consistent and engaging manner if the problem is going to be solved.” DMD helps healthcare professionals asses which communications needs exist, and then enables users to create customized material to communicate inside hospitals and at satellite locations where germs and infections often spread including doctor offices, long-term care and rehab facilities, and urgent and surgical care centers. “Germs and infections are entering, exiting, and moving around these facilities all day,” added Rose-Miller. “Consider that the average patient has five visits from three different people each hour; staff might need to clean their hands 100 times in a 12-hour shift. You have to constantly remind them to be effective.” Research shows that 70 percent compliance is needed to improve hand hygiene and reduce infections. A recent study by Johns Hopkins University concluded a program that includes multi-media communications, education, leadership engagement, performance measurement, along with regular feedback and observation can increase compliance to over 70 percent, when measured by an independent audit. Most importantly, it will help sustain increased compliance levels. “There is clearly an industry need, and Design Manage Deliver can save healthcare professionals time and money, while increasing brand recognition around an important topic,” concluded Cancelmo. “You are communicating potentially life-saving messages to staff, patients and visitors; that’s the power of DMD’s portal.” Design Manage Deliver’s portal is an easy-to-use, cost-effective communications tool for promoting infection prevention throughout your healthcare community. Through our technology-based platform, you can quickly access fresh, sustainable messaging that heightens awareness of hand hygiene and engages every audience: your clinical and nonclinical staff, visitors, patients and residents. You’ll be able to customize and efficiently execute powerful communications at the click of a mouse.


News Article | April 27, 2017
Site: www.rdmag.com

Working with human brain tissue samples and genetically engineered mice, Johns Hopkins Medicine researchers together with colleagues at the National Institutes of Health, the University of California San Diego Shiley-Marcos Alzheimer's Disease Research Center, Columbia University, and the Institute for Basic Research in Staten Island say that consequences of low levels of the protein NPTX2 in the brains of people with Alzheimer’s disease (AD) may change the pattern of neural activity in ways that lead to the learning and memory loss that are hallmarks of the disease. This discovery, described online in the April 25 edition of eLife, will lead to important research and may one day help experts develop new and better therapies for Alzheimer’s and other forms of cognitive decline. AD currently affects more than five million Americans. Clumps of proteins called amyloid plaques, long seen in the brains of people with AD, are often blamed for the mental decline associated with the disease. But autopsies and brain imaging studies reveal that people can have high levels of amyloid without displaying symptoms of AD, calling into question a direct link between amyloid and dementia. This new study shows that when the protein NPTX2 is “turned down” at the same time that amyloid is accumulating in the brain, circuit adaptations that are essential for neurons to “speak in unison” are disrupted, resulting in a failure of memory. “These findings represent something extraordinarily interesting about how cognition fails in human Alzheimer’s disease,” says Paul Worley, M.D., a neuroscientist at the Johns Hopkins University School of Medicine and the paper’s senior author. “The key point here is that it’s the combination of amyloid and low NPTX2 that leads to cognitive failure.” Since the 1990s, Worley’s group has been studying a set of genes known as “immediate early genes,” so called because they’re activated almost instantly in brain cells when people and other animals have an experience that results in a new memory. The gene NPTX2 is one of these immediate early genes that gets activated and makes a protein that neurons use to strengthen “circuits” in the brain. “Those connections are essential for the brain to establish synchronized groups of ‘circuits’ in response to experiences,” says Worley. “Without them, neuronal activation cannot be effectively synchronized and the brain cannot process information.” Worley says he was intrigued by previous studies indicating altered patterns of activity in brains of individuals with Alzheimer’s. Worley’s group wondered whether altered activity was linked to changes in immediate early gene function. To get answers, the researchers first turned to a library of 144 archived human brain tissue samples to measure levels of the protein encoded by the NPTX2 gene. NPTX2 protein levels, they discovered, were reduced by as much as 90 percent in brain samples from people with AD compared with age-matched brain samples without AD. By contrast, people with amyloid plaques who had never shown signs of AD had normal levels of NPTX2. This was an initial suggestion of a link between NPTX2 and cognition. Prior studies had shown NPTX2 to play an essential role for developmental brain wiring and for resistance to experimental epilepsy. To study how lower-than-normal levels of NPTX2 might be related to the cognitive dysfunction of AD, Worley and his collaborators examined mice bred without the rodent equivalent of the NPTX2 gene. Tests showed that a lack of NPTX2 alone wasn’t enough to affect cell function as tested in brain slices. But then the researchers added to mice a gene that increases amyloid generation in their brain. In brain slices from mice with both amyloid and no NPTX2, fast-spiking interneurons could not control brain “rhythms” important for making new memories. Moreover, a glutamate receptor that is normally expressed in interneurons and essential for interneuron function was down-regulated as a consequence of amyloid and NPTX2 deletion in mouse and similarly reduced in human AD brain. Worley says that results suggest that the increased activity seen in the brains of AD patients is due to low NPTX2, combined with amyloid plaques, with consequent disruption of interneuron function. And if the effect of NPTX2 and amyloid is synergistic — one depending on the other for the effect — it would explain why not all people with high levels of brain amyloid show signs of AD. The team then examined NPTX2 protein in the cerebrospinal fluid (CSF) of 60 living AD patients and 72 people without AD. Lower scores of memory and cognition on standard AD tests, they found, were associated with lower levels of NPTX2 in the CSF. Moreover, NPTX2 correlated with measures of the size of the hippocampus, a brain region essential for memory that shrinks in AD. In this patient population, NPTX2 levels were more closely correlated with cognitive performance than current best biomarkers  — including tau, a biomarker of neurodegenerative diseases, and a biomarker known as A-beta-42, which has long been associated with AD. Overall, NPTX2 levels in the CSF of AD patients were 36 to 70 percent lower than in people without AD. “Perhaps the most important aspect of the discovery is that NPTX2 reduction appears to be independent of the mechanism that generates amyloid plaques. This means that NPTX2 represents a new mechanism, which is strongly founded in basic science research, and that has not previously been studied in animal models or in the context of human disease.  This creates many new opportunities,” says Worley. “One immediate application may be to determine whether measures of NPTX2 can be helpful as a way of sorting patients and identifying a subset that are most responsive to emerging therapies.” Worley says. For instance, drugs that disrupt amyloid may be more effective in patients with relatively high NPTX2. His group is now providing reagents to companies to assess development of a commercial test that measures NPTX2 levels. More work is needed, Worley adds, to understand why NPTX2 levels become low in AD and how that process could be prevented or slowed. In addition to Paul Worley, the study’s authors are Meifang Xiao, Desheng Xu, Chun-Che Chien, Yang Shi, Juhong Zhang, Olga Pletnikova, Alena Savonenko, Roger Reeves, and Juan Troncoso of Johns Hopkins University School of Medicine; Michael Craig of University of Exeter; Kenneth Pelkey and Chris McBain of the National Institute of Child Health and Human Development; Susan Resnick of the National Institute on Aging’s Intramural Research Program; David Salmon, James Brewer, Steven Edland, and Douglas Galasko of the Shiley-Marcos Alzheimer's Disease Research Center at the University of California San Diego Medical Center; Jerzy Wegiel of the Institute for Basic Research in Staten Island; and Benjamin Tycko of Columbia University Medical Center. Funding for the studies described in the eLife article was provided by the National Institutes of Health under grant numbers MH100024, R35 NS-097966, P50 AG005146, and AG05131, Alzheimer’s Disease Discovery Foundation and Lumind.


News Article | April 2, 2017
Site: www.techtimes.com

The world's strongest coffee has just been named, and it has dangerously high levels of caffeine in it. Caffeine is an organic substance that can naturally be found in as many as 60 plant sources — including coffee beans, tea leaves, cacao pods, and kola nuts. It can also be present in various prescription and over-the-counter drugs, such as allergy medications and pain relievers. It's also a common additive in most fat-loss supplements. Both the U.S. Food and Drug Administration and the International Food Information Council recommend 400 milligrams of caffeine daily. To get an idea of how much caffeine most commonly consumed beverages contain: Caffeine works as a natural stimulant by giving the central nervous system a kick. It blocks adenosine receptors in the brain, which is a neurotransmitter that relaxes the brain and makes one feel tired. Caffeine also amps up blood adrenaline levels and increases brain activity of the neurotransmitters dopamine and norepinephrine. In effect, caffeine is lauded for its incredible ability to keep a person awake all night, sharpen focus, improve concentration, and keep energy levels up. The stimulating effects of caffeine can start as early as 15 minutes after consumption and last up to 6 hours, based on an article by the University of Michigan Health Services. But aside from that, mounting research also shows that caffeine, especially in a hot cup of coffee, can bring amazing health benefits, such as: But of course, as with all things, too much of something is bad. How much is too much? For FDA, 600 milligrams, which is roughly four to seven cups of coffee, is considered too much. In excess, the common side effects of caffeine may include migraine, insomnia, irritability, stomach problems, and palpitations. Too much caffeine may also lead to sleep deprivation and eventually result in mood disorders and anxiety-related feelings, such as extreme nervousness, sweating, and tremors. In children, experts believe caffeine may negatively impact a developing brain. "Notably, caffeine interferes with sleep, and sleep plays a critical role in learning. Some laboratory research suggests that caffeine interferes with sleep and learning among adolescent rodents, which, in turn, hinders normal neurological development that is noticeable into adulthood," Steven E. Meredith, post-doctoral research fellow at the Johns Hopkins University School of Medicine, told Medical News Today. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 30, 2017
Site: www.npr.org

Being A Guinea Pig For Science Can Be A Long, Slow Slog "Why am I doing this, again?" I've asked myself that question several mornings over the past few months as my stomach begins growling, usually after I smell popcorn in my coworker's office. He's on a strict 10 a.m. popcorn schedule that coincides with my strict 10 a.m. hunger pang schedule. I am following an intermittent fasting program as part of a clinical trial for people with multiple sclerosis. For the past five months, I have tried to eat only between noon and 8 p.m., and am allowed only water, tea or coffee during the remaining 16 hours. It's part of a study at Johns Hopkins Medicine in which researchers are looking at bacteria in the guts of patients with multiple sclerosis to determine whether intermittent fasting changes the number, the types, or the functions of our bacteria. They're also looking to see if any of those changes affect inflammation and the symptoms we experience. Scientists know that fasting can affect the microbiome, according to Dr. Ellen Mowry, associate professor of neurology and epidemiology at the Johns Hopkins University and lead researcher for the study. But they don't yet know now. Right now, researchers are simply trying to determine which dietary changes affect the microbiome and what the effects are. It's a complicated interplay of microbes, the human body, the environment and genetics. The science is hard, yes, but so is the intermittent fasting! Mowry said in an email that when she tries to fast along with participants, she has found it difficult to maintain. "But often for me," she says, "this is related more to my mental stamina rather than physical." I agree. Over five months, it's been the same nearly every day — I do get a little hungry in the mornings, but I'm thinking about eating more often. I have only eaten any earlier than noon once or twice during the study, like the infamous O'Hare Airport incident when I just couldn't resist that bagel. I still have no regrets. My slip-ups tend to be when I'm running late and eat after 8 p.m. I don't think I've screwed up enough to affect the tests, and I've been honest when it comes to food logging. In a previous fasting study looking at calorie restriction, researchers had more to rely on than participants' word that they were following the diet. "We can predict the amount of weight people should lose in a given time period, so if they aren't doing it," Mowry says, "we can guess adherence isn't accurate." But for this study, it's not so easy. The biggest challenge is sample size — 54 people are enrolled in the study I'm doing. Some are fasting, some are on a restricted calorie diet; and some are in the control group, doing nothing different. "The studies are too small to be certain that any change in symptoms is related to the intervention," Mowry says. She wants to do larger studies, but finding funding for dietary studies is difficult. Also, self-reporting isn't always the most accurate way to get data. Mowry says ideally participants would log meals during food recalls every 24 hours with trained professionals. In other words, I would go in on a Tuesday and tell someone what I ate and drank on Monday. I would explain how food was prepared, how much I ate and drank, and so on. This isn't easy to do because, again, money. There's just not enough to hire the professionals needed. Instead of food recalls, we intermittent fasters text pictures twice a week of what we ate and drank that day. As I entered the final few months of the study, I found myself forgetting more and more to take the pictures before I eat. I have sent more than one photo of a mostly-empty plate with the note, "Sorry! I forgot to snap a picture." I was entering a period of fasting fatigue. Obviously, I needed to get pumped again. So I set up a Google Alert for "microbiome." I cozied up with Ed Yong's book I Contain Multitudes, about the human microbiome, before bed every night — I highly recommend it. I got sucked into thousands of journal articles, skimming everything from "Role of the Gut Microbiome in Obesity and Diabetes Mellitus" in Nutrition in Clinical Practice to "HOW RESEARCH INTO THE MICROBIOME CAN BE USED TO SOLVE CRIMES" in the Southern California Interdisciplinary Law Journal. (I have no idea why the title was in all caps, but I took that as a sign I should read it.) If I decide to continue with intermittent fasting once the study is over, new motivation may come in the form of results from a previous study that Mowry will present this fall. The results should show if calorie restriction or a more extreme form of intermittent fasting called 5:2 fasting affect metabolism and how they affect the microbiome. The thing is, I'm not sure if I want to continue. In the beginning, my boyfriend, who is particularly observant and good at catching subtle changes that I may not notice, said I seemed to have a bit more energy and was not complaining (my word, not his!) about pain as often. And fasting has been a good way to maintain my weight. But a few months ago, my MS symptoms seemed to be worsening. This could be for a couple of reasons, but the big one is likely stress. Finishing my thesis, working, freelancing, gymming ... things reached a frenzied pace, and my pain levels skyrocketed. So did intermittent fasting help? A little, maybe? Not at all? The jury is still out, and it will likely end in deadlock. There are too many factors for me to consider. In about a month I'll return to the doctor with a stool sample and leave with permission to eat breakfast again. If I decide to keep up the fasting, at least I won't have the pressure of worrying that my slip-ups might compromise the work that Dr. Mowry, Research Study Coordinator Sam Roman and others have put into trying to help me and the estimated 2.5 million worldwide who have multiple sclerosis. And if I feel a pang of guilt if I want cream in my morning coffee — or heck, maybe I want breakfast! — I'll just remember what Mowry told me: "It can't be a totally inflexible diet plan; otherwise, it definitely won't be sustainable." Brandie Michelle Jefferson is a communications manager and freelance reporter who loves a good science story. She's on Twitter, too: @b_m_jefferson.


In the future, you can send a text to a friend without taking your hands off what you're doing. How? By typing directly with your brain. At least, that's how Facebook envisions the technology it is developing. Facebook has revealed its ongoing research on "silent speech system" during the second day of the Facebook F8 developer conference. This is in line with the company's move toward the development of virtual reality and augmented reality. Not surprisingly, Facebook has also unveiled its concept products for AR and VR use. "So what if you could type directly from your brain?" This is the thought-provoking question posed by Regina Dugan, vice president of engineering and head of Building 8, the research and development department of Facebook, during her keynote speech at the F8 developer conference. Dugan revealed that Facebook has 60 people working on a technology that will soon tap into people's brains, decode the signals related to speech, then type these words instantly. This technology, dubbed the "silent speech system," can type up to 100 words per minute. For comparison, the average typing speed is 40 words per minute. And all this technology without any invasive surgical procedure. Dugan, a former director of the Defense Advanced Research Projects Agency, and her team are neck-deep in work researching the use of optical imaging (via lasers capturing changes in the speech-associated neurons) to decode brain signals and translate them into words, which are then transmitted to other people. Think of texting via telepathy. For this project, Facebook is collaborating with various academic and scientific institutions such as UC San Francisco, Johns Hopkins Medicine, UC Berkeley, Washington University School of Medicine, and Johns Hopkins University's Applied Physics Laboratory. This idea isn't exactly novel or new. Stanford University researchers have been working on brain-to-computer technology that enables paralyzed people to type via their brains using electrodes implanted in their brains. This can help people with ALS and spinal cord injuries. This effort by Facebook is another one in the line of Silicon Valley companies trying to push the envelope when it comes to technology and the human body. Tesla CEO Elon Musk has recently announced that he founded a company that wants to fuse computers with the human brain. Google also has a science division called Verily that is working on high-tech contact lenses. This technology may sound amazing but it can also be scary. After all, people will let a company that makes its money from mining user data get a free pass into their minds, the very sanctuary of their private thoughts. However, Facebook was quick to assuage this privacy concern; the company likened it to the kind of information we share online. "This isn't about decoding your random thoughts. Think of it like this: You take many photos and choose to share only some of them. Similarly, you have many thoughts and choose to share only some of them," wrote Facebook in its F8 blog. Dugan also described the technology as a mode of communication "with the convenience of voice and the privacy of text." This brain-typing technology could soon make conventional interfaces obsolete, as there won't be a need for one as the brain will pretty much do the typing. Thus, this tech could be the foundation of Facebook's move toward a future filled with augmented and virtual reality. The "brain mouse," as Dugan called it, will soon control the text input in this AR-led future. So if you can send via your brain, how can your recipient receive it? Via brain, too? According to Facebook, through your skin. Dugan explained that we have 2 square meters (21.5 square feet) of skin in the body "packed with sensors" that are wired to the brain via nerves. Referring to the intuitive function of the Braille system for the blind, Facebook wants to translate electronic messages into signals that can be transmitted through the skin, then translated by the brain. "Today we demonstrated an artificial cochlea of sorts and the beginnings of a new a 'haptic vocabulary,'" Dugan said. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


A study that surveyed a national sample of emergency department health care providers and adult patients suggests that patients are substantially more willing to disclose their sexual orientation than health care workers believe. In a report on the study, published in JAMA Internal Medicine on April 24, the researchers found that nearly 80 percent of health care professionals believed patients would refuse to provide sexual orientation information, while only 10.3 percent of patients reported they would refuse. Closing that disclosure gap, the investigators say, has the potential to improve the care of lesbian, gay and bisexual patients, a population with historically poorer overall health and less access to health care and insurance compared to the straight population. An estimated 8 million American adults identify as lesbian, gay or bisexual, according to government sources. The new research, part of the EQUALITY Study, is a collaborative effort among researchers at the Johns Hopkins University School of Medicine, Brigham and Women's Hospital and Harvard Medical School. "Unlike racial/ethnic and age data, information about sexual orientation and gender identity has not been collected routinely in health care settings, which limits the ability of researchers and clinicians to determine the unique needs of the lesbian, gay and bisexual communities," says Brandyn Lau, M.P.H., C.P.H., assistant professor of surgery at the Johns Hopkins University School of Medicine and the report's senior author. "Health care providers haven't collected these data, at least in part due to fear of offending patients, but this study shows that most patients actually would not be offended," he says. For the study, which the researchers believe is the first to compare patients' and clinicians' views about routine collection of sexual identity data, the scientists focused on emergency departments (EDs), which see more than 130 million visits annually in the U.S. EDs are the source of nearly half of inpatient hospital admissions in the U.S. and the primary point of entry for uninsured and underinsured patients, making them ideal locations for collecting sexual orientation information. In the study's first phase, the research team recruited 53 adult patients and 26 health care professionals from three community hospitals and two academic medical centers for qualitative interviews that lasted an average of one hour. The interviews took place between August 2014 and January 2015, and one of two researchers had a guided conversation with each participant about sexual orientation collection in the ED, barriers and facilitators to collection, and preferred methods of collection. The interviews revealed that although clinicians recognized the importance of disclosure of sexual orientation when medically relevant, most patients believed that sexual orientation was always relevant. Similarly, many patients stated that sexual orientation was something health care professionals needed to know, not only for the sake of patients' health care, but also for recognizing and normalizing the lesbian, gay and bisexual population. "Our patients are telling us that routinely asking all patients who come to the ED about this information creates a sense of normalcy toward people of all sexual orientations and signals that each patient is equally welcome here, including the three to 10 percent of Americans who identify as lesbian, gay or bisexual," says Adil Haider, M.D., M.P.H., Kessler Director of the Center for Surgery and Public Health at Brigham and Women's Hospital and the paper's first author. In the next phase of the study, the research team conducted an online survey using a nationally representative sample from GfK's KnowledgePanel and GfK's Physician Consulting Network. GfK is a marketing research company with survey tools tailored to a variety of interests. Surveys were sent to 1,516 potential adult patients (244 lesbian, 289 gay, 179 bisexual and 804 straight) and 429 ED health care professionals (209 physicians and 220 nurses) between March and April 2015. The 53-question survey for patients and 45-question survey for health care professionals consisted of multiple-choice responses, Likert scale choices (agree/strongly agree, neutral, and disagree/strongly disagree) and open-ended questions. 333 (77.8 percent) of all surveyed clinicians said patients would refuse to provide their sexual orientation if asked, but only 154 patients (10.3 percent) reported they would refuse to do so, indicating a significant discrepancy between what physicians and patients believe, the researchers said. In the population-weighted results, 143 straight patients (10.1 percent of the total), one lesbian (4.8 percent), three gay (12.0 percent) and five bisexual (16.4 percent) patients would refuse to share their sexual orientation in the ED. Bisexual patients were almost twice as likely to refuse to provide their sexual orientation as were straight patients. Both patients and clinicians indicated nonverbal self-report as the preferred method of sexual orientation information collection. "We need to make collecting sexual orientation information a regular part of our practice, similar to how other demographic information such as age and race is collected, and because I don't think providers will start consistently collecting these data on their own, clinics and hospitals need to mandate it," says Lau. As a next step, the research team will test different approaches to data collection, also as part of the EQUALITY Study. Other authors on this paper include Lisa M. Kodadek, Claire Snyder, Laura Vail, Danielle German and Susan Peterson of the Johns Hopkins University; Eric B. Schneider, Rachel R. Adler and Anju Ranjit of Harvard Medical School; Maya Torain of Duke University School of Medicine; Ryan Y. Shields of Yale School of Medicine; and Jeremiah D. Schuur of Brigham & Women's Hospital. This study was funded by PCORI (contract AD-1306-03980). Brandyn Lau is supported by the Institute for Excellence in Education Berkheimer Faculty Education Scholar Grant contract CE-12-11-4489 from the Patient Centered Outcomes Research Institute (PCORI) and grant 1R01HS024547 from the Agency for Healthcare Research and Quality.


News Article | April 17, 2017
Site: cen.acs.org

Right-handed amino acids—found in small doses in nature—are largely a mystery. In humans, most linger at low concentrations yet play unknown roles in the body. A new survey of right- and left-handed amino acids in the mouse brain thickens the plot. The study’s findings imply that the brain tightly regulates the levels of right-handed amino acids and hint at undiscovered enzymes that flip lefties to righties (ACS Chem. Neurosci. 2017, DOI: 10.1021/acschemneuro.6b00398). Ubiquitous left-handed -amino acids serve as the building blocks of all proteins. However, their mirror images, right-handed -amino acids, are rarer and more perplexing. Their purpose in humans was unknown until 2000, when scientists figured out that one of them, -serine, is a neurotransmitter. Many researchers think the presence of other -amino acids in the brain and body suggests that they must have significant functions, too. To find those functions, though, requires baseline surveys that determine normal levels of - and -amino acids in various parts of the body, says Daniel W. Armstrong at the University of Texas, Arlington. Armstrong’s group collaborated with Adam L. Hartman’s laboratory at Johns Hopkins University to measure baseline values for - and -amino acids in the cortex and hippocampus of mice, brain areas of interest because of Hartman’s previous work on epilepsy. The collaborators separated blood from brain tissue and then purified and fluorescently tagged the amino acids in the samples. They measured the amino acids by separating the - and -varieties with highly sensitive chiral columns and then detecting the fluorescent peaks. “Many curious things turned up,” Armstrong says. Most -amino acid levels were 10 to 2000 times as high in the brain as in the blood for the 12 amino acids measured. The concentration of the neurotransmitter -serine, near 0.05 μg per mg brain tissue, was among the highest, but -aspartate and -glutamine were even higher. Such quantities suggest neural roles for many of the -amino acids, Armstrong says. Particularly striking was the conspicuous absence of -glutamate anywhere in the brain or blood. -glutamate is the most abundant amino acid in the brain, so Armstrong expected to see at least some -glutamate. Yet his team couldn’t detect it at all with a detection threshold of 0.00005 μg per mg tissue. The unexpected absence, Armstrong says, implies that the body keeps -glutamate low for a physiological reason. It also implies that the brain has a mechanism for efficiently removing -glutamate or keeping it quite low compared with everything else. Armstrong surmises there may be undiscovered enzymes at work. A known enzyme converts - and -glutamate to - and -glutamine, so perhaps a stereoselective enzyme takes only -glutamine back the other way, making -glutamine a sink for -glutamate, he says. High brain levels of -glutamine support that theory, Armstrong says. Or, Herman Wolosker of Technion Israel Institute of Technology suggests that the brain may also lack transporters that recognize -glutamate, so it cannot be imported from the blood into the brain. Other findings from the study raise the specter of ghost enzymes, too: Wolosker points out that for some of the less abundant amino acids, relatively high fractions exist in the -form. Isoleucine, for instance, is 24% right-handed in the hippocampus. “It points to the possibility of additional enzymes in the brain that transform - into -amino acids,” he says. For the moment, Armstrong and his group are concentrating on pinning down the role of -amino acid oxidase, an enzyme known to degrade -amino acids, in particular -serine. By repeating the baseline study in mice lacking the gene for -amino acid oxidase, the scientists hope to elucidate what other -amino acids it regulates. They are also conducting a -amino acid survey on human blood and are curious to see whether -glutamate goes missing there too.


News Article | May 1, 2017
Site: www.businesswire.com

BLOOMINGTON, Minn.--(BUSINESS WIRE)--Minnesota Masonic Charities (MMC) today announced the recipients of its 2017 Scholarships Program. As part of its continuing commitment to building a better future for Minnesota, the nonprofit organization provides annual awards to some of the state’s most promising scholars. Since 2008, the organization has provided more than $2 million to fund Minnesota students seeking higher education. By 2018, Minnesota Masonic Charities plans to distribute $1 million annually in merit scholarship awards. “Our scholars reflect the values and character that are important to Masons,” said Eric Neetenbeek, Minnesota Masonic Charities president and CEO. “They demonstrate integrity and dedication – two traits we believe exemplify leadership. We have great faith in the individuals we select for these awards each year.” MMC offers up to 95 scholarship awards annually. The Signature, Legacy, Heritage and Vocational scholarships are made available to high school seniors on an equal opportunity basis, with no discrimination for age, gender, religion, national origin or Masonic affiliation; and an Undergraduate scholarship for up to 20 current college students is also available. All awards range from $1,000 to $5,000 per year, and students may renew their scholarship awards annually, provided they maintain scholastic performance. Please see the following page for a complete list of the 2017 Masonic Scholars. For more information about the Minnesota Masonic Charities Scholarships Program, please contact Kelly Johns, Director of Communications for MMC, at 952-948-6202 or kelly.johns@mnmasonic.org. Colton Mowers, Albert Lea (University of Wisconsin, Madison) Lucas Fleissner, Rochester (Iowa State University) Seth Cattanach, Lake Elmo (University of Notre Dame) Katelynne Schatz, Kettle River (College of St. Scholastica) Rachel Pompa, Hermantown (University of Minnesota, Duluth) Karli Weisz, Mora (University of North Dakota) Brock Drevlow, Theif River Falls (Johns Hopkins University) Jack Hedberg, Roseville (University of Minnesota, Twin Cities) Sophia Vrba, Maple Grove (University of Minnesota, Twin Cities) Za Vang, Minneapolis (University of St. Thomas) Sela Fadness, Austin (Hamline University) Tess Hatfield, Hill City (University of Wisconsin, Superior) Isabel Brown, White Bear lake (University of Minnesota, Twin Cities) Taylor Schmidt, Duluth (College of St. Scholastica) Jenifer Weyer, St. Cloud (Winona State University) Anthony Tran Vu, St. Paul (University of St. Thomas) Ryan McMahon, Mahtomedi (University of Minnesota, Twin Cities) Alex Sellner, Fairfax (Gustavus Adolphus College) Nathan Kuhn, Eagan (Southwest Minnesota State University) Caroline Sullivan, Fridley (University of Minnesota, Twin Cities)


News Article | May 4, 2017
Site: www.prweb.com

Anesthesiology expert Mohamed Rehman, M.D. joins Johns Hopkins All Children's Hospital as chair of the Department of Anesthesia and will also serve as professor of anesthesiology and critical care (pending academic review) with the Johns Hopkins University School of Medicine. Nationally recognized for his medical and clinical informatics expertise, Dr. Rehman is also establishing the Perioperative Health Informatics Unit within the hospital’s Health Informatics Core, which uses electronic health data to improve care and increase understanding of children's illnesses, while lowering the cost of care. “Dr. Rehman’s expertise will help innovate our anesthesia program and perioperative care by the use of data analytics for real time assessment of patients in the operating room – anticipating problems before they happen via trends,” says Jonathan Ellen, M.D., president and vice dean of Johns Hopkins All Children’s Hospital. Previously Dr. Rehman was a professor of clinical anesthesiology and critical care and professor of pediatrics at the University Of Pennsylvania School Of Medicine. He held numerous leadership roles at the Children’s Hospital of Philadelphia (CHOP), including director of transplant anesthesia, and was the anesthesia team leader for the world’s first bilateral hand transplant and several conjoint twin separations. He also developed the first biomedical informatics program within a pediatric anesthesia and critical care program while at CHOP, where he was the first endowed chair in biomedical informatics and entrepreneurial sciences. Dr. Rehman is the author of more than 50 original research publications and review articles and more than 70 scientific abstracts. At the national level, he currently chairs the Biomedical Informatics and Technology Group of the Society for Pediatric Anesthesia and is a past president of the Society for Technology in Anesthesia. About Johns Hopkins All Children’s Hospital Johns Hopkins All Children’s Hospital in St. Petersburg is a leader in children’s health care, combining a legacy of compassionate care focused solely on children since 1926 with the innovation and experience of one of the world’s leading health care systems. The 259-bed teaching hospital, ranked as a U.S. News & World Report Best Children’s Hospital, stands at the forefront of discovery, leading innovative research to cure and prevent childhood diseases while training the next generation of pediatric experts. With a network of Johns Hopkins All Children’s Outpatient Care centers and collaborative care provided by All Children’s Specialty Physicians at regional hospitals, Johns Hopkins All Children’s brings care closer to home. Johns Hopkins All Children’s Hospital consistently keeps the patient and family at the center of care while continuing to expand its mission in treatment, research, education and advocacy. For more information, visit HopkinsAllChildrens.org.


News Article | May 4, 2017
Site: www.futurity.org

Even a relatively mild Zika outbreak in the continental United States could cost more than $183 million in medical bills and productivity losses, and a worse epidemic could come with a price tag of $1.2 billion or even more. Experts estimated the potential impact of epidemics of various sizes in five Southeastern states and Texas, the US locations most populated by Aedes aegypti, the mosquito most likely to carry the disease. “This is a threat that has not gone away. Zika is still spreading silently and we are just now approaching mosquito season in the United States, which has the potential of significantly increasing the spread,” says study leader Bruce Y. Lee, an associate professor of international health at Johns Hopkins University’s Bloomberg School of Public Health. “There’s still a lot we don’t know about the virus but it is becoming clear that more resources will be needed to protect public health. Understanding what a Zika epidemic might look like, however, can really help us with planning and policy-making as we prepare.” While many infected by the Zika virus suffer mild symptoms, if any, a Zika infection during a woman’s pregnancy can cause severe birth defects such as microcephaly or other brain problems. In regions affected by Zika, there have also been increased reports of Guillain-Barré syndrome, a rare illness of the nervous system. There is no treatment nor is there a vaccine to prevent Zika. Policymakers need estimates of Zika costs to help guide funding decisions, researchers say. It is unclear how many people in the United States have already been infected and how many more cases will occur this summer, but the findings, published in the journal PLOS Neglected Tropical Diseases, are further evidence that the costs of any Zika outbreak would be high. For the study, researchers developed and ran a computational model estimating the impact of different rates of spread if Zika were to hit Florida, Georgia, Alabama, Mississippi, Louisiana, and Texas. The model considered health care costs—such as visits to the doctor, laboratory tests, and the lifetime cost of caring for a child born with microcephaly—as well as productivity losses. Even assuming an attack rate—the percentage of the population eventually infected—of only 0.01 percent, the model estimates that Zika would cost more than $183 million and cause more than 7,000 infections, two cases of microcephaly, and four cases of Guillain-Barré. An attack rate of 1 percent would cause more than 704,000 infections, 200 cases of microcephaly, and 423 cases of Guillain-Barré. The 1-percent attack rate could result in $1.2 billion in medical costs and productivity losses to the economy. A 10-percent attack rate could cost more than $10.3 billion. These attack rates would be far lower than those seen in French Polynesia (66 percent), on Yap Island in Micronesia (73 percent), and in the state of Bahia in Brazil (32 percent), where the current Zika outbreak in the Americas and the Caribbean is believed to have originated. They are also lower than recent outbreaks of chikungunya, a virus spread the same way as Zika, including one in Puerto Rico (23.5 percent). After much delay last year, Congress allocated $1.1 billion for mosquito control efforts and vaccine development, as well as for emergency health care for Puerto Rico, where more than 35,000 people contracted the virus. But, Lee believes far more money may be necessary, given his estimates for medical care. “Without details regarding the Zika-prevention measures that would be implemented and how effective these may be, it is unclear what percentage of these costs may be averted,” Lee says. “But our model shows it is very likely that preventing an epidemic—or at least finding ways to slow one down—would save money, especially since epidemics like Zika have hidden costs that aren’t always considered.” Other researchers from Johns Hopkins and from Yale University and the National School of Tropical Medicine at Baylor College of Medicine are coauthors of the study. National Institutes of Health, the Agency for Healthcare Research and Quality, and the US Agency for International Development funded the work.


News Article | April 27, 2017
Site: www.eurekalert.org

Washington, DC (April 27, 2017) -- Researchers have developed a risk calculator to provide personalized risk estimates of developing kidney failure after donation. The findings, which appear in an upcoming issue of the Journal of the American Society of Nephrology (JASN), may be useful for individuals considering donation, for living donors wishing to understand their long-term risk, and for clinicians who monitor the long-term health of living donors. Research suggests that there are minimal health consequences for individuals who selflessly donate a kidney, but only a few comprehensive studies have looked at this issue. Also, although long-term studies of living kidney donors have reported low rates of premature death and kidney failure, personalized estimates based on donor characteristics have not previously been available. To help provide accurate estimates of long-term risks, a team led by Dorry Segev, MD, PhD, of the Johns Hopkins University School of Medicine and the Johns Hopkins School of Public Health, studied information on 133,824 living kidney donors from 1987 to 2015, as reported to the Organ Procurement and Transplantation Network. Overall risk was quite low: the investigators predicted that the median risk of kidney failure was only 1 case per 10,000 donors at 5 years after donation and only 34 per 10,000 donors at 20 years after donation. Nevertheless, black race and male sex were associated with 3.0- and 3.9-times increased risks of developing kidney failure, respectively. Among nonblack donors, older age was linked with greater risk, but this was not seen in black donors. Higher body mass index was also associated with an increased risk of kidney failure. The findings suggest that greater permissiveness may be warranted in older black candidate donors, and that young black candidates should be evaluated carefully. "Because living kidney donors voluntarily undergo surgery for no direct medical benefit to themselves, it is incumbent upon the transplant community to provide them with accurate estimates of long-term risk," said Dr. Segev. "Our risk prediction model may be helpful to individuals considering donation, and to living donors and their care providers as they plan long-term follow-up care and health maintenance," added lead author Allan Massie, PhD, MHS. Disclosures: The authors reported no financial disclosures. The article, entitled "Quantifying post-donation risk of ESRD in living kidney donors," will appear online at http://jasn. on April 27, 2017, doi: 10.1681/ASN. 2016101084. The content of this article does not reflect the views or opinions of The American Society of Nephrology (ASN). Responsibility for the information and views expressed therein lies entirely with the author(s). ASN does not offer medical advice. All content in ASN publications is for informational purposes only, and is not intended to cover all possible uses, directions, precautions, drug interactions, or adverse effects. This content should not be used during a medical emergency or for the diagnosis or treatment of any medical condition. Please consult your doctor or other qualified health care provider if you have any questions about a medical condition, or before taking any drug, changing your diet or commencing or discontinuing any course of treatment. Do not ignore or delay obtaining professional medical advice because of information accessed through ASN. Call 911 or your doctor for all medical emergencies. Since 1966, ASN has been leading the fight to prevent, treat, and cure kidney diseases throughout the world by educating health professionals and scientists, advancing research and innovation, communicating new knowledge, and advocating for the highest quality care for patients. ASN has nearly 17,000 members representing 112 countries. For more information, please visit http://www.  or contact the society at 202-640-4660.


Patent
Johns Hopkins University and Lieber Institute For Brain Development | Date: 2017-01-25

RNA polymerase I (Pol I) is a dedicated polymerase for the transcription of the 47S ribosomal RNA precursor subsequently processed into the mature 5.8S, 18S and 28S ribosomal RNAs and assembled into ribosomes in the nucleolus. Pol I activity is commonly deregulated in human cancers. Based on the discovery of lead molecule BMH-21, a series of pyridoquinazolinecarboxamides were synthesized as inhibitors of Pol I and activators of the destruction of RPA194, the Pol I large catalytic subunit protein. The present invention identifies a set of bioactive compounds, including purified stereoisomers, that potently cause RPA194 degradation that function in a tightly constrained chemical space. Pharmaceutical compositions comprising these compounds and their uses in cancer and other Pol I related diseases is also provided.


The presently disclosed subject matter provides methods and kits for treating solid tumors in a subject by using a combination of anti-CTLA-4 and/or anti-PD-1 antibodies with at least one member of the group consisting of a bacterium, bacterial product, and an immunoregulatory entity. In particular embodiments, the bacteria are toxin-depleted, anaerobic bacteria, such as Clostridium novyi-NT.


News Article | May 8, 2017
Site: www.eurekalert.org

The concept of marriage may be in flux, but a new study of commuter marriages--in which a married couple lives apart in service to their dual professional careers--appears to confirm that married people still see interdependence as a key feature of their unions. The study, "Going the Distance: Individualism and Interdependence in the Commuter Marriage," draws on data from in-depth interviews with 97 people who are married but live apart from their spouses due to their individual career pursuits. In it, the author, assistant professor of sociology at Lehigh University Danielle Lindemann, explores how the seemingly conflicting cultural norms of personal autonomy and a commitment to the institution of marriage play out "on the ground" from the viewpoint of the participants. Her analysis--which will be published in an upcoming issue of the Journal of Marriage and Family--finds that commuter couples indeed engage in discourses about two subjects that operate in tension: independence and interdependence. "Although the study participants positioned themselves as highly individualistic, interdependence was a key theme in their responses as well," says Lindemann. "Perhaps more surprisingly, a substantial minority of respondents indicated that their non-cohabitation, in fact, enhanced their interdependence." Lindemann acknowledges that married couples may live apart for a number of reasons. However, her study specifically focuses on college-educated, dual-earning couples as prior research has suggested that commuter marriage is more common within this group than in other segments of the population. Lindemann always sought to interview both spouses in a relationship, but it was not a necessary criterion for inclusion in the study. Fifty-six of the respondents were married to other people in the sample. Lindemann presents commuter marriage as particularly fertile ground to examine the cultural tension between marital interdependence and the shift to toward the "individualization" of the American marriage. This shift, she writes--citing the work of Andrew J. Cherlin of Johns Hopkins University--has been largely driven by "...the decline of the male breadwinner/female homemaker model, decreasing task specialization between the genders, the increasing democratization of marital decision-making, and the increasing ability of each partner to provide financially for himself or herself." "Commuter marriages may be viewed as an extreme manifestation of major transitions in the nature of work and family that have been taking place in the U.S. since the 1970s," says Lindemann. "The study results not only shed light on this under-studied population but also broaden our understanding of the evolving cultural meaning of marriage." In addition to engaging in parallel narratives around individualism and interdependence, nearly one half (48.5%) of participants in the study engaged with the theme of "apart togetherness"--seeing themselves as connected, despite the distance. According to Lindemann, this frequently came up in response to the question, "What do you like the most about being married?" From the study (all names are pseudonyms): "For instance, Katie, a banking professional in her mid-30's, replied that she enjoyed having her husband 'there,' adding 'We've learned that just because you don't see each other, it doesn't mean you're not together.'" Lindemann writes that eighty respondents received this question and, perhaps paradoxically for non-cohabitating couples, "enjoying each other's company" (41.3%) and "companionship" (30.0%) were the most common themes. One respondent, a 60-year-old director of a company named Matthew, described both the emotional and practical aspects of the "apart togetherness" he has experienced with spouse Trudy, from whom he has been living apart due to their individual career pursuits for twelve years. "Emphasizing both the emotional and task-sharing aspects of marriage, Matthew gave his relationship an interdependent frame, despite the fact that he and his wife had not lived in the same household, except on weekends, for over a decade," writes Lindemann. When asked a series of questions about their communication, more than three fourths of study respondents discussed the usefulness of communication technologies for managing and sharing tasks. In contrast to previous studies of non-cohabitating couples (largely based on research from the 1970's and 1980's), this study's respondents described being in near constant contact via cell phones, texting, email, instant messaging, and video chat. From the study: "...respondents saw these technologies as facilitating inter-reliance. That is, [they] had the capacity to be reachable at virtually any time, so that they could rely on each other--not only emotionally, but financially and logistically as well." "One of the more surprising findings is that 15.5% of respondents--a substantial minority--interpreted their non-cohabitation as paradoxically facilitating their interdependence," says Lindemann. "Some went so far as to suggest that their communication with their spouses in fact improved when they were geographically separated."


Patent
Johns Hopkins University | Date: 2017-03-08

The present invention provides compositions comprising PAMAM dendrimers conjugated with one or more biologically active agents, and their use systemically to target activated microglia/macrophages in retina/choroid and generally, inflammatory and/or angiogenic diseases of the eye.


News Article | May 5, 2017
Site: www.biosciencetechnology.com

Working with mouse, fly and human cells and tissue, Johns Hopkins researchers report new evidence that disruptions in the movement of cellular materials in and out of a cell’s control center — the nucleus — appear to be a direct cause of brain cell death in Huntington’s disease, an inherited adult neurodegenerative disorder. Moreover, they suggest, laboratory experiments with drugs designed to clear up these cellular “traffic jams” restored normal transport in and out of the nucleus and saved the cells. In the featured article published online on April 5 in Neuron, the researchers also conclude that potential treatments targeting the transport disruptions they identified in Huntington’s disease neurons may also work for other neurodegenerative diseases, such as ALS and forms of dementia. Huntington’s disease is a relatively rare fatal inherited condition that gradually kills off healthy nerve cells in the brain, leading to loss of language, thinking and reasoning abilities, memory, coordination and movement. Its course and effects are often described as Alzheimer’s disease, Parkinson’s disease and ALS rolled into one, making Huntington’s disease a rich focus of scientific investigation. “We’re trying to get at the heart of the mechanism behind neurodegenerative diseases and with this research believe we’ve found one that seems to be commonly disrupted in many of them, suggesting that similar drugs may work for some or all of these disorders,” says Jeffrey Rothstein, M.D., Ph.D., a professor of neurology and neuroscience, and director of the Brain Science Institute and the Robert Packard Center for ALS Research at the Johns Hopkins University School of Medicine. In 2015, Rothstein’s team found out how a mutation in a gene — implicated in 40 percent of inherited ALS cases and 25 percent of inherited frontotemporal dementia cases — gums up transport in and out of the nucleus in neurons, ultimately shutting the cell down and leading to its death. The mutant gene makes RNA molecules that stick to a transport protein, RanGAP1. RanGAP1 in turn helps move molecules through nuclear pores that serve as passageways in the nucleus, letting proteins and genetic material flow in and out of it. Jonathan Grima, currently a fourth-year neuroscience graduate student in Rothstein’s laboratory, learned that this same mutation is also the most common cause of another disorder in which patients have Huntington’s -like symptoms without having the causative Huntington’s disease mutation. Additionally, he realized that other researchers previously showed that mutations in the nuclear pore protein NUP62 caused Huntington’s disease-like pathology. Because of such clues from others’ research, Grima took on the task of investigating whether problems with nuclear transport and the nuclear pores also happened in neurons with Huntington’s disease. Huntington’s disease is caused by a mutation in the Huntingtin protein, resulting in too many repeats of the amino acid glutamine in the protein’s sequence, making the protein sticky and clumpy. Grima used two mouse models of Huntington’s disease: one with a human version of the mutant Huntingtin protein and another with an aggressive form of the disease that contains only the first portion of the mouse Huntingtin protein. By using antibodies with glowing markers that bind to specific proteins and viewing the neurons under the microscope, Grima saw that the mutant Huntingtin protein clumped up in the same location of the cell as abnormal clumps of RanGAP1, the nuclear transport protein. It also clumped up in the same location as abnormal clumps of nuclear pore proteins NUP88 and NUP62. “This finding was quite tantalizing given the fact that mutations in the NUP62 protein were shown by other researchers to cause an infantile form of Huntington’s disease called infantile bilateral striatal necrosis,” says Grima. Grima also observed this same clumping of Huntingtin protein with RanGAP1 and nuclear pore proteins to the wrong place in the cell in brain tissue and cultured brain cells derived from deceased patients with Huntington’s disease. To further explore nuclear transport’s role in Huntington’s disease, Grima took lab-grown mouse neurons and used chemical switches to a) turn on both an additional healthy copy of the RanGAP1 gene and a mutant version of Huntingtin; b) just turn on the mutant Huntingtin; or c) just turn on a healthy version of Huntingtin. He then measured cell death and found that neurons with the healthy version of Huntingtin had about 17 percent of the neurons die off. Neurons with only the mutant version of Huntingtin were more likely to die, with about 33 percent dying off, but in neurons with both the mutant Huntingtin and the RanGAP1, only 24 percent of the neurons died off. The researchers think that some of the extra healthy RanGAP1 they introduced into diseased cells wasn’t bound up to the mutant Huntingtin and resumed normal nuclear transport.


News Article | May 5, 2017
Site: phys.org

The challenges that this presents have led to some rather novel ideas, ranging from balloons and landers to floating drones and submarines. But it is the proposal for a "Dragonfly" drone by researchers at NASA's JHUAPL that seems particularly adventurous. This eight-bladed drone would be capable of vertical-takeoff and landing (VTOL), enabling it to explore both the atmosphere and the surface of Titan in the coming decades. The mission concept was proposed by a science team led by Elizabeth Turtle, a planetary scientist from NASA's Johns Hopkins University Applied Physics Laboratory (JHUAPL). Back in February, the concept was presented at the "Planetary Science Vision 2050 Workshop" – which took place at NASA's headquarters in Washington, DC – and again in late March at the 48th Lunar and Planetary Science Conference in The Woodlands, Texas. Such a mission, as Turtle explained to Universe Today via email, is both timely and necessary. Not only would it build on many recent developments in robotic explorers (such as the Curiosity rover and the Cassini orbiter); but on Titan, there is simply no shortage of opportunities for scientific research. As she put it: "Titan's an ocean world with a unique twist, which is the rich and complex organic chemistry occurring in its atmosphere and on its surface. This combination makes Titan a particularly good target for studying planetary habitability. One of the big questions about the development of life is how chemical interactions led to biological processes. Titan's been doing experiments in prebiotic chemistry for millions of years – timescales that are impossible to reproduce in the lab – and the results of these experiments are there to be collected." Their proposal is based in part on previous Decadal Surveys, such as the Campaign Strategy Working Group (CSWG) on Prebiotic Chemistry in the Outer Solar System. This survey emphasized that a mobile aerial vehicle (i.e an airship or a balloon) would well-suited to exploring Titan. Not only is Titan the only known body other than Earth that has a dense, nitrogen-rich atmosphere – four times as dense as Earth's – but it's gravity is also about 1/7th that of Earth's. However, balloons and airships would be unable to study Titan's methane lakes, which are one of the most exciting draws as far as research into prebiotic chemistry goes. What's more, an aerial vehicle would not be able to conduct in-situ chemical analysis of the surface, much like what the Mars Exploration Rovers (Spirit, Opportunity and Curiosity) have been doing on Mars. As such, Turtle and her colleagues began looking for a proposal that represented the best of both worlds – i.e. an aerial platform and a lander. This was the genesis of the Dragonfly concept. "Several different methods have been considered for in-situ aerial exploration of Titan (helicopters, different types of balloons, airplanes)," said Turtle. "Dragonfly takes advantage of the recent developments in multi-rotor aircraft to provide aerial mobility for a lander with a sophisticated payload. Because Dragonfly would be able to travel long distances – a few tens of kilometers at a time, and up to a few hundred kilometers over the course of the mission – it would be possible to make measurements at multiple sites with very different geologic histories." Initially, Turtle and her colleagues – which includes Ralph Lorenz (also from JHUAPL), Melissa Trainer of the Goddard Space Flight Center, and Jason Barnes of University of Idaho – had proposed a mission that would combine a Montgolfière-style balloon with a Pathfinder-like lander. Whereas the balloon would explore Titan from a low altitude, the lander would explore the surface up close. However, by the 48th Lunar and Planetary Science Conference, they had made some adjustments to their idea. Instead of a balloon and multiple landers, they presented a concept for a "Dragonfly" qaudcopter to conduct both aerial and surface studies. This four-rotor vehicle, it was argued, would be able to take advantage of Titan's thick atmosphere and low gravity to obtain samples and determine surface compositions in multiple geological settings. In its latest iteration, the Dragonfly incorporates eight rotors (two positioned at each of its four corners) to achieve and maintain flight. Much like the Curiosity and upcoming Mars 2020 rovers, the Dragonfly would be powered by a Multimission Radioisotope Thermoelectric Generator (MMRTG). This system uses the heat generated by decaying plutonium-238 to generate electricity, and can keep a robotic mission going for years. This design, says Turtle, would offer scientists the ideal in-situ platform for studying Titan's environment: "Dragonfly would be able to measure compositional details of different surface materials, which would show how far organic chemistry has progressed in different environments. These measurements could also reveal chemical signatures of water-based life (like that on Earth) or even hydrocarbon-based life, if either were present on Titan. Dragonfly would also study Titan's atmosphere, surface, and sub-surface to understand current geologic activity, how materials are transported, and the possibility of exchange of organic material between the surface and the interior water ocean." This concept incorporates a lot of recent advances in technology, which include modern control electronics and advances in commercial unmanned aerial vehicle (UAV) designs. On top of that, the Dragonfly would do away with chemically-powered retrorockets and could power-up between flights, giving it a potentially much longer lifespan. "And now is the perfect time," says Turtle, "because we can build on what we've learned from the Cassini-Huygens mission to take the next steps in Titan exploration." Currently, NASA's Jet Propulsion Laboratory is developing a similar concept. Known as the Mars Helicopter "Scout", for use on Mars, this aerial drone is expected to be launched aboard the Mars 2020 mission. In this case, the design calls for two coaxial counter-rotating rotors, which would provide the best thrust-to-weight ratio in Mars' thin atmosphere. This sort of VTOL platform could become the mainstay in the coming decades, wherever long-term missions that involve bodies that have atmospheres are called for. Between Mars and Titan, such aerial drones could hop from one area to the next, obtaining samples for in-situ analysis and combining surface studies with atmospheric readings at various altitudes to get a more complete picture of the planet.


News Article | May 4, 2017
Site: www.eurekalert.org

Working with mouse, fly and human cells and tissue, Johns Hopkins researchers report new evidence that disruptions in the movement of cellular materials in and out of a cell's control center -- the nucleus -- appear to be a direct cause of brain cell death in Huntington's disease, an inherited adult neurodegenerative disorder. Moreover, they suggest, laboratory experiments with drugs designed to clear up these cellular "traffic jams" restored normal transport in and out of the nucleus and saved the cells. In the featured article published online on April 5 in Neuron, the researchers also conclude that potential treatments targeting the transport disruptions they identified in Huntington's disease neurons may also work for other neurodegenerative diseases, such as ALS and forms of dementia. Huntington's disease is a relatively rare fatal inherited condition that gradually kills off healthy nerve cells in the brain, leading to loss of language, thinking and reasoning abilities, memory, coordination and movement. Its course and effects are often described as Alzheimer's disease, Parkinson's disease and ALS rolled into one, making Huntington's disease a rich focus of scientific investigation. "We're trying to get at the heart of the mechanism behind neurodegenerative diseases and with this research believe we've found one that seems to be commonly disrupted in many of them, suggesting that similar drugs may work for some or all of these disorders," says Jeffrey Rothstein, M.D., Ph.D., a professor of neurology and neuroscience, and director of the Brain Science Institute and the Robert Packard Center for ALS Research at the Johns Hopkins University School of Medicine. In 2015, Rothstein's team found out how a mutation in a gene -- implicated in 40 percent of inherited ALS cases and 25 percent of inherited frontotemporal dementia cases -- gums up transport in and out of the nucleus in neurons, ultimately shutting the cell down and leading to its death. The mutant gene makes RNA molecules that stick to a transport protein, RanGAP1. RanGAP1 in turn helps move molecules through nuclear pores that serve as passageways in the nucleus, letting proteins and genetic material flow in and out of it. Jonathan Grima, currently a fourth-year neuroscience graduate student in Rothstein's laboratory, learned that this same mutation is also the most common cause of another disorder in which patients have Huntington's -like symptoms without having the causative Huntington's disease mutation. Additionally, he realized that other researchers previously showed that mutations in the nuclear pore protein NUP62 caused Huntington's disease-like pathology. Because of such clues from others' research, Grima took on the task of investigating whether problems with nuclear transport and the nuclear pores also happened in neurons with Huntington's disease. Huntington's disease is caused by a mutation in the Huntingtin protein, resulting in too many repeats of the amino acid glutamine in the protein's sequence, making the protein sticky and clumpy. Grima used two mouse models of Huntington's disease: one with a human version of the mutant Huntingtin protein and another with an aggressive form of the disease that contains only the first portion of the mouse Huntingtin protein. By using antibodies with glowing markers that bind to specific proteins and viewing the neurons under the microscope, Grima saw that the mutant Huntingtin protein clumped up in the same location of the cell as abnormal clumps of RanGAP1, the nuclear transport protein. It also clumped up in the same location as abnormal clumps of nuclear pore proteins NUP88 and NUP62. "This finding was quite tantalizing given the fact that mutations in the NUP62 protein were shown by other researchers to cause an infantile form of Huntington's disease called infantile bilateral striatal necrosis," says Grima. Grima also observed this same clumping of Huntingtin protein with RanGAP1 and nuclear pore proteins to the wrong place in the cell in brain tissue and cultured brain cells derived from deceased patients with Huntington's disease. To further explore nuclear transport's role in Huntington's disease, Grima took lab-grown mouse neurons and used chemical switches to a) turn on both an additional healthy copy of the RanGAP1 gene and a mutant version of Huntingtin; b) just turn on the mutant Huntingtin; or c) just turn on a healthy version of Huntingtin. He then measured cell death and found that neurons with the healthy version of Huntingtin had about 17 percent of the neurons die off. Neurons with only the mutant version of Huntingtin were more likely to die, with about 33 percent dying off, but in neurons with both the mutant Huntingtin and the RanGAP1, only 24 percent of the neurons died off. The researchers think that some of the extra healthy RanGAP1 they introduced into diseased cells wasn't bound up to the mutant Huntingtin and resumed normal nuclear transport. Next, Grima looked at cell death in cultured neurons with a healthy or a mutant form of Huntingtin, or with a mutant form of Huntingtin that was treated with small amounts of an experimental drug called KPT-350, one that prevents a nuclear export protein, Exportin-1, from shuttling proteins and RNA out of the nucleus. Neurons with the healthy version of Huntingtin had about 18 percent die off, and neurons with the mutant version of Huntingtin had about 38 percent die off. Those treated with the nuclear export blocking drug had improved survival, with only about 22 percent of the cells die off. Blocking nuclear export seemed to prevent cells from dying and counteracted the defects in neurons with mutant Huntingtin, the researchers say. "Our studies show that broken-down components of the nuclear transport machinery lead to traffic jams within brain neurons of essential information and eventually brain cell death," says Grima. "We believe that the reestablishment of proper cell transport could provide a promising therapeutic target for Huntington's disease, and potentially other neurodegenerative disorders." "Although the disrupted nuclear transport seems to be killing neurons in multiple neurodegenerative diseases, these diseases have very different properties and symptoms," cautions Rothstein. "We need to do more work to find out why one disease causes a certain set of symptoms and another disease causes others with respect to what is happening with nuclear transport." According to the researchers, there is an average of 2000 nuclear pores per cell and each individual nuclear pore consists of multiple copies of more than 30 different proteins that each serve different functions. It may be that nuclear pores on neurons and other types of brain cells like glia are constructed of different combinations of these proteins, some of which may be more or less critical in various neurodegenerative diseases. Grima is currently working on answering this question using a new mouse model developed at Johns Hopkins that will allow him to isolate these nuclear pore proteins from different cell types in the mouse brain to identify whether these nuclear pore components are in fact different based on brain cell types and brain locations. "We sincerely hope our new findings may help bring us a step closer to treating this and potentially other horrific neurodegenerative disorders," says Grima. According to the Huntington's Society of America, about 30,000 people in the United States have Huntington's symptoms and 200,000 people are at risk of inheriting the disease from a parent. Additional authors include J. Gavin Daigle, Nicolas Arbez, Kathleen Cunningham, Ke Zhang, Jenna Glatzer, Jacqueline Pham, Ishrat Ahmed, Qi Peng, Harsh Wadhwa, Olga Pletnikova, Juan Troncoso, Wenzhen Duan, Solomon Snyder, Thomas Lloyd, and Christopher Ross of Johns Hopkins Medicine; Joseph Ochaba, Charlene Geater, Eva Morozko, Jennifer Stocksdale and Leslie Thompson of University of California, Irvine; and Laura Ranum of University of Florida, Gainesville. This work was funded by grants from the National Institute of Neurological Disorders and Stroke (R01NS094239 and R01NS085207); CIRM Training Grant; National Science Foundation Graduate Research Fellowship Award; Thomas Shortman Training Fund Graduate Award; Axol Science Award; NIH Training in Neurotherapeutics Discovery and Development for Academic Scientists; MDI Laboratory QFM Chroma Fellowship Award; and the Johns Hopkins Brain Science Institute.


News Article | May 8, 2017
Site: phys.org

But the new virtual money could face a tough battle integrating into the wider financial system. After debuting on currency trading platforms in October, Zcash took off, hitting an exchange rate of $1,000 per unit, putting it in league with the much better established Bitcoin, the virtual currency pioneer created in 2009. While its value has since come down to earth, Zcash is attracting the interest of Russian, Chinese, Venezuelan and, as of May 4, South African consumers. Brazilians now use Zcash to pay taxes and electricity bills and make purchases. To make its mark in the world of virtual currencies, Zcash boasts that it protects user privacy. But because of that guarantee it does not offer the transparency demanded by authorities who want to prevent these new tender from being used in money laundering, financing terrorism, evading taxes or fraud. Zcash was developed by researchers at Johns Hopkins University and the Massachusetts Institute of Technology in the United States and Tel Aviv University and the Technion-Israel Institute of Technology in Israel. Only five of the six people who developed the cryptography have been publicly identified. It is based on a technology dubbed zk-Snark, which allows untraceable transactions. The resulting data are encrypted but users are free to identify themselves. Other cryptocurrencies such as Dash and Monero offer a level of privacy, but Zcash goes further, even obscuring the origin of a payment. This is the opposite of Bitcoin, which uses blockchain technology that publicly records transaction details including the unique alphanumeric strings that identify buyers and sellers. "You don't expose all of your communications or all of your transactions to random people on the internet you barely know," said Zooko Wilcox, CEO of Zerocoin Electric Coin Company, which manages Zcash. Virtual currencies are produced, or "mined," by banks of computers solving complex algorythms, an operation that can be expensive. Wilcox told AFP he hoped the expanded privacy protection could overcome businesses' reluctance to adopt Zcash as a trustworthy alternative to traditional state-controlled currencies. But Jonathan Levin, co-founder of Chainalysis, a start-up that helps banks and authorities trace the origins and destinations of virtual currency payments, doubts Zcash will find its place in the wider financial system. "It is hard for existing financial institutions to integrate these types of crypto currencies as information on the origin of funds is very hard to ascertain," he said. Financial institutions began to take an interest in Bitcoin, and in particular in its blockchain technology, once the darknet marketplace known as Silk Road was closed in 2013. Silk Road facilitated Bitcoin transactions but was also platform for the sale of illegal drugs. "Nobody has ever used Zcash for any kind of crime as far as anyone knows," Wilcox said, while conceding that "all technologies can be misused." Wilcox said he gave a presentation on Zcash to Canadian and US authorities in November and their attitude was "very pragmatic." Virtual currencies are not regulated by any central bank. In the United States, trading is authorized by individual states which issues license to exchanges, and so far there is no regulation at the federal level. Unlike central bank-issued demoninations, virtual currencies can be "mined" by anyone with sophisticated software skills to gather up the code. Nevertheless, despite Zcash's efforts to protect users, the currency itself may be vulnerable to hacking or counterfeiting. In a June attack against another cryptocurrency called ether, hackers reportedly made off with 3.6 million units with a value of $50 million. Cryptography consultant Peter Todd said in a November blog that Zcash's encryption could be weak, allowing hackers to crack the code. "The threat here is that an attacker may be able to create fake zk-Snark proofs by breaking the crypto directly, even without having access to the trusted setup backdoor," he wrote. Wilcox said Zerocoin Electric was alert to such risks and pays hackers to test the currency's security. In total, Zerocoin Electric expects a maximum of 21 million Zcash units will be mined, or produced, of which 10 percent will go to Zcash Electric shareholders, including founders, employees and investors. Explore further: Towards equal access to digital coins


News Article | May 8, 2017
Site: www.prweb.com

Johns Hopkins All Children's Hospital named cardiothoracic surgeon Jeffrey Jacobs, M.D., FACS, FACC, FCCP and cardiologist Gary Stapleton, M.D. as co-directors of the Johns Hopkins All Children's Heart Institute. The two are overseeing the U.S. News and World Report ranked pediatric cardiac surgery and cardiology programs at the hospital, as well as the team of specialists in cardiac surgery, pediatric cardiology, cardiac anesthesia, critical care and nursing working together to provide excellence in clinical care, education and research. “Drs. Jacobs and Stapleton will provide strong leadership, vision and clear strategy focused on innovation and excellence which will help us push quality and safety in cardiac care forward, as well as improve the overall care and outcomes for our heart patients,” says Jonathan Ellen, M.D., president and vice dean of Johns Hopkins All Children’s Hospital. Dr. Jacobs serves as chief of the Division of Cardiovascular Surgery and director of the Andrews/Daicoff Cardiovascular Program within the heart institute. He is a professor of cardiac surgery and pediatrics at Johns Hopkins University and surgical director of the Heart Transplantation Program and director of the Extracorporeal Life Support Program at Johns Hopkins All Children’s Heart Institute. In addition to his research in cardiothoracic surgery, he also serves as editor in chief of Cardiology in the Young, one of the most widely read journals dedicated to patients with pediatric and congenital cardiac disease. He also chairs the Society of Thoracic Surgeons Workforce on National Databases. Dr. Stapleton also serves as chief of pediatric cardiology and medical director of the cardiac catheterization lab, where more than 400 diagnostic and interventional procedures are performed annually. Additionally, Dr. Stapleton is active in research and education in interventional cardiology and has launched innovative techniques at Johns Hopkins All Children’s to treat congenital heart disease without the need for open heart surgery. About Johns Hopkins All Children’s Hospital Johns Hopkins All Children’s Hospital in St. Petersburg is a leader in children’s health care, combining a legacy of compassionate care focused solely on children since 1926 with the innovation and experience of one of the world’s leading health care systems. The 259-bed teaching hospital, ranked as a U.S. News & World Report Best Children’s Hospital, stands at the forefront of discovery, leading innovative research to cure and prevent childhood diseases while training the next generation of pediatric experts. With a network of Johns Hopkins All Children’s Outpatient Care centers and collaborative care provided by All Children’s Specialty Physicians at regional hospitals, Johns Hopkins All Children’s brings care closer to home. Johns Hopkins All Children’s Hospital consistently keeps the patient and family at the center of care while continuing to expand its mission in treatment, research, education and advocacy. For more information, visit HopkinsAllChildrens.org.


El Everyone Graduates Center del Center for Social Organization of Schools (Centro para la Organización Social de las Escuelas) de la Facultad de Educación de la Johns Hopkins University busca identificar las barreras a la graduación de la escuela secundaria, encontrar soluciones estratégicas para superar esas barreras y desarrollar la capacidad local para implementar y mantener las soluciones para que todos los estudiantes se gradúen con la preparación necesaria para alcanzar el éxito como adultos. www.every1graduates.org


News Article | April 20, 2017
Site: news.yahoo.com

This undated microscope image made available by the National Center for Microscopy and Imaging Research shows HeLa cells. Until these cells came along, whenever human cells were put in a lab dish, they would die immediately or reproduce only a few times. Henrietta Lacks' cells, by contrast, grew indefinitely. They were "perpetual, everlasting, death-defying, or whatever other word you want to use to describe immortal," says Dr. Francis Collins, director of the U.S. National Institutes of Health. (National Center for Microscopy and Imaging Research via AP) NEW YORK (AP) — What happened in the 1951 case of Henrietta Lacks, and could it happen again today? The story of the woman who unwittingly spurred a scientific bonanza made for a best-selling book in 2010. On Saturday, it returns in an HBO film with Oprah Winfrey portraying Lacks' daughter Deborah. Cells taken from Henrietta Lacks have been widely used in biomedical research. They came from a tumor sample taken from Lacks — who never gave permission for their use. A look at the case: HOW DID DOCTORS GET THE CELLS? As the book relates, Lacks was under anesthesia on an operating table at Johns Hopkins Hospital in Baltimore one day in 1951, undergoing treatment for cervical cancer. A hospital researcher had been collecting cervical cancer cells to see if they would grow continuously in the laboratory. So the surgeon treating Lacks shaved a dime-sized piece of tissue from her tumor for that project. Nobody had asked Lacks if she wanted to provide cells for the research. She died later that year. WAS IT ILLEGAL TO TAKE THE CELLS WITHOUT HER PERMISSION? Not at that time. "What happened to Henrietta Lacks was commonly done," says bioethicist Dr. Robert Klitzman of Columbia University in New York. WHAT ARE THE RULES NOW IN THE U.S.? Specimens intended specifically for research can be collected only if the donor gives consent first. If cells or tissues are instead removed for diagnosis and treatment, that is considered part of the patient's general consent for treatment. But there's a twist. Once a specimen is no longer needed for treating the patient and would otherwise be discarded, scientists can use it for research. No further consent is needed, as along as information identifying the patient as the source is removed and the specimen can't be traced back to the patient, says Johns Hopkins University bioethicist Jeffrey Kahn. IF A SPECIMEN LEADS TO A PRODUCT, DOES THE DONOR HAVE A RIGHT TO SHARE IN THE PROFITS? Generally not, because the consent form for donation or treatment usually waives any such legal right. WHAT WAS SO SPECIAL ABOUT LACKS' CELLS? Until they came along, whenever human cells were put in a lab dish, they would die immediately or reproduce only a few times. Her cells, by contrast, could be grown indefinitely. They were "perpetual, everlasting, death-defying, or whatever other word you want to use to describe immortal," as Dr. Francis Collins, director of the U.S. National Institutes of Health, put it. So they provided an unprecedented stock of human cells that could be shipped worldwide for experiments. They quickly became the most popular human cells for research, and have been cited in more than 74,000 scientific publications. HOW HAVE RESEARCHERS USED THE CELLS? The so-called "HeLa" cells became crucial for key developments in such areas as basic biology, understanding viruses and other germs, cancer treatments, in vitro fertilization and development of vaccines, including the polio vaccine. WHAT MAKES THEM GROW SO WELL? Researchers proposed a possible answer in 2013. Virtually all cases of cervical cancer are caused by infection with human papillomavirus , which inserts its genetic material into a human cell's DNA. Scientists who examined the DNA of HeLa cells suggested that happened in a place that strongly activated a cancer-promoting gene. That might explain both why Lacks' cancer was so aggressive and why the cells grow so robustly in a lab dish. DID EVERYBODY ALWAYS KNOW THE ORIGIN OF THE CELLS? No. Lacks was named publicly only in 1971, by an article in a medical journal. Her story appeared in some magazines in the 1970s, and in a 1997 documentary on BBC. She became famous in 2010 with publication of Rebecca Skloot's best-selling book, "The Immortal Life of Henrietta Lacks." Follow Malcolm Ritter at http://twitter.com/malcolmritter His recent work can be found at http://tinyurl.com/RitterAP


WASHINGTON (April 19, 2017) -- Defective HIV proviruses, long thought to be harmless, produce viral proteins and distract the immune system from killing intact proviruses needed to reduce the HIV reservoir and cure HIV. Researchers at the George Washington University (GW) and Johns Hopkins University publish their findings in Cell Host & Microbe. Current HIV cure research focuses on eliminating intact proviruses in infected patients. However, the ratio of intact and defective proviruses is about one to 1,000, creating a "needle in a haystack problem," according to Brad Jones, Ph.D., co-first author of the paper and assistant professor of microbiology, immunology, and tropical medicine at the GW School of Medicine and Health Sciences. "For a long time, most of the field has thought that we don't have to worry about defective proviruses, because they could never restart infection," said Jones. "Our research shows that these defective proviruses can actually produce some viral proteins. While they can't produce an infection, they do harm by acting as decoy viruses and distracting the immune system." Researchers in the field have been frustrated with defective proviruses because they interfere with measurement -- most assays used to measure HIV will measure both the intact and defective proviruses. However, this research details their role as much more active. By producing viral proteins, the immune system expends resources on defective proviruses, rather than intact viruses. "It's a much bigger issue than we expected," said Jones. "In a way, this is a setback, but every time we learn what the obstacles are, we are moving forward. Perhaps we didn't quite know how far we had to go at the beginning." Further research may lead to different courses of treatment for HIV patients. If one therapy kills defective proviruses, it may still be considered of benefit, even if it doesn't kill all intact proviruses. Also, efforts to kill defective proviruses may lead to much stronger immune responses to clear both defective and intact proviruses. This research was supported in part by the BELIEVE grant - a multimillion-dollar HIV/AIDS cure research grant awarded to GW as part of the second iteration of the Martin Delaney Collaboratory at the National Institutes for Health. amfAR generationCURE also had a significant role in funding this research. In addition, this research was supported by the Johns Hopkins University Center for AIDS Research and the District of Columbia Center for AIDS Research. "Defective HIV-1 Proviruses Are Expressed and Can Be Recognized by Cytotoxic T Lymphocytes, which Shape the Proviral Landscape," published in Cell Host & Microbe, is available at http://www.cell.com/cell-host-microbe/fulltext/S1931-3128(17)30118-X. Media: To interview Dr. Jones, please contact Lisa Anderson at lisama2@gwu.edu or 202-994-3121. Founded in 1824, the GW School of Medicine and Health Sciences (SMHS) was the first medical school in the nation's capital and is the 11th oldest in the country. Working together in our nation's capital, with integrity and resolve, the GW SMHS is committed to improving the health and well-being of our local, national and global communities. smhs.gwu.edu


A gene previously identified as critical for tumor growth in many human cancers also maintains intestinal stem cells and encourages the growth of cells that support them, according to results of a study led by Johns Hopkins researchers. The finding, reported in the Apr. 28 issue of Nature Communications, adds to evidence for the intimate link between stem cells and cancer, and advances prospects for regenerative medicine and cancer treatments. Study leader Linda M. S. Resar, M.D., professor of medicine, oncology and pathology at the Institute for Cellular Engineering at the Johns Hopkins University School of Medicine, has been studying genes in the high-mobility group (HMG) family for over two decades. Several years ago, while creating a genetically engineered mouse that expresses high levels of the mouse HMGA1 gene to investigate its role in leukemia, Resar and her colleagues made the chance finding that the intestines of these animals were much larger and heavier than those of "wild-type" animals (or control mice that were not genetically modified). The mouse intestines were also riddled with polyps, abnormal growths projecting from the intestinal lining that can be precursors of cancer. In fact, polyps in humans frequently progress to colon cancer, which is why they are removed during screening colonoscopies in people over 50 and others at risk for colon cancer. To better understand how HMGA1 affected the rodents' intestines, Resar and Lingling Xian, M.D., Ph.D., research associate at the Johns Hopkins University School of Medicine, and their colleagues examined the transgenic animals' intestinal cells to determine which ones were expressing this gene. Several different experiments localized the active gene and its protein to stem cells buried within the crypts, or deep grooves in the intestinal lining. After isolating these stem cells from both transgenic and wild-type mice, the researchers found that those carrying the HMGA1 transgene multiplied far more rapidly, forming identical daughter cells in a process called self-renewal, which is a defining property of all stem cells. These transgenic stem cells also readily created intestinal tissues called "organoids" in laboratory dishes. These organoids had more stem cells than those isolated from wild-type mice. Further investigation, says Resar, showed that these unusual properties arise from the ability of HMGA1 to turn on several genes involved in the Wnt pathway, a network of proteins necessary for embryonic development and stem cell activity. Stem cells do not function in isolation, explains Resar. They need a "niche" to survive and maintain an undifferentiated state. From the French word nicher, which means to build a nest, a niche is a nest-like compartment comprised of cells that secrete growth factors and other proteins that help stem cells survive. The niche also prevents stem cells from morphing into mature intestinal cells until new intestinal cells are needed. Intestinal stem cells are particularly important because a new intestinal lining is generated about every 4-5 days. Looking further into the intestinal crypts of both the transgenic and wild-type mice, the research team made what they consider a surprising finding: Not only was HMGA1 causing the stem cells themselves to self-renew or proliferate more rapidly in the transgenic animals, but it was also increasing the number of Paneth cells, a type of niche cell known to support intestinal stem cells. Additional experiments showed that the protein produced by HMGA1 activates another gene called Sox9, which is directly responsible for turning stem cells into Paneth cells. "We suspected that HMGA1 might generate new stem cells, but we were extremely surprised that it also helps support these cells by building a niche," Resar says. "We believe that our experiments provide the first example of a factor that both expands the intestinal stem cell compartment and builds a niche." Many genes that are involved in the growth and development of embryos or adult stem cells also play roles in cancer, Resar adds. After scanning the Cancer Genome Atlas, a database of genes expressed in human cancers, the research team discovered that the activity of both HMGA1 and SOX9 genes are tightly correlated in normal colon tissue, and both genes become highly overexpressed in colon cancer. "This tells us that the pathway turned on by HMGA1 in normal intestinal stem cells becomes disrupted and hyperactive in colon cancer," Resar says. Resar says the team plans to continue investigating the function of HMGA1 and SOX9 in intestinal and other cancers as well as their role in stem cells. Both avenues of investigation could eventually lead to clinical applications, she adds. For example, if scientists can find a way to turn down overexpression of these genes in cancer, we could disrupt cancer growth and prevent tumor progression. On the flip side, turning up expression of these genes or their pathways could help researchers grow new intestinal tissue to replace tissue destroyed by diseases such as inflammatory bowel disease or radiation treatment for cancer. "What we discovered is something referred to as the Goldilocks paradox," she says. "Too little of this protein disrupts normal stem cell function, but too much can promote abnormal growth and lead to cancer. For our work to help patients, we will need to find ways to get the amount just right and in the appropriate cell context." To view the video that accompanies this release please click here. Other Johns Hopkins researchers who participated in this research include Dan Georgess, Tait Huso, Leslie Cope, Amy Belton, Yu-Ting Chang, Wenyong Kuang, Qihua Gu, Xiaoyan Zhang, David L. Huso and Andrew J. Ewald. This work also included Alessio Fasano and Stefania Senger from Massachusetts General Hospital for Children. This work was supported by grants from the National Cancer Institute (grant numbers CA182679, CA164677, and CA149550) and the Maryland Stem Cell Research Fund (grant numbers 2011-MSCRFE-0102 and 2015-MSCRFE-1759).


News Article | April 20, 2017
Site: www.chromatographytechniques.com

Despite what may seem like a head-on collision between science and politics, officials involved in the March for Science do not believe the march should be partisan. During an April 19 press conference Rush Holt, Ph.D., CEO of the American Association for the Advancement of Science, said the April 22 march that will take place throughout the world is not about partisan politics but rather for advocating for the advancement of science. The march, which coincides with Earth Day, is expected to take place in over 400 locations around the world, including marches scheduled to occur in all 50 states. It was organized by numerous science-based organizations including the AAAS, the American Geophysical Union, the National Center for Science Education and others. “I’d say the organizers of the march have taken great pain to say that it is not partisan,” Holt said during the press conference. “It is not about any particular public official or political figure or particular funding situation but more generally about a statement about the relevance or the significance or the value of science. “There’s been a concern amongst scientists and friends of science that evidence has been crowded out by ideology and opinion in public debate, in policy making,” Holt added. Holt, who is a physicist and also formerly represented New Jersey’s 12th district as a Democrat for eight terms in Congress, said the March for Science was birthed when various scientists participating in the January 21st Women’s March, the day after the inauguration of President Donald Trump, began to break out into conversations about the future of science. However, the feelings that led to the march started long before Trump was ever elected. “It’s true that the March for Science began in January but it was built on a growing level of concern that reached the level of anxiety about the conditions under which science can thrive,” he said. “Conditions that have been challenged and threatened in a lot of ways for a number of years. “Scientists find it appalling that evidence has been crowded out by ideological assertions, raw opinion and wishful thinking. We’ve seen that from both parties, from policy makers on all levels,” he added. Holt said while the march isn’t partisan, there are certain initiatives of the Trump administration that will negatively impact science including the potential of a travel ban and a gag order on government scientists. Lydia Villa-Komaroff, an honorary national co-chair of the March for Science, a molecular and cellular biologist and co-founding member of the Society for the Advancement of Chicanos/Hispanics and Native Americans in Science, explained the importance in advocating for evidence based science decisions. “I think it is incredibly important that we try to make the case that the fundament base of science really underlies all of modern life,” Villa-Komaroff said. “I think if we use evidence-based science we will make better decisions.” Elias Zerhouni, president of global research and development at the Sanofi pharmaceutical company and the former director of the National Institutes of Health (NHI), said maintaining federal funding is crucial for the next generation of scientists and science work. “What’s most critical is that federal funding is absolutely critical to create the human capital that any society will need to push the fountains of knowledge,” Zerhouni said. “The new knowledge will allow not only society to use these ideas but in the process train thousands of scientists.” Zerhouni said there are about 300,000 scientists in the U.S. and he expects many of the younger or aspiring scientists to be discouraged from the field if funding dries up. “Federal funding is not something you can turn on or off when it comes to human capital,” he said. “To me the march for science is a way to say this is not a partisan issue, this is not one administration vs. another.” Carol Greider, a professor of molecular biology at Johns Hopkins University and the 2009 Nobel Laureate of Medicine, agreed that funding is crucial to the future of science. “Without the support from the NIH we will lose the next generation of scientists and the next wave of breakthroughs and therapies,” Greider said. “If there is a 20 percent reduction in the NIH budget there will be no new grants for young scientists starting out and we will potentially lose an entire generation of people who are now trained and have the talents and are ready and eager to make the next breakthroughs.” For more information on the March for Science visit their official website.


News Article | April 20, 2017
Site: www.eurekalert.org

Working with genetically engineered mice -- and especially their whiskers -- Johns Hopkins researchers report they have identified a group of nerve cells in the skin responsible for what they call "active touch," a combination of motion and sensory feeling needed to navigate the external world. The discovery of this basic sensory mechanism, described online April 20 in the journal Neuron, advances the search for better "smart" prosthetics for people, ones that provide more natural sensory feedback to the brain during use. Study leader Daniel O'Connor, Ph.D., assistant professor of neuroscience at the Johns Hopkins University School of Medicine, explains that over the past several decades, researchers have amassed a wealth of knowledge about the sense of touch. "You can open up textbooks and read all about the different types of sensors or receptor cells in the skin," he says. "However, almost everything we know is from experiments where tactile stimulation was applied to the stationary skin--in other words, passive touch." Such "passive touch," O'Connor adds, isn't how humans and other animals normally explore their world. For example, he says, people entering a dark room might search for a light switch by actively feeling the wall with their hands. To tell if an object is hard or soft, they'd probably need to press it with their fingers. To see if an object is smooth or rough, they'd scan their fingers back and forth across an object's surface. Each of these forms of touch combined with motion, he says, is an active way of exploring the world, rather than waiting to have a touch stimulus presented. They each also require the ability to sense a body part's relative position in space, an ability known as proprioception. While some research has suggested that the same populations of nerve cells, or neurons, might be responsible for sensing both proprioception and touch necessary for this sensory-motor integration, whether this was true and which neurons accomplish this feat have been largely unknown, O'Connor says. To find out more, O'Connor and his team developed an experimental system with mice that allowed them to record electrical signals from specific neurons located in the skin, during both touch and motion. The researchers accomplished this, they report, by working with members of a laboratory led by David Ginty, Ph.D., a former Johns Hopkins University faculty member, now at Harvard Medical School, to develop genetically altered mice. In these animals, a type of sensory neuron in the skin called Merkel afferents were mutated so that they responded to touch -- their "native" stimulus, and one long documented in previous research -- but also to blue light, which skin nerve cells don't normally respond to. The scientists trained the rodents to run on a mouse-sized treadmill that had a small pole attached to the front that was motorized to move to different locations. Before the mice started running, the researchers used their touch-and-light sensitized system to find a single Merkel afferent near each animal's whiskers and used an electrode to measure the electrical signals from this neuron. Much like humans use their hands to explore the world through touch, mice use their whiskers, explains O'Connor. Consequently, as the animals began running on the treadmill, they moved their whiskers back and forth in a motion that researchers call "exploratory whisking." Using a high-speed camera focused on the animals' whiskers, the researchers took nearly 55,000,000 frames of video while the mice ran and whisked. They then used computer-learning algorithms to separate the movements into three different categories: when the rodents weren't whisking or in contact with the pole; when they were whisking with no contact; or when they were whisking against the pole. They then connected each of these movements -- using video snapshots captured 500 times every second -- to the electrical signals coming from the animals' blue-light-sensitive Merkel afferents. The results show that the Merkel afferents produced action potentials -- the electrical spikes that neurons use to communicate with each other and the brain -- when their associated whiskers contacted the pole. That finding wasn't particularly surprising, O'Connor says, because of these neurons' well-established role in touch. However, he says, the Merkel afferents also responded robustly when they were moving in the air without touching the pole. By delving into the specific electrical signals, the researchers discovered that the action potentials precisely related to a whisker's position in space. These findings suggest that Merkel afferents play a dual role in touch and proprioception, and in the sensory-motor integration necessary for active touch, O'Connor says. Although these findings are particular to mouse whiskers, he cautions, he and his colleagues believe that Merkel afferents in humans could serve a similar function, because many anatomical and physiological properties of Merkel afferents appear similar across a range of species, including mice and humans. Besides shedding light on a basic biological question, O'Connor says, his team's research could also eventually improve artificial limbs and digits. Some prosthetics are now able to interface with the human brain, allowing users to move them using directed brain signals. While this motion is a huge advance beyond traditional static prosthetics, it still doesn't allow the smooth movement of natural limbs. By integrating signals similar to those produced by Merkel afferents, he explains, researchers might eventually be able to create prosthetics that can send signals about touch and proprioception to the brain, allowing movements akin to native limbs. Other Johns Hopkins researchers who participated in this study include Kyle S. Severson, Duo Xu, Margaret Van de Loo, and Ling Bai. Funding for this work was provided by the National Institutes for Health under grant numbers R01NS34814 and P30NS050274. O'Connor is supported by the Whitehall Foundation, Klingenstein Fund and the National Institutes of Health under grant number R01NS089652.


News Article | April 21, 2017
Site: www.techrepublic.com

Advice on the internet flows freely. With so much information available, how does one know what to believe? For example, there is still significant confusion regarding the now defunct FCC regulation requiring ISPs to get permission from their customers before they collect web-browsing data. So who do we trust to give good advice about being safe and private on the internet? SEE: The real reason behind the new law for ISPs and what it means for internet users (TechRepublic) Elissa Redmiles, a Ph.D. student in computer science at the University of Maryland, wrote a commentary for The Conversation titled Can better advice keep you safer online? in which she offers insight about who to trust when it comes to cybersecurity advice. "One key to staying safer online may be getting advice from the right places—people and sources with accurate, helpful information that can let you take control of your online privacy and security," writes Redmiles. "My research, in collaboration with Sean Kross (Johns Hopkins University) and Michelle Mazurek (University of Maryland), explores where people get their advice about online security, and how useful it actually is." SEE: Your internet history is now for sale. Here's how you can protect it. (TechRepublic) Redmiles, Kross, and Mazurek used a survey of 3,000 internet users who are in the US to determine where people receive their advice about online security and privacy. The researchers published their findings in the paper Where is the Digital Divide? A Survey of Security, Privacy, and Socioeconomics (PDF). "We found that no matter how wealthy or how poor a person is, no matter her education level, the speed of her internet service or whether she has a smartphone, a person's online safety is closely related to where, and from whom, she gets advice about online security," reports Redmiles. "Approximately 70 percent of Americans learn about online security behaviors as a result of advice shared by friends, family and co-workers, or on websites they visit." Figure A provides a comparative overview of the respondents' advice sources and the percentages of respondents who were eventually victims of an online security/privacy issue. In her commentary, Redmiles offered the following additional information. One of the findings by Redmiles, Kross, and Mazurek of particular interest is that 13% of people participating in the survey received advice from teachers or librarians, and of those only 8%—the lowest percentage reported—had an online safety problem. "Our findings also suggest that librarians are underutilized but potentially very valuable sources of online safety information," explains Redmiles. "We asked local librarians for a few suggestions of good resources for getting started with protecting your information. They recommended Get Started With Privacy and Security Starter Pack & Tutorials as good first steps to making an online security plan." To help keep children safe while online, Redmiles mentions the librarians recommended the National Cyber Security Alliance website, with security and privacy activities and information for kids and parents. With so much security advice available, Redmiles suggests people should not accept any answer wholesale, and follow these recommendations instead. SEE: Online security 101: Tips for protecting your privacy from hackers and spies (ZDNet) Redmiles, Kross, and Mazurek feel strongly that there is a strong relationship between respondents' security and privacy experiences and advice sources; however, the details are murky. "The direction of this relationship is unclear: do people receive bad advice that leads to worse experiences, or do they wait to seek advice until after a negative experience?" explain the researchers. "We hypothesize some of both." What is crystal clear to the three authors is that the current advice ecosystem is not working and should be reevaluated.


News Article | April 20, 2017
Site: www.rdmag.com

This Earth Day, thousands are expected to take a stand for science—gathering in Washington D.C. and in over 400 locations around the world as part of the March for Science. However, despite what may seem like a head-on collision between science and politics, officials involved in the April 22 event do not believe the march should be partisan. Rush Holt, Ph.D., CEO of the American Association for the Advancement of Science (AAAS), said the march—which was organized in partnership with the AAAS, the American Geophysical Union, the National Center for Science Education and other science-based organizations—is not about partisan politics, but rather about advocating for the advancement of science. “I’d say the organizers of the march have taken great pain to say that it is not partisan,” Holt said during an April 19 press conference about the upcoming event. “It is not about any particular public official or political figure or particular funding situation, but more generally about a statement about the relevance or the significance or the value of science. “There’s been a concern amongst scientists and friends of science that evidence has been crowded out by ideology and opinion in public debate, in policy making,” Holt added. Holt, who is a physicist and formerly represented New Jersey’s 12th district as a Democrat for eight terms in Congress, said the March for Science was birthed when various scientists participating in the January 21st Women’s March, began to have conversations about the future of science. However, the feelings that led to the march started long before Trump was ever elected. “It’s true that the March for Science began in January but it was built on a growing level of concern that reached the level of anxiety about the conditions under which science can thrive,” he said. “Conditions have been challenged and threatened in a lot of ways for a number of years. Scientists find it appalling that evidence has been crowded out by ideological assertions, raw opinion and wishful thinking. We’ve seen that from both parties, from policy makers on all levels." Holt said while the march isn’t partisan, there are certain initiatives coming from the Trump administration that will negatively impact science, including the potential of a travel ban and a gag order on government scientists. Lydia Villa-Komaroff, an honorary national co-chair of the March for Science, a molecular and cellular biologist and co-founding member of the Society for the Advancement of Chicanos/Hispanics and Native Americans in Science, explained at the press conference the importance of advocating for evidence-based science decisions. “I think it is incredibly important that we try to make the case that the fundament base of science really underlies all of modern life,” Villa-Komaroff said. “I think if we use evidence-based science we will make better decisions.” During the same press conference, Elias Zerhouni, president of global research and development at the Sanofi pharmaceutical company and the former director of the National Institutes of Health (NHI), said maintaining federal funding is crucial for the next generation of scientists and science work. “What’s most critical is that federal funding is absolute critical to create the human capital that any society will need to push the fountains of knowledge,” Zerhouni said. “The new knowledge will allow not only society to use these ideas, but in the process train thousands of scientists.” Zerhouni said there are about 300,000 scientists in the U.S. and he expects many of the younger or aspiring scientists to be discouraged from the field if funding dries up. “Federal funding is not something you can turn on or off when it comes to human capital,” he said. “To me the March for Science is a way to say this is not a partisan issue, this is not one administration vs. another.” Carol Greider, a professor of molecular biology at Johns Hopkins University and the 2009 Nobel Laureate of Medicine, agreed that funding is crucial to the future of science. “Without the support from the NIH we will lose the next generation of scientists and the next wave of breakthroughs and therapies,” Greider said during the press conference. “If there is a 20 percent reduction in the NIH budget there will be no new grants for young scientists starting out and we will potentially lose an entire generation of people who are now trained and have the talents and are ready and eager to make the next breakthroughs.” For more information on the March for Science visit their official website.


News Article | April 19, 2017
Site: www.eurekalert.org

The latest in a series of studies led by researchers at Johns Hopkins Medicine shows that addition of a widely available, noninvasive imaging test called 99mTc-sestamibi SPECT/CT to CT or MRI increases the accuracy of kidney tumor classification. The research team reports that the potential improvement in diagnostic accuracy will spare thousands of patients each year in the United States alone from having to undergo unnecessary surgery. In a recent report on ongoing work to improve kidney tumor classification, published in the April issue of the journal Clinical Nuclear Medicine, the team reports that the sestamibi SPECT/CT test--short for 99mTc-sestamibi single-photon emission computed tomography/computed tomography(CT) -- adds additional diagnostic information in conjunction with conventional CTs and MRI and improves physicians' ability to differentiate between benign and malignant kidney tumors. "Sestamibi SPECT/CT lets radiologists and urologists 'see' the most common benign kidney tumor, something CT and MRI have not succeeded in doing alone," says Mohamad E. Allaf, M.D., MEA Endowed Professor of Urology at the Johns Hopkins University School of Medicine. "This noninvasive scan may prevent patients with a potentially benign kidney tumor from having to undergo a surgery to remove the tumor or potentially the entire kidney, along with its associated risks and high costs. At Johns Hopkins, use of this test has already spared a number of our patients from unnecessary surgery and unnecessary removal of a kidney that would require them to be on dialysis. These results are hugely encouraging, but we need to do more studies." For this study, 48 patients who were diagnosed with a kidney tumor on conventional CT or MRI were imaged with sestamibi SPECT/CT at Johns Hopkins prior to surgery. Radiologists, who were not allowed to talk to each other or know the results of the surgeries, graded the conventional and sestamibi SPECT/CT images benign or malignant using a 5-point scale (1 = definitely benign, 5 = definitely cancerous). Following surgery, similarly 'blinded' pathologists analyzed the tumors without knowing the radiologists' imaging results. Pathology results of surgically removed tumors showed that 8 of the 48 were benign. The remaining 40 were classified as a variety of other tumor types, including malignant renal cell carcinomas. Reviewing sestamibi SPECT/CT scan results in conjunction with CT or MRI changed the initial rating levels from cancerous (score 3, 4, 5) toward benign (score 1 and 2) in 9 cases, and changed reviewers' score from likely cancerous (score 4) to definitely cancerous (score 5) in 5 cases, or about 10 percent of all cases. The addition of sestamibi SPECT/CT increased the reviewers' diagnostic certainty in 14 of the 48 patients, or in nearly 30 percent of all cases. Overall, the investigators said, adding sestamibi SPECT/CT helped identify 7 of 9 benign tumors, and conventional imaging with added sestamibi SPECT/CT outperformed conventional imaging alone, as measured by a statistical analysis that measures tradeoffs between sensitivity and specificity. On this measure, a value of 0.50 indicates that a diagnostic test is no better than chance. Conventional imaging combined with sestamibi SPECT/CT had a value of 0.85, while conventional imaging alone had a value of 0.60. Even for patients whose tumors were not reclassified, the addition of sestamibi SPECT/CT increased physicians' ability to more confidently classify malignant tumors, which reduces the risk of misdiagnosis and unnecessary surgery for all patients, the researchers say. Radiologists and urologists have been frustrated for decades by the inability of conventional imaging tests, such as CT and MRI, to distinguish benign from malignant kidney tumors. At Johns Hopkins, multispecialty teams work together to determine the best care for patients and as partners on research innovations and quality improvement initiatives. "This collaborative venue enabled two then-residents [Drs. Michael Gorin and Steven Rowe] from different departments and specialties to design a clinical trial based on a few reports in the literature suggesting a potential role for sestamibi SPECT/CT in this diagnostic conundrum, and their hypothesis proved correct," says Mehrbod Som Javadi, MD, assistant professor of radiology at Johns Hopkins University School of Medicine and the senior author on the paper. Pamela T. Johnson, MD, associate professor of radiology at the Johns Hopkins University School of Medicine notes, "these types of advances are critical to our precision medicine initiative, Hopkins inHealth, designed for individualized patient management, and to our mission of high-value health care, where the highest quality care is safely delivered at the lowest personal and financial cost to the patient." "As radiologists, we have struggled to find noninvasive ways to better classify patients and spare unnecessary surgery, but this has not been easy," says Steven P. Rowe, M.D., Ph.D., one of the two former residents who developed this approach, and now assistant professor of radiology and radiological science at the Johns Hopkins University School of Medicine. "Sestamibi SPECT/CT offers an inexpensive and widely available means of better characterizing kidney tumors, and the identical test is now being performed as part of a large trial in Sweden, for which the first results have just recently been published and appear to confirm our conclusions." Although further study is needed to validate the accuracy of sestamibi SPECT/CT, this test appears to be a less expensive, faster, noninvasive alternative to surgery, says Michael A. Gorin, M.D., the other resident involved in developing this approach and now chief resident with The James Buchanan Brady Urological Institute of the Johns Hopkins University School of Medicine. "In the absence of diagnostic certainty, surgeons tend to remove kidney tumors in an abundance of caution, leading to an estimated 5,600 surgically removed benign kidney tumors each year in the United States." Other authors on this paper include, Sara Sheikhbahaei, Christopher S. Jones, Kristin K. Porter, Alex S. Baras, Phillip M. Pierorazio, Mark W. Ball, Lilja B. Solnes, Jonathan I. Epstein, and Mehrbod S. Javadi, all of the Johns Hopkins University School of Medicine, and Takahiro Higuchi of Wurzburg University in Germany.


News Article | April 26, 2017
Site: www.eurekalert.org

In a preclinical study in mice and human cells, researchers report that selectively removing old or 'senescent' cells from joints could stop and even reverse the progression of osteoarthritis. The findings, published April 24 in Nature Medicine, support growing evidence that senescent cells contribute to age-related diseases and demonstrate that using drug therapies to remove them from the joint not only reduces the development of post-traumatic osteoarthritis, but creates an environment for new cartilage to grow and repair joints. Senescent cells accumulate in tissues as we age and are a normal part of wound healing and injury repair. They secrete important signals that call immune cells and other cell types into damaged tissue so they can clean up and rebuild. However, in articular joints such as the knee, and cartilage tissue in particular, these senescent cells often are not cleared from the area after injury. Their prolonged presence causes a cascade of events, which starts the development of osteoarthritis. "Combine age-related increases in senescent cells, plus trauma, and it's a double whammy," says Jennifer Elisseeff, Ph.D., director of the translational tissue engineering center and Morton Goldberg Professor of Ophthalmology at the Johns Hopkins Wilmer Eye Institute. The researchers took young mice and performed surgery on them, cutting their anterior cruciate ligaments (ACL) to mimic injury. The researchers then administered injections of an experimental drug named UBX0101, which was recently identified to kill senescent cells in laboratory studies. Researchers injected UBX0101 into the mice's joints 14 days after trauma, when degradation was already starting, and observed that the presence of senescent cells was reduced by roughly 50 percent. In addition, the researchers monitored gene expression in treated mice and found that genes associated with reparative cartilage growth were activated in the joint after treatment. Similar experiments were conducted in older mice, which showed some key differences from the treatment in younger mice. The older mice had thinner cartilage in the joint and increased pain levels before the experiment. After treatment with UBX0101 injections, the older mice exhibited reduced pain like their more youthful counterparts, but did not exhibit signs of cartilage regeneration. To gauge the potential for UBX0101 to be translated to a human treatment, researchers tested the drug in cultures of human cartilage cells taken from donors with clinically severe osteoarthritis (i.e. patients who had undergone a total knee replacement surgery due to damage from osteoarthritis). Elisseeff's group then grew these cartilage cells into 3D structures in the lab. The 3D structures mimic how cartilage tissues grow in the body, Elisseeff explains. They then exposed these cells to UBX0101 for four days. The researchers observed that not only were the number of senescent cells dramatically reduced, but the tissue derived from these patients began forming new cartilage after the elimination of senescent cells. "What was most striking about the results in human tissue is the fact that removal of senescent cells had a profound effect on tissue from very advanced osteoarthritis patients, suggesting that even patients with advanced disease could benefit," says Elisseeff. Although the treatment appears promising, Elisseeff says one limitation in the current study is the short time that UBX101 remains in the joint. However, Unity Biotechnology, who co-developed UBX0101, is working on single-injection formulations. The researchers are hopeful that with further development, UBX0101 may one day offer a one-dose treatment for osteoarthritis. Elisseeff explains, "Because the drug targets and kills the senescent cells directly, once they are eliminated, patients will not need to return for frequent treatments." Prior to this study, Johns Hopkins Technology Ventures (JHTV), the commercialization arm of The Johns Hopkins University, licensed intellectual property around the senescent cell technology to Unity Biotechnology Inc., a company aiming to develop therapeutics that address age-related diseases; both jointly own the patent. "The promising results from this collaboration between Johns Hopkins and Unity showcase how industry and academia can work together to develop innovative therapies," says Neil Veloso, JHTV's executive director of technology transfer. "We are excited that the results from this collaboration may develop into a product that will positively impact people around the world." Other researchers involved in this study include: Ok Hee Jeon, Sona Rathod, Jae Wook Chung and Do Hun Kim from the Johns Hopkins University School of Medicine; Alain P. Vasserot, Yan Poon and Nathaniel David of Unity Biotechnology; Darren J. Baker and Jan M. Van Deursen of the Mayo Clinic College of Medicine; Judith Campisi of the Buck Institute for Research on Aging; Chaekyu Kim of the Johns Hopkins University School of Medicine and the Ulsan National Institute of Science and Technology; Remi-Martin Laberge of the Buck Institute for Research on Aging and Unity Biotechnology; and Marco Demaria of the Buck Institute for Research on Aging and the University Medical Center Groningen. This research was supported by Unity Biotechnology, the Morton Goldberg professorship, the National Cancer Institute (R01CA96985, the Paul F. Glenn Foundation, a Fulbright scholarship from the Institute of International Education, and the Bloomberg-Kimmel Institute for Cancer Immunotherapy. Jennifer Elisseeff owns equity in Unity Biotechnology. Johns Hopkins University and Unity Biotechnology own intellectual property related to the research. Ok Hee Jeon, Chaekyu Kim and Jennifer Elisseeff are inventors of JHU intellectual property licensed to Unity. This arrangement has been reviewed and approved by The Johns Hopkins University in accordance with its conflict of interest policies.


News Article | April 19, 2017
Site: www.eurekalert.org

Researchers at Johns Hopkins and George Washington universities report new evidence that proteins created by defective forms of HIV long previously believed to be harmless actually interact with our immune systems and are actively monitored by a specific type of immune cell, called cytotoxic T cells. In a report on the study, conducted on laboratory-grown human cells and published April 12 in the journal Cell Host and Microbe, the investigators say their experiments show that while defective HIV proviruses -- the viral genetic material -- cannot create functional infectious HIVs, a specific subset called "hypermutated" HIV proviruses creates proteins that cytotoxic T cells recognize as HIV. HIV proviruses can outnumber functional HIV 1000 copies to one and the faulty proteins they create can complicate efforts to measure a patient's viral load, exhaust immune systems, shield functional HIV from attack by natural means or drugs, and seriously complicate the development of a cure. Researchers believe that if they can exploit the "hypermutated" form of these proviruses, it could help them eliminate more of the defective HIV proviruses and develop a cure for HIV infection. "The virus has a lot of ways, even in its defective forms, to distract our immune systems, and understanding how they do this is essential in finding a cure," says Ya Chi Ho, M.D., Ph.D., instructor of medicine at the Johns Hopkins University School of Medicine, and the lead study investigator. In the study, the scientists collected nine different defective HIV proviruses from six people infected with HIV, then transfected cultures of human immune cells with them in the laboratory. They grew and tested the transfected cells for markers of HIV proliferation -- such as RNA and proteins -- and found that all of them were capable of creating these components despite their mutations. "The fact that defective proviruses can contribute to viral RNA and protein production is concerning, because it means that the measurements of HIV load in infected patients may not be as accurate as we thought. Part of the count is coming from defective viruses," says Ho. After verifying that defective HIV proviruses created HIV proteins, the researchers then tested whether human immune system cells could biologically recognize and interact with those proteins. The group again transfected cells in the lab with 6 different types of defective HIV provirus taken from patients. In collaboration with Dr. R. Brad Jones, Ph.D., co-first author of the paper and assistant professor of microbiology, immunology and tropical medicine at the George Washington School of Medicine and Health Sciences, Ho's team matched cytotoxic T lymphocytes, the immune cells responsible for recognizing and destroying HIV, from the corresponding patient to the infected cells. The researchers observed that cells containing a the "hypermutated" HIV can be recognized by an infected patient's cytotoxic T cells. "If we identify and find a way to use the right protein, perhaps one of those expressed by the "hypermutated" HIV we found in this study, we could create a potent vaccine which could boost the immune system enough to eliminate HIV altogether," says Ho. However, defective HIV proviruses can distract the immune cells from attacking fully infectious normal HIV. "The cytotoxic T lymphocytes' ability to identify and target the real threat appears to be greatly impaired, because they may attack proteins from defective proviruses instead of the real thing," says Ho. Ho believes that further information about the mutant proviruses could give scientists the tools to target them, get around them, and create a cure for HIV -- a long elusive goal for virologists. Other researchers involved in this study include Ross A. Pollack, Mihaela Pertea, Katherine M. Bruner, Alyssa R. Martin, Adam A. Capoferri, Subul A. Beg and Robert F. Siliciano from the Johns Hopkins University School of Medicine; R. Brad Jones, Allison S. Thomas, Szu-Han Huang and Sara Karandish of the George Washington University; Eitan Halper-Stromberg of the University of Colorado; Patrick C. Young of the Icahn School of Medicine at Mount Sinai; Colin Kovacs of the University of Toronto & The Maple Leaf Medical Clinic; and Erika Benko of the Maple Leaf Medical Clinic. This work was supported by the National Institute of Allergy and Infectious Diseases Extramural Activities (1R21AI118402-01, AI096114, 1U1AI096109), The Martin Dulaney CARE and DARE Collaboratories, the ARCHE Collaborative Research Grant from the Foundations for AIDS Research Generature Cure initiative, the Johns Hopkins Center for AIDS Research, the W. Smith Charitable Trust AIDS Research Grant, Gilead Science HIV Cure Research Grant, the Howard Hughes Medical Institute and the Bill and Melinda Gates Foundation.


News Article | April 28, 2017
Site: www.eurekalert.org

According to a new multicenter study, nearly half of previously employed adult survivors of acute respiratory distress syndrome were jobless one year after hospital discharge, and are estimated to have lost an average of $27,000 in earnings. A summary of the research was published on April 28 in the American Journal of Respiratory and Critical Care Medicine. Acute respiratory distress syndrome (ARDS) is a lung condition often caused by severe infection or trauma, and marked by fluid build up in the lungs' air sacs. The resulting damage leads to a substantial decrease in oxygen reaching the bloodstream and rapidly developing difficulty with breathing. Patients are usually hospitalized and placed on a life-supporting ventilator. ARDS affects approximately 200,000 Americans every year. ARDS survivors often have long-lasting impairments such as cognitive dysfunction, mental health issues and physical impairments, all of which may affect employment. "This study is important and novel given its comprehensive evaluation of joblessness among almost 400 previously employed ARDS survivors from multiple sites across the U.S.," says Dale Needham, F.C.P.A, M.D., Ph.D., professor of medicine and of physical medicine and rehabilitation at the Johns Hopkins University School of Medicine and senior author of the study. "Multiple studies have suggested that joblessness is common in people who survive ARDS, but to our knowledge, none have carefully tracked those who returned to work or subsequently lost their jobs, performed an in-depth analysis of risk factors for joblessness, and evaluated the impact of joblessness on lost earnings and health care coverage," adds Biren Kamdar, M.D., M.B.A., M.H.S., assistant professor of medicine at the David Geffen School of Medicine at UCLA and the study's first author. One important goal of the research, the scientists say, is to better identify specific risk factors for joblessness and to inform future interventions aimed at reducing joblessness after ARDS. The new study was conducted as part of the ARDS Network Long-Term Outcome Study (ALTOS), a national multicenter prospective study longitudinally evaluating ARDS survivors recruited from 2006 to 2014, including patients from 43 hospitals across the U.S. For the analysis, the investigators recruited 922 survivors and interviewed them by telephone at six months and 12 months after the onset of their ARDS. Each survivor was asked about employment status, hours working per week, how long before they returned to work following hospital discharge, perceived effectiveness at work and major change in occupation. The research team estimated lost earnings using age- and sex-matched wage data from the U.S. Bureau of Labor Statistics. Individual survivors' matched wages were multiplied by the number of hours worked prior to hospitalization to determine potential earnings and by current hours worked to determine estimated earnings. Estimated lost earnings were calculated as the difference between estimated and potential earnings. Of the 922 survivors, 386 (42 percent) were employed prior to ARDS. The average age of these previously employed survivors was 45 years, 56 percent were male and 4 percent were 65 years or older. Overall, previously employed survivors were younger, predominantly male and had fewer pre-existing health conditions compared with survivors not employed before ARDS. Of the 379 previously employed patients who survived to 12-month follow-up, nearly half (44 percent) were jobless a year after discharge. Some 68 percent of survivors eventually returned to work during the 12-month follow-up period, but 24 percent of these survivors subsequently lost their jobs. Throughout the 12-month follow-up, non-retired jobless survivors had an average estimated earnings loss of about $27,000 each, or 60 percent of their pre-ARDS annual earnings. The research team also saw a substantial decline in private health insurance coverage (from 44 to 30 percent) and a rise in Medicare and Medicaid enrollment (33 to 49 percent), with little change in uninsured status. For the 68 percent of ARDS survivors who returned to work by the end of the follow-up year, the median time to return was 13 weeks after discharge. Of those, 43 percent never returned to the number of previous hours worked, 27 percent self-reported reduced effectiveness at work, and 24 percent later lost their jobs. The team found that older, non-white survivors, and those experiencing a longer hospitalization for their ARDS had greater delays in returning to work. Severity of illness and sex, however, did not affect time to return to work. "These results cry out for those in our medical field to investigate occupational rehabilitation strategies and other interventions to address the problem of post-discharge joblessness," Needham says. "Health care providers need to start asking themselves, 'What can we do to help patients regain meaningful employment,' and not just concern ourselves with their survival." "We believe that ARDS survivors are often jobless due to a combination of physical, psychological and cognitive impairments that may result, in part, from a culture of deep sedation and bed rest that plagues many ICUs. Perhaps if we can start rehabilitation very early, while patients are still on life support in the intensive care unit, getting them awake, thinking and moving sooner, this may result in greater cognitive and physical stimulation and improved well-being. This change in culture can occur and is part of regular clinical practice in our medical ICU at The Johns Hopkins Hospital." Other authors on this paper include Minxuan Huang, Victor D. Dinglas and Elizabeth Colantuoni of The Johns Hopkins University, Till M. von Wachter of the University of California at Los Angeles, and Ramona O. Hopkins of Intermountain Medical Center in Utah. Funding for this study is provided by the National Heart, Lung and Blood Institute (N01HR56170, R01HL091760 and 3R01HL091760-02S1), the ARDS Network trials (contracts HHSN268200536165C to HHSN268200536176C and HHSN268200536179C) and the UCLA Clinical and Translational Science Institute (CTSI) (NIH-National Center for Advancing Translational Science (NCATS) UCLA UL1TR000124 & UL1TR001881).


News Article | April 27, 2017
Site: www.biosciencetechnology.com

Johns Hopkins researchers report that an analysis of survey responses and health records of more than 10,000 American adults for nearly 20 years suggests a “synergistic” link between exercise and good vitamin D levels in reducing the risk of heart attacks and strokes. Both exercise and adequate vitamin D have long been implicated in reducing heart disease risks, but in a new study — one not designed to show cause and effect — the researchers investigated the relationship between these two health factors and their joint role in heart health. Their findings, which were published in the April 1 issue of The Journal of Clinical Endocrinology & Metabolism, identified a positive and direct relationship between exercise and vitamin D levels in the blood, which may provide evidence that exercise may boost vitamin D stores. They also found that the two factors working together seemed to somehow do more than either factor alone to protect the cardiovascular system. The researchers caution that their study is an observational one and that long-term, carefully controlled clinical trials would be needed to establish evidence for cause and effect. Nevertheless, the study does support the notion that exposure to the “sunshine” vitamin D and exercise are indicators of good health. “In our study, both failure to meet the recommended physical activity levels and having vitamin D deficiency were very common” said Erin Michos, M.D., M.H.S., associate director of preventive cardiology and associate professor of medicine at the Ciccarone Center for the Prevention of Heart Disease at the Johns Hopkins University School of Medicine. “The bottom line is we need to encourage people to move more in the name of heart health.” Michos adds that exposure to a few minutes a day of sunlight in non-winter seasons, eating a well-balanced meal that includes oily fish such as salmon, along with fortified foods like cereal and milk, may be enough to provide adequate levels of vitamin D for most adults. For their data analysis, the Johns Hopkins researchers used previously gathered information from the federally funded Atherosclerosis Risk in Communities study beginning in 1987 and collected from 10,342 participants initially free of heart or vascular disease. Information about participants was updated and followed until 2013, and included adults from Forsyth County, North Carolina; Jackson, Mississippi; greater Minneapolis, Minnesota; and Washington County, Maryland. The participants were an average age of 54 at the start of the study and 57 percent were women. Twenty-one percent were African-American, with the remaining participants identifying as white. In the first visit between 1987 and 1989, participants self-reported their exercise levels, which were compared to the American Heart Association recommendations of more than 150 minutes per week of moderate intensity exercise or 75 minutes per week or more of vigorous intensity. The researchers used the information to classify each participant’s exercise level as adequate, intermediate or poor. People with adequate exercise levels met the AHA’s recommendations, those with intermediate levels exercised vigorously for up to 74 minutes per week or exercised moderately for less than 149 minutes a week, and those classified as poor didn’t exercise at all. About 60 percent of the participants had inadequate exercise in the poor or intermediate categories. The researchers converted the exercise to metabolic equivalent tasks (METs), an exercise intensity scale used by cardiologists and other clinicians to assess fitness. They then calculated physical activity levels by multiplying METs by minutes per week of exercise. Reviewing data from the second study visit by each participant between 1990 and 1992, the researchers measured vitamin D levels in the blood by detecting the amount of 25-hydroxyvitamin D. Anyone with less than 20 nanograms per milliliter of 25-hydroxyvitamin D was considered deficient for vitamin D, and levels above 20 nanograms per milliliter were considered adequate. Thirty percent of participants had inadequate vitamin D levels. In the first part of their study, the Johns Hopkins team showed that exercise levels positively corresponded to vitamin D levels in a direct relationship, meaning that the more one exercised, the higher their vitamin D levels seemed. For example, people with adequate exercise had an average 25-hydroxyvitamin D level of 26.6 nanograms per milliliter, those with intermediate exercise had 24.4 nanograms per milliliter, and those with poor exercise had 22.7 nanograms per milliliter. Those meeting recommended levels of exercise at visit 1 had a 31 percent lower risk of being vitamin D deficient at visit 2. Yet, the researchers only saw such a positive relationship between exercise and vitamin D in whites and not African-Americans. In the next part of the study, they found that the most active participants with the highest vitamin D levels had the lowest risk for future cardiovascular disease. Over the 19 years of the study, 1800 adverse cardiac events occurred, including heart attack, stroke or death due to heart disease or stroke. After adjusting the data for age, sex, race, education, smoking, alcohol use, blood pressure, diabetes, high blood pressure medication, cholesterol levels, statin use and body mass index, the researchers found that those people who met both the recommended activity levels and had vitamin D levels above 20 nanograms per milliliter experienced about a 23 percent less chance of having an adverse cardiovascular event than those people with poor physical activity who were deficient for vitamin D. On the other hand, people who had adequate exercise but were vitamin D deficient didn’t have a reduced risk of an adverse event. In other words, the combined benefit of having adequate vitamin D and exercise levels was better than either health factor alone. But Michos said that sun exposure may not be the whole story of the direct relationship found between exercise and vitamin D levels, since vitamin D produced by the skin after exposure to sunlight tends to level off when the body makes enough, and the levels in these participants didn’t show signs of doing so.  She said this points to evidence that there may be something else going on in the body that causes vitamin D and exercise to positively influence levels of each other. For example, people who exercise may also have other healthy habits that influence vitamin D levels such as lower body fat and a healthier diet. Alternatively, people who exercise may take more vitamin supplements. As for the racial disparity they saw, this could mean promoting physical activity may not be as effective for raising vitamin D levels in African-Americans as in whites. Michos notes that people with darker skin produce vitamin D less efficiently after sun exposure, possibly due to the greater amount of melanin pigment, which acts as a natural sunscreen. African-Americans also tend to have lower levels of 25-hydroxyvitamin D overall but they don’t seem to experience the same consequences, such as bone fractures, that whites have with similarly low levels. Michos cautions that people who meet the recommended daily amount of 600 to 800 International Units a day and who have adequate levels of vitamin D don’t need to take additional vitamin supplements. “More isn’t necessarily better once your blood levels are above 20 nanograms per milliliter,” said Michos. “People at risk of bone diseases, have seasonal depression, or are obese should have their physicians measure vitamin D levels to ensure they’re adequate, but for many, the best way to ensure adequate blood levels of the vitamin is from sun exposure, healthy diet, being active and maintaining a normal body weight.” She adds, “Just 15 minutes of sunlight in the summer produces about 3000 international units of vitamin D depending on latitude and skin pigmentation, which is equivalent to 30 glasses of milk. Just be sure to use sunscreen if you plan to be outside longer than 15 minutes.” While the health boost from regular physical activity is undisputed, the benefits of vitamin D supplements haven’t yet been proven for heart health. Michos notes that a recent randomized clinical trial published in JAMA Cardiology failed to show any cardiovascular benefit with high-doses of monthly vitamin D supplements among participants living in New Zealand. She said that larger studies including more diverse populations of patients and different dosing regimens are currently on-going and, when published, will provide further insight and guide recommendations for patients.


In the study, "Measurement of the Cosmic Optical Background using the Long Range Reconnaissance Imager on New Horizons," lead author Michael Zemcov used archival data from the instrument onboard New Horizons—the Long Range Reconnaissance Imager, or LORRI—to measure visible light from other galaxies. The light shining beyond the Milky Way is known as the cosmic optical background. Zemcov's findings give an upper limit to the amount of light in the cosmic optical background. "Determining how much light comes from all the galaxies beyond our own Milky Way galaxy has been a stubborn challenge in observational astrophysics," said Zemcov, assistant professor in RIT's School of Physics and Astronomy and member of RIT's Center for Detectors and Future Photon Initiative. Light from the cosmic optical background can reveal the number and location of stars, how galaxies work and give insights into the peculiar nature of exotic physical processes, such as light that may be produced when dark matter decays. Dark matter is the invisible substance thought to comprise 85 percent of matter in the universe. "This result shows some of the promise of doing astronomy from the outer solar system," Zemcov said. "What we're seeing is that the optical background is completely consistent with the light from galaxies and we don't see a need for a lot of extra brightness; whereas previous measurements from near the Earth need a lot of extra brightness. The study is proof that this kind of measurement is possible from the outer solar system, and that LORRI is capable of doing it." Spacecraft in the outer solar system give scientists virtual front-row seats for observing the cosmic optical background. The faint light from distant galaxies is hard to see from the inner solar system because it is polluted by the brightness of sunlight reflected off interplanetary dust in the inner solar system. Cosmic dust is sooty bits of rock and small debris that moved, over time, from the outer solar system toward the sun. Scientists launching experiments on sounding rockets and satellites must account for the dust that makes the Earth's atmosphere many times brighter than the cosmic optical background. NASA's New Horizons mission has been funded through 2021, and Zemcov is hopeful for the chance to use Long Range Reconnaissance Imager to re-measure the brightness of the cosmic optical background. "NASA sends missions to the outer solar system once a decade or so," Zemcov said. "What they send is typically going to planets and the instruments onboard are designed to look at them, not to do astrophysics. Measurements could be designed to optimize this technique while LORRI is still functioning." Zemcov's method harkens back to NASA's first long distance missions Pioneer 10 and 11 in 1972 and 1974. Light detectors on the instruments measured the brightness of objects outside the Milky Way and made the first direct benchmark of the cosmic optical background. "With a carefully designed survey, we should be able to produce a definitive measurement of the diffuse light in the local universe and a tight constraint on the light from galaxies in the optical wavebands," Zemcov said. Archived data from New Horizon's Long Range Reconnaissance Imager show "the power of LORRI for precise low-foreground measurements of the cosmic optical background," Zemcov wrote in the paper. Chi Nguyen, a Ph.D. student in RIT's astrophysical sciences and technology program, mined data sets from New Horizons' 2006 launch, Jupiter fly-by and cruise phase. She isolated four different spots on the sky between Jupiter and Uranus, captured in 2007, 2008 and 2010, that met their criteria: looking away from the solar system and looking out the galaxy. Poppy Immel, an undergraduate majoring in math and computer science, generated the data cuts and determined the photometric calibration of the instrument. Other co-authors include Asantha Cooray from University of California Irvine; Carey Lisee from Johns Hopkins University; and Andrew Poppe from UC Berkeley. Zemcov is affiliated with the Jet Propulsion Laboratory. Explore further: Universe's ultraviolet background could provide clues about missing galaxies More information: Michael Zemcov et al, Measurement of the cosmic optical background using the long range reconnaissance imager on New Horizons, Nature Communications (2017). DOI: 10.1038/ncomms15003


(PR NewsChannel) / May 5, 2017 / Burien, Wisconsin The International Association of HealthCare Professionals is pleased to welcome Dr. Rodney D. Skoglund, MD, Endocrinologist and Internist to their prestigious organization with his upcoming publication in The Leading Physicians of the World. He is a highly trained and qualified physician with an extensive expertise in all facets of his work. Dr. Skoglund has been in practice for more than four decades and is currently serving patients within Three Tree Internal Medicine located in Burien, Wisconsin. Dr. Skoglund’s career in medicine began in 1976 when he graduated from Johns Hopkins University School of Medicine in Baltimore, Maryland. Upon receiving his Medical Degree, he completed an Internal Medicine residency at Vanderbilt University in Nashville, Tennessee, before undertaking his Endocrinology fellowship at the University of Washington in Seattle. Dr. Skoglund is board certified in Endocrinology by the American Board of Internal Medicine, and maintains a professional membership with the American Medical Association, the American College of Physicians, and the American Association of Clinical Endocrinologists. He attributes his success to having had great teachers, preceptors, and colleagues, and when he is not working, Dr. Skoglund likes to relax by fishing. View Dr. Rodney D. Skoglund’s Profile Here: https://www.findatopdoc.com/doctor/8138292-Rodney-Skoglund-Internist-Burien-Washington-98166 Learn more about Dr. Skoglund here: http://www.3treeim.com/ and be sure to read his upcoming publication in The Leading Physicians of the World. About FindaTopDoc.com FindaTopDoc.com is a hub for all things medicine, featuring detailed descriptions of medical professionals across all areas of expertise, and information on thousands of healthcare topics.  Each month, millions of patients use FindaTopDoc to find a doctor nearby and instantly book an appointment online or create a review.  FindaTopDoc.com features each doctor’s full professional biography highlighting their achievements, experience, patient reviews and areas of expertise.  A leading provider of valuable health information that helps empower patient and doctor alike, FindaTopDoc enables readers to live a happier and healthier life.  For more information about FindaTopDoc, visit http://www.findatopdoc.com


LEESBURG, Va., May 01, 2017 (GLOBE NEWSWIRE) -- K2M Group Holdings, Inc. (NASDAQ:KTWO) (the "Company" or "K2M"), a global leader of complex spine and minimally invasive solutions focused on achieving three-dimensional Total Body Balance™, hosted more than 100 international spine surgeons from 22 countries for its annual Meeting of Minds™ in Lisbon, Portugal, from April 28–29, 2017. Meeting of Minds is a premier, world-class curriculum in the latest approaches and techniques for the operative treatment of spinal disorders. The Company also demonstrated its Balance ACS™ (or BACS™) platform, which applies three-dimensional solutions across the entire clinical care continuum to help drive quality outcomes in spine patients. “K2M just concluded a highly successful and educational meeting in Lisbon,” stated Harry Shufflebarger, MD, director of the Division of Spinal Surgery at Nicklaus Children’s Hospital in Miami, Florida, past president of the Scoliosis Research Society, former professor of orthopedic surgery and neurosurgery, and current clinical professor of orthopaedics and rehabilitation at the University of Miami, and a guest lecturer at this year’s Meeting of Minds. “Participants heard world experts lecture and discuss a variety of spinal topics. These include pediatric and adult topics of all etiologies. I rate this as a top-level educational event.” Meeting of Minds—the largest of K2M’s many Medical Education programs—featured more than 60 interactive discussions, case presentations and hands-on demonstrations on key deformity topics in areas as diverse as adult, adolescent and cervical deformity, revision surgery, minimally invasive surgery, proximal junctional kyphosis, as well as transformative solutions for achieving three-dimensional Total Body Balance in spine patients. "Meeting of Minds was a great, well-planned meeting,” stated Anant Tambe, FRCS, MCh, MS, DNB, a scoliosis surgeon at Royal Manchester Children’s Hospital in Manchester, U.K. “The opportunity to listen to senior pioneers in spinal surgery and hear their views made it really worthwhile to be there. It lived up to, and exceeded, all expectations." Meeting of Minds was chaired by distinguished leaders in spine surgery, including Oheneba Boachie-Adjei, MD, FOCOS Orthopaedic Hospital, Accra, Ghana; René Castelein, MD, PhD, University Medical Centre Utrecht, The Netherlands; Martin Gehrchen, MD, PhD, Copenhagen University, Denmark; Kan Min, MD, Swiss Scoliosis, Zurich, Switzerland; and John P. Kostuik, MD, Chief Medical Officer and Co-founder of K2M, and professor emeritus at Johns Hopkins University. The faculty also featured 14 additional leaders in spine surgery to meet the increasing global demand for comprehensive deformity education. “K2M is proud to have welcomed some of the world’s top spine surgeons to our annual Meeting of Minds, which featured the latest clinical solutions and advancements in spine surgery,” said Dr. Kostuik. “At this year’s meeting, we also showcased BACS—a comprehensive platform of products, services and research—allowing surgeons to holistically manage the patient experience across the entire episodic care continuum. We are pleased to offer medical education curriculums, coupled with our comprehensive BACS platform, with the goal of transforming spine surgery and improving the lives of patients around the world.” BACS provides solutions focused on achieving balance of the spine by addressing each anatomical vertebral segment with a 360-degree approach to the axial, coronal, and sagittal planes, emphasizing Total Body Balance as an important component of surgical success. For more information on K2M’s comprehensive Medical Education offering, as well as K2M's complete product portfolio, visit www.K2M.com. For more information on Balance ACS, visit www.BACS.com. K2M Group Holdings, Inc. is a global leader of complex spine and minimally invasive solutions focused on achieving three-dimensional Total Body Balance. Since its inception, K2M has designed, developed, and commercialized innovative complex spine and minimally invasive spine technologies and techniques used by spine surgeons to treat some of the most complicated spinal pathologies. K2M has leveraged these core competencies into Balance ACS, a platform of products, services, and research to help surgeons achieve three-dimensional spinal balance across the axial, coronal, and sagittal planes, with the goal of supporting the full continuum of care to facilitate quality patient outcomes. The Balance ACS platform, in combination with the Company's technologies, techniques, and leadership in the 3D-printing of spinal devices, enable K2M to compete favorably in the global spinal surgery market. For more information, visit www.K2M.com and connect with us on Facebook, Twitter, Instagram, LinkedIn, and YouTube. This press release contains forward-looking statements that reflect current views with respect to, among other things, operations and financial performance.  Forward-looking statements include all statements that are not historical facts such as our statements about our expected financial results and guidance and our expectations for future business prospects.  In some cases, you can identify these forward-looking statements by the use of words such as "“outlook,” “guidance,” “believes,” “expects,” “potential,” “continues,” “may,” “will,” “should,” “could,” “seeks,” “predicts,” “intends,” “plans,” “estimates,” “anticipates” or the negative version of these words or other comparable words.  Such forward-looking statements are subject to various risks and uncertainties including, among other things: our ability to achieve or sustain profitability in the future; our ability to demonstrate to spine surgeons the merits of our products; pricing pressures and our ability to compete effectively generally; collaboration and consolidation in hospital purchasing; inadequate coverage and reimbursement for our products from third-party payors; lack of long-term clinical data supporting the safety and efficacy of our products; dependence on a limited number of third-party suppliers; our ability to maintain and expand our network of direct sales employees, independent sales agencies and international distributors and their level of sales or distribution activity with respect to our products; proliferation of physician-owned distributorships in our industry; decline in the sale of certain key products; loss of key personnel; our ability to enhance our product offerings through research and development; our ability to manage expected growth; our ability to successfully acquire or invest in new or complementary businesses, products or technologies; our ability to educate surgeons on the safe and appropriate use of our products; costs associated with high levels of inventory; impairment of our goodwill and intangible assets; disruptions in our main facility or information technology systems;  our ability to ship a sufficient number of our products to meet demand; our ability to strengthen our brand; fluctuations in insurance cost and availability; our ability to comply with extensive governmental regulation within the United States and foreign jurisdictions; our ability  to maintain or obtain regulatory approvals and clearances within the United States and foreign jurisdictions; voluntary corrective actions by us or our distribution or other business partners or agency enforcement actions; recalls or serious safety issues with our products; enforcement actions by regulatory agencies for improper marketing or promotion; misuse or off-label use of our products; delays or failures in clinical trials and results of clinical trials; legal restrictions on our procurement, use, processing, manufacturing or distribution of allograft bone tissue; negative publicity concerning methods of tissue recovery and screening of donor tissue; costs and liabilities relating to environmental laws and regulations;  our failure or the failure of our agents to comply with fraud and abuse laws; U.S. legislative or Food and Drug Administration regulatory reforms; adverse effects of medical device tax provisions; potential tax changes in jurisdictions in which we conduct business; our ability to generate significant sales; potential fluctuations in sales volumes and our results of operations over the course of the year; uncertainty in future capital needs and availability of capital to meet our needs; our level of indebtedness and the availability of borrowings under our credit facility; restrictive covenants and the impact of other provisions in the indenture governing our convertible  senior notes and our credit facility;  continuing worldwide economic instability; our ability to protect our intellectual property rights; patent litigation and product liability lawsuits; damages relating to trade secrets or non-competition or non-solicitation agreements; risks associated with operating internationally; fluctuations in foreign currency exchange rates; our ability to comply with the Foreign Corrupt Practices Act and similar laws; increased costs and additional regulations and requirements as a result of being a public company; our ability to implement and maintain effective internal control over financial reporting; potential volatility in our stock due to sales of additional shares by our pre-IPO owners or otherwise; our lack of current plans to pay cash dividends; our ability to take advantage of certain reduced disclosure requirements and exemptions as a result of being an emerging growth company; potential dilution by the future issuances of additional common stock in connection with our incentive plans, acquisitions or otherwise; anti-takeover provisions in our organizational documents and our ability to issue preferred stock without shareholder approval; potential limits on our ability to use our net operating loss carryforwards; and other risks and uncertainties, including those described under the section entitled “Risk Factors” in our most recent Annual Report on Form 10-K filed with the SEC, as such factors may be updated from time to time in our periodic filings with the SEC, which are accessible on the SEC’s website at www.sec.gov.  Accordingly, there are or will be important factors that could cause actual outcomes or results to differ materially from those indicated in these statements.  These factors should not be construed as exhaustive and should be read in conjunction with the other cautionary statements that are included in this release and our filings with the SEC. We operate in a very competitive and challenging environment.  New risks and uncertainties emerge from time to time, and it is not possible for us to predict all risks and uncertainties that could have an impact on the forward-looking statements contained in this release.  We cannot assure you that the results, events and circumstances reflected in the forward-looking statements will be achieved or occur, and actual results, events or circumstances could differ materially from those described in the forward-looking statements. The forward-looking statements made in this press release relate only to events as of the date on which the statements are made.  We undertake no obligation to publicly update or review any forward-looking statement, whether as a result of new information, future developments or otherwise, except as required by law.  We may not actually achieve the plans, intentions or expectations disclosed in our forward-looking statements and you should not place undue reliance on our forward-looking statements. Unless specifically stated otherwise, our forward-looking statements do not reflect the potential impact of any future acquisitions, mergers, dispositions, joint ventures, investments or other strategic transactions we may make.


Dr. Mansikka brings significant and relevant experience to Chromocell, having served as medical director at AbbVie across multiple therapeutic areas, including pain management. In his role at AbbVie, Dr. Mansikka provided strategic oversight for multiple preclinical-stage, small molecule compounds across the spectrum of pain management. This work enabled the advancement of multiple candidates into clinical testing. Dr. Mansikka also led cross-functional development, evaluation, and clinical validation of various translational tools supporting the Company's pain portfolio. Dr. Mansikka was also responsible for leading clinical development of multiple assets targeting immune mediated inflammatory diseases. Before his tenure at AbbVie, Dr. Mansikka held positions of increasing responsibility at several biopharmaceutical companies, including Pfizer, Mundipharma Research Limited, and Grünenthal. In these roles, he made impactful contributions across many functions, including clinical research, regulatory and commercial, spanning multiple therapeutic areas. Dr. Mansikka has experience from all phases of drug development from translational medicine up to Phase III leading clinical development program through approval. Dr. Mansikka has specialist training in Anesthesiology and Pain Management and has authored more than 30 publications in the areas of pain and inflammation research. He earned his M.D. and Ph.D. from the University of Helsinki, Finland, and completed his post-doctoral research at Johns Hopkins University. About CC8464/ASP1807 Chromocell's lead compound, CC8464 (Astellas' Development Code:ASP1807), is an oral, potent, highly selective, peripherally-restricted inhibitor of NaV1.7, which is believed to be efficacious in the treatment of neuropathic and inflammatory pain. NaV1.7 is an ion channel, involved in pain transmission. CC8464 was developed using Chromocell's proprietary drug discovery platform, Chromovert®. This technology enables the company to identify rare cells suited for effective high-throughput screening resulting in the discovery of promising drug candidates. About Chromocell Corporation Chromocell is a life sciences company which improves consumer products and patient lives through breakthrough science and technologies. Chromocell is focused on the discovery and development of therapeutics and flavors through the use of pioneering Chromovert® technology.  Chromovert® technology enables Chromocell to use rare cells ideally suited for effective high-throughput screening. Chromocell's therapeutics pipeline is currently focused on analgesics and rare diseases, where Chromovert® technology has proven highly effective in the rapid identification of potential new drug candidates, as well as discovery and development of novel flavor ingredients and natural taste enhancers. For more information, please visit our website at www.chromocell.com. To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/chromocell-appoints-heikki-mansikka-md-phd-vice-president-of-clinical-development-300452999.html


News Article | April 18, 2017
Site: www.prnewswire.com

The conference will also feature video messages from former public officials including Paul Volcker and James Baker, Lewis Lehrman, and a letter from Dr. Mundell. Featured speakers include Dr. Judy Shelton, former IMF economist Dr. Warren Coats, Dr. Benn Steil, Dr. Brian Domitrovic, Mr. John Mueller, and Dr. Xiang Songzuo of China. The event's co-chairmen are Mr. James Kemp of the Jack Kemp Foundation and Dr. Steve Hanke of Johns Hopkins University. The event is inspired by a 1983 conference on exchange rates, hosted by Jack Kemp and Dr. Mundell, that helped lay the groundwork for the Plaza Accord of 1985. For the full conference lineup, see our website here. Register here. Contact: Sean Rushton, director, Project on Exchange Rates and the Dollar srushton@jackkempfoundation.org / (202) 487-6439 To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/jack-kemp-foundation-to-host-forum-on-exchange-rates-and-the-dollar-300441180.html


News Article | May 1, 2017
Site: www.prweb.com

Dr. Farooq Ashraf, medical director of the Atlanta Vision Institute, is pleased to announce that starting May 1, he will be offering the SMILE procedure to patients wishing to reduce or eliminate myopia (nearsightedness). SMILE – short for Small Incision Lenticule Extraction – is an innovative, new method of performing laser eye surgery which combines the benefits of PRK (photorefractive keratectomy) and LASIK (laser-assisted in-situ keratomileusis). Like LASIK, SMILE surgery is bladeless, however it requires no laser tissue ablation. Like PRK, SMILE is “flapless” (meaning no large flap is created), which contributes to quicker healing, and may make it a viable option for patients who have thin corneas and cannot undergo LASIK. “Being both bladeless and flapless, SMILE has the potential of being a safer procedure than LASIK, with less pain and discomfort than either PRK or LASIK,” Dr. Ashraf says. SMILE was developed by Zeiss, a German manufacturer of optical systems and lasers. This technology was approved for use in the United States by the Food and Drug Administration in September 2016. The Atlanta Vision Institute is among the first eye clinics in the U.S. and the first in Atlanta to perform this procedure commercially. Dr. Ashraf has experience performing SMILE overseas. He is the medical director of the Atlanta Vision Clinic in Dubai, UAE. Dr. Ashraf introduced SMILE to his patients there when the UAE approved the technology many years ago. During the SMILE procedure, computer-guided, highly-focused laser light is used to create a lenticule (a disc-shaped piece of tissue within the cornea), which is then extracted through a tiny keyhole incision. This removal of tissue reshapes the cornea, correcting the nearsightedness. SMILE surgery takes less than 5 minutes, with minimal recovery time afterwards; most patients are able to resume regular activities within 24 hours after treatment. “SMILE is a groundbreaking milestone in laser vision correction for people who have always wanted freedom from eye glasses and contact lenses, but may have been scared of the LASIK procedure,” Dr. Ashraf says. “SMILE adds an additional treatment modality and I am very pleased to be able to offer this technology to my patients in Atlanta.” To learn more about SMILE bladeless, flapless laser vision correction, or to schedule a consultation with Dr. Ashraf, please call the Atlanta Vision Institute at (770) 622-2488 or visit the Atlanta Vision Institute’s website at https://www.atlanta2020.com/. About the Atlanta Vision Institute: Dr. Ashraf is the founder of the Atlanta Vision Institute and is a board-certified ophthalmologist who specializes in corneal and refractive surgery, as well as other treatments for astigmatism, glaucoma, cataracts and other eye conditions. He obtained his advanced training in ocular surgery at Johns Hopkins University and has performed over 40,000 LASIK procedures. In addition to his Atlanta practice, Dr. Ashraf has also founded the Atlanta Vision Clinic in Dubai, UAE. For more information, visit https://www.atlanta2020.com/.


"Car crashes remain the number one killer of teens[i]. Certainly distractions—including smartphones, infotainment systems built right into the car and even peer passengers—are risks teen drivers need to avoid," said Deborah A.P. Hersman, president and CEO of the National Safety Council. "But it all boils down to inexperience. One of the best things parents can do is to stay involved and help their teen build the experience needed to become a safer driver." The survey also found that 60 percent of teens describe driving as somewhat or very stressful. By staying involved and helping teens become more experienced and confident drivers, parents can help alleviate some of that stress. Here are some key tips for parents: The National Safety Council encourages parents with new teen drivers to use resources from DriveitHOME.org to help them become effective driving coaches. DriveitHOME.org includes tips, driving lessons and a New Driver Deal that parents and teens can use to outline household driving rules. Parents also can see all the risks their new teen drivers face, including drowsy driving. Finally, DriveitHOME's new monitoring technology page can help guide parents on the best options for extending their involvement, even when they can't be in the passenger seat. About the National Safety Council Founded in 1913 and chartered by Congress, the National Safety Council, nsc.org, is a nonprofit organization whose mission is to save lives by preventing injuries and deaths at work, in homes and communities, and on the road through leadership, research, education and advocacy. NSC advances this mission by partnering with businesses, government agencies, elected officials and the public in areas where we can make the most impact – distracted driving, teen driving, workplace safety, prescription drug overdoses and Safe Communities. [i] According to NSC Injury Facts 2017 [ii] According to Johns Hopkins University School of Public Health [iii] According to an NSC survey [iv] According to an NSC survey To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/national-safety-council-encourages-parent-education-during-prom-graduation-season-300451052.html


News Article | April 17, 2017
Site: co.newswire.com

" Leadership development begins with a support system, that helps all team members reach their potential, focusing on their gifts, talents and capabilities. The purpose is not exploitation, but functional benefit for the mission of the team. This requires a fine balance between the need for tunnel vision during execution of a mission and capabilities that support stability, health, happiness and prosperity in the bigger picture of life. Though paradoxical, the objective is a team of leaders."  -- Stephen M. Apatow. From "Living On The Edge" to being the "Cutting Edge" In 1994, a small nonprofit organization named Humanitarian Resource Institute (HRI), was formed in Carson City, Nevada.  The mission was to address the cross section of needs defined during two national touch outreach projects, the first for substance abuse in 1990, and second for hunger, homelessness and poverty in 1993.  HRI's first project was named Focus On America.  Through the assistance of the Federal Emergency Management Agency (FEMA) and Emergency Food and Shelter National Board Program (EFSNBP), the mission was to take lessons learned, and "bridge unmet needs to untapped resources."   This project reached front-line programs and EFSNBP directors in over 3100 U.S. counties, all 50 states and territories.  In 1999, the successful completion of United States networks, led to the development the International Disaster Information Network (IDIN), to assist FEMA with remediation for the Year 2000 Conversion, and then complex emergencies in 193 UN member countries. Formation of the Humanitarian University Consortium in 2002, helped connect subject matter experts at colleges and universities, public, private and defense organizations in every UN member country.  Through this consortium initiative, the worlds top reference points in medicine, veterinary medicine and law helped HRI be a global reference point for health care, education, agricultural and economic development. Shortly thereafter, HRI was recognized as one of nine leading educational and research institutions by the National Academy of Sciences, with the Center for Nonproliferation Studies, Columbia University: Center for Public Health Preparedness, Harvard University John F. Kennedy School of Government: Belfer Center for Science and International Affairs, Humanitarian Resource Institute, Johns Hopkins University: Center for Civilian Biodefense Studies, Massachusetts Institute of Technology: Center for International Studies, National Academy of Sciences, University of Maryland: Center for International and Security Studies at Maryland,  University of Minnesota: Center for Infectious Disease Research and Policy. -- See:  Biological Threats and Terrorism, Assessing the Science and Response Capabilities: Workshop Summary:  Forum on Emerging Infections, Board on Global Health. "Front Matter, " Washington, DC: The National Academies Press, 2002.   In 2009, HRI formed the United Nations Arts Initiative to promote "Arts Integration Into Education," connecting educators, artists and entertainment industry, who have the innovation, creativity and intimate connection with the grassroots level, to impact prioritized humanitarian emergencies and relief operations. The United Nations Arts Initiative helps both artist and grassroots leaders with strategic planning, critical analysis, expert think tank development for background discussions, peer reviewed data compilation and communications that engage decision makers and audiences in a target demographic. In 2011, H-II OPSEC Expeditionary Operations was developed to assist defense support for humanitarian and security emergencies, currently beyond the capabilities of governmental, UN, NGO and relief organizations. Though functioning outside of the mainstream spotlight for 23 years, Humanitarian Resource Institute has been the reference point for unconventional asymmetric strategic planning. Today, Stephen M. Apatow, President, Director of Research and Development for HRI, is focused on helping young leaders and executive leadership teams understand how to operate in complex environments and strategic areas viewed as critical to the CEO level of operations.  Lead from the Front: Development Programs help the CEO level break down walls and barriers, establishing a focus on optimization of the mission objective, through:


News Article | April 21, 2017
Site: news.yahoo.com

US President Donald Trump's budget proposals include slashing funding for the National Institutes of Health and eliminating one third of the staff at the Environmental Protection Agency (AFP Photo/JIM WATSON) Miami (AFP) - Budget cuts and political assaults on science are expected to draw thousands of demonstrators to the streets in more than 500 cities worldwide Saturday for the first March for Science. Organizers insist that the demonstrations -- anchored by a major rally in Washington on Earth Day, April 22 -- are not aimed specifically at US President Donald Trump or any political party. Rather, they say, the goal is to defend the vital role of evidence and scientific research when formulating public policies, and to speak out against travel restrictions that prevent the free flow of information and expertise. "The organizers of the march have taken great pains to say this is not partisan, it is not about any particular public official or political figure," Rush Holt, a physicist and former US congressman told reporters on a conference call, noting that scientists are "often reticent" to wade into the political fray. "For years now, going back far before the election of last fall, there has been a concern among scientists and friends of science that evidence has been crowded out by ideology and opinion in public debate and policy making." Holt, who heads the American Association for the Advancement of Science, described the trend as "appalling" and said it has driven anxiety to new heights. The idea for the science-specific march arose during the Women's March on January 21, which drew more than two million protesters into the streets worldwide in support of human rights, he said. "Scientists started breaking out spontaneously in the Women's March," and several of them connected on social media to forge the plans for a demonstration in support of science, he said. In the months and years prior to the 2016 election, Trump declared climate change a hoax perpetrated by the Chinese, but since taking office he has delivered mixed messages regarding his views on global warming. He has signed, however, an executive order to roll back environmental protections enacted by his predecessor Barack Obama, and has nominated climate-science skeptics to top posts in his administration. Trump has also kept people guessing on whether or not the United States will remain committed to the Paris Climate Accord of 2015, which called for curbing fossil-fuel emissions. Media reports have pointed to deep divisions within his administration on the matter and no announcement is expected before May. One of Trump's most alarming moves, according to many scientists, was his budget proposal -- yet to be approved by lawmakers -- that would slash funding for the National Institutes of Health and would eliminate one third of the staff at the Environmental Protection Agency while boosting spending on the military. This decrease in research funding could "prevent an entire new generation of scientists from ever getting started," said Nobel laureate Carol Greider, professor of molecular biology at Johns Hopkins University. Lydia Villa-Komaroff, a molecular cellular biologist and honorary national co-chair of the March for Science, said the problem is not new, and that federal support for research has been declining since the 1960s. "I think it is fair to say that this administration catalyzed the happening of this march, there is no doubt about that," she told reporters. "But it is nonpartisan. It is aimed not only at both sides of the aisle, where there are people who are dismissing the use of evidence in decisions and policy, but at the public at large where there seems to have become this disconnect between what science is and its value to society." A 2009 Pew Research poll found that most scientists identify as Democrats (55 percent), while 32 percent said they are independent and six percent claimed to be Republican. Celebrities set to appear at the March for Science in Washington will include the musician Questlove of the hip-hop group The Roots, and television personality Bill Nye the Science Guy, who currently heads the Planetary Society. The US capital rally begins Saturday at 8:00 am (1200 GMT), and will be capped with a march from the National Mall to the Capitol at 2:00 pm. More than 500 satellite marches are planned across the United States and worldwide, including in Australia, Brazil, Canada, many nations in Europe, Japan, Mexico, Nepal, Nigeria and South Korea.


News Article | April 12, 2017
Site: www.techtimes.com

NASA is holding a major press conference this week to reveal its latest results on ocean worlds in our own solar system. At the said mysterious event, the U.S. space agency will disclose findings on its “broader search for life beyond Earth,” as well as things that could potentially affect “future ocean world exploration.” The announcements, to be given in a news briefing 2 p.m. EDT this Thursday, April 13, will relate to findings from its Cassini spacecraft as well as the Hubble Space Telescope. They will be made at the NASA headquarters’ James Webb Auditorium in Washington, with remote participation from experts across the United States. “These new discoveries will help inform future ocean world exploration — including NASA’s upcoming Europa Clipper mission planned for launch in the 2020s — and the broader search for life beyond Earth,” NASA said in a statement. Participants in the media briefing will include Thomas Zurbuchen, associate administrator for Science Mission Directorate; Jim Green, Planetary Science Division director; and astrobiology senior scientist Mary Voytek. The panel and remotely positioned experts will entertain questions during the event, while the public can also ask their own questions using the hashtag #AskNASA. The Cassini spacecraft, launched in 1997 and which arrived in the Saturn system in 2004, is bound to end its two-decade mission on Sept. 15 with a planned probe kill. In a final maneuver, Cassini, currently facing a fuel crunch, will be set on a collision course with Saturn’s atmosphere and is expected to “break apart, melt, vaporize, and become a part of the very planet it left Earth 20 years ago to explore,” according to project manager Earl Maize. This crash has been planned by the agency to avoid contamination of a nearby moon hoped to potentially harbor alien life. Afterward, NASA’s planned Europa Clipper will position a spacecraft in orbit around Jupiter to perform a detailed probe of moon Europa. The giant planet’s moon has exhibited strong proof of an ocean of liquid water, situated beneath icy crust and which could be hospitable to life. Cassini has been deemed generally productive given the high-value investigation of Saturn, its rings, and moons. It transmitted, for instance, images that highlighted Enceladus’ geysers with a hint of an ocean underneath, as well as the Earth-like moon Titan. Other NASA missions are marking their own milestones. Just recently, the New Horizons probe successfully reached the halfway point between Pluto and its second target for flyby, the remote Kuiper Belt object 2014 MU69. The probe reached this milestone on April 3 at midnight UTC (April 2, 8 p.m. ET), at a distance of 486.19 million miles from Pluto and the same distance to the remote asteroid. New Horizons eyes swooping past the object, located about 1 billion miles beyond Pluto, on Jan. 1, 2019 — another record for space explorations. New Horizons reached Pluto in July 2015 after its launch from Cape Canaveral in Florida back in January 2006. As Pluto’s first guest from Earth, it is currently 3.5 billion miles from our planet, taking radio signals five hours and 20 minutes to get from the control center in Johns Hopkins University in Maryland to the spacecraft. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 17, 2017
Site: www.prweb.com

Today, the Arnold and Mabel Beckman Foundation announced a $12.5 million investment in cryogenic electron microscopes at five leading research universities throughout the United States. The investment underscores the Foundation’s mission of supporting research breakthroughs in chemistry and the life sciences, and will go toward installing state-of-the-art Cryo-EM instrumentation at Johns Hopkins University School of Medicine, Massachusetts Institute of Technology, Perelman School of Medicine at the University of Pennsylvania, University of Utah and University of Washington School of Medicine, which were selected based on their potential to accelerate fundamental research and discovery already underway. Cryo-EM microscopes have generated excitement in the field of structural biology because of their ability to reveal an unprecedented level of detail of molecules, better enabling scientists to conduct advanced research and address important biological issues. The Foundation is eager to support Cryo-EM initiatives at some of the nation’s foremost research universities and increase scientists’ access to these leading-edge instruments. “The Beckman Foundation recognizes that Cryo-EM has potential to transform the structural biology research community,” explained Dr. Anne Hultgren, Executive Director of the Beckman Foundation. “While the expense can make acquiring this technology via federal grants prohibitive, we as a private foundation are in a unique position to support major infrastructure investments to enable broader deployment of this new tool and increase access for young scientists to this exciting field of study.” The funds will be provided to the universities this spring, allowing for microscope installation by Fall 2018. About the Arnold and Mabel Beckman Foundation Located in Irvine, California, the Arnold and Mabel Beckman Foundation supports researchers and nonprofit research institutions in making the next generation of breakthroughs in chemistry and the life sciences. Founded in 1977 by 20th century scientific instrumentation pioneer Dr. Arnold O. Beckman, the Foundation supports institutions and young scientists whose creative, high-risk, and interdisciplinary research will lead to innovations and new tools and methods for scientific discovery. For more information, visit http://www.beckman-foundation.org/.


For the first time, researchers find that a majority of cancerous mutations are linked to DNA, while a small number are caused by environmental factors and lifestyle choices, according to a numerical model based on DNA sequencing and epidemiological data. But this does not mean your lifestyle no longer affects your chances of getting sick, experts asserted. In general, 66 percent of mutations resulted from random errors as cells replaced themselves, the research from a Johns Hopkins University team noted. Environmental factors, on the other hand, form 29 percent, and the remaining 5 percent are attributed to genetics. Researchers Dr. Bert Vogelstein and Cristian Tomasetti previously proposed that the risk of cancer development is primarily anchored on random DNA mistakes occurring when self-reviving cells are in the process of division. Their new paper details the prevalent role of “chance” in the disease. In a press briefing, Vogelstein explained that a completely normal cell could make several errors or mutations as it divides. “Now most of the time, these mutations don't do any harm … That's the usual situation and that's good luck,” he said in a CNN report. “Bad luck” occurs when one of such random mistakes take place in a cancer-driving gene, he added, noting that this finding might be comforting for people with cancer in their family history. Even mutations from environmental or lifestyle factors can be quite reckless as well, Tomasetti added. Smoking, for instance, causes more DNA mutations than normal, but the location of the DNA defect on a smoker’s genome is also accidental. But there remains a strong case for making smart lifestyle choices to prevent the disease. According to Tomasetti, a single mutation is not enough to lead to cancer. Typically there should be three or more, and factors such as poor diet, obesity, lack of exercise, and smoking can supply the needed gene defect that brings the body to a diseased state. American Cancer Society chief medical officer Dr. Otis Brawley favored this new paper over the 2015 one, which caused a stir and elicited hundreds of responses in the community. "And it really upset the anti-smoking people, it upset the folks who are in the nutrition and physical activity for cancer prevention - he really upset the prevention crowd," Brawley said, believing the current paper explains the theory better. Don't use tobacco. Using any type of this substance can set one off on a collision course with cancer, including that of the lung, mouth, throat, and pancreas. Consume a healthy diet, which includes eating plenty of fruits and vegetables, maintaining a healthy weight, drinking alcohol in moderation, and limiting intake of processed meats. Stay physically active, which directly assists in weight management and might lower the risk of cancer of the breast, prostate, colon, lung, and kidney. Get sun protection, including avoiding midday sun, staying in the shade, and avoiding tanning beds and sunlamps. Get immunized against certain viral infections, from hepatitis B to the human papillomavirus (HPV). Avoid risky behaviors. Practice safe sex and do not share needles, which can lead to HIV, hepatitis B, and hepatitis C. These conditions can up the risk of liver cancer. Get regular medical care. Undergo regular self-exams and screenings, from skin and breast to colon and cervix health. The study is discussed in the journal Science and has an accompanying editorial penned by Harvard professor Martin A. Nowak and University of Edinburgh research fellow Bartlomiej Waclaw. Recently, researchers developed a blood test that can detect cancer and identify its location in the body. The test involves a computer program and works by analyzing the amount of tumor DNA making rounds in a person’s blood. © 2017 Tech Times, All rights reserved. Do not reproduce without permission.


News Article | April 26, 2017
Site: www.businesswire.com

WESTPORT, Conn.--(BUSINESS WIRE)--PreScience Labs, LLC (“PreScience”) announced today that it has closed on an initial tranche of its first institutional round of financing. The money raised will support the Company’s pursuit of an Investigational New Drug (“IND”) application from the U.S. Food and Drug Administration (“FDA”) for evaluation of its new reformulated, systemically delivered anti-cancer drug known as PS-102. Camden Partners is the lead investor of this $2 million Series AA Round. Proceeds of this round will fund the Company to a Phase I clinical study. This current round of financing is the initial stage of a larger and focused effort to accelerate PreScience’s development. Commenting on these events, Jeff Geschwind, MD, Founder and CEO of PreScience, stated, “Over the past several years, we have been dedicated to pursue a patentable systemic formulation of our core technology. This infusion of capital not only provides the needed resources to proceed with this development, but also validates our approach.” Over the next few months, PreScience will work closely with Camden Partners’ lead investor in early stage biomedical companies, R. Jacob Vogelstein, PhD. Commenting on Camden’s involvement with this program, Dr. Vogelstein added, “We are thrilled to be working with such cutting-edge science, a tremendous scientific team, and look forward to adding value in the form of resources and strategic insight. As PreScience gains momentum and fills additional key executive positions, the Company will better position itself for success, and we are excited by these next steps.” PreScience Labs is a bio-pharmaceutical company focused on the development of new drugs targeting the metabolism of tumors. Dr. Jeff Geschwind, Professor of Radiology and Oncology at Yale School of Medicine, founded the Company in 2008 to develop cancer therapies based on novel formulations of halopyruvates. The Company’s newest lead compound, PS-102, is a novel formulation of a particular halopyruvate called 3-bromopyruvate. PreScience has tested 3-bromopyruvate in numerous animal models and in limited human trials (through compassionate use protocols), and it has shown very high efficacy and tolerability. The data suggests that PS-102 will be effective against most, if not all, types of cancers, including pancreatic, lung, and breast cancer, because it attacks a fundamental component of the metabolic pathway of tumors. The Company already holds an IND for one formulation of 3-bromopyruvate (PS-101), and anticipates submitting a new IND application for PS-102 in 2017, with enrollment of a Phase I study targeting pancreatic cancer in 2018. PS-102 was created at the Johns Hopkins University School of Medicine, and the company has an exclusive license to the associated patent portfolio. PS-102 is part of a patent portfolio fully and exclusively licensed from Johns Hopkins University by PreScience. PS-102 is one of a new class of drugs that targets the tumor glycolysis pathway. This pathway is a signature of cancer cells and is considered one of the hallmarks of cancer. Tumor glycolysis has been exploited for diagnostic purposes (PET imaging) and is now being explored for therapeutic intervention. One of the key enzymes in tumor glycolysis, GAPDH, is the primary target of PS-102. PS-102 irreversibly binds to GAPDH resulting in a multi-prong assault on cancer cells, ultimately leading to their death. The predominant effect of this interaction is the profound depletion of ATP, depriving the cancer cells of any energy. Because glycolysis is the dominant metabolic pathway in cancer cells, those cells are acutely sensitive to any disruption of that pathway. In addition, because normal cells do not rely on glycolysis, but rather on oxidative phosphorylation for their energy needs, disruption of glycolysis is highly specific to cancer cells. The combination of high sensitivity and specificity makes targeting tumor glycolysis highly attractive. Through its ability to inhibit GAPDH, PS-102 has proven extremely effective at shutting down the energy-producing capabilities of cancer cells which in turn destroys them. PreScience’s core technology and patent protection relies on both the novel PS-102 compound and other core technology licensed from Johns Hopkins University. The company plans to further evaluate PS-102 in a Phase I clinical study.

Loading Johns Hopkins University collaborators
Loading Johns Hopkins University collaborators