Time filter

Source Type

Federal Way, CA, United States

Mullane K.,Profectus Pharma Consulting Inc
Biochemical Pharmacology | Year: 2011

The prevalence of asthma continues to rise. Current drugs provide symptomatic relief to some, but not all, patients. Despite the need for new therapeutics, and a huge research effort, only four novel agents from two classes of drugs - the antileukotrienes and an anti-IgE antibody - have been approved in the last 30 years. This review highlights three particular issues that contribute to the challenge of identifying new therapeutics. First is an over-reliance on animal models of allergy to define targets and expectations of efficacy that has met with poor translation to the clinical setting. While sensitivity to particular aeroallergens is one key risk factor for asthma, atopy and asthma are not synonymous, and while about half of adult asthmatics are atopic the incidence of allergic asthma is probably <50%. The second issue is a fundamental disconnect between the directions of basic research and clinical research. Basic research has developed a detailed, reductive, unifying mechanism of antigen-induced, T helper type 2 cell-mediated airway inflammation as the root cause of asthma. In contrast, clinical research has started to identify multiple asthma phenotypes with differing cellular components, mediators and sensitivities to asthma drugs, and probably varying underlying factors including susceptibility genes. Finally, different features of asthma - bronchoconstriction, symptoms, and exacerbations - respond diversely to treatment; effects that are not captured in animal models which do not develop asthma per se, but utilize unvalidated surrogate markers. Basic research needs to better integrate and utilize the clinical research findings to improve its relevance to drug discovery efforts. © 2011 Elsevier Inc. All rights reserved. Source

Mullane K.,Profectus Pharma Consulting Inc | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2014

Animal models of disease represent the pinnacle of hierarchical research efforts to validate targets and compounds for therapeutic intervention. Yet models of asthma, particularly in the mouse, which, for practical reasons, has become the sine qua non of asthma research, have been a bone of contention for decades. With barely a nod to their limitations and an extensive history of translational failures, they continue to be used for target identification and to justify the clinical evaluation of new compounds. Recent improvements - including sensitization directly to the airways; use of more relevant allergens; development of a chronic rather than short-term condition; utilization of techniques to measure lung function beyond uninterpretable measures of airway hyperresponsiveness - are laudable but cannot bridge the chasm between the models and the myriad complexities of the human disorder and multiple asthma endophenotypes. While further model developments are necessary, including recognition of key environmental factors beyond allergens, the judicious integration with newer ex vivo and in vitro techniques, including human precision-cut lung slices, reprograming of patient-derived induced pluripotent stem cells and fibroblasts to epithelial and smooth muscle cells, and use of other clinical samples to create a more holistic depiction of activities, might improve their translational success. © 2013 Elsevier Inc. Source

Kenakin T.,University of North Carolina at Chapel Hill | Bylund D.B.,University of Nebraska Medical Center | Toews M.L.,University of Nebraska Medical Center | Mullane K.,Profectus Pharma Consulting Inc | And 2 more authors.
Biochemical Pharmacology | Year: 2014

A pharmacological experiment is typically conducted to: i) test or expand a hypothesis regarding the potential role of a target in the mechanism(s) underlying a disease state using an existing drug or tool compound in normal and/or diseased tissue or animals; or ii) characterize and optimize a new chemical entity (NCE) targeted to modulate a specific disease-associated target to restore homeostasis as a potential drug candidate. Hypothesis testing necessitates an intellectually rigorous, null hypothesis approach that is distinct from a high throughput fishing expedition in search of a hypothesis. In conducting an experiment, the protocol should be transparently defined along with its powering, design, appropriate statistical analysis and consideration of the anticipated outcome (s) before it is initiated. Compound-target interactions often involve the direct study of phenotype(s) unique to the target at the cell, tissue or animal/human level. However, in vivo studies are often compromised by a lack of sufficient information on the compound pharmacokinetics necessary to ensure target engagement and also by the context-free analysis of ubiquitous cellular signaling pathways downstream from the target. The use of single tool compounds/drugs at one concentration in engineered cell lines frequently results in reductionistic data that have no physiologically relevance. This overview, focused on trends in the peer-reviewed literature, discusses the execution and reporting of experiments and the criteria recommended for the physiologically-relevant assessment of target engagement to identify viable new drug targets and facilitate the advancement of translational studies. © 2013 Elsevier Inc. Source

Winquist R.J.,Vertex Pharmaceuticals | Mullane K.,Profectus Pharma Consulting Inc | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2014

Pharmacology is an integrative discipline that originated from activities, now nearly 7000 years old, to identify therapeutics from natural product sources. Research in the 19th Century that focused on the Law of Mass Action (LMA) demonstrated that compound effects were dose-/concentration-dependent eventually leading to the receptor concept, now a century old, that remains the key to understanding disease causality and drug action. As pharmacology evolved in the 20th Century through successive biochemical, molecular and genomic eras, the precision in understanding receptor function at the molecular level increased and while providing important insights, led to an overtly reductionistic emphasis. This resulted in the generation of data lacking physiological context that ignored the LMA and was not integrated at the tissue/whole organism level. As reductionism became a primary focus in biomedical research, it led to the fall of pharmacology. However, concerns regarding the disconnect between basic research efforts and the approval of new drugs to treat 21st Century disease tsunamis, e.g., neurodegeneration, metabolic syndrome, etc. has led to the reemergence of pharmacology, its rise, often in the semantic guise of systems biology. Against a background of limited training in pharmacology, this has resulted in issues in experimental replication with a bioinformatics emphasis that often has a limited relationship to reality. The integration of newer technologies within a pharmacological context where research is driven by testable hypotheses rather than technology, together with renewed efforts in teaching pharmacology, is anticipated to improve the focus and relevance of biomedical research and lead to novel therapeutics that will contain health care costs. © 2013 Elsevier Inc. Source

Mullane K.,Profectus Pharma Consulting Inc | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2015

The credibility and consequent sustainability of the biomedical research "ecosystem" is in jeopardy, in part due to an inability to reproduce data from the peer-reviewed literature. Despite obvious and relatively inexpensive solutions to improve reproducibility - ensuring that experimental reagents, specifically cancer cell lines and antibodies, are authenticated/validated before use and that best practices in statistical usage are incorporated into the design, analysis, and reporting of experiments - these are routinely ignored, a reflection of hubris and a comfort with the status quo on the part of many investigators. New guidelines for the peer review of publications and grant applications introduced in the past year, while well-intended, lack the necessary consequences, e.g., denial of funding, that would result in sustained improvements when scientific rigor is lacking and/or transparency is, at best, opaque. An additional factor contributing to irreproducibility is a reductionist mindset that prioritizes certainty in research outcomes over the ambiguity intrinsic to biological systems that is often reflected in "unknown unknowns". This has resulted in a tendency towards codifying "rules" that can provide "yes-no" outcomes that represent a poor substitute for the intellectual challenge and skepticism that leads to an awareness and consideration of "unknown unknowns". When acknowledged as potential causes of unexpected experimental outcomes, these can often transition into the "knowns" that facilitate positive, disruptive innovation in biomedical research like the human microbiome. Changes in investigator mindset, both in terms of validating reagents and embracing ambiguity, are necessary to aid in reducing issues with reproducibility. © 2015 Elsevier Inc. All rights reserved. Source

Discover hidden collaborations