Profectus Pharma Consulting Inc

Federal Way, CA, United States

Profectus Pharma Consulting Inc

Federal Way, CA, United States

Time filter

Source Type

Mullane K.,Profectus Pharma Consulting Inc. | Enna S.J.,University of Kansas Medical Center | Piette J.,University of Liège | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2015

Recent reports have highlighted studies in biomedical research that cannot be reproduced, tending to undermine the credibility, relevance and sustainability of the research process. To address this issue, a number of factors can be monitored to improve the overall probability of reproducibility. These include: (i) shortcomings in experimental design and execution that involve hypothesis conceptualization, statistical analysis, and data reporting; (ii) investigator bias and error; (iii) validation of reagents including cells and antibodies; and (iv) fraud. Historically, research data that have undergone peer review and are subsequently published are then subject to independent replication via the process of self-correction. This often leads to refutation of the original findings and retraction of the paper by which time considerable resources have been wasted in follow-on studies. New NIH guidelines focused on experimental conduct and manuscript submission are being widely adopted in the peer-reviewed literature. These, in their various iterations, are intended to improve the transparency and accuracy of data reporting via the use of checklists that are often accompanied by "best practice" guidelines that aid in validating the methodologies and reagents used in data generation. The present Editorial provides background and context to a newly developed checklist for submissions to Biochemical Pharmacology that is intended to be clear, logical, useful and unambiguous in assisting authors in preparing manuscripts and in facilitating the peer review process. While currently optional, development of this checklist based on user feedback will result in it being mandatory within the next 12 months. © 2015 Elsevier Inc. All rights reserved.


Winquist R.J.,Vertex Pharmaceuticals | Mullane K.,Profectus Pharma Consulting Inc. | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2014

Pharmacology is an integrative discipline that originated from activities, now nearly 7000 years old, to identify therapeutics from natural product sources. Research in the 19th Century that focused on the Law of Mass Action (LMA) demonstrated that compound effects were dose-/concentration-dependent eventually leading to the receptor concept, now a century old, that remains the key to understanding disease causality and drug action. As pharmacology evolved in the 20th Century through successive biochemical, molecular and genomic eras, the precision in understanding receptor function at the molecular level increased and while providing important insights, led to an overtly reductionistic emphasis. This resulted in the generation of data lacking physiological context that ignored the LMA and was not integrated at the tissue/whole organism level. As reductionism became a primary focus in biomedical research, it led to the fall of pharmacology. However, concerns regarding the disconnect between basic research efforts and the approval of new drugs to treat 21st Century disease tsunamis, e.g., neurodegeneration, metabolic syndrome, etc. has led to the reemergence of pharmacology, its rise, often in the semantic guise of systems biology. Against a background of limited training in pharmacology, this has resulted in issues in experimental replication with a bioinformatics emphasis that often has a limited relationship to reality. The integration of newer technologies within a pharmacological context where research is driven by testable hypotheses rather than technology, together with renewed efforts in teaching pharmacology, is anticipated to improve the focus and relevance of biomedical research and lead to novel therapeutics that will contain health care costs. © 2013 Elsevier Inc.


Kenakin T.,University of North Carolina at Chapel Hill | Bylund D.B.,University of Nebraska Medical Center | Toews M.L.,University of Nebraska Medical Center | Mullane K.,Profectus Pharma Consulting Inc. | And 2 more authors.
Biochemical Pharmacology | Year: 2014

A pharmacological experiment is typically conducted to: i) test or expand a hypothesis regarding the potential role of a target in the mechanism(s) underlying a disease state using an existing drug or tool compound in normal and/or diseased tissue or animals; or ii) characterize and optimize a new chemical entity (NCE) targeted to modulate a specific disease-associated target to restore homeostasis as a potential drug candidate. Hypothesis testing necessitates an intellectually rigorous, null hypothesis approach that is distinct from a high throughput fishing expedition in search of a hypothesis. In conducting an experiment, the protocol should be transparently defined along with its powering, design, appropriate statistical analysis and consideration of the anticipated outcome (s) before it is initiated. Compound-target interactions often involve the direct study of phenotype(s) unique to the target at the cell, tissue or animal/human level. However, in vivo studies are often compromised by a lack of sufficient information on the compound pharmacokinetics necessary to ensure target engagement and also by the context-free analysis of ubiquitous cellular signaling pathways downstream from the target. The use of single tool compounds/drugs at one concentration in engineered cell lines frequently results in reductionistic data that have no physiologically relevance. This overview, focused on trends in the peer-reviewed literature, discusses the execution and reporting of experiments and the criteria recommended for the physiologically-relevant assessment of target engagement to identify viable new drug targets and facilitate the advancement of translational studies. © 2013 Elsevier Inc.


Mullane K.,Profectus Pharma Consulting Inc. | Winquist R.J.,Vertex Pharmaceuticals | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2014

The translational sciences represent the core element in enabling and utilizing the output from the biomedical sciences and to improving drug discovery metrics by reducing the attrition rate as compounds move from preclinical research to clinical proof of concept. Key to understanding the basis of disease causality and to developing therapeutics is an ability to accurately diagnose the disease and to identify and develop safe and effective therapeutics for its treatment. The former requires validated biomarkers and the latter, qualified targets. Progress has been hampered by semantic issues, specifically those that define the end product, and by scientific issues that include data reliability, an overt reductionistic cultural focus and a lack of hierarchically integrated data gathering and systematic analysis. A necessary framework for these activities is represented by the discipline of pharmacology, efforts and training in which require recognition and revitalization. © 2013 Elsevier Inc.


Mullane K.,Profectus Pharma Consulting Inc
Biochemical Pharmacology | Year: 2011

The prevalence of asthma continues to rise. Current drugs provide symptomatic relief to some, but not all, patients. Despite the need for new therapeutics, and a huge research effort, only four novel agents from two classes of drugs - the antileukotrienes and an anti-IgE antibody - have been approved in the last 30 years. This review highlights three particular issues that contribute to the challenge of identifying new therapeutics. First is an over-reliance on animal models of allergy to define targets and expectations of efficacy that has met with poor translation to the clinical setting. While sensitivity to particular aeroallergens is one key risk factor for asthma, atopy and asthma are not synonymous, and while about half of adult asthmatics are atopic the incidence of allergic asthma is probably <50%. The second issue is a fundamental disconnect between the directions of basic research and clinical research. Basic research has developed a detailed, reductive, unifying mechanism of antigen-induced, T helper type 2 cell-mediated airway inflammation as the root cause of asthma. In contrast, clinical research has started to identify multiple asthma phenotypes with differing cellular components, mediators and sensitivities to asthma drugs, and probably varying underlying factors including susceptibility genes. Finally, different features of asthma - bronchoconstriction, symptoms, and exacerbations - respond diversely to treatment; effects that are not captured in animal models which do not develop asthma per se, but utilize unvalidated surrogate markers. Basic research needs to better integrate and utilize the clinical research findings to improve its relevance to drug discovery efforts. © 2011 Elsevier Inc. All rights reserved.


Mullane K.,Profectus Pharma Consulting Inc. | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2014

Animal models of disease represent the pinnacle of hierarchical research efforts to validate targets and compounds for therapeutic intervention. Yet models of asthma, particularly in the mouse, which, for practical reasons, has become the sine qua non of asthma research, have been a bone of contention for decades. With barely a nod to their limitations and an extensive history of translational failures, they continue to be used for target identification and to justify the clinical evaluation of new compounds. Recent improvements - including sensitization directly to the airways; use of more relevant allergens; development of a chronic rather than short-term condition; utilization of techniques to measure lung function beyond uninterpretable measures of airway hyperresponsiveness - are laudable but cannot bridge the chasm between the models and the myriad complexities of the human disorder and multiple asthma endophenotypes. While further model developments are necessary, including recognition of key environmental factors beyond allergens, the judicious integration with newer ex vivo and in vitro techniques, including human precision-cut lung slices, reprograming of patient-derived induced pluripotent stem cells and fibroblasts to epithelial and smooth muscle cells, and use of other clinical samples to create a more holistic depiction of activities, might improve their translational success. © 2013 Elsevier Inc.


Mullane K.,Profectus Pharma Consulting Inc. | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2013

The worldwide incidence of Alzheimer's disease (AD) is increasing with estimates that 115 million individuals will have AD by 2050, creating an unsustainable healthcare challenge due to a lack of effective treatment options highlighted by multiple clinical failures of agents designed to reduce the brain amyloid burden considered synonymous with the disease. The amyloid hypothesis that has been the overarching focus of AD research efforts for more than two decades has been questioned in terms of its causality but has not been unequivocally disproven despite multiple clinical failures, This is due to issues related to the quality of compounds advanced to late stage clinical trials and the lack of validated biomarkers that allow the recruitment of AD patients into trials before they are at a sufficiently advanced stage in the disease where therapeutic intervention is deemed futile. Pursuit of a linear, reductionistic amyloidocentric approach to AD research, which some have compared to a religious faith, has resulted in other, equally plausible but as yet unvalidated AD hypotheses being underfunded leading to a disastrous roadblock in the search for urgently needed AD therapeutics. Genetic evidence supporting amyloid causality in AD is reviewed in the context of the clinical failures, and progress in tau-based and alternative approaches to AD, where an evolving modus operandi in biomedical research fosters excessive optimism and a preoccupation with unproven, and often ephemeral, biomarker/genome-based approaches that override transparency, objectivity and data-driven decision making, resulting in low probability environments where data are subordinate to self propagating hypotheses. © 2012 Elsevier Inc. All rights reserved.


Mullane K.,Profectus Pharma Consulting Inc.
Biochemical Pharmacology | Year: 2011

Over the last 30 years, scientific research into asthma has focused almost exclusively on one component of the disorder - airway inflammation - as being the key underlying feature. These studies have provided a remarkably detailed and comprehensive picture of the events following antigen challenge that lead to an influx of T cells and eosinophils in the airways. Indeed, in basic research, even the term "asthma" has become synonymous with a T helper 2 cell-mediated disorder. From this cascade of cellular activation processes and mediators that have been identified it has been possible to pinpoint critical junctures for therapeutic intervention, leading experimentalists to produce therapies that are very effective in decreasing airway inflammation in animal models. Many of these compounds have now completed early Phase 2 "proof-of-concept" clinical trials so the translational success of the basic research model can be evaluated. This commentary discusses clinical results from 39 compounds and biologics acting at 23 different targets, and while 6 of these drugs can be regarded as a qualified success, none benefit the bulk of asthma sufferers. Despite this disappointing rate of success, the same immune paradigm and basic research models, with a few embellishments to incorporate newly identified cells and mediators, continue to drive target identification and drug discovery efforts. It is time to re-evaluate the focus of these efforts. © 2011 Elsevier Inc. All rights reserved.


Mullane K.,Profectus Pharma Consulting Inc. | Williams M.,Northwestern University
Biochemical Pharmacology | Year: 2015

The credibility and consequent sustainability of the biomedical research "ecosystem" is in jeopardy, in part due to an inability to reproduce data from the peer-reviewed literature. Despite obvious and relatively inexpensive solutions to improve reproducibility - ensuring that experimental reagents, specifically cancer cell lines and antibodies, are authenticated/validated before use and that best practices in statistical usage are incorporated into the design, analysis, and reporting of experiments - these are routinely ignored, a reflection of hubris and a comfort with the status quo on the part of many investigators. New guidelines for the peer review of publications and grant applications introduced in the past year, while well-intended, lack the necessary consequences, e.g., denial of funding, that would result in sustained improvements when scientific rigor is lacking and/or transparency is, at best, opaque. An additional factor contributing to irreproducibility is a reductionist mindset that prioritizes certainty in research outcomes over the ambiguity intrinsic to biological systems that is often reflected in "unknown unknowns". This has resulted in a tendency towards codifying "rules" that can provide "yes-no" outcomes that represent a poor substitute for the intellectual challenge and skepticism that leads to an awareness and consideration of "unknown unknowns". When acknowledged as potential causes of unexpected experimental outcomes, these can often transition into the "knowns" that facilitate positive, disruptive innovation in biomedical research like the human microbiome. Changes in investigator mindset, both in terms of validating reagents and embracing ambiguity, are necessary to aid in reducing issues with reproducibility. © 2015 Elsevier Inc. All rights reserved.


PubMed | Northwestern University and Profectus Pharma Consulting Inc.
Type: Journal Article | Journal: Biochemical pharmacology | Year: 2015

The credibility and consequent sustainability of the biomedical research ecosystem is in jeopardy, in part due to an inability to reproduce data from the peer-reviewed literature. Despite obvious and relatively inexpensive solutions to improve reproducibility-ensuring that experimental reagents, specifically cancer cell lines and antibodies, are authenticated/validated before use and that best practices in statistical usage are incorporated into the design, analysis, and reporting of experiments-these are routinely ignored, a reflection of hubris and a comfort with the status quo on the part of many investigators. New guidelines for the peer review of publications and grant applications introduced in the past year, while well-intended, lack the necessary consequences, e.g., denial of funding, that would result in sustained improvements when scientific rigor is lacking and/or transparency is, at best, opaque. An additional factor contributing to irreproducibility is a reductionist mindset that prioritizes certainty in research outcomes over the ambiguity intrinsic to biological systems that is often reflected in unknown unknowns. This has resulted in a tendency towards codifying rules that can provide yes-no outcomes that represent a poor substitute for the intellectual challenge and skepticism that leads to an awareness and consideration of unknown unknowns. When acknowledged as potential causes of unexpected experimental outcomes, these can often transition into the knowns that facilitate positive, disruptive innovation in biomedical research like the human microbiome. Changes in investigator mindset, both in terms of validating reagents and embracing ambiguity, are necessary to aid in reducing issues with reproducibility.

Loading Profectus Pharma Consulting Inc collaborators
Loading Profectus Pharma Consulting Inc collaborators