Valerio Jr. L.G.,Center for Drug Evaluation and Research |
Long A.,Lhasa Ltd
Current Drug Discovery Technologies | Year: 2010
In this study we employed the use of the Meteor computational software program to perform predictions in silico on 17 hepatotoxic drugs for determining human-specific drug metabolites. Congruence of the in silico predictions from a qualitative standpoint of drug metabolite structures was established by comparison to human in vivo drug metabolic profiles characterized in publically available clinical studies. A total of 87 human-specific metabolites were identified from the 17 drugs. We found that Meteor's positive predictions included 4 out of the 9 reported major metabolites (detected in excreta at a level of >10% of the administered p.o. dose) and 10 out of the 15 major phase II metabolites giving a total of 14 correctly predicted drug metabolite structures out of 23 major metabolites. A significant level of unconfirmed positive predictions resulted and discussion on reasons for this is presented. An example is given whereby the in silico metabolism prediction succeeded to predict the putative toxic pathway of one of the drugs whilst conventional rodent liver microsomal assays failed to predict the pathway. Overall, we describe a reasonable simulation of human metabolic profiling using this in silico method with this data set of hepatotoxic drugs now withdrawn from the market. We provide an in-depth and objective discussion of this first of its kind validation test using clinical study data with interest in the prediction human-specific metabolism. Further research is discussed on what areas need to be investigated to improve upon the predictive data. The strong potential of this method to predict human-specific drug metabolites suggests the utility of this computational tool to help support not only the discovery development of therapeutics but also the safety assessment in identifying drug metabolites early to protect patients prior to initiating clinical studies. © 2010 Bentham Science Publishers Ltd.
Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-33-2015 | Award Amount: 30.12M | Year: 2016
The vision of EU-ToxRisk is to drive a paradigm shift in toxicology towards an animal-free, mechanism-based integrated approach to chemical safety assessment. The project will unite all relevant disciplines and stakeholders to establish: i) pragmatic, solid read-across procedures incorporating mechanistic and toxicokinetic knowledge; and ii) ab initio hazard and risk assessment strategies of chemicals with little background information. The project will focus on repeated dose systemic toxicity (liver, kidney, lung and nervous system) as well as developmental/reproduction toxicity. Different human tiered test systems are integrated to balance speed, cost and biological complexity. EU-ToxRisk extensively integrates the adverse outcome pathway (AOP)-based toxicity testing concept. Therefore, advanced technologies, including high throughput transcriptomics, RNA interference, and high throughput microscopy, will provide quantitative and mechanistic underpinning of AOPs and key events (KE). The project combines in silico tools and in vitro assays by computational modelling approaches to provide quantitative data on the activation of KE of AOP. This information, together with detailed toxicokinetics data, and in vitro-in vivo extrapolation algorithms forms the basis for improved hazard and risk assessment. The EU-ToxRisk work plan is structured along a broad spectrum of case studies, driven by the cosmetics, (agro)-chemical, pharma industry together with regulators. The approach involves iterative training, testing, optimization and validation phases to establish fit-for-purpose integrated approaches to testing and assessment with key EU-ToxRisk methodologies. The test systems will be combined to a flexible service package for exploitation and continued impact across industry sectors and regulatory application. The proof-of-concept for the new mechanism-based testing strategy will make EU-ToxRisk the flagship in Europe for animal-free chemical safety assessment.
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 4.52M | Year: 2014
Moores Law states that the number of active components on an microchip doubles every 18 months. Variants of this Law can be applied to many measures of computer performance, such as memory and hard disk capacity, and to reductions in the cost of computations. Remarkably, Moores Law has applied for over 50 years during which time computer speeds have increased by a factor of more than 1 billion! This remarkable rise of computational power has affected all of our lives in profound ways, through the widespread usage of computers, the internet and portable electronic devices, such as smartphones and tablets. Unfortunately, Moores Law is not a fundamental law of nature, and sustaining this extraordinary rate of progress requires continuous hard work and investment in new technologies most of which relate to advances in our understanding and ability to control the properties of materials. Computer software plays an important role in enhancing computational performance and in many cases it has been found that for every factor of 10 increase in computational performance achieved by faster hardware, improved software has further increased computational performance by a factor of 100. Furthermore, improved software is also essential for extending the range of physical properties and processes which can be studied computationally. Our EPSRC Centre for Doctoral Training in Computational Methods for Materials Science aims to provide training in numerical methods and modern software development techniques so that the students in the CDT are capable of developing innovative new software which can be used, for instance, to help design new materials and understand the complex processes that occur in materials. The UK, and in particular Cambridge, has been a pioneer in both software and hardware since the earliest programmable computers, and through this strategic investment we aim to ensure that this lead is sustained well into the future.
Briggs K.A.,Lhasa Ltd
Drug Discovery Today | Year: 2016
Is preclinical data sharing the new norm? In my experience, it is certainly becoming more commonplace. However, it is not yet standard practice and remains the preserve of special projects. Here, I expound the benefits of sharing proprietary preclinical data using examples of successful initiatives. The main barriers to data sharing are then described, with suggestions for how these might be overcome. To maximise the benefits and minimise the risks involved, I suggest that organisations look to develop standard operating procedures for data sharing. © 2016 Elsevier Ltd.
Segall M.D.,Optibrium Ltd |
Barber C.,Lhasa Ltd
Drug Discovery Today | Year: 2014
Prioritising compounds with a lower chance of causing toxicity, early in the drug discovery process, would help to address the high attrition rate in pharmaceutical R&D. Expert knowledge-based prediction of toxicity can alert chemists if their proposed compounds are likely to have an increased likelihood of causing toxicity. We will discuss how multiparameter optimisation approaches can be used to balance the potential for toxicity with other properties required in a high-quality candidate drug, giving appropriate weight to the alert in the selection of compounds. Furthermore, we will describe how information about the region of a compound that triggers a toxicity alert can be interactively visualised to guide the modification of a compound to reduce the likelihood of toxicity. © 2014 Elsevier Ltd.
Judson P.,Lhasa Ltd
Methods in Molecular Biology | Year: 2012
Prediction of mutagenicity by computer is now routinely used in research and by regulatory authorities. Broadly, two different approaches are in wide use. The first is based on statistical analysis of data to find patterns associated with mutagenic activity. The resultant models are generally termed quantitative structure-activity relationships (QSAR). The second is based on capturing human knowledge about the causes of mutagenicity and applying it in ways that mimic human reasoning. These systems are generally called knowledge-based system. Other methods for finding patterns in data, such as the application of neural networks, are in use but less widely so. © 2012 Springer Science+Business Media, LLC.
Marchant C.A.,Lhasa Ltd
Wiley Interdisciplinary Reviews: Computational Molecular Science | Year: 2012
Statistical, expert system, and machine learning methods among others have been used to develop in silico tools for the prediction of toxicological hazard from chemical structure. The models are being applied to the mammalian and environmental toxicological assessment of chemicals across a range of industries including cosmetics, foods, industrial chemicals, and pharmaceuticals. Their use within a regulatory environment has also been encouraged by recent legislation. Generally, the models address the potential toxicity of low to medium molecular weight organic chemicals but models for other chemical types such as proteins and nanoparticles have also received some attention. © 2011 John Wiley & Sons, Ltd.
Parenty A.D.C.,Lhasa Ltd |
Button W.G.,Lhasa Ltd |
Ott M.A.,Lhasa Ltd
Molecular Pharmaceutics | Year: 2013
In this paper we describe Zeneth, a new expert computational system for the prediction of forced degradation pathways of organic compounds. Intermolecular reactions such as dimerization, reactions between the query compound and its degradants, as well as interactions with excipients can be predicted. The program employs a knowledge base of patterns and reasoning rules to suggest the most likely transformations under various environmental conditions relevant to the pharmaceutical industry. Building the knowledge base is facilitated by data sharing between the users. © 2013 American Chemical Society.
Ali M.,Lhasa Ltd
SAR and QSAR in environmental research | Year: 2013
Development of accurate quantitative structure-activity relationship (QSAR) models requires the availability of high quality validated data. International regulations such as REACH in Europe will now accept (Q)SAR-based evaluations for risk assessment. The number of toxicity datasets available for those wishing to share knowledge, or to use for data mining and modelling, is continually expanding. The challenge is the current use of a multitude of different data formats. The issues of comparing or combining disparate data apply both to public and proprietary sources. The ToxML project addresses the need for a common data exchange standard that allows the representation and communication of these data in a well-structured electronic format. It is an open standard based on Extensible Markup Language (XML). Supporting information for overall toxicity endpoint data can be included within ToxML files. This makes it possible to assess the quality and detail of the data used in a model. The data file model allows the aggregation of experimental data to the compound level in the detail needed to support (Q)SAR work. The standard is published on a website together with tools to view, edit and download it.
Long A.,Lhasa Ltd
Drug Discovery Today: Technologies | Year: 2013
Drug metabolism in silico is briefly discussed in terms of the importance of understanding the mechanistic basis of drug molecule biotransformation in vivo and its consequences in terms of changes in the properties of metabolites relative to those of the parent compound. A basic overview of an expert system is presented, along with how these general principles apply to expert systems for the prediction of xenobiotic metabolism. A brief history of the development of these systems is also presented. Methods for increasing both the sensitivity and selectivity of prediction are outlined and the benefits of using complementary prediction systems in a conjoint manner are proposed. © 2012 Elsevier Ltd. All rights reserved.