Agency: European Commission | Branch: H2020 | Program: RIA | Phase: PHC-33-2015 | Award Amount: 30.12M | Year: 2016
The vision of EU-ToxRisk is to drive a paradigm shift in toxicology towards an animal-free, mechanism-based integrated approach to chemical safety assessment. The project will unite all relevant disciplines and stakeholders to establish: i) pragmatic, solid read-across procedures incorporating mechanistic and toxicokinetic knowledge; and ii) ab initio hazard and risk assessment strategies of chemicals with little background information. The project will focus on repeated dose systemic toxicity (liver, kidney, lung and nervous system) as well as developmental/reproduction toxicity. Different human tiered test systems are integrated to balance speed, cost and biological complexity. EU-ToxRisk extensively integrates the adverse outcome pathway (AOP)-based toxicity testing concept. Therefore, advanced technologies, including high throughput transcriptomics, RNA interference, and high throughput microscopy, will provide quantitative and mechanistic underpinning of AOPs and key events (KE). The project combines in silico tools and in vitro assays by computational modelling approaches to provide quantitative data on the activation of KE of AOP. This information, together with detailed toxicokinetics data, and in vitro-in vivo extrapolation algorithms forms the basis for improved hazard and risk assessment. The EU-ToxRisk work plan is structured along a broad spectrum of case studies, driven by the cosmetics, (agro)-chemical, pharma industry together with regulators. The approach involves iterative training, testing, optimization and validation phases to establish fit-for-purpose integrated approaches to testing and assessment with key EU-ToxRisk methodologies. The test systems will be combined to a flexible service package for exploitation and continued impact across industry sectors and regulatory application. The proof-of-concept for the new mechanism-based testing strategy will make EU-ToxRisk the flagship in Europe for animal-free chemical safety assessment.
Agency: GTR | Branch: EPSRC | Program: | Phase: Training Grant | Award Amount: 4.52M | Year: 2014
Moores Law states that the number of active components on an microchip doubles every 18 months. Variants of this Law can be applied to many measures of computer performance, such as memory and hard disk capacity, and to reductions in the cost of computations. Remarkably, Moores Law has applied for over 50 years during which time computer speeds have increased by a factor of more than 1 billion! This remarkable rise of computational power has affected all of our lives in profound ways, through the widespread usage of computers, the internet and portable electronic devices, such as smartphones and tablets. Unfortunately, Moores Law is not a fundamental law of nature, and sustaining this extraordinary rate of progress requires continuous hard work and investment in new technologies most of which relate to advances in our understanding and ability to control the properties of materials. Computer software plays an important role in enhancing computational performance and in many cases it has been found that for every factor of 10 increase in computational performance achieved by faster hardware, improved software has further increased computational performance by a factor of 100. Furthermore, improved software is also essential for extending the range of physical properties and processes which can be studied computationally. Our EPSRC Centre for Doctoral Training in Computational Methods for Materials Science aims to provide training in numerical methods and modern software development techniques so that the students in the CDT are capable of developing innovative new software which can be used, for instance, to help design new materials and understand the complex processes that occur in materials. The UK, and in particular Cambridge, has been a pioneer in both software and hardware since the earliest programmable computers, and through this strategic investment we aim to ensure that this lead is sustained well into the future.
Segall M.D.,Optibrium Ltd |
Barber C.,Lhasa Ltd
Drug Discovery Today | Year: 2014
Prioritising compounds with a lower chance of causing toxicity, early in the drug discovery process, would help to address the high attrition rate in pharmaceutical R&D. Expert knowledge-based prediction of toxicity can alert chemists if their proposed compounds are likely to have an increased likelihood of causing toxicity. We will discuss how multiparameter optimisation approaches can be used to balance the potential for toxicity with other properties required in a high-quality candidate drug, giving appropriate weight to the alert in the selection of compounds. Furthermore, we will describe how information about the region of a compound that triggers a toxicity alert can be interactively visualised to guide the modification of a compound to reduce the likelihood of toxicity. © 2014 Elsevier Ltd.
Judson P.,Lhasa Ltd
Methods in Molecular Biology | Year: 2012
Prediction of mutagenicity by computer is now routinely used in research and by regulatory authorities. Broadly, two different approaches are in wide use. The first is based on statistical analysis of data to find patterns associated with mutagenic activity. The resultant models are generally termed quantitative structure-activity relationships (QSAR). The second is based on capturing human knowledge about the causes of mutagenicity and applying it in ways that mimic human reasoning. These systems are generally called knowledge-based system. Other methods for finding patterns in data, such as the application of neural networks, are in use but less widely so. © 2012 Springer Science+Business Media, LLC.
Marchant C.A.,Lhasa Ltd
Wiley Interdisciplinary Reviews: Computational Molecular Science | Year: 2012
Statistical, expert system, and machine learning methods among others have been used to develop in silico tools for the prediction of toxicological hazard from chemical structure. The models are being applied to the mammalian and environmental toxicological assessment of chemicals across a range of industries including cosmetics, foods, industrial chemicals, and pharmaceuticals. Their use within a regulatory environment has also been encouraged by recent legislation. Generally, the models address the potential toxicity of low to medium molecular weight organic chemicals but models for other chemical types such as proteins and nanoparticles have also received some attention. © 2011 John Wiley & Sons, Ltd.
Parenty A.D.C.,Lhasa Ltd |
Button W.G.,Lhasa Ltd |
Ott M.A.,Lhasa Ltd
Molecular Pharmaceutics | Year: 2013
In this paper we describe Zeneth, a new expert computational system for the prediction of forced degradation pathways of organic compounds. Intermolecular reactions such as dimerization, reactions between the query compound and its degradants, as well as interactions with excipients can be predicted. The program employs a knowledge base of patterns and reasoning rules to suggest the most likely transformations under various environmental conditions relevant to the pharmaceutical industry. Building the knowledge base is facilitated by data sharing between the users. © 2013 American Chemical Society.
Ali M.,Lhasa Ltd
SAR and QSAR in environmental research | Year: 2013
Development of accurate quantitative structure-activity relationship (QSAR) models requires the availability of high quality validated data. International regulations such as REACH in Europe will now accept (Q)SAR-based evaluations for risk assessment. The number of toxicity datasets available for those wishing to share knowledge, or to use for data mining and modelling, is continually expanding. The challenge is the current use of a multitude of different data formats. The issues of comparing or combining disparate data apply both to public and proprietary sources. The ToxML project addresses the need for a common data exchange standard that allows the representation and communication of these data in a well-structured electronic format. It is an open standard based on Extensible Markup Language (XML). Supporting information for overall toxicity endpoint data can be included within ToxML files. This makes it possible to assess the quality and detail of the data used in a model. The data file model allows the aggregation of experimental data to the compound level in the detail needed to support (Q)SAR work. The standard is published on a website together with tools to view, edit and download it.
Long A.,Lhasa Ltd
Drug Discovery Today: Technologies | Year: 2013
Drug metabolism in silico is briefly discussed in terms of the importance of understanding the mechanistic basis of drug molecule biotransformation in vivo and its consequences in terms of changes in the properties of metabolites relative to those of the parent compound. A basic overview of an expert system is presented, along with how these general principles apply to expert systems for the prediction of xenobiotic metabolism. A brief history of the development of these systems is also presented. Methods for increasing both the sensitivity and selectivity of prediction are outlined and the benefits of using complementary prediction systems in a conjoint manner are proposed. © 2012 Elsevier Ltd. All rights reserved.
LHASA Ltd | Date: 2016-01-25
Computer software for use as an expert decision support system for the prediction of the chemical degradation pathways of organic compounds. Brochures exclusively relating to computer software and computer databases concerned with the chemical degradation pathways of organic compounds, the fates of chemicals in biological and environmental systems, or toxicological information. Scientific and technological services, namely, the development of predictive expert systems and databases concerned with toxicology, metabolism, chemical synthesis, environmental science and chemical degradation, research and database design relating thereto; research and development services, namely, research and development services in toxicology, metabolism, chemical synthesis, environmental science and chemical degradation; industrial analysis and research services, namely, industrial analysis relating to chemical synthesis, chemical degradation and toxicology; design and development of computer software exclusively relating to the chemical degradation pathways of organic compounds, the fates of chemicals in biological and environmental systems, or toxicological information; computer software installation and maintenance services.
LHASA Ltd | Date: 2016-03-18
Computer software for use in searching databases concerned with toxicity and toxicity data associated with chemical structures and with toxicity prediction. Printed matter, namely, instructional and teaching materials relating to toxicity and toxicity data associated with chemical structures and with toxicity prediction; books, printed periodicals, newsletters, pamphlets and brochures relating to toxicity and toxicity data associated with chemical structures and with toxicity prediction. Scientific and technological services and research and design relating thereto; research and development services, including research and development services in toxicology, chemical synthesis, environmental science and chemical degradation; design and development of computer software for use in storing and searching data concerned with the toxicity and toxicity data associated with chemical structures and with toxicity prediction; software installation and maintenance services.