National Center for Computational Toxicology

Pumpkin Center, NC, United States

National Center for Computational Toxicology

Pumpkin Center, NC, United States
SEARCH FILTERS
Time filter
Source Type

Van Vliet E.,Innovitox Consultation and Services | Daneshian M.,University of Konstanz | Beilmann M.,Boehringer Ingelheim | Davies A.,Irish National Center for High Content Screening and Analysis | And 14 more authors.
Altex | Year: 2014

High content imaging combines automated microscopy with image analysis approaches to simultaneously quantify multiple phenotypic and/or functional parameters in biological systems. The technology has become an important tool in the fields of safety sciences and drug discovery, because it can be used for mode-of-action identification, determination of hazard potency and the discovery of toxicity targets and biomarkers. In contrast to conventional biochemical endpoints, high content imaging provides insight into the spatial distribution and dynamics of responses in biological systems. This allows the identification of signaling pathways underlying cell defense, adaptation, toxicity and death. Therefore, high content imaging is considered a promising technology to address the challenges for the 'Toxicity testing in the 21st century' approach. Currently, high content imaging technologies are frequently applied in academia for mechanistic toxicity studies and in pharmaceutical industry for the ranking and selection of lead drug compounds or to identify/confirm mechanisms underlying effects observed in vivo. A recent workshop gathered scientists working on high content imaging academia, pharmaceutical industry and regulatory bodies with the objective to compile the state- of-the-art of the technology in the different institutions. Together they defined technical and methodological gaps, proposed quality control measures and performance standards, highlighted cell sources and new readouts and discussed future requirements for regulatory implementation. This review summarizes the discussion, proposed solutions and recommendations of the specialists contributing to the workshop.


Joubert B.R.,U.S. Environmental Protection Agency | Joubert B.R.,U.S. National Institutes of Health | Reif D.M.,National Center for Computational Toxicology | Edwards S.W.,National Health and Environmental Effects Research Laboratory | And 5 more authors.
BMC Medical Genetics | Year: 2011

Background: Asthma and allergy represent complex phenotypes, which disproportionately burden ethnic minorities in the United States. Strong evidence for genomic factors predisposing subjects to asthma/allergy is available. However, methods to utilize this information to identify high risk groups are variable and replication of genetic associations in African Americans is warranted.Methods: We evaluated 41 single nucleotide polymorphisms (SNP) and a deletion corresponding to 11 genes demonstrating association with asthma in the literature, for association with asthma, atopy, testing positive for food allergens, eosinophilia, and total serum IgE among 141 African American children living in Detroit, Michigan. Independent SNP and haplotype associations were investigated for association with each trait, and subsequently assessed in concert using a genetic risk score (GRS).Results: Statistically significant associations with asthma were observed for SNPs in GSTM1, MS4A2, and GSTP1 genes, after correction for multiple testing. Chromosome 11 haplotype CTACGAGGCC (corresponding to MS4A2 rs574700, rs1441586, rs556917, rs502581, rs502419 and GSTP1 rs6591256, rs17593068, rs1695, rs1871042, rs947895) was associated with a nearly five-fold increase in the odds of asthma (Odds Ratio (OR) = 4.8, p = 0.007). The GRS was significantly associated with a higher odds of asthma (OR = 1.61, 95% Confidence Interval = 1.21, 2.13; p = 0.001).Conclusions: Variation in genes associated with asthma in predominantly non-African ethnic groups contributed to increased odds of asthma in this African American study population. Evaluating all significant variants in concert helped to identify the highest risk subset of this group. © 2011 Joubert et al; licensee BioMed Central Ltd.


Paun A.,Louisiana Tech University | Paun A.,Technical University of Madrid | Paun A.,Romanian National Institute for Research and Development for Biological Sciences | Paun M.,Louisiana Tech University | And 4 more authors.
2011 IEEE 1st International Conference on Computational Advances in Bio and Medical Sciences, ICCABS 2011 | Year: 2011

We will describe the memory enhancement and give a simple three reaction model to illustrate the differences between our technique and a continuous, concentration-based approach using a system of ordinary differential equations. Furthermore, we provide our results from the modeling of two well-known models: the Lotka-Volterra predator-prey and a circadian rhythm model. For these models, we provide the results of our simulation technique in comparison to results from ordinary differential equations and the Gillespie Algorithm. We show that our algorithm, while being faster than Gillespie's approach, is capable of generating oscillatory behavior where ordinary differential equations do not. © 2011 IEEE.


Gangwa S.,National Center for Computational Toxicology | Brown J.S.,U.S. Environmental Protection Agency | Wang A.,National Center for Computational Toxicology | Houck K.A.,National Center for Computational Toxicology | And 3 more authors.
Environmental Health Perspectives | Year: 2011

Background: Little justification is generally provided for selection of in vitro assay testing concentrations for engineered nanomaterials (ENMs). Selection of concentration levels for hazard evaluation based on real-world exposure scenarios is desirable. Objectives: Our goal was to use estimates of lung deposition after occupational exposure to nanomaterials to recommend in vitro testing concentrations for the U.S. Environmental Protection Agency's ToxCast™ program. Here, we provide testing concentrations for carbon nanotubes (CNTs) and titanium dioxide (TiO 2) and silver (Ag) nanoparticles (NPs). Methods: We reviewed published ENM concentrations measured in air in manufacturing and R&D (research and development) laboratories to identify input levels for estimating ENM mass retained in the human lung using the multiple-path particle dosimetry (MPPD) model. Model input parameters were individually varied to estimate alveolar mass retained for different particle sizes (5-1,000 nm), aerosol concentrations (0.1 and 1 mg/m 3), aspect ratios (2, 4, 10, and 167), and exposure durations (24 hr and a working lifetime). The calculated lung surface concentrations were then converted to in vitro solution concentrations. Results: Modeled alveolar mass retained after 24 hr is most affected by activity level and aerosol concentration. Alveolar retention for Ag and TiO 2 NPs and CNTs for a working-lifetime (45 years) exposure duration is similar to high-end concentrations (~ 30-400 μg/mL) typical of in vitro testing reported in the literature. Conclusions: Analyses performed are generally applicable for providing ENM testing concentrations for in vitro hazard screening studies, although further research is needed to improve the approach. Understanding the relationship between potential real-world exposures and in vitro test concentrations will facilitate interpretation of toxicological results.


Segal D.,National Center for Environmental Assessment | Makris S.L.,National Center for Environmental Assessment | Kraft A.D.,National Center for Environmental Assessment | Bale A.S.,National Center for Environmental Assessment | And 8 more authors.
Regulatory Toxicology and Pharmacology | Year: 2015

Regulatory agencies often utilize results from peer reviewed publications for hazard assessments. A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the publicly available software-based ToxRTool (Toxicological data Reliability assessment Tool) for use in human health hazard assessments. The ToxRTool was developed by the European Commission's Joint Research Center in 2009. It builds on Klimisch categories, a rating system established in 1997, by providing additional criteria and guidance for assessing the reliability of toxicological studies. It also transparently documents the study-selection process. Eight scientists used the ToxRTool to rate the same 20 journal articles on thyroid toxicants. Results were then compared using the Finn coefficient and "AC1" to determine inter-rater consistency. Ratings were most consistent for high-quality journal articles, but less consistent as study quality decreased. Primary reasons for inconsistencies were that some criteria were subjective and some were not clearly described. It was concluded, however, that the ToxRTool has potential and, with refinement, could provide a more objective approach for screening published toxicology studies for use in health risk evaluations, although the ToxRTool ratings are primarily based on study reporting quality. © 2015.


PubMed | Ross University School of Medicine, National Center for Computational Toxicology, ICF International, National Center for Environmental Assessment and 2 more.
Type: Journal Article | Journal: Regulatory toxicology and pharmacology : RTP | Year: 2015

Regulatory agencies often utilize results from peer reviewed publications for hazard assessments. A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the publicly available software-based ToxRTool (Toxicological data Reliability assessment Tool) for use in human health hazard assessments. The ToxRTool was developed by the European Commissions Joint Research Center in 2009. It builds on Klimisch categories, a rating system established in 1997, by providing additional criteria and guidance for assessing the reliability of toxicological studies. It also transparently documents the study-selection process. Eight scientists used the ToxRTool to rate the same 20 journal articles on thyroid toxicants. Results were then compared using the Finn coefficient and AC1 to determine inter-rater consistency. Ratings were most consistent for high-quality journal articles, but less consistent as study quality decreased. Primary reasons for inconsistencies were that some criteria were subjective and some were not clearly described. It was concluded, however, that the ToxRTool has potential and, with refinement, could provide a more objective approach for screening published toxicology studies for use in health risk evaluations, although the ToxRTool ratings are primarily based on study reporting quality.

Loading National Center for Computational Toxicology collaborators
Loading National Center for Computational Toxicology collaborators