Cincinnati, OH, United States
Cincinnati, OH, United States

Time filter

Source Type

Hack C.E.,Toxicology Excellence for Risk Assessment TERA | Hack C.E.,U.S. Air force | Haber L.T.,Toxicology Excellence for Risk Assessment TERA | Maier A.,Toxicology Excellence for Risk Assessment TERA | And 4 more authors.
Risk Analysis | Year: 2010

A Bayesian network model was developed to integrate diverse types of data to conduct an exposure-dose-response assessment for benzene-induced acute myeloid leukemia (AML). The network approach was used to evaluate and compare individual biomarkers and quantitatively link the biomarkers along the exposure-disease continuum. The network was used to perform the biomarker-based dose-response analysis, and various other approaches to the dose-response analysis were conducted for comparison. The network-derived benchmark concentration was approximately an order of magnitude lower than that from the usual exposure concentration versus response approach, which suggests that the presence of more information in the low-dose region (where changes in biomarkers are detectable but effects on AML mortality are not) helps inform the description of the AML response at lower exposures. This work provides a quantitative approach for linking changes in biomarkers of effect both to exposure information and to changes in disease response. Such linkage can provide a scientifically valid point of departure that incorporates precursor dose-response information without being dependent on the difficult issue of a definition of adversity for precursors. © 2010 Society for Risk Analysis.


Hertzberg R.C.,Biomathematics Consulting | Pan Y.,Emory University | Pan Y.,National Center for Environmental Health | Li R.,Emory University | And 6 more authors.
Toxicology | Year: 2013

Mixture risk assessment is often hampered by the lack of dose-response information on the mixture being assessed, forcing reliance on component formulas such as dose addition. We present a four-step approach for evaluating chemical mixture data for consistency with dose addition for use in supporting a component based mixture risk assessment. Following the concepts in the U.S. EPA mixture risk guidance ( U.S. EPA, 2000a,b), toxicological interaction for a defined mixture (all components known) is departure from a clearly articulated definition of component additivity. For the common approach of dose additivity, the EPA guidance identifies three desirable characteristics, foremost of which is that the component chemicals are toxicologically similar. The other two characteristics are empirical: the mixture components have toxic potencies that are fixed proportions of each other (throughout the dose range of interest), and the mixture dose term in the dose additive prediction formula, which we call the combined prediction model (CPM), can be represented by a linear combination of the component doses. A consequent property of the proportional toxic potencies is that the component chemicals must share a common dose-response model, where only the dose coefficients depend on the chemical components. A further consequence is that the mixture data must be described by the same mathematical function ("mixture model") as the components, but with a distinct coefficient for the total mixture dose. The mixture response is predicted from the component dose-response curves by using the dose additive CPM and the prediction is then compared with the observed mixture results. The four steps are to evaluate: (1) toxic proportionality by determining how well the CPM matches the single chemical models regarding mean and variance; (2) fit of the mixture model to the mixture data; (3) agreement between the mixture data and the CPM prediction; and (4) consistency between the CPM and the mixture model. Because there are four evaluations instead of one, some involving many parameters or dose groups, there are more opportunities to reject statistical hypotheses about dose addition, thus statistical adjustment for multiple comparisons is necessary. These four steps contribute different pieces of information about the consistency of the component and mixture data with the two empirical characteristics of dose additivity. We examine this four-step approach in how it can show empirical support for dose addition as a predictor for an untested mixture in a screening level risk assessment. The decision whether to apply dose addition should be based on all four of those evidentiary pieces as well as toxicological understanding of these chemicals and should include interpretations of the numerical and toxicological issues that arise during the evaluation. This approach is demonstrated with neurotoxicity data on carbamate mixtures. © 2012 Elsevier Ireland Ltd.


Rider C.V.,National Health Research Institute | Dourson M.L.,Toxicology Excellence for Risk Assessment TERA | Hertzberg R.C.,Biomathematics Consulting | Mumtaz M.M.,Agency for Toxic Substances and Disease Registry | And 2 more authors.
Toxicological Sciences | Year: 2012

The role of nonchemical stressors in modulating the human health risk associated with chemical exposures is an area of increasing attention. On 9 March 2011, a workshop titled "Approaches for Incorporating Nonchemical Stressors into Cumulative Risk Assessment" took place during the 50th Anniversary Annual Society of Toxicology Meeting in Washington D.C. Objectives of the workshop included describing the current state of the science from various perspectives (i.e., regulatory, exposure, modeling, and risk assessment) and presenting expert opinions on currently available methods for incorporating nonchemical stressors into cumulative risk assessments. Herein, distinct frameworks for characterizing exposure to, joint effects of, and risk associated with chemical and nonchemical stressors are discussed. Published by Oxford University Press 2012.


Reichard J.F.,University of Cincinnati | Haber L.T.,Toxicology Excellence for Risk Assessment TERA
Food and Chemical Toxicology | Year: 2016

The purpose of this work is to systematically consider the data relating to the mode of action (MOA) for the effects of industrially produced trans fatty acid (iTFA) on plasma low-density lipoprotein (LDL) levels. The hypothesized MOA is composed of two key events: increased LDL production and decreased LDL clearance. A substantial database supports this MOA, although the key events are likely to be interdependent, rather than sequential. Both key events are functions of nonlinear biological processes including rate-limited clearance, receptor-mediated transcription, and both positive and negative feedback regulation. Each key event was evaluated based on weight-of-evidence analysis and for human relevance. We conclude that the data are inadequate for a detailed dose-response analysis in the context of the evolved Bradford Hill considerations; however, the weight of evidence is strong and the overall shape of the dose-response curves for markers of the key events and the key determinants of those relationships is well understood in many cases and is nonlinear. Feedback controls are responsible for maintaining homeostasis of cholesterol and triglyceride levels and underlie both of the key events, resulting in a less-than-linear or thresholded relationship between TFA and LDL-C. The inconsistencies and gaps in the database are discussed. © 2016 Elsevier Ltd.


Dourson M.L.,Toxicology Excellence for Risk Assessment TERA | York R.G.,R G York & Associates LLC
Regulatory Toxicology and Pharmacology | Year: 2016

The safety of food ingredients will be assessed in the 21st century by mixture of traditional methods, such as the “safe” dose concept, which is thought to be an accurate but imprecise estimation of dose below the population threshold for adverse effect, and contemporary methods, such as the Benchmark Dose (BMD), Chemical Specific Adjustment Factors (CSAF), physiologically-based pharmacokinetic models, and biologically-informed dose response modeling. New research on the horizon related to toxicology 21 may also improve these risk assessment methods, or suggest new ones. These traditional, contemporary and new methods and research will be briefly described. © 2016 Elsevier Inc.


DeBord D.G.,U.S. National Institute for Occupational Safety and Health | Burgoon L.,U.S. Environmental Protection Agency | Edwards S.W.,U.S. Environmental Protection Agency | Haber L.T.,Toxicology Excellence for Risk Assessment TERA | And 5 more authors.
Journal of Occupational and Environmental Hygiene | Year: 2015

In a recent National Research Council document, new strategies for risk assessment were described to enable more accurate and quicker assessments.( 1 ) This report suggested that evaluating individual responses through increased use of bio-monitoring could improve dose-response estimations. Identi-fication of specific biomarkers may be useful for diagnostics or risk prediction as they have the potential to improve exposure assessments. This paper discusses systems biology, biomarkers of effect, and computational toxicology approaches and their relevance to the occupational exposure limit setting process.The systems biology approach evaluates the integration of biological processes and how disruption of these processes by chemicals or other hazards affects disease outcomes. This type of approach could provide information used in delineating the mode of action of the response or toxicity, and may be useful to define the low adverse and no adverse effect levels. Biomarkers of effect are changes measured in biological systems and are considered to be preclinical in nature. Advances in computational methods and experimental-omics methods that allow the simultaneous measurement of families of macromolecules such as DNA, RNA, and proteins in a single analysis have made these systems approaches feasible for broad application.The utility of the information for risk assessments from-omics approaches has shown promise and can provide information on mode of action and dose-response relationships. As these techniques evolve, estimation of internal dose and response biomarkers will be a critical test of these new technologies for application in risk assessment strategies. While proof of concept studies have been conducted that provide evidence of their value, challenges with standardization and harmonization still need to be overcome before these methods are used routinely. © 2015 Published with license by Taylor and Francis.


PubMed | Toxicology Excellence for Risk Assessment TERA
Type: Journal Article | Journal: Food and chemical toxicology : an international journal published for the British Industrial Biological Research Association | Year: 2016

The purpose of this work is to systematically consider the data relating to the mode of action (MOA) for the effects of industrially produced trans fatty acid (iTFA) on plasma low-density lipoprotein (LDL) levels. The hypothesized MOA is composed of two key events: increased LDL production and decreased LDL clearance. A substantial database supports this MOA, although the key events are likely to be interdependent, rather than sequential. Both key events are functions of nonlinear biological processes including rate-limited clearance, receptor-mediated transcription, and both positive and negative feedback regulation. Each key event was evaluated based on weight-of-evidence analysis and for human relevance. We conclude that the data are inadequate for a detailed dose-response analysis in the context of the evolved Bradford Hill considerations; however, the weight of evidence is strong and the overall shape of the dose-response curves for markers of the key events and the key determinants of those relationships is well understood in many cases and is nonlinear. Feedback controls are responsible for maintaining homeostasis of cholesterol and triglyceride levels and underlie both of the key events, resulting in a less-than-linear or thresholded relationship between TFA and LDL-C. The inconsistencies and gaps in the database are discussed.


PubMed | Toxicology Excellence for Risk Assessment TERA and R G York & Associates LLC
Type: | Journal: Regulatory toxicology and pharmacology : RTP | Year: 2016

The safety of food ingredients will be assessed in the 21st century by mixture of traditional methods, such as the safe dose concept, which is thought to be an accurate but imprecise estimation of dose below the population threshold for adverse effect, and contemporary methods, such as the Benchmark Dose (BMD), Chemical Specific Adjustment Factors (CSAF), physiologically-based pharmacokinetic models, and biologically-informed dose response modeling. New research on the horizon related to toxicology 21 may also improve these risk assessment methods, or suggest new ones. These traditional, contemporary and new methods and research will be briefly described.


PubMed | Independent Consultant, Toxicology Excellence for Risk Assessment TERA and BioFortis Innovation Services
Type: Journal Article | Journal: Food and chemical toxicology : an international journal published for the British Industrial Biological Research Association | Year: 2016

We conducted a meta-regression of controlled clinical trial data to investigate quantitatively the relationship between dietary intake of industrial trans fatty acids (iTFA) and increased low-density lipoprotein cholesterol (LDL-C). Previous regression analyses included insufficient data to determine the nature of the dose response in the low-dose region and have nonetheless assumed a linear relationship between iTFA intake and LDL-C levels. This work contributes to the previous work by 1) including additional studies examining low-dose intake (identified using an evidence mapping procedure); 2) investigating a range of curve shapes, including both linear and nonlinear models; and 3) using Bayesian meta-regression to combine results across trials. We found that, contrary to previous assumptions, the linear model does not acceptably fit the data, while the nonlinear, S-shaped Hill model fits the data well. Based on a conservative estimate of the degree of intra-individual variability in LDL-C (0.1mmoL/L), as an estimate of a change in LDL-C that is not adverse, a change in iTFA intake of 2.2% of energy intake (%en) (corresponding to a total iTFA intake of 2.2-2.9%en) does not cause adverse effects on LDL-C. The iTFA intake associated with this change in LDL-C is substantially higher than the average iTFA intake (0.5%en).


PubMed | Toxicology Excellence for Risk Assessment TERA
Type: Journal Article | Journal: Regulatory toxicology and pharmacology : RTP | Year: 2013

A biomathematical model was previously developed to describe the long-term clearance and retention of particles in the lungs of coal miners. The model structure was evaluated and parameters were estimated in two data sets, one from the United States and one from the United Kingdom. The three-compartment model structure consists of deposition of inhaled particles in the alveolar region, competing processes of either clearance from the alveolar region or translocation to the lung interstitial region, and very slow, irreversible sequestration of interstitialized material in the lung-associated lymph nodes. Point estimates of model parameter values were estimated separately for the two data sets. In the current effort, Bayesian population analysis using Markov chain Monte Carlo simulation was used to recalibrate the model while improving assessments of parameter variability and uncertainty. When model parameters were calibrated simultaneously to the two data sets, agreement between the derived parameters for the two groups was very good, and the central tendency values were similar to those derived from the deterministic approach. These findings are relevant to the proposed update of the ICRP human respiratory tract model with revisions to the alveolar-interstitial region based on this long-term particle clearance and retention model.

Loading Toxicology Excellence for Risk Assessment TERA collaborators
Loading Toxicology Excellence for Risk Assessment TERA collaborators