Explore chapters and articles related to this topic
Bias, Conflict of Interest, Ignorance, and Uncertainty
Published in Ted W. Simon, Environmental Risk Assessment, 2019
In 2012, the National Research Council released a report titled Exposure Science in the 21st Century: A Vision and a Strategy.124 The report recommended that information on biomarkers be combined with data derived from remote sensing, Global Positioning System, satellite imaging, and other sources using an informatics approach. The development of the informatics necessary to combine these diverse data in a meaningful way was emphasized in the document. EPA’s ExpoCast program was recently developed as a complement to ToxCast™, with the goal of developing high-throughput exposure measures that integrate many data sources.
Using the Matrix to bridge the epidemiology/risk assessment gap: a case study of 2,4-D
Published in Critical Reviews in Toxicology, 2021
Carol J. Burns, Judy S. LaKind
The fields of environmental epidemiology and exposure science have provided consequential data for use in meta-analyses, systematic reviews, and ultimately, public health decision-making. Yet while activities such as the development of reference doses have been based on data from epidemiology studies, it is often the case that these data are secondary to toxicological data or are judged to be insufficient to examine exposure-outcome associations (Nachman et al. 2011; EFSA Panel on Plant Protection Products et al. 2017; Deener et al. 2018). While calls have been made for improving the utility of epidemiology for risk assessment, hurdles remain (Burns et al. 2014; Christensen et al. 2015; Birnbaum et al. 2016). In an effort to move the needle on this issue, an international, multi-sector group with expertise in risk assessment, toxicology, epidemiology, and exposure science developed the Matrix (Table 1), a structured approach to bridging the risk assessment-epidemiology gap (Burns et al. 2019).
Is current risk assessment of non-genotoxic carcinogens protective?
Published in Critical Reviews in Toxicology, 2018
Hedwig M. Braakhuis, Wout Slob, Evelyn D. Olthof, Gerrit Wolterink, Edwin P. Zwart, Eric R. Gremmer, Emiel Rorije, Jan van Benthem, Ruud Woutersen, Jan Willem van der Laan, Mirjam Luijten
Currently, much effort is invested in the transformation of (cancer) risk assessment approaches by incorporating enhanced mechanistic understanding, toxicokinetics, and interactions with biological processes (NRC 2007; Committee on Human and Environmental Exposure Science in the 21st Century et al. 2012; Thomas et al. 2013; Embry et al. 2014; Pastoor et al. 2014; Simon et al. 2014; Burden et al. 2015; National Academies of Sciences, Engineering, and Medicine et al. 2017). It will take time before such advanced information will be available for a large number of routinely assessed chemicals. Until then, in vivo toxicity studies may remain needed. Our results indicate that avoiding carcinogenicity studies, and basing HBGVs on subchronic toxicity studies would be fine as long as a cancer risk of roughly 1% (in the sensitive subpopulation, after lifelong exposure) is considered acceptable. In that case, carcinogenicity studies could be avoided in most cases (as under REACH). However, if this risk level would not be considered acceptable, other solutions need to be found, some of which may not come cheap. One option is to apply an (arbitrary) additional assessment factor. Or, carcinogenicity studies would need to be conducted at an even larger scale (e.g. requiring an increased number of animals) than currently required. Obviously, the latter case is not an option that appears realistic or desirable. Moreover, it would also require full abandonment of the NOAEL approach, as this approach does not allow for estimating a dose with a lower than 1% risk.
Robert T. Drew, Ph.D. (1936–2018)
Published in Inhalation Toxicology, 2018
Bob’s doctoral research focused on the relationship between areas of high natural backgrounds of radiation in Minas Gerais, Brazil, with the occurrence of lung and other types of cancers in the local populations. However, it was his training at the NYU Center for Environmental Medicine in Sterling Forest, NY with Drs. Norton Nelson, Sid Laskin, and Marvin Kushner that put Bob at the epicenter of the burgeoning science of inhalation toxicology. The technology of rodent inhalation exposure systems and associated generation/measurement methods was sharpened in this hotbed of inhalation icons as public concern grew with the public interest in urban air pollution and the risk of lung cancer. Bob published several papers on inhalation methods and studies of the carcinogenicity of combustion organics, including benzo(a)pyrene and other polycyclic organics, as well as the potential for these complex organics to interact synergistically with other pollutants, such as SO2. He well appreciated the reality of mixed exposures in the real-world and where interactions may enhance lung cancer outcomes – a theme he often referred through his many years of inhalation testing. These early days of inhalation exposure science catapulted the era of Inhalation Toxicology to the forefront of the expanding discipline of toxicology as air pollution in the late ‘60s and ‘70s drew political action. The work emerging from NYU at this time provided innovative technologies for pollutant generation and measurement, and exposure designs that led to the standardization of inhalation exposure scenarios carried out to this day under the auspices of the National Toxicology Program and regulated industrial testing operations.