29
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Office of Health Assessment and Translation: A Problem-Solving Resource for the National Toxicology Program

      editorial

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The National Toxicology Program (NTP) Center for the Evaluation of Risks to Human Reproduction (CERHR) was established in 1998. CERHR served as an environmental health resource providing in-depth scientific assessments of effects on reproduction and development caused by agents to which humans are exposed. To our knowledge, CERHR was the only resource of its kind, producing evaluations that considered toxicity findings in the context of current human exposures to derive “level-of-concern” conclusions. This qualitative integration step is what distinguished CERHR documents from more traditional hazard evaluations prepared by other agencies. When CERHR was established, the focus on reproduction and development was appropriate because of a strong interest in these health outcomes by the public, regulatory and health agencies, and the scientific community. In addition, a rationale for creating CERHR was the sense of a lack of uniformity across state and federal agencies in interpreting experimental animal studies of reproduction and development. CERHR was envisioned as a mechanism to apply a consistent strategy for interpreting these data. Although this need remains, we believe that the approaches used for CERHR evaluations should also be extended to other important health outcomes. Many chemicals display more than one type of toxicity, that is, carcinogens are often immunotoxicants, and reproductive and developmental toxicants may influence many endocrine-sensitive systems. A strict focus on reproductive and developmental end points evaluated in the context of current human exposures may not result in the most health protective levels of concern, and could be confusing to the public. From a public health perspective, understanding the implications of current human exposures should include consideration of all relevant health effects. Also, the NTP and the broader toxicology community need to confront the challenging scientific questions involved in utilizing information from the Toxicology in the 21st Century initiative (Collins et al. 2008). To do this we need a mechanism to systematically explore linkages between “toxicity pathways” and disease outcomes. To provide this, CERHR has spent the last 2 years in transition, laying the groundwork to become a more flexible scientific analysis program, while continuing to be grounded and recognized as a unique and important public health resource for the interpretation of reproductive and developmental hazards to humans. This evolution of CERHR is a response to the changing and increasing demands on both the NTP analysis and research programs. ”What does it mean?” is a question we increasingly want to answer, as our research and testing tools become more sophisticated and mechanistically based. A change in CERHR’s scope will also bring its work more in line with two recent initiatives established within the NTP that have mandates to address a broad range of health effects (Bucher 2008). In 2007 the NTP established a biomolecular screening program to administer its High Throughput Screening (HTS) Initiative in collaboration with our Tox21 partners (Schmidt 2009). This program takes advantage of technological advances in molecular biology and computer science to screen for mechanistic targets or “toxicity pathways” considered critical to adverse health effects. The host susceptibility program was also established in 2007 to study the genetic basis for differences in susceptibility that may lead to a better understanding of how substances in our environment may be hazardous to some individuals and not to others. On 11–13 January 2011, CERHR launched its expanded role by convening a diverse group of experts in toxicology, epidemiology, bioinformatics, and endocrinology to assess the strength of the literature linking selected environmental agents and exposures with diabetes and obesity (NTP 2011). Consideration was given to an array of information ranging from epidemiological findings and experimental animal and mechanistic data to screens of toxicity and disease pathways using HTS and literature curation methodologies. The use of several new analysis tools revealed novel linkages between a number of environmental agents and obesity or diabetes. These exciting findings are now being collated for publication. To fulfill its mission, the NTP is developing more innovative and flexible approaches for information and data integration, both across different programs within the NTP and across the different types of data that are generated and utilized (i.e., mechanistic or high throughput; “hypothesis-driven” animal studies of the type undertaken by National Institute of Environmental Health Sciences (NIEHS)-funded extramural grantees; and toxicology studies conducted for the purpose of safety assessment). Recent experience with bisphenol A highlights the public’s confusion and the waste of scientific resources that can occur when these different types of scientific literature are developed on parallel, but separate, paths (Bucher 2009). The evolution of CERHR is an important part of this information integration effort, and CERHR’s new role calls for a new name: the Office of Health Assessment and Translation. Under the leadership of Kristina Thayer, the Office of Health Assessment and Translation will be the NTP focal point for the thoughtful and deliberative integration of relevant information of all types in health assessments for the protection of public health.

          Related collections

          Most cited references4

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          TOX 21: New Dimensions of Toxicity Testing

          On the ground floor of the National Institutes of Health Chemical Genomics Center (NCGC) in Rockville, Maryland, a $10-million automated laboratory spends all day and night screening chemicals at speeds no team of human researchers could ever match. In a week, depending on the nature of the assay, it can yield up to 2.2 million molecular data points derived from thousands of chemicals tested at 15 concentrations each. Is this the new face of toxicology? Many experts say the answer could be yes. High-throughput screening tools such as the NCGC’s robotic system—combined with a growing assortment of in vitro assays and computational methods—are revealing how chemicals interact with biologic targets. Scientists increasingly believe these tools could generate more accurate assessments of human toxicity risk than those predicted by animal tests now. What’s more, in vitro analytical approaches are seen as the best hope for evaluating the enormous back log of untested chemicals in commerce. Estimates vary, but tens of thousands of industrial chemicals are used in consumer products without any knowledge of their potential toxicity. Meanwhile, it takes years and millions of dollars to assess risks for a single chemical using animal testing. “In almost all aspects, this looks like a paradigm shift in the field,” says John Bucher, associate director of the National Toxicology Program (NTP). “It’s a major change to move from using studies in animals, with which we’re comfortable, to relying mainly on results from biochemical or cell-based assays to make health policy decisions. This is a totally different approach that provides a different kind of information.” The Tox21 Partnership Enabled by new technology, the NTP, the NCGC, and the U.S. Environmental Protection Agency (EPA) are partnering to advance the state of toxicity testing. Specifically, the partners seek to identify new mechanisms of chemical activity in cells, to prioritize the backlog of untested chemicals for more extensive evaluations, and to develop better predictive models of human response to toxicants. Formalized last year in a Memorandum of Understanding, the partnership, dubbed Tox21, responds to a challenge made by the National Research Council (NRC) in its 2007 report Toxicity Testing in the 21st Century: A Vision and a Strategy. This report called for transforming toxicology “from a system based on whole-animal testing to one founded primarily on in vitro methods that evaluate changes in biologic processes using cells, cell lines, or cellular components, prefer ably of human origin.” In March 2009, the EPA published its own Tox21 agenda, The U.S. Environmental Protection Agency’s Strategic Plan for Evaluating the Toxicity of Chemicals, which asserts that “the explosion of new scientific tools in computation al, informational, and molecular sciences offers great promise to . . . strengthen toxicity testing and risk assessment approaches.” The concept of adding more mechanistic data to risk assessment isn’t new. Before Tox21, physiologically based pharmaco kinetic (PBPK) models, toxicogenomics, and related approaches were already making risk assessment more mechanistically based. But that research didn’t necessarily translate into changes in regulatory policies that govern human exposure, argues Lorenz Rhomberg, a principal with Gradient Corporation, a risk assessment consulting firm in Cambridge, Massachusetts. Despite the availability of mechanistic data, health officials at the EPA have been reluctant to use these data in setting exposure standards because in many cases they would justify higher allowable exposures than those suggested by more conservative default assumptions. Instead, the EPA relies more often on conservative default assumptions about how chemicals affect human beings. “EPA goes by precedent and does things as it did in the past so as to not be arbitrary,” Rhomberg explains. “So, there’s a lot of inertia in the system.” Robert Kavlock, director of the EPA National Center for Computational Toxicology, says the main difference between Tox21 and prior molecular research in toxicology is one of scale. Scientists have generally focused on hypothesis-driven investigations, such as how a chemical interacts with a specific cell target assumed to play a role in toxicity, he explains. Tox21, on the other hand, relies on unbiased screening methods that don’t assume any prior knowledge about what a chemical might do in the cell. Those investigations ideally will reveal entirely new molecular networks that coordinate toxicity, he says. Kavlock emphasizes that with its new strategy the EPA is demonstrating a willingness to take mechanistic data seriously. “Tox21 was produced with input from senior members across all the EPA offices,” he says. “There’s an explicit recognition that we’re in a scientific transition and that the business part of the agency needs to come along with it.” A New Focus on Pathways Tox21’s essential premise is that scientists can infer human harm from chemicals on the basis of how they activate toxicity path ways in cells. “Toxicity pathway” refers to a chemically induced chain of events that leads to an adverse effect such as tumor formation, explains Raymond Tice, chief of the NTP Biomolecular Screening Branch. Tice emphasizes that these pathways ordinarily coordinate normal processes such as hormone signaling or gene expression. It’s only when they are altered by chemicals or other stressors that harm occurs, he says. “We’re talking about pathways that occur all the time under typical circumstances,” Tice explains. Estrogen-receptor signaling, for instance, is an ordinary feature of nor mal cell biology, “but if it’s inappropriately up- or down-regulated,” Tice says, “it can cause developmental problems.” Scientists are now attempting to identify and map toxicity pathways and the ways chemicals interact with the biochemical processes involved in cell function, communication, and the ability to adapt to environmental changes. Ideally, these efforts will identify molecular “nodes” vulnerable to chemical exposure. An example of such a node could be a protein that—upon chemical binding—blocks or amplifies estrogen-receptor signaling, altering the pathway’s normal function. This is called a “pathway perturbation.” After identifying a perturbation, scientists have to put it into a broader context of toxicity in living animals. Doing so requires them to extrapolate a toxic blood or tissue dose from a cell-based response, which can be accomplished with PBPK modeling and computational methods based on human cell circuitry, says Gina Solomon, a senior scientist with the nonprofit Natural Resources Defense Council. Cell-based assays offer some advantages in this respect. Unlike animal tests, which are limited by cost and resource constraints to just a few doses, in vitro assays can test chemicals at a broad range of doses that might provide better information about low-dose human effects, scientists say. The whole process requires a leap of faith that perturbations and associated modeling efforts will accurately predict human effects from chemical exposure, Solomon says. “And this is why risk assessors at EPA have such a hard time with this type of data,” she explains. “It’s not easy to extrapolate from [the results of] a cell-based assay to [exposure effects] in a real population of humans. This is the toughest aspect of pathway-based risk assessment, and it’s one of the main reasons why it’s going to take years for these new approaches to come into widespread use.” The Path Forward Experts anticipate Tox21 will roll out in two phases. In the first, perturbations could guide the selection of chemicals for further testing in animals. With this approach, chemicals that, for instance, trigger oxidative stress (which can lead to inflammation) or impede DNA repair (thus potentially increasing the risk for cancer) could be given high-priority status for testing, whereas those that don’t induce such immediately worri-some effects could be relegated to a lesser concern. The EPA, through its ToxCast™ program, is already exploring how high-throughput systems can be used for prioritization, as is the NTP, in accordance with its own research program for the twenty-first century—the NTP Roadmap that was introduced in 2004. Kavlock says there’s a crucial need to prioritize chemicals on more of a biological basis. “Right now we’re prioritizing chemicals on the basis of other criteria, such as production volume, the likelihood for human exposure, or their structural similarity to other chemicals with known liabilities,” he says. “By incorporating more biology into prioritization, we think we can do a better job selecting the right chemicals for animal testing. We could also be more efficient in terms of how we conduct these tests.” In Tox21’s second phase, which some stakeholders say may roll out several decades from now, pathway perturbations could replace animal tests in setting chemical safety standards. Compared with prioritization, this is a far more challenging and elusive goal. Toxicologists have based human standards on the results of animal tests for more than 50 years. Standards for noncarcinogenic chemicals, for instance, are defined by the maximum dose that causes no harm to animals in a toxicity study, divided by numerical factors to reflect data uncertainties. Humans can theoretically tolerate this “reference dose” every day, risk-free, for a lifetime. Alternatively, carcinogens are regulated with a “cancer slope factor” that scientists extrapolate mathematically from doses that cause tumors in rodents. Tumors often appear only with high doses given for up to two years. Still, EPA regulators cautiously assume dose linearity for carcinogens, meaning even a single molecule of toxicant could, in theory, interact with DNA and cause cancer—in other words, until they can be convinced otherwise, EPA regulators assume there is no dose threshold for carcinogens below which cancer risk is negligible. The cancer slope factor, therefore, aims to limit the number of expected cancers in the exposed population to no more than 1 in 1 million people. The fact that animal tests rely on doses far higher than those found in the environment raises difficult questions about their relevance to humans. “I’ve spent nearly forty years as a toxicologist trying to relate high-dose animal studies to low-dose human risk,” says Melvin E. Andersen, director of the Program in Chemical Safety Sciences at the nonprofit Hamner Institutes for Health Sciences. “I now believe that’s impossible to do.” But experts are divided over the degree to which in vitro tests can completely replace animals in risk assessment. Andersen’s view—backed by the NRC report, he says—is that testing for perturbations of toxicity pathways, leading to the elimination of animal tests, should be a fundamental goal. “EPA and the NTP want to in vitro results to predict high-dose outcomes in animals,” he says. “But that’s backwards—we need to identify cellular targets and then predict what’s going to happen to people at environmentally relevant concentrations. In vitro methods will provide better information for such health risk assessment than animal studies. We have to stay current with where modern biology is going. If we don’t, much of what we do in toxicity testing will be regarded as irrelevant.” Daniel Krewski, director of the R. Samuel McLaughlin Centre for Population Health Risk Assessment at the University of Ottawa and chair of the NRC panel that produced the 2007 report, shares that view. “Let me say this in plain English,” he says. “The thrust of our vision, and also its beauty, is that we will no longer have to regulate on the basis of avoiding what we see in animals but on avoiding perturbations that we see in cell-based tests.” The EPA approach is more conservative, however, and focuses on prioritizing chemicals for further screening in animals rather than eliminating animals altogether. Kavlock emphasizes that if new technologies help scientists select appropriate chemicals for animal testing, they will go a long way toward making the process more effective and more efficient. “Predicting the future isn’t easy,” Kavlock says. “So, I wouldn’t rule in or rule out that someday we might be able to do [toxicity testing] without animals. But for the foreseeable future, the state of the science just doesn’t allow for that.” Overcoming the Status Quo What animal tests have going for them—apart from a long history in toxicology and a regulatory structure built around their results—is that they integrate responses across physiologic systems. Toxicity is sometimes caused not by a “parent” com pound—the actual chemical to which an animal or human is exposed—but by a metabolite of that compound. Moreover, some chemicals, including some develop mental and neurotoxic compounds, aren’t toxic at the point of exposure but rather at locations elsewhere in the body. John Doull, professor emeritus at the University of Kansas Medical Center, gives the example of chemicals that target certain regions in the brain whose toxic effects are reflected elsewhere, perhaps in terms of gait or vision. Cell-based assays might not pick up these metabolic or downstream effects, however. A study done in isolated liver hepatocytes, for example, might miss toxicity that occurs only in whole liver, where adjoining cells can metabolize parent chemicals to toxic forms, for instance by what’s known as cytochrome P450-mediated activation. Christopher Austin, director of the NCGC, concedes metabolic activation poses a tough challenge for in vitro research, but not one that can’t be overcome. “This is a very hard problem to deal with,” Austin says. “And we’re approaching it through a major technology development initiative involving co-cultures of hepatocytes and P450-responsive cells. That way, we only see the P450-mediated response if the parent compound is metabolized.” Another shortcoming with in vitro testing is compound integrity. Most laboratories store chemicals in dimethyl sulfoxide (DMSO), a popular solvent that can dissolve both polar (i.e., miscible with water) and nonpolar compounds. But DMSO can also absorb water from the atmosphere and thus degrade the compounds stored in it, explains Adam Yasgar, a research associate at NCGC. “The absorbed water can lead compounds to precipitate, which interferes with the analysis,” he says. “You might not know exactly what you’re testing.” Yasgar adds that NCGC gets around this problem by testing compounds at many different concentrations. The redundancy of that process leads to more reliable data, he says. But laboratories that rely on single-dose analyses could run into problems, he adds. Kavlock points out that Tox21 plans on chemical characterization of solutions being tested in order to confirm the identity of the chemical, its purity, and its stability in DMSO—an expensive but necessary step, he says, to build confidence in the resulting data. The Current Agenda Tox21 investigators are now conducting proof-of-principle experiments to show that pathway perturbations can predict toxicities already documented in completed animal studies. Their research focuses in part on roughly 10,000 com pounds, including industrial chemicals, pesticide active and inert ingredients, drinking water contaminants, and approved drugs, among others. According to a review in the May 2009 issue of EHP by Richard Judson and colleagues, there is at least limited hazard information for about two-thirds of these compounds and detailed toxicology information for about one-quarter of them. The compounds are being screened both at the EPA—through ToxCast—and at NCGC, which is about to purchase yet another robotic laboratory devoted exclusively to Tox21 research. Kavlock says the screens test for a range of end points, such as interactions with nuclear receptors, up-regulation of the p53 tumor suppressor gene, and effects on DNA repair mechanisms. Meanwhile, scientists are working to identify and map as many toxicity pathways as possible. Just how many pathways might participate in toxicity is a matter of some disagreement, however. Arguing that biology has definable boundaries that are set by the genome, Andersen claims the number is finite. “How many pathways could there be?” he asks. “I don’t know. I’ve suggested, somewhat tongue-in-cheek, that there are exactly 132 of them! The main point is that biology has to be robust, which compels us to believe these pathways are conserved across species [and through evolution]. My personal view is that all toxicity pathways revolve around stress responses and the control of gene expression.” Seen this way, Andersen adds, multiple classes of chemicals could share the same toxicity path ways in spite of differences in their physical structure. But Katrina Waters, a senior research scientist at Pacific Northwest National Laboratory in Richland, Washington, asserts the number of toxicity pathways might be virtually unlimited. “When you consider the diversity of chemicals facing testing and their potential effects, I don’t think it’s possible to say that some finite number of pathways will predict all adverse events,” she says. “I think it’s probable that each chemical class will have its own set of toxicity pathways for whatever adverse events are characterized for that class.” The debate is far from semantic—the number of toxicity pathways reflects the amount of work ultimately needed to meet the goals of Tox21. For example, Waters explains, if there were only 25 pathways involved in apoptosis, or programmed cell death, scientists could model those pathways mathematically and assume they capture adverse events for every chemical class. “You wouldn’t have to create a new mathematical model for each class; you could simply reuse the same models [and apply them to different chemicals],” she says. “But if you have a limitless number of pathways and conditional interactions between pathways, then you have to repeat the modeling process for every new chemical class [under investigation].” Going forward, Tox21 offers the opportunity to confer the advantages of high-throughput research on toxicology and risk assessment. But its promise is tempered by the vast research challenges that lie ahead. Scientists are aiming for nothing less than a complete map of the cell circuits that dictate toxicity, assembled from untold millions of data points, converted somehow into some thing useful. Regulatory officials will have to devise ways to replace decisions made on traditional end points with ones made on cell-based findings, Andersen says. Officials will also have to craft new strategies to explain those findings to the public. “Your average person on the street understands that when something causes birth defects in a rat, that’s something for humans to be concerned about,” says Solomon. “But when you base policies on perturbations of thyroid hormone homeostasis, well, it’s going to be harder for the public to know what to think about that.”
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            NTP Workshop: Role of Environmental Chemicals in the Development of Diabetes and Obesity.

            (2011)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Bisphenol A: Where to Now?

              Recently, Time magazine published an article titled “The Year in Medicine: From A to Z” (Park et al. 2008). The letter “B” was represented by the controversy over bisphenol A, a ubiquitous chemical used in polycarbonate and polyvinyl chloride plastics and epoxy resins and found in the urine of > 90% of Americans. The debate over whether bisphenol A poses a threat to human health has been brewing for the better part of the past decade. On 3 September 2008, the National Toxicology Program (NTP) Center for the Evaluation of Risks to Human Reproduction (CERHR) weighed in by releasing a report that significantly contributes to this ongoing discussion. The NTP-CERHR Monograph on the Potential Human Reproductive and Developmental Effects of Bisphenol A (NTP 2008a) identified evidence from experimental animal studies that raised “some concern” that current levels of exposure to human fetuses, infants, and children may result in developmental changes in the prostate gland and brain and diminish sexually dimorphic behaviors. “Some concern” represents the mid-point of a five-level scale of concern used by the NTP that ranges from “negligible” to “serious” concern. A lower level, “minimal concern,” was also expressed for possible changes in development of the mammary gland and an earlier age of attaining puberty in females. The NTP’s opinion on the level of concern for effects of bisphenol A on human reproduction and development stemmed from a 2-year analysis of a very limited number of available human studies but nearly 1,000 studies in experimental animals. Many of the laboratory studies explored effects on offspring of pregnant rodents receiving “low doses” of bisphenol A (< 5 mg/kg body weight/day, and including studies performed with much lower doses) during critical periods of development. The NTP Board of Scientific Counselors (2008) provided peer review and suggestions for refinement of the NTP CERHR’s conclusions (NTP 2008a), and the Science Board to the Food and Drug Administration (FDA 2008a, 2008b) also expressed agreement with the evaluation. The NTP’s evaluation of bisphenol A expressed “some concern” because many of the developmental effects reported in laboratory animals were observed at exposures to bisphenol A similar to those experienced by humans. Collectively, the findings could not be dismissed. Similar conclusions were reached by Health Canada (2008) and by participants at a workshop examining the potential relationship between bisphenol A and negative trends in human health (vom Saal et al. 2007). However, the NTP CERHR report (NTP 2008a), as well as other reviews, identified many areas of uncertainty and data gaps that should be addressed to fully understand bisphenol A’s potential to harm human development. In the months since release of the NTP-CERHR report (NTP 2008a), the literature on exposures and potential human health effects of bisphenol A has continued to grow (Calafat et al. 2008; Hugo et al. 2008; Lang et al. 2008; Leranth et al. 2008), raising public concern and generating more questions. Lists of research needs have been assembled (NTP 2008a; vom Saal et al. 2007). The NTP and the National Institute of Environmental Health Sciences (NIEHS) Division of Extramural Research and Training (DERT) recently issued a request for information (RFI) to the scientific community seeking information to help focus future research and testing activities (NTP 2008b). The RFI seeks information about a) on going research on the health effects of bisphenol A; b) unmet research needs; and c) suggestions for collaboration and cooperation between investigators to improve efficiency and timeliness in filling the information gaps. Together, the NTP and DERT will carefully consider the responses to this RFI as we develop research programs and explore other ways to address these issues in the future. The RFI (NTP 2008b) listed a number of general topics that scientists have consistently raised as areas where research is needed: a) the need to better understand sources of human exposures; b ) the need to compare the metabolism of bisphenol A among rodents, nonhuman primates, and humans and understand how it changes with age; c) the need for physiologically based pharmaco kinetic (PBPK) models to provide a scaffold for quantitatively assessing the consistency of outcomes across studies performed with widely different doses and designs; and d) the need for additional developmental toxicology studies of traditional design and power, but with modifications to provide the capability to detect the range of effects reported in academic studies as well as functional consequences as the animals age. The NTP has begun work in several areas. In collaboration with the Centers for Disease Control and Prevention and academic investigators, we are facilitating an evaluation of exposures to bisphenol A in infants in neonatal care settings and in children < 6 years of age. Together with the FDA National Center for Toxicological Research, we have initiated studies to obtain the data for constructing PBPK models in rodents and nonhuman primates, and we are planning studies to explore the long-term consequences of perinatal exposure to bisphenol A in order to understand the potential impact to humans of the developmental changes reported in numerous laboratory animal studies. The NTP and DERT are considering a number of strategies to provide the academic community access to the animals and tissues generated in these studies. We will provide additional details on the status and direction of our bisphenol A testing program through public meetings of the NTP Board of Scientific Counselors. Collectively, the results of these studies should begin to chip away at the uncertainties and research gaps and provide a better perspective of the potential threat that exposure to bisphenol A poses to public health.
                Bookmark

                Author and article information

                Journal
                Environ Health Perspect
                Environmental Health Perspectives
                National Institute of Environmental Health Sciences
                0091-6765
                1552-9924
                May 2011
                : 119
                : 5
                : A196-A197
                Affiliations
                National Institute of Environmental Health Sciences, National Institutes of Health, Department of Health and Human Services, Research Triangle Park, North Carolina, E-mail: bucher@ 123456niehs.nih.gov
                Author notes

                The authors declare they have no actual or potential competing financial interests.

                John R. Bucher is the associate director of the NTP, an interagency program headquartered at the NIEHS. Along with participating programs at the National Center for Toxicological Research, Food and Drug Administration, and laboratories of the National Institute for Occupational Safety and Health in Morgantown, West Virginia, and Cincinnati, Ohio, the NTP is the nation’s principal comprehensive toxicology analysis, research, and testing effort. Bucher holds a Ph.D. in pharmacology from the University of Iowa, an M.S. in biochemistry from the University of North Carolina, and a B.A. in biology from Knox College, and he was an NIH Postdoctoral Fellow in biochemistry and environmental toxicology at Michigan State University. He is a Diplomate of the American Board of Toxicology and a Fellow of the Collegium Ramazzini.

                Kristina A. Thayer, director of the NTP Office of Health Assessment and Translation, holds a Ph.D. in biological sciences from the University of Missouri–Columbia. She has been with the NIEHS since 2003, serving in the NTP Office of Liaison, Policy, and Review and the NIEHS Office of Risk Assessment Research prior to assuming her current position. She has authored numerous NTP reports and manuscripts on the toxicological potential of environmental substances.

                Linda S. Birnbaum, director of the NIEHS and the NTP, oversees a budget that funds multidisciplinary biomedical research programs and prevention and intervention efforts that encompass training, education, technology transfer, and community outreach. She recently received an honorary Doctor of Science from the University of Rochester, the distinguished alumna award from the University of Illinois, and was elected to the Institute of Medicine. She is the author of > 700 peer-reviewed publications, book chapters, abstracts, and reports. Birnbaum received her M.S. and Ph.D. in microbiology from the University of Illinois, Urbana. A board-certified toxicologist, she has served as a federal scientist for 31 years, 19 with the U.S. EPA Office of Research and Development, preceded by 10 years at the NIEHS as a senior staff fellow, a principal investigator, a research microbiologist, and a group leader for the institute’s Chemical Disposition Group.

                Article
                ehp-119-a196
                10.1289/ehp.1103645
                3094430
                21531652
                62ec3cf2-3170-456e-9f27-f23dce5e38f5
                This is an Open Access article: verbatim copying and redistribution of this article are permitted in all media for any purpose, provided this notice is preserved along with the article's original DOI.
                History
                Categories
                Perspectives
                Editorial

                Public health
                Public health

                Comments

                Comment on this article