Research Article: Toxicity Testing in the 21st Century Beyond Environmental Chemicals

Date Published: May 24, 2018

Publisher:

Author(s): Costanza Rovida, Shoji Asakura, Mardas Daneshian, Hana Hofman-Huether, Marcel Leist, Leo Meunier, David Reif, Anna Rossi, Markus Schmutz, Jean-Pierre Valentin, Joanne Zurlo, Thomas Hartung.

http://

Abstract

After the publication of the report titled Toxicity Testing in the 21st Century – A Vision and a Strategy, many initiatives started to foster a major paradigm shift for toxicity testing – from apical endpoints in animal-based tests to mechanistic endpoints through delineation of pathways of toxicity (PoT) in human cell based systems. The US EPA has funded an important project to develop new high throughput technologies based on human cell based in vitro technologies. These methods are currently being incorporated into the chemical risk assessment process. In the pharmaceutical industry, the efficacy and toxicity of new drugs are evaluated during preclinical investigations that include drug metabolism, pharmacokinetics, pharmacodynamics and safety toxicology studies. The results of these studies are analyzed and extrapolated to predict efficacy and potential adverse effects in humans. However, due to the high failure rate of drugs during the clinical phases, a new approach for a more predictive assessment of drugs both in terms of efficacy and adverse effects is getting urgent. The food industry faces the challenge of assessing novel foods and food ingredients for the general population, while using animal safety testing for extrapolation purposes is often of limited relevance. The question is whether the latest paradigm shift proposed by the Tox21c report for chemicals may provide a useful tool to improve the risk assessment approach also for drugs and food ingredients.

Partial Text

In 2004, the US Environmental Protection Agency (EPA) requested that the National Academy of Sciences review existing strategies and develop a vision for the future of toxicity testing. A committee comprising 22 experts in various fields of toxicology, epidemiology, environmental health, risk assessment and animal welfare representing academia, industry and non-governmental organizations worked together for four years and produced its ultimate report titled Toxicity Testing in the 21st Century – A Vision and a Strategy (Tox21c) (NRC, 2007). In this report, the committee proposed a major paradigm shift for toxicity testing – from apical endpoints in animal-based tests to mechanistic endpoints through delineation of pathways of toxicity (PoT) in human-based cell systems.

In the period 2008–2010, the EU FP7 program supported the project START-UP1 (Scientific and Technological issues in 3Rs Alternatives Research in The process of drug development and Union Politics), with the intention to cover all the issues of the 3Rs-bottlenecks in pharmaceutical research and development. More than two hundred representatives from industry, academia and regulatory agencies had regular meetings, and while the concluding report is valuable, it is principally focused on 2Rs (reduction and refinement). The report offers little on how to predict the effect of a substance in the human organism with advanced cell technology. This report partially addresses this issue.

The Tox21c report has highlighted the limitations of animal models to predict complex toxicity outcome in humans. The NRC report in 2007 was directed to environmental chemicals, rather than drugs or food additives. ToxCast9 is a program within the US EPA than was generated to explore the 2007 NRC vision. It incorporates more than 700 diverse assay endpoints in a high throughput-screening (HTS) paradigm to assess the toxicity of thousands of chemicals. The assays include both cell-free and cellular systems, derived from multiple species and tissues. Tox-Cast aims to profile the bioactivity of all test chemicals in an unbiased way by testing the same concentration ranges and experimental protocols to each chemical, regardless of class. While this necessitates a broad, screening-level concentration range covering several orders of magnitude, it allows direct chemical-chemical comparison of potencies across all assays or subsets thereof. Phase I of ToxCast mainly studied pesticides, while Phase II covered a much broader area of chemical space (Kavlock et al., 2012). Compounds (plus associated human testing data) donated by six pharmaceutical companies (GSK, Hoffmann LaRoche, Sanofi-Aventis, Pfizer, Merck, Astellas), cosmetics ingredients (sponsored by L’Oréal) as well as some food additives are included in the list of tested substances. These compounds normally fall under the purview of the FDA, rather than EPA, so their inclusion in an otherwise environmentally-focused chemical set is a noteworthy opportunity for trans-disciplinary comparison. All data and results are made publically-available3 to allow modeling by interested research groups, use by stakeholders, and analysis in the context of external data on these same compounds. The program is ongoing, with major efforts directed toward computational methods to model these massive data (Reif et al., 2010), as well as extensions such as Tox21™ (an alliance between EPA, NIEHS/NTP, NIH/NCATS and FDA to screen over 8,000 unique substances across a subset of ~50 high-throughput assays) (Huang et al., 2014), and alternative in vivo models amenable to HTS (Truong et al., 2014).

Regulatory safety assessments of pharmaceuticals are largely harmonized via the International Conference on Harmonization (ICH), which is responsible for issuing guidelines that represent recommendations, rather than protocols. ICH guidelines are complemented on a regional scale, for example, those that are drafted by the relevant Working Parties of the Committee for Human Medicinal Products (CHMP) at the European Medicines Agency (EMA). With respect to non-clinical testing requirements for human medicinal products, new in vitro methods have been accepted for regulatory use via multiple and flexible approaches, including formal validation, either as pivotal, supportive or as exploratory mechanistic studies, wherever applicable. Pharmaceutical regulators can drop redundant testing requirements. Indeed, data analysis following the publication of the concept paper on the need for revision of the EMA guideline on single-dose toxicity led to the complete removal of this guideline and its requirements, and thus a significant reduction in animal use (Chapman et al., 2010).

The food industry differs from the pharmaceutical industry in that its primary aim is not to cure diseases but to improve food quality and taste as well as providing adequate nutrition to help in the prevention of the occurrence of diseases. Importantly and in contrast to pharmaceuticals, that in most case target a specific population group, food is for the general population, including babies, pregnant women and the elderly. Unlike pharmaceuticals, for which a benefit/risk evaluation is systematically considered, a benefit/risk approach is rarely used for ingredients voluntarily added to regular foodstuffs, since typically no risk is accepted for the general population.

The real question is whether the actual procedures are satisfactory and in case the answer is negative, if there is something different that can be better, keeping in mind that the ultimate goal for all sectors is human prediction.

The application of Tox21c approaches may aid the safety prediction for drugs and novel food. The process is very complex and needs further work, also considering that implementation is not through replacement of individual patches.

 

Source:

http://

 

Leave a Reply

Your email address will not be published.