I work in the field of chemical toxicology, with a focus on how science is translated into regulation and EU policy. I have a background in computational toxicology, physiological sciences and linguistics.
Within linguistics, I am particularly interested in theoretical syntax, comparative linguistics, language evolution and language change.
Toxicology is my day job, linguistics my hobby! Address: Via Enrico Fermi 2749
The assessment of the allergenic potential of chemicals, crucial for ensuring public health safet... more The assessment of the allergenic potential of chemicals, crucial for ensuring public health safety, faces challenges in accuracy and raises ethical concerns due to reliance on animal testing. This paper presents a novel bioinformatic protocol designed to address the critical challenge of predicting immune responses to chemical sensitizers without the use of animal testing. The core innovation lies in the integration of advanced bioinformatics tools, including the Universal Immune System Simulator (UISS), which models detailed immune system dynamics. By leveraging data from structural predictions and docking simulations, our approach provides a more accurate and ethical method for chemical safety evaluations, especially in distinguishing between skin and respiratory sensitizers. Our approach integrates a comprehensive eight-step process, beginning with the meticulous collection of chemical and protein data from databases like PubChem and the Protein Data Bank. Following data acquisition, structural predictions are performed using cutting-edge tools such as AlphaFold to model proteins whose structures have not been previously elucidated. This structural information is then utilized in subsequent docking simulations, leveraging both ligand-protein and protein-protein interactions to predict how chemical compounds may trigger immune responses. The core novelty of our method lies in the application of UISS-an advanced agent-based modelling system that simulates detailed immune system dynamics. By inputting the results from earlier stages, including docking scores and potential epitope identifications, UISS meticulously forecasts the type and severity of immune responses, distinguishing between Th1mediated skin and Th2-mediated respiratory allergic reactions. This ability to predict distinct immune pathways is a crucial advance over current methods, which often cannot differentiate between the sensitization mechanisms. To validate the accuracy and robustness of our approach, we applied the protocol to well-known sensitizers: 2,4-dinitrochlorobenzene for skin allergies and trimellitic anhydride for respiratory allergies. The results clearly demonstrate the protocol's ability to differentiate between these distinct immune responses, underscoring its potential for replacing traditional animal-based testing methods. The results not only support the potential of our method to replace animal testing in chemical safety assessments but also highlight its role in enhancing the understanding of chemicalinduced immune reactions. Through this innovative integration of computational biology and immunological modelling, our protocol offers a transformative approach to toxicological evaluations, increasing the reliability of safety assessments.
The European regulatory framework on chemicals is at a crossroads. There are calls for the framew... more The European regulatory framework on chemicals is at a crossroads. There are calls for the framework to be more effective, by better protecting people and the environment. There is also room for it to be more efficient and cost-effective, by harmonizing assessment practices across sectors and avoiding the need for unnecessary testing. At the same time, there is a political commitment to phase out animal testing in chemical safety assessments. In this commentary, we argue that these needs are not at odds with each other. On the contrary, the European Commission's roadmap to phase out animal testing could also be the transition pathway to a more efficient, effective, and sustainable regulatory ecosystem. Central to our proposal is a framework based on biological reasoning in which biological questions can be answered by a choice of methods, with non-animal methods progressively becoming the only choice. Within this framework, a tiered approach to testing and assessment allows for greater efficiency and effectiveness, while also introducing considerations of proportionality and cost-effectiveness. Testing strategies, and their component methods, should be developed in tandem and judged in terms of their outcomes, and the protection levels they inform, rather than their ability to predict the outputs of animal tests.
The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge... more The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge base on chemicals worldwide. Like any evolving system, however, it has become increasingly diverse and complex, resulting in inefficiencies and potential inconsistencies. In the light of the EU Chemicals Strategy for Sustainability, it is therefore timely and reasonable to consider how aspects of the system could be simplified and streamlined, without losing the hard-earned benefits to human health and the environment. In this commentary, we propose a conceptual framework that could be the basis of Chemicals 2.0-a future safety assessment and management approach that is based on the application of New Approach Methodologies (NAMs), mechanistic reasoning and cost-benefit considerations. Chemicals 2.0 is designed to be a more efficient and more effective approach for assessing chemicals, and to comply with the EU goal to completely replace animal testing, in line with Directive 2010/63/EU. We propose five design criteria for Chemicals 2.0 to define what the future system should achieve. The approach is centered on a classification matrix in which NAMs for toxicodynamics and toxicokinetics are used to classify chemicals according to their level of concern. An important principle is the need to ensure an equivalent, or higher, protection level.
A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the ... more A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the body with physiologically relevant compartments connected via blood flow rates described by mathematical equations to determine drug disposition. PBPK models are used in the pharmaceutical sector for drug development, precision medicine, and the chemical industry to predict safe levels of exposure during the registration of chemical substances. However, one area of application where PBPK models have been scarcely used is forensic science. In this review, we give an overview of PBPK models successfully developed for several illicit drugs and environmental chemicals that could be applied for forensic interpretation, highlighting the gaps, uncertainties, and limitations.
Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shif... more Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shift towards the use of in vitro and in silico systems, we present herein a theoretical mathematical description of a quasi-diffusion process to predict chemical concentrations in 3-D spheroid cell cultures. By extending a 2-D Virtual Cell Based Assay (VCBA) model into a 3-D spheroid cell model, we assume that cells are arranged in a series of concentric layers within the sphere. We formulate the chemical quasi-diffusion process by simplifying the spheroid with respect to the number of cells in each layer. The system was calibrated and tested with acetaminophen (APAP). Simulated predictions of APAP toxicity were compared with empirical data from in vitro measurements by using a 3-D spheroid model. The results of this first attempt to extend the VCBA model are promisingthey show that the VCBA model simulates close correlation between the influence of compound concentration and the viability of the HepaRG 3-D cell culture. The 3-D VCBA model provides a complement to current in vitro procedures to refine experimental setups, to fill data gaps and help in the interpretation of in vitro data for the purposes of risk assessment.
In this editorial we reflect on the past decade of developments in predictive toxicology, and in ... more In this editorial we reflect on the past decade of developments in predictive toxicology, and in particular on the evolution of the Adverse Outcome Pathway (AOP) paradigm. Starting out as a concept, AOPs have become the focal point of a community of scientists, regulators and decision-makers. AOPs provide the mechanistic knowledge underpinning the development of Integrated Approaches to Testing and Assessment (IATA), including computational models now referred to as quantitative AOPs (qAOPs). With reference to recent and related works on qAOPs, we take a brief historical perspective and ask what is the next stage in modernising chemical toxicology beyond animal testing.
The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and int... more The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and interpretation of mechanistic data representing multiple biological levels and deriving from a range of methodological approaches including in silico, in vitro and in vivo assays. AOPs are playing an increasingly important role in the chemical safety assessment paradigm and quantification of AOPs is an important step towards a more reliable prediction of chemically induced adverse effects. Modelling methodologies require the identification, extraction and use of reliable data and information to support the inclusion of quantitative considerations in AOP development. An extensive and growing range of digital resources are available to support the modelling of quantitative AOPs, providing a wide range of information, but also requiring guidance for their practical application. A framework for qAOP development is proposed based on feedback from a group of experts and three qAOP case studies. The proposed framework provides a harmonised approach for both regulators and scientists working in this area.
Editorial on the Research Topic Advances and Refinements in the Development and Application of Th... more Editorial on the Research Topic Advances and Refinements in the Development and Application of Threshold of Toxicological Concern The Threshold of Toxicological Concern (TTC) is an exposure threshold below which there is no appreciable risk to human health. There are two main approaches: TTC values based on cancer potency data from which a one in a million excess lifetime risk is estimated and TTC values based on non-cancer effects. For the latter approach, a distribution is typically fitted to the No Observed Adverse Effect Levels (NOAELs) from repeat dose toxicity studies from which a 5th percentile value is taken and adjusted using an uncertainty factor (usually this is 100). Established TTC values are those based on oral chronic studies that were first developed by Munro et al. (1996) who subcategorised chemicals into one of three Cramer structural classes (Cramer et al., 1978). Kroes et al. (2004) presented a tiered TTC approach that established several human exposure thresholds spanning four orders of magnitude where the lowest TTC was for substances presenting structural alerts for genotoxicity (0.15 μg/d), to the next tier for organophosphates/carbamates (18 μg/d) and the remaining higher TTC values representing the same three Cramer classes originally derived by Munro et al. (1996). The World Health Organization (WHO) and the European Food Safety Authority (EFSA) (European Food Safety Authority and World Health Organization, 2016; EFSA et al., 2019) have determined that the TTC approach is a sound and fit-for-purpose risk assessment tool, with a number of caveats, in cases where chemical-specific repeat dose toxicity data are not available.
Computational and Structural Biotechnology Journal, 2022
Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended e... more Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended effects of chemical exposure on the immune system. Perfluorinated alkylate substances (PFAS), such as perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA), are persistent, globally disseminated environmental contaminants known to be immunotoxic. Elevated PFAS exposure is associated with lower antibody responses to vaccinations in children and in adults. In addition, some studies have reported a correlation between PFAS levels in the body and lower resistance to disease, in other words an increased risk of infections or cancers. In this context, modelling and simulation platforms could be used to simulate the human immune system with the aim to evaluate the adverse effects that immunotoxicants may have. Here, we show the conditions under which a mathematical model developed for one purpose and application (e.g., in the pharmaceutical domain) can be successfully translated and transferred to another (e.g., in the chemicals domain) without undergoing significant adaptation. In particular, we demonstrate that the Universal Immune System Simulator was able to simulate the effects of PFAS on the immune system, introducing entities and new interactions that are biologically involved in the phenomenon. This also revealed a potentially exploitable pathway for assessing immunotoxicity through a computational model.
Computational and Structural Biotechnology Journal, 2022
In many domains regulating chemicals and chemical products, there is a legal requirement to deter... more In many domains regulating chemicals and chemical products, there is a legal requirement to determine skin sensitivity to allergens. While many in vitro assays to detect contact hypersensitivity have been developed as alternatives to animal testing over the past ten years and significant progress has been made in this area, there is still a need for continued investment in the creation of techniques and strategies that will allow accurate identification of potential contact allergens and their potency in vitro. In silico models are promising tools in this regard. However, none of the state-of-the-art systems seems to function well enough to serve as a stand-alone hazard identification tool, especially in evaluating the possible allergenicity effects in humans. The Universal Immune System Simulator, a mechanistic computational platform that simulates the human immune system response to a specific insult, provides a means of predicting the immunotoxicity induced by skin sensitisers, enriching the collection of computational models for the assessment of skin sensitization. Here, we present a specific disease layer implementation of the Universal Immune System Simulator for the prediction of allergic contact dermatitis induced by specific skin sensitizers.
Structure-activity relationships (SARs) in toxicology have enabled the formation of structural ru... more Structure-activity relationships (SARs) in toxicology have enabled the formation of structural rules which, when coded as structural alerts, are essential tools in in silico toxicology. Whilst other in silico methods have approaches for their evaluation, there is no formal process to assess the confidence that may be associated with a structural alert. This investigation proposes twelve criteria to assess the uncertainty associated with structural alerts, allowing for an assessment of confidence. The criteria are based around the stated purpose, description of the chemistry, toxicology and mechanism, performance and coverage, as well as corroborating and supporting evidence of the alert. Alerts can be given a confidence assessment and score, enabling the identification of areas where more information may be beneficial. The scheme to evaluate structural alerts was placed in the context of various use cases for industrial and regulatory applications. The analysis of alerts, and consideration of the evaluation scheme, identifies the different characteristics an alert may have, such as being highly specific or generic. These characteristics may determine when an alert can be used for specific uses such as identification of analogues for read-across or hazard identification.
New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-... more New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-based method, as well as the strategies to implement them, that may provide information that could inform chemical safety assessment. Current chemical legislation in the European Union is limited in its acceptance of the widespread use of NAMs. The European Partnership for Alternative Approaches to Animal Testing (EPAA) therefore convened a 'Deep Dive Workshop' to explore the use of NAMs in chemical safety assessment, the aim of which was to support regulatory decisions, whilst intending to protect human health. The workshop recognised that NAMs are currently used in many industrial sectors, with some considered as fit for regulatory purpose. Moreover, the workshop identified key discussion points that can be addressed to increase the use and regulatory acceptance of NAMs. These are based on the changes needed in frameworks for regulatory requirements and the essential needs in education, training and greater stakeholder engagement as well the gaps in the scientific basis of NAMs.
In a century where toxicology and chemical risk assessment are embracing alternative methods to a... more In a century where toxicology and chemical risk assessment are embracing alternative methods to animal testing, there is an opportunity to understand the causal factors of neurodevelopmental disorders such as learning and memory disabilities in children, as a foundation to predict adverse effects. New testing paradigms, along with the advances in probabilistic modelling, can help with the formulation of mechanistically-driven hypotheses on how exposure to environmental chemicals could potentially lead to developmental neurotoxicity (DNT). This investigation aimed to develop a Bayesian hierarchical model of a simplified AOP network for DNT. The model predicted the probability that a compound induces each of three selected common key events (CKEs) of the simplified AOP network and the adverse outcome (AO) of DNT, taking into account correlations and causal relations informed by the key event relationships (KERs). A dataset of 88 compounds representing pharmaceuticals, industrial chemicals and pesticides was compiled including physicochemical properties as well as in silico and in vitro information. The Bayesian model was able to predict DNT potential with an accuracy of 76%, classifying the compounds into low, medium or high probability classes. The modelling workflow achieved three further goals: it dealt with missing values; accommodated unbalanced and correlated data; and followed the structure of a directed acyclic graph (DAG) to simulate the simplified AOP network. Overall, the model demonstrated the utility of Bayesian hierarchical modelling for the development of quantitative AOP (qAOP) models and for informing the use of new approach methodologies (NAMs) in chemical risk assessment.
Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels... more Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels of chemicals based on internal doses at target organs in experimental animals, humans and organisms used in environmental risk assessment. As the toxicity testing paradigm shifts to alternative testing approaches, PBK model development has started to rely (mostly or entirely) on model parameters quantified using in vitro or in silico methods. Recently, the Organisation for Economic Cooperation and Development (OECD) published a guidance document (GD) describing a scientific workflow for characterising and validating PBK models developed using in vitro and in silico data. The GD provides an assessment framework for evaluating these models, with emphasis on the major uncertainties underlying model inputs and outputs. To help end-users submit or evaluate a PBK model for regulatory purposes, the GD also includes a template for documenting the model characteristics, and a checklist for evaluating the quality of a model. This commentary highlights the principles, criteria and tools laid out in the OECD PBK model GD, with the aim of facilitating the dialogue between model developers and risk assessors.
New approaches in toxicology based on in vitro methods and computational modelling offer consider... more New approaches in toxicology based on in vitro methods and computational modelling offer considerable potential to improve the efficiency and effectiveness of chemical hazard and risk assessment in a variety of regulatory contexts. However, this presents challenges both for developers and regulatory assessors because often these two communities do not share the same level of confidence in a new approach. To address this challenge, various assessment frameworks have been developed over the past 20 years with the aim of creating harmonised and systematic approaches for evaluating new methods. These frameworks typically focus on specific methodologies and technologies, which has proven useful for establishing the validity and credibility of individual methods. However, given the increasing need to compare methods and combine their use in integrated assessment strategies, the multiplicity of frameworks is arguably becoming a barrier to their acceptance. In this commentary, we explore the concepts of model validity and credibility, and we illustrate how a set of seven credibility factors provides a method-agnostic means of comparing different kinds of predictive toxicology approaches. It is hoped that this will facilitate communication and cross-disciplinarity among method developers and users, with the ultimate aim of increasing the acceptance and use of predictive approaches in toxicology.
With current progress in science, there is growing interest in developing and applying Physiologi... more With current progress in science, there is growing interest in developing and applying Physiologically Based Kinetic (PBK) models in chemical risk assessment, as knowledge of internal exposure to chemicals is critical to understanding potential effects in vivo. In particular, a new generation of PBK models is being developed in which the model parameters are derived from in silico and in vitro methods. To increase the acceptance and use of these "Next Generation PBK models", there is a need to demonstrate their validity. However, this is challenging in the case of data-poor chemicals that are lacking in kinetic data and for which predictive capacity cannot, therefore, be assessed. The aim of this work is to lay down the fundamental steps in using a read across framework to inform modellers and risk assessors on how to develop, or evaluate, PBK models for chemicals without in vivo kinetic data. The application of a PBK model that takes into account the absorption, distribution, metabolism and excretion characteristics of the chemical reduces the uncertainties in the biokinetics and biotransformation of the chemical of interest. A strategic flow-charting application, proposed herein, allows users to identify the minimum information to perform a read-across from a data-rich chemical to its data-poor analogue(s). The workflow analysis is illustrated by means of a real case study using the alkenylbenzene class of chemicals, showing the reliability and potential of this approach. It was demonstrated that a consistent quantitative relationship between model simulations could be achieved using models for estragole and safrole (source chemicals) when applied to methyleugenol (target chemical). When the PBK model code for the source chemicals was adapted to utilise input values relevant to the target chemical, simulation was consistent between the models. The resulting PBK model for methyleugenol was further evaluated by comparing the results to an existing, published model for methyleugenol, providing further evidence that the approach was successful. This can be considered as a "read-across" approach, enabling a valid PBK model to be derived to aid the assessment of a data poor chemical.
In view of the need to enhance the assessment of consumer products called for in the EU Chemicals... more In view of the need to enhance the assessment of consumer products called for in the EU Chemicals Strategy for Sustainability, we developed a methodology for evaluating hazard by combining information across different systemic toxicity endpoints and integrating the information with new approach methodologies. This integrates mechanistic information with a view to avoiding redundant in vivo studies, minimising reliance on apical endpoint tests and ultimately devising efficient testing strategies. Here, we present the application of our methodology to carcinogenicity assessment, mapping the available information from toxicity test methods across endpoints to the key characteristics of carcinogens. Test methods are deconstructed to allow the information they provide to be organised in a systematic way, enabling the description of the toxicity mechanisms leading to the adverse outcome. This integrated approach provides a flexible and resource-efficient means of fully exploiting test methods for which test guidelines are available to fulfil regulatory requirements for systemic toxicity assessment as well as identifying where new methods can be integrated.
The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other E... more The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other EU regulations, such as REACH and the Cosmetic Products Regulation advocate for a change in the way toxicity testing is conducted. Whilst the Cosmetic Products Regulation bans animal testing altogether, REACH aims for a progressive shift from in vivo testing towards quantitative in vitro and computational approaches. Several endpoints can already be addressed using non-animal approaches including skin corrosion and irritation, serious eye damage and irritation, skin sensitisation, and mutagenicity and genotoxicity. However, for systemic effects such as acute toxicity, repeated dose toxicity and reproductive and developmental toxicity, evaluation of chemicals under REACH still heavily relies on animal tests. Here we summarise current EU regulatory requirements for the human health assessment of chemicals under REACH and the Cosmetic Products Regulation, considering the more critical endpoints and identifying the main challenges in introducing alternative methods into regulatory testing practice. This supports a recent initiative taken by the International Cooperation on Alternative Test Methods (ICATM) to summarise current regulatory requirements specific for the assessment of chemicals and cosmetic products for several human health-related endpoints, with the aim of comparing different jurisdictions and coordinating the promotion and ultimately the implementation of non-animal approaches worldwide. Recent initiatives undertaken at European level to promote the 3Rs and the use of alternative methods in current regulatory practice are also discussed.
The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-relate... more The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-related chemicals within the COSMOS Project funded as part of the SEURAT-1 Research Initiative. The database has subsequently been maintained and developed further into COSMOS Next Generation (NG), a combination of database and in silico tools, essential components of a knowledge base. COSMOS DB provided a cosmetics inventory as well as other regulatory inventories, accompanied by assessment results and in vitro and in vivo toxicity data. In addition to data content curation, much effort was dedicated to data governance – data authorisation, characterisation of quality, documentation of meta information, and control of data use. Through this effort, COSMOS DB was able to merge and fuse data of various types from different sources. Building on the previous effort, the COSMOS Minimum Inclusion (MINIS) criteria for a toxicity database were further expanded to quantify the reliability of studies. COSMOS NG features multiple fingerprints for analysing structure similarity, and new tools to calculate molecular properties and screen chemicals with endpoint-related public profilers, such as DNA and protein binders, liver alerts and genotoxic alerts. The publicly available COSMOS NG enables users to compile information and execute analyses such as category formation and read-across. This paper provides a step-by-step guided workflow for a simple read-across case, starting from a target structure and culminating in an estimation of a NOAEL confidence interval. Given its strong technical foundation, inclusion of quality-reviewed data, and provision of tools designed to facilitate communication between users, COSMOS NG is a first step towards building a toxicological knowledge hub leveraging many public data systems for chemical safety evaluation. We continue to monitor the feedback from the user community at support@mn-am.com.
Prenatal and postnatal co-exposure to multiple chemicals at the same time may have deleterious ef... more Prenatal and postnatal co-exposure to multiple chemicals at the same time may have deleterious effects on the developing nervous system. We previously showed that chemicals acting through similar mode of action (MoA) and grouped based on perturbation of brain derived neurotrophic factor (BDNF), induced greater neurotoxic effects on human induced pluripotent stem cell (hiPSC)-derived neurons and astrocytes compared to chemicals with dissimilar MoA. Here we assessed the effects of repeated dose (14 days) treatments with mixtures containing the six chemicals tested in our previous study (Bisphenol A, Chlorpyrifos, Lead(II) chloride, Methylmercury chloride, PCB138 and Valproic acid) along with 2,2 ′ 4,4 ′-tetrabromodiphenyl ether (BDE47), Ethanol, Vinclozolin and 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)), on hiPSC-derived neural stem cells undergoing differentiation toward mixed neurons/astrocytes up to 21 days. Similar MoA chemicals in mixtures caused an increase of BDNF levels and neurite outgrowth, and a decrease of synapse formation, which led to inhibition of electrical activity. Perturbations of these endpoints are described as common key events in adverse outcome pathways (AOPs) specific for DNT. When compared with mixtures tested in our previous study, adding similarly acting chemicals (BDE47 and EtOH) to the mixture resulted in a stronger downregulation of synapses. A synergistic effect on some synaptogenesis-related features (PSD95 in particular) was hypothesized upon treatment with tested mixtures, as indicated by mathematical modelling. Our findings confirm that the use of human iPSC-derived mixed neuronal/ glial models applied to a battery of in vitro assays anchored to key events in DNT AOP networks, combined with mathematical modelling, is a suitable testing strategy to assess in vitro DNT induced by chemical mixtures.
The assessment of the allergenic potential of chemicals, crucial for ensuring public health safet... more The assessment of the allergenic potential of chemicals, crucial for ensuring public health safety, faces challenges in accuracy and raises ethical concerns due to reliance on animal testing. This paper presents a novel bioinformatic protocol designed to address the critical challenge of predicting immune responses to chemical sensitizers without the use of animal testing. The core innovation lies in the integration of advanced bioinformatics tools, including the Universal Immune System Simulator (UISS), which models detailed immune system dynamics. By leveraging data from structural predictions and docking simulations, our approach provides a more accurate and ethical method for chemical safety evaluations, especially in distinguishing between skin and respiratory sensitizers. Our approach integrates a comprehensive eight-step process, beginning with the meticulous collection of chemical and protein data from databases like PubChem and the Protein Data Bank. Following data acquisition, structural predictions are performed using cutting-edge tools such as AlphaFold to model proteins whose structures have not been previously elucidated. This structural information is then utilized in subsequent docking simulations, leveraging both ligand-protein and protein-protein interactions to predict how chemical compounds may trigger immune responses. The core novelty of our method lies in the application of UISS-an advanced agent-based modelling system that simulates detailed immune system dynamics. By inputting the results from earlier stages, including docking scores and potential epitope identifications, UISS meticulously forecasts the type and severity of immune responses, distinguishing between Th1mediated skin and Th2-mediated respiratory allergic reactions. This ability to predict distinct immune pathways is a crucial advance over current methods, which often cannot differentiate between the sensitization mechanisms. To validate the accuracy and robustness of our approach, we applied the protocol to well-known sensitizers: 2,4-dinitrochlorobenzene for skin allergies and trimellitic anhydride for respiratory allergies. The results clearly demonstrate the protocol's ability to differentiate between these distinct immune responses, underscoring its potential for replacing traditional animal-based testing methods. The results not only support the potential of our method to replace animal testing in chemical safety assessments but also highlight its role in enhancing the understanding of chemicalinduced immune reactions. Through this innovative integration of computational biology and immunological modelling, our protocol offers a transformative approach to toxicological evaluations, increasing the reliability of safety assessments.
The European regulatory framework on chemicals is at a crossroads. There are calls for the framew... more The European regulatory framework on chemicals is at a crossroads. There are calls for the framework to be more effective, by better protecting people and the environment. There is also room for it to be more efficient and cost-effective, by harmonizing assessment practices across sectors and avoiding the need for unnecessary testing. At the same time, there is a political commitment to phase out animal testing in chemical safety assessments. In this commentary, we argue that these needs are not at odds with each other. On the contrary, the European Commission's roadmap to phase out animal testing could also be the transition pathway to a more efficient, effective, and sustainable regulatory ecosystem. Central to our proposal is a framework based on biological reasoning in which biological questions can be answered by a choice of methods, with non-animal methods progressively becoming the only choice. Within this framework, a tiered approach to testing and assessment allows for greater efficiency and effectiveness, while also introducing considerations of proportionality and cost-effectiveness. Testing strategies, and their component methods, should be developed in tandem and judged in terms of their outcomes, and the protection levels they inform, rather than their ability to predict the outputs of animal tests.
The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge... more The body of EU chemicals legislation has evolved since the 1960s, producing the largest knowledge base on chemicals worldwide. Like any evolving system, however, it has become increasingly diverse and complex, resulting in inefficiencies and potential inconsistencies. In the light of the EU Chemicals Strategy for Sustainability, it is therefore timely and reasonable to consider how aspects of the system could be simplified and streamlined, without losing the hard-earned benefits to human health and the environment. In this commentary, we propose a conceptual framework that could be the basis of Chemicals 2.0-a future safety assessment and management approach that is based on the application of New Approach Methodologies (NAMs), mechanistic reasoning and cost-benefit considerations. Chemicals 2.0 is designed to be a more efficient and more effective approach for assessing chemicals, and to comply with the EU goal to completely replace animal testing, in line with Directive 2010/63/EU. We propose five design criteria for Chemicals 2.0 to define what the future system should achieve. The approach is centered on a classification matrix in which NAMs for toxicodynamics and toxicokinetics are used to classify chemicals according to their level of concern. An important principle is the need to ensure an equivalent, or higher, protection level.
A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the ... more A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the body with physiologically relevant compartments connected via blood flow rates described by mathematical equations to determine drug disposition. PBPK models are used in the pharmaceutical sector for drug development, precision medicine, and the chemical industry to predict safe levels of exposure during the registration of chemical substances. However, one area of application where PBPK models have been scarcely used is forensic science. In this review, we give an overview of PBPK models successfully developed for several illicit drugs and environmental chemicals that could be applied for forensic interpretation, highlighting the gaps, uncertainties, and limitations.
Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shif... more Prediction of chemical toxicity is very useful in risk assessment. With the current paradigm shift towards the use of in vitro and in silico systems, we present herein a theoretical mathematical description of a quasi-diffusion process to predict chemical concentrations in 3-D spheroid cell cultures. By extending a 2-D Virtual Cell Based Assay (VCBA) model into a 3-D spheroid cell model, we assume that cells are arranged in a series of concentric layers within the sphere. We formulate the chemical quasi-diffusion process by simplifying the spheroid with respect to the number of cells in each layer. The system was calibrated and tested with acetaminophen (APAP). Simulated predictions of APAP toxicity were compared with empirical data from in vitro measurements by using a 3-D spheroid model. The results of this first attempt to extend the VCBA model are promisingthey show that the VCBA model simulates close correlation between the influence of compound concentration and the viability of the HepaRG 3-D cell culture. The 3-D VCBA model provides a complement to current in vitro procedures to refine experimental setups, to fill data gaps and help in the interpretation of in vitro data for the purposes of risk assessment.
In this editorial we reflect on the past decade of developments in predictive toxicology, and in ... more In this editorial we reflect on the past decade of developments in predictive toxicology, and in particular on the evolution of the Adverse Outcome Pathway (AOP) paradigm. Starting out as a concept, AOPs have become the focal point of a community of scientists, regulators and decision-makers. AOPs provide the mechanistic knowledge underpinning the development of Integrated Approaches to Testing and Assessment (IATA), including computational models now referred to as quantitative AOPs (qAOPs). With reference to recent and related works on qAOPs, we take a brief historical perspective and ask what is the next stage in modernising chemical toxicology beyond animal testing.
The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and int... more The adverse outcome pathway (AOP) is a conceptual construct that facilitates organisation and interpretation of mechanistic data representing multiple biological levels and deriving from a range of methodological approaches including in silico, in vitro and in vivo assays. AOPs are playing an increasingly important role in the chemical safety assessment paradigm and quantification of AOPs is an important step towards a more reliable prediction of chemically induced adverse effects. Modelling methodologies require the identification, extraction and use of reliable data and information to support the inclusion of quantitative considerations in AOP development. An extensive and growing range of digital resources are available to support the modelling of quantitative AOPs, providing a wide range of information, but also requiring guidance for their practical application. A framework for qAOP development is proposed based on feedback from a group of experts and three qAOP case studies. The proposed framework provides a harmonised approach for both regulators and scientists working in this area.
Editorial on the Research Topic Advances and Refinements in the Development and Application of Th... more Editorial on the Research Topic Advances and Refinements in the Development and Application of Threshold of Toxicological Concern The Threshold of Toxicological Concern (TTC) is an exposure threshold below which there is no appreciable risk to human health. There are two main approaches: TTC values based on cancer potency data from which a one in a million excess lifetime risk is estimated and TTC values based on non-cancer effects. For the latter approach, a distribution is typically fitted to the No Observed Adverse Effect Levels (NOAELs) from repeat dose toxicity studies from which a 5th percentile value is taken and adjusted using an uncertainty factor (usually this is 100). Established TTC values are those based on oral chronic studies that were first developed by Munro et al. (1996) who subcategorised chemicals into one of three Cramer structural classes (Cramer et al., 1978). Kroes et al. (2004) presented a tiered TTC approach that established several human exposure thresholds spanning four orders of magnitude where the lowest TTC was for substances presenting structural alerts for genotoxicity (0.15 μg/d), to the next tier for organophosphates/carbamates (18 μg/d) and the remaining higher TTC values representing the same three Cramer classes originally derived by Munro et al. (1996). The World Health Organization (WHO) and the European Food Safety Authority (EFSA) (European Food Safety Authority and World Health Organization, 2016; EFSA et al., 2019) have determined that the TTC approach is a sound and fit-for-purpose risk assessment tool, with a number of caveats, in cases where chemical-specific repeat dose toxicity data are not available.
Computational and Structural Biotechnology Journal, 2022
Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended e... more Immunotoxicity hazard identification of chemicals aims to evaluate the potential for unintended effects of chemical exposure on the immune system. Perfluorinated alkylate substances (PFAS), such as perfluorooctane sulfonate (PFOS) and perfluorooctanoic acid (PFOA), are persistent, globally disseminated environmental contaminants known to be immunotoxic. Elevated PFAS exposure is associated with lower antibody responses to vaccinations in children and in adults. In addition, some studies have reported a correlation between PFAS levels in the body and lower resistance to disease, in other words an increased risk of infections or cancers. In this context, modelling and simulation platforms could be used to simulate the human immune system with the aim to evaluate the adverse effects that immunotoxicants may have. Here, we show the conditions under which a mathematical model developed for one purpose and application (e.g., in the pharmaceutical domain) can be successfully translated and transferred to another (e.g., in the chemicals domain) without undergoing significant adaptation. In particular, we demonstrate that the Universal Immune System Simulator was able to simulate the effects of PFAS on the immune system, introducing entities and new interactions that are biologically involved in the phenomenon. This also revealed a potentially exploitable pathway for assessing immunotoxicity through a computational model.
Computational and Structural Biotechnology Journal, 2022
In many domains regulating chemicals and chemical products, there is a legal requirement to deter... more In many domains regulating chemicals and chemical products, there is a legal requirement to determine skin sensitivity to allergens. While many in vitro assays to detect contact hypersensitivity have been developed as alternatives to animal testing over the past ten years and significant progress has been made in this area, there is still a need for continued investment in the creation of techniques and strategies that will allow accurate identification of potential contact allergens and their potency in vitro. In silico models are promising tools in this regard. However, none of the state-of-the-art systems seems to function well enough to serve as a stand-alone hazard identification tool, especially in evaluating the possible allergenicity effects in humans. The Universal Immune System Simulator, a mechanistic computational platform that simulates the human immune system response to a specific insult, provides a means of predicting the immunotoxicity induced by skin sensitisers, enriching the collection of computational models for the assessment of skin sensitization. Here, we present a specific disease layer implementation of the Universal Immune System Simulator for the prediction of allergic contact dermatitis induced by specific skin sensitizers.
Structure-activity relationships (SARs) in toxicology have enabled the formation of structural ru... more Structure-activity relationships (SARs) in toxicology have enabled the formation of structural rules which, when coded as structural alerts, are essential tools in in silico toxicology. Whilst other in silico methods have approaches for their evaluation, there is no formal process to assess the confidence that may be associated with a structural alert. This investigation proposes twelve criteria to assess the uncertainty associated with structural alerts, allowing for an assessment of confidence. The criteria are based around the stated purpose, description of the chemistry, toxicology and mechanism, performance and coverage, as well as corroborating and supporting evidence of the alert. Alerts can be given a confidence assessment and score, enabling the identification of areas where more information may be beneficial. The scheme to evaluate structural alerts was placed in the context of various use cases for industrial and regulatory applications. The analysis of alerts, and consideration of the evaluation scheme, identifies the different characteristics an alert may have, such as being highly specific or generic. These characteristics may determine when an alert can be used for specific uses such as identification of analogues for read-across or hazard identification.
New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-... more New Approach Methodologies (NAMs) are considered to include any in vitro, in silico or chemistry-based method, as well as the strategies to implement them, that may provide information that could inform chemical safety assessment. Current chemical legislation in the European Union is limited in its acceptance of the widespread use of NAMs. The European Partnership for Alternative Approaches to Animal Testing (EPAA) therefore convened a 'Deep Dive Workshop' to explore the use of NAMs in chemical safety assessment, the aim of which was to support regulatory decisions, whilst intending to protect human health. The workshop recognised that NAMs are currently used in many industrial sectors, with some considered as fit for regulatory purpose. Moreover, the workshop identified key discussion points that can be addressed to increase the use and regulatory acceptance of NAMs. These are based on the changes needed in frameworks for regulatory requirements and the essential needs in education, training and greater stakeholder engagement as well the gaps in the scientific basis of NAMs.
In a century where toxicology and chemical risk assessment are embracing alternative methods to a... more In a century where toxicology and chemical risk assessment are embracing alternative methods to animal testing, there is an opportunity to understand the causal factors of neurodevelopmental disorders such as learning and memory disabilities in children, as a foundation to predict adverse effects. New testing paradigms, along with the advances in probabilistic modelling, can help with the formulation of mechanistically-driven hypotheses on how exposure to environmental chemicals could potentially lead to developmental neurotoxicity (DNT). This investigation aimed to develop a Bayesian hierarchical model of a simplified AOP network for DNT. The model predicted the probability that a compound induces each of three selected common key events (CKEs) of the simplified AOP network and the adverse outcome (AO) of DNT, taking into account correlations and causal relations informed by the key event relationships (KERs). A dataset of 88 compounds representing pharmaceuticals, industrial chemicals and pesticides was compiled including physicochemical properties as well as in silico and in vitro information. The Bayesian model was able to predict DNT potential with an accuracy of 76%, classifying the compounds into low, medium or high probability classes. The modelling workflow achieved three further goals: it dealt with missing values; accommodated unbalanced and correlated data; and followed the structure of a directed acyclic graph (DAG) to simulate the simplified AOP network. Overall, the model demonstrated the utility of Bayesian hierarchical modelling for the development of quantitative AOP (qAOP) models and for informing the use of new approach methodologies (NAMs) in chemical risk assessment.
Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels... more Physiologically Based Kinetic (PBK) models are valuable tools to help define safe external levels of chemicals based on internal doses at target organs in experimental animals, humans and organisms used in environmental risk assessment. As the toxicity testing paradigm shifts to alternative testing approaches, PBK model development has started to rely (mostly or entirely) on model parameters quantified using in vitro or in silico methods. Recently, the Organisation for Economic Cooperation and Development (OECD) published a guidance document (GD) describing a scientific workflow for characterising and validating PBK models developed using in vitro and in silico data. The GD provides an assessment framework for evaluating these models, with emphasis on the major uncertainties underlying model inputs and outputs. To help end-users submit or evaluate a PBK model for regulatory purposes, the GD also includes a template for documenting the model characteristics, and a checklist for evaluating the quality of a model. This commentary highlights the principles, criteria and tools laid out in the OECD PBK model GD, with the aim of facilitating the dialogue between model developers and risk assessors.
New approaches in toxicology based on in vitro methods and computational modelling offer consider... more New approaches in toxicology based on in vitro methods and computational modelling offer considerable potential to improve the efficiency and effectiveness of chemical hazard and risk assessment in a variety of regulatory contexts. However, this presents challenges both for developers and regulatory assessors because often these two communities do not share the same level of confidence in a new approach. To address this challenge, various assessment frameworks have been developed over the past 20 years with the aim of creating harmonised and systematic approaches for evaluating new methods. These frameworks typically focus on specific methodologies and technologies, which has proven useful for establishing the validity and credibility of individual methods. However, given the increasing need to compare methods and combine their use in integrated assessment strategies, the multiplicity of frameworks is arguably becoming a barrier to their acceptance. In this commentary, we explore the concepts of model validity and credibility, and we illustrate how a set of seven credibility factors provides a method-agnostic means of comparing different kinds of predictive toxicology approaches. It is hoped that this will facilitate communication and cross-disciplinarity among method developers and users, with the ultimate aim of increasing the acceptance and use of predictive approaches in toxicology.
With current progress in science, there is growing interest in developing and applying Physiologi... more With current progress in science, there is growing interest in developing and applying Physiologically Based Kinetic (PBK) models in chemical risk assessment, as knowledge of internal exposure to chemicals is critical to understanding potential effects in vivo. In particular, a new generation of PBK models is being developed in which the model parameters are derived from in silico and in vitro methods. To increase the acceptance and use of these "Next Generation PBK models", there is a need to demonstrate their validity. However, this is challenging in the case of data-poor chemicals that are lacking in kinetic data and for which predictive capacity cannot, therefore, be assessed. The aim of this work is to lay down the fundamental steps in using a read across framework to inform modellers and risk assessors on how to develop, or evaluate, PBK models for chemicals without in vivo kinetic data. The application of a PBK model that takes into account the absorption, distribution, metabolism and excretion characteristics of the chemical reduces the uncertainties in the biokinetics and biotransformation of the chemical of interest. A strategic flow-charting application, proposed herein, allows users to identify the minimum information to perform a read-across from a data-rich chemical to its data-poor analogue(s). The workflow analysis is illustrated by means of a real case study using the alkenylbenzene class of chemicals, showing the reliability and potential of this approach. It was demonstrated that a consistent quantitative relationship between model simulations could be achieved using models for estragole and safrole (source chemicals) when applied to methyleugenol (target chemical). When the PBK model code for the source chemicals was adapted to utilise input values relevant to the target chemical, simulation was consistent between the models. The resulting PBK model for methyleugenol was further evaluated by comparing the results to an existing, published model for methyleugenol, providing further evidence that the approach was successful. This can be considered as a "read-across" approach, enabling a valid PBK model to be derived to aid the assessment of a data poor chemical.
In view of the need to enhance the assessment of consumer products called for in the EU Chemicals... more In view of the need to enhance the assessment of consumer products called for in the EU Chemicals Strategy for Sustainability, we developed a methodology for evaluating hazard by combining information across different systemic toxicity endpoints and integrating the information with new approach methodologies. This integrates mechanistic information with a view to avoiding redundant in vivo studies, minimising reliance on apical endpoint tests and ultimately devising efficient testing strategies. Here, we present the application of our methodology to carcinogenicity assessment, mapping the available information from toxicity test methods across endpoints to the key characteristics of carcinogens. Test methods are deconstructed to allow the information they provide to be organised in a systematic way, enabling the description of the toxicity mechanisms leading to the adverse outcome. This integrated approach provides a flexible and resource-efficient means of fully exploiting test methods for which test guidelines are available to fulfil regulatory requirements for systemic toxicity assessment as well as identifying where new methods can be integrated.
The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other E... more The EU Directive 2010/63/EU on the protection of animals used for scientific purposes and other EU regulations, such as REACH and the Cosmetic Products Regulation advocate for a change in the way toxicity testing is conducted. Whilst the Cosmetic Products Regulation bans animal testing altogether, REACH aims for a progressive shift from in vivo testing towards quantitative in vitro and computational approaches. Several endpoints can already be addressed using non-animal approaches including skin corrosion and irritation, serious eye damage and irritation, skin sensitisation, and mutagenicity and genotoxicity. However, for systemic effects such as acute toxicity, repeated dose toxicity and reproductive and developmental toxicity, evaluation of chemicals under REACH still heavily relies on animal tests. Here we summarise current EU regulatory requirements for the human health assessment of chemicals under REACH and the Cosmetic Products Regulation, considering the more critical endpoints and identifying the main challenges in introducing alternative methods into regulatory testing practice. This supports a recent initiative taken by the International Cooperation on Alternative Test Methods (ICATM) to summarise current regulatory requirements specific for the assessment of chemicals and cosmetic products for several human health-related endpoints, with the aim of comparing different jurisdictions and coordinating the promotion and ultimately the implementation of non-animal approaches worldwide. Recent initiatives undertaken at European level to promote the 3Rs and the use of alternative methods in current regulatory practice are also discussed.
The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-relate... more The COSMOS Database (DB) was originally established to provide reliable data for cosmetics-related chemicals within the COSMOS Project funded as part of the SEURAT-1 Research Initiative. The database has subsequently been maintained and developed further into COSMOS Next Generation (NG), a combination of database and in silico tools, essential components of a knowledge base. COSMOS DB provided a cosmetics inventory as well as other regulatory inventories, accompanied by assessment results and in vitro and in vivo toxicity data. In addition to data content curation, much effort was dedicated to data governance – data authorisation, characterisation of quality, documentation of meta information, and control of data use. Through this effort, COSMOS DB was able to merge and fuse data of various types from different sources. Building on the previous effort, the COSMOS Minimum Inclusion (MINIS) criteria for a toxicity database were further expanded to quantify the reliability of studies. COSMOS NG features multiple fingerprints for analysing structure similarity, and new tools to calculate molecular properties and screen chemicals with endpoint-related public profilers, such as DNA and protein binders, liver alerts and genotoxic alerts. The publicly available COSMOS NG enables users to compile information and execute analyses such as category formation and read-across. This paper provides a step-by-step guided workflow for a simple read-across case, starting from a target structure and culminating in an estimation of a NOAEL confidence interval. Given its strong technical foundation, inclusion of quality-reviewed data, and provision of tools designed to facilitate communication between users, COSMOS NG is a first step towards building a toxicological knowledge hub leveraging many public data systems for chemical safety evaluation. We continue to monitor the feedback from the user community at support@mn-am.com.
Prenatal and postnatal co-exposure to multiple chemicals at the same time may have deleterious ef... more Prenatal and postnatal co-exposure to multiple chemicals at the same time may have deleterious effects on the developing nervous system. We previously showed that chemicals acting through similar mode of action (MoA) and grouped based on perturbation of brain derived neurotrophic factor (BDNF), induced greater neurotoxic effects on human induced pluripotent stem cell (hiPSC)-derived neurons and astrocytes compared to chemicals with dissimilar MoA. Here we assessed the effects of repeated dose (14 days) treatments with mixtures containing the six chemicals tested in our previous study (Bisphenol A, Chlorpyrifos, Lead(II) chloride, Methylmercury chloride, PCB138 and Valproic acid) along with 2,2 ′ 4,4 ′-tetrabromodiphenyl ether (BDE47), Ethanol, Vinclozolin and 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)), on hiPSC-derived neural stem cells undergoing differentiation toward mixed neurons/astrocytes up to 21 days. Similar MoA chemicals in mixtures caused an increase of BDNF levels and neurite outgrowth, and a decrease of synapse formation, which led to inhibition of electrical activity. Perturbations of these endpoints are described as common key events in adverse outcome pathways (AOPs) specific for DNT. When compared with mixtures tested in our previous study, adding similarly acting chemicals (BDE47 and EtOH) to the mixture resulted in a stronger downregulation of synapses. A synergistic effect on some synaptogenesis-related features (PSD95 in particular) was hypothesized upon treatment with tested mixtures, as indicated by mathematical modelling. Our findings confirm that the use of human iPSC-derived mixed neuronal/ glial models applied to a battery of in vitro assays anchored to key events in DNT AOP networks, combined with mathematical modelling, is a suitable testing strategy to assess in vitro DNT induced by chemical mixtures.
In Silico Methods for Predicting Drug Toxicity. Methods in Molecular Biology, 2022
In this chapter, we give a brief overview of the regulatory requirements for acute systemic toxic... more In this chapter, we give a brief overview of the regulatory requirements for acute systemic toxicity information in the European Union, and we review structure-based computational models that are available and potentially useful in the assessment of acute systemic toxicity. Emphasis is placed on quantitative structure–activity relationship (QSAR) models implemented by means of a range of software tools. The most recently published literature models for acute systemic toxicity are also discussed, and perspectives for future developments in this field are offered.
The History of Alternative Test Methods in Toxicology, 2019
In view of the time and space available to us, and given the wealth of literature and experience ... more In view of the time and space available to us, and given the wealth of literature and experience on the subject, we have not attempted to provide a fully comprehensive and all-embracing review of the history of alternative test methods in toxicology. For example, the application of alternative approaches in environmental toxicology and developments in nanotoxicology are out of the scope of this work. Instead we have gathered together a group of co-authors from a wide variety of geographical locations and with considerable scientific achievements in different areas, and invited them to give their impressions of how it all began, where we are now and what the future may hold. We have tried assiduously not to impose our own experiences and prejudices on our co-authors, but this means that we are entitled to say that we do not necessarily agree with all that they have included in their chapters, or with the interpretations and views they have recorded. To set the scene, we offer a quotation from Guinevere Glasfurd's novel, The Words In My Hand, 1 based on time spent in Amsterdam in the 1630s by the French philosopher, René Descartes. At a party held in his honour, Descartes is criticised for his belief that everything should be questioned and is asked what can be believed. He replies, Truththat which can be proven. Knowledge is insufficient. Knowledge is nothing without understanding. Our aim is to find better ways of searching for the truth, to increase understanding and thereby reduce uncertainty, to provide a relevant and reliable, valid and humane, basis for the policies and decisions which profoundly affect the wellbeing of humans and the world in which we live. We believe that there is great value in historical perspectives, however subjective the individual accounts may be, as the collective lessons learned should guide us toward the truth.
The History of Alternative Test Methods in Toxicology, 2019
Established in 1991 by a Commission Communication and in the context of Directive 86/609/EEC, ECV... more Established in 1991 by a Commission Communication and in the context of Directive 86/609/EEC, ECVAM was given the mandate to coordinate the EU-level validation of non-animal methods and to promote dialogue between legislators, industries, biomedical scientists, consumer organisations and animal welfare groups, with a view to the development, validation and international recognition of alternative test methods. In 2011, these duties were reaffirmed under Directive 2010/63/EU, which also formalised ECVAM as an EU Reference Laboratory. In the 1990s, ECVAM's scientific activities were focused mainly on the needs of EU cosmetics legislation, with the new EU chemicals legislation, REACH, becoming important in the early 2000s. In the 2010s, additional cross-sector policy areas, such as chemical mixtures and endocrine disruptors, have become important drivers of ECVAM's activities. From its establishment, ECVAM has also played an important role in promoting the uptake of the Three Rs in the biologicals area. ECVAM's training, dissemination and stakeholder engagement activities have also served to promote better regulatory science through alternatives to animal testing.
The History of Alternative Test Methods in Toxicology, 2019
In the early 1980s, the Organisation for Economic Cooperation and Development (OECD) began to pub... more In the early 1980s, the Organisation for Economic Cooperation and Development (OECD) began to publish test guidelines (TGs) on how in vivo toxicity tests should be conducted in laboratory animals, although where and when such TGs should be applied was the responsibility of national and regional regulatory authorities. The first TGs for non-animal tests involved in vitro methods for geno-toxicity, but, by the mid-1990s, alternative tests for other types of toxicity began to be validated and proposed for regulatory acceptance. The OECD published guidance on the development of TGs and on the validation and acceptance of new or updated methods for hazard assessment, and by the mid-1990s, TGs for validated individual non-animal tests began to be published. Later, guidance was published on the development, validation and acceptance of Quantitative Structure-Activity Relationship models and integrated testing strategies.
The History of Alternative Test Methods in Toxicology, 2019
In this chapter, we explain how Integrated Approaches to Testing and Assessment (IATA) offer a me... more In this chapter, we explain how Integrated Approaches to Testing and Assessment (IATA) offer a means of integrating and translating the data generated by toxicity testing methods, thereby serving as flexible and suitable tools for toxicological decision making in the twenty-first century. In addition to traditional in vitro and in vivo testing methods, IATA are increasingly incorporating newly developed in vitro systems and measurement technologies such as high throughput screening and high content imaging. Computational approaches are also being used in IATA development, both as a means of generating data (e.g. QSARs), interpreting data (bioinformatics and chemoinformatics), and as a means of integrating multiple sources of data (e.g. expert systems, bayesian models). Decision analytic methods derived from socioeconomic theory can also play a role in developing flexible and optimal IATA solutions. Some of the challenges involved in the development, validation and implementation of IATA are also discussed.
The History of Alternative Test Methods in Toxicology, 2019
This chapter explains why toxicity testing is carried out for the purposes of protecting human he... more This chapter explains why toxicity testing is carried out for the purposes of protecting human health or the environment. The traditional reliance on animal testing and the challenges faced in the journey from developing to accepting non-animal methods are discussed, along with the roles of different players, including academia, industry, contract research organisations, governmental authorities and non-governmental organisations.
The History of Alternative Test Methods in Toxicology, 2019
As non-animal toxicity tests began to be developed in the late 1980s, the need for their validati... more As non-animal toxicity tests began to be developed in the late 1980s, the need for their validation, i. e. the independent assessment of their relevance and reliability for particular purposes, was recognised, and international discussions on the principles of validation and their practical application took place, notably, in Europe and the USA. This came to involve recognition of a prevalidation stage to ensure that a method was ready for formal validation and of a prediction model, an unambiguous algorithm for converting test data into a prediction of a relevant in vivo pharmacotoxicological endpoint. By the mid-1990s, validated non-animal methods could be proposed for the development of test guidelines and for regulatory acceptance. Meanwhile, the validation process itself evolved as a result of practical experience, and processes such as catch-up validation, weight-of-evidence validation, QSAR model validation and integrated testing strategies came to be seen as important. Major roles in these developments were played by ECVAM in Europe, ICCVAM in the USA, JaCVAM in Japan, and the OECD, but validation centres now operate in a number of other countries.
The History of Alternative Test Methods in Toxicology, 2019
The development of alternative methods to living animals, originally in the area of toxicity test... more The development of alternative methods to living animals, originally in the area of toxicity testing for chemical safety assessment, and more recently in biomedical research, has a rich history going back to the 1950s. In the early days, the drivers for change were mostly based on animal welfare concerns, but the evidence has increasingly pointed to the scientific and practical limitations of reliance on traditional animal methods. As toxicology transitioned into the 21st century, the need to assess an ever increasing range of chemicals, products and exposure scenarios has been accompanied by a plethora of new tools and assessment approaches. This rapid pace of change, both in terms of societal expectations for safety assessment science and the need to evaluate and intelligently apply the emerging technologies, continues to present challenges, as well as opportunities. This chapter highlights a number of lessons learned from a historical review of the field, and reflects on how far we have come since the seminal work of Russell and Burch in 1959.
The History of Alternative Test Methods in Toxicology, 2019
This chapter provides an overview of the Threshold of Toxicological Concern (TTC) concept and the... more This chapter provides an overview of the Threshold of Toxicological Concern (TTC) concept and the ways in which it has been applied, or proposed for application, in the EU-wide risk assessment of chemicals in food. The basis of the TTC approach is described in relation to key historical developments, including policy and research initiatives.
In this chapter, we describe different theories relevant to understanding the behavior, fate, and... more In this chapter, we describe different theories relevant to understanding the behavior, fate, and effects of manufactured nanomaterials (NMs). In the first part, background information on regulatory requirements related to NMs is reported, along with an overview of risk assessment as an approach to address risks posed by exposure to NMs. The second part is dedicated to the identification of key physicochemical properties that are relevant for characterizing and understanding the behavior (fate and biological effects) of NMs. An understanding of the scientific basis of NM behavior is important in the development and application of standard and alternative approaches to animal testing.
This chapter gives an overview of the EU project research landscape in computational nanotoxicolo... more This chapter gives an overview of the EU project research landscape in computational nanotoxicology, highlighting the challenges for development and use of computational approaches in the safety assessment of nanomaterials and analyzing recent progress against these needs.
Although considerable progress has been made and is ongoing in view of overcoming these shortcomings, there currently still are gaps in the knowledge of nanomaterial behavior; there is fragmentation in the scientific results and need for more harmonization and standardization. A lack of public access to the results and tools is preventing uptake and use of the models in regulatory decision-making. More guidance is needed and, in particular, more coordination is required to direct the model development to elucidating relevant gaps of knowledge and addressing questions and endpoints relevant for regulatory applications.
Overall, a number of recommendations are made in view of increasing the availability and uptake of computational methods for regulatory nanosafety assessment. They concern inherent scientific uncertainties in the understanding of nanomaterial behavior; data quality and variability; standardization and harmonization; the availability of data and predictive models; and the accessibility and visibility of models, including the need for infrastructures.
In this chapter, we give an overview of the regulatory requirements for acute systemic toxicity i... more In this chapter, we give an overview of the regulatory requirements for acute systemic toxicity information in the European Union, and we review the availability of structure-based computational models that are available and potentially useful in the assessment of acute systemic toxicity. The most recently published literature models for acute systemic toxicity are also discussed, and perspectives for future developments in this field are offered.
Validation of Alternative Methods for Toxicity Testing , 2016
In this chapter, we provide an overview of how (Quantitative) Structure Activity Relationships, (... more In this chapter, we provide an overview of how (Quantitative) Structure Activity Relationships, (Q)SARs, are validated and applied for regulatory purposes. We outline how chemical categories are derived to facilitate endpoint specific read-across using tools such as the OECD QSAR Toolbox and discuss some of the current difficulties in addressing the residual uncertainties of read-across. Finally we put forward a perspective of how non-testing approaches may evolve in light of the advances in new and emerging technologies and how these fit within the Adverse Outcome Pathway (AOP) framework.
Advances in Experimental Medicine and Biology, 2016
In this chapter, we explain how Integrated Approaches to Testing and Assessment (IATA) offer a me... more In this chapter, we explain how Integrated Approaches to Testing and Assessment (IATA) offer a means of integrating and translating the data generated by toxicity testing methods, thereby serving as flexible and suitable tools for toxicological decision making in the twenty-first century. In addition to traditional in vitro and in vivo testing methods, IATA are increasingly incorporating newly developed in vitro systems and measurement technologies such as high throughput screening and high content imaging. Computational approaches are also being used in IATA development, both as a means of generating data (e.g. QSARs), interpreting data (bioinformatics and chemoinformatics), and as a means of integrating multiple sources of data (e.g. expert systems, bayesian models). Decision analytic methods derived from socioeconomic theory can also play a role in developing flexible and optimal IATA solutions. Some of the challenges involved in the development, validation and implementation of IATA are also discussed.
In this chapter, a range of computational tools for applying QSAR and grouping/read-across method... more In this chapter, a range of computational tools for applying QSAR and grouping/read-across methods are described, and their integrated use in the computational assessment of genotoxicity is illustrated through the application of selected tools to two case-study compounds—2-amino-9H-pyrido[2,3-b]indole (AαC) and 2-aminoacetophenone (2-AAP). The first case study compound (AαC) is an environment pollutant and a food contaminant that can be formed during the cooking of protein-rich food. The second case study compound (2-AAP) is a naturally occurring compound in certain foods and also proposed for use as a flavoring agent. The overall aim is to describe and illustrate a possible way of combining different information sources and software tools for genotoxicity and metabolism prediction by means of a simple stepwise approach. The chapter is aimed at researchers and assessors who have a basic knowledge of computational toxicology and some familiarity with the practical use of computational tools. The emphasis is on how to evaluate the data generated by multiple tools, rather than the practical use of any specific tool.
In the last few decades, society has become increasingly concerned about the possible impacts of ... more In the last few decades, society has become increasingly concerned about the possible impacts of chemicals to which humans and environmental organisms are exposed. In many industrialised countries, this has led to the implementation of stringent chemicals legislation and to the initiation of ambitious risk assessment and management programmes (see Chapter 1). However, it has become increasingly apparent that the magnitude of the task exceeds the availability of resources (experts, time, money) if traditional test methods are employed. This realization, coupled with increasing attention to animal welfare concerns, has prompted the development and application of various (computer-based) estimation methods in the regulatory assessment of chemicals.
Complex in vitro models (CIVMs) - such as 3D cell cultures and spheroids, microphysiological syst... more Complex in vitro models (CIVMs) - such as 3D cell cultures and spheroids, microphysiological systems including organ-on-chip devices, bioreactor cultures or bioprinted tissues – aim to represent higher-level anatomical and physiological aspects of human biology in experimental studies. The underpinning technologies are developing fast and the models and methods being created constitute promising approaches in several different scientific areas with many applications in research and regulatory testing. However, the successful implementation and acceptance of CIVMs is likely to require their proper characterisation, validation or qualification to demonstrate to end-users that they are fit for a particular purpose or context of use. To explore this, the EU Reference Laboratory for alternatives to animal testing (EURL ECVAM) of the European Commission's Joint Research Centre (JRC) conducted a survey to investigate stakeholder opinions and perceived needs. The outcome of the survey showed that i) there is high interest in establishing some kind of assessment approaches for CIVMs; ii) assessment approaches (even if conducted differently) should be adequate not only for regulatory use-contexts but also address applications in research; and iii) CIVMs are still under (technological and biological) development and are thus not yet mature or standardised enough to enable a consensus on how their assessment should be conducted.
Under the European Commission's Better Regulation initiative the JRC is leading a Fitness Check o... more Under the European Commission's Better Regulation initiative the JRC is leading a Fitness Check of the EU legislation for the identification and control of endocrine disrupting substances. Public consultation is an important part of any Fitness Check. This report provides a brief factual overview of the responses received over a 12 week period from 16/12/19 to 09/03/20 to questions addressed to the general public. The aims of this public consultation were to collect views on the concerns of citizens related to endocrine disruptors, to which extent the current EU legislation meets these concerns and to identify possible opportunities for improvements. Two other consultations conducted under this Fitness Check, one focused on stakeholder organisations and the other on small and medium-sized enterprises will be reported separately. Responses will provide an essential input to the Fitness Check analysis carried out by the JRC. A more detailed analysis of the responses from the three consultations will be published in a synopsis report at the end of the of the process.
Currently, the identification of chemicals that have the potential to induce developmental neurot... more Currently, the identification of chemicals that have the potential to induce developmental neurotoxicity (DNT) is based on animal testing, since there are no regulatory accepted alternative methods for this purpose. Since at the regulatory level, systematic testing of DNT is not a standard requirement within the EU legislation of chemical safety assessment, DNT testing is only performed in higher tiered tests triggered based on structure activity relationships or evidence of neurotoxicity in systemic adult studies. However, these triggers are rarely used and in addition do not always serve as reliable indicators of DNT as they are observed in an adult rodent animal. Consequently, to date only a limited amount of chemicals (Grandjean and Landrigan, 2006; Smirnova et al., 2014), mainly pesticides (Bjørling-Poulsen et al., 2008) have been tested under US EPA (OPPTS 870.630) or OECD DNT TG 426. Therefore, there is the pressing need for developing alternative methodologies that can more rapidly and cost-effectively screen large numbers of chemicals for their potential to cause DNT. In this report we propose that in vitro studies could contribute to the identification of potential triggers for DNT evaluation since existing cellular models permit the evaluation of a chemical impact on key neurodevelopmental processes, mimicking different windows of human brain development, especially if human models derived from induced pluripotent stem cells are applied. Furthermore, the battery of currently available DNT alternative test methods anchored to critical neurodevelopmental processes and key events identified in DNT Adverse Outcome Pathways (AOPs) could be applied to generate in vitro data useful for various regulatory purposes. Incorporation of in vitro mechanistic information would increase scientific confidence in decision making, by decreasing uncertainty and leading to refinement of chemical grouping according to biological activity. In this report development of IATA (Integrated Approaches to Testing and Assessment) based on key neurodevelopmental processes and AOP-informed is proposed as a tool for not only speeding up chemical screening, but also providing mechanistic data in support of hazard assessment and in the evaluation of chemical mixtures. Such mechanistically informed IATA for DNT evaluation could be developed integrating various sources of information (e.g., non-testing methods, in vitro approaches, as well as in vivo animal and human data), contributing to screening for prioritization, hazard identification and characterization, and possibly safety assessment of chemicals, speeding up the evaluation of thousands of compounds present in industrial, agricultural and consumer products that lack safety data on DNT potential. It is planned that the data and knowledge generated from such testing will be fed into the development of an OECD guidance document on alternative approaches to DNT testing.
Humans and the environment are continuously exposed to a multitude of substances via different ro... more Humans and the environment are continuously exposed to a multitude of substances via different routes of exposure. However, the risk assessment of chemicals for regulatory purposes does not generally take into account the "real life" exposure to multiple substances, but mainly relies on the assessment of individual substances. This report summarises the different methodologies that are used to assess the toxic effects of mixtures. It also provides an overview of current legislation in the European Union (EU) that deals with the safety assessment of chemicals in different matrices and the extent to which the current legislation addresses the toxicological risk of mixtures. Relevant Guidance Documents from the EU and other countries (USA, Canada) and international organisations, such as the World Health Organisation (WHO) and the Organisation for Economic Cooperation and Development (OECD) are also included in this review.
This is the final report of the Nanocomput project, the main aims of which were to review the cur... more This is the final report of the Nanocomput project, the main aims of which were to review the current status of computational methods that are potentially useful for predicting the properties of engineered nanomaterials, and to assess their applicability in order to provide advice on the use of these approaches for the purposes of the REACH regulation. Since computational methods cover a broad range of models and tools, emphasis was placed on Quantitative Structure-Property Relationship (QSPR) and Quantitative Structure-Activity Relationship (QSAR) models, and their potential role in predicting NM properties. In addition, the status of a diverse array of compartment-based mathematical models was assessed. These models comprised toxicokinetic (TK), toxicodynamic (TD), in vitro and in vivo dosimetry, and environmental fate models. Finally, based on systematic reviews of the scientific literature, as well as the outputs of the EU-funded research projects, recommendations for further research and development were also made. The Nanocomput project was carried out by the European Commission’s Joint Research Centre (JRC) for the Directorate-General (DG) for Internal Market, Industry, Entrepreneurship and SMEs (DG GROW) under the terms of an Administrative Arrangement between JRC and DG GROW. The project lasted 39 months, from January 2014 to March 2017, and was supported by a steering group with representatives from DG GROW, DG Environment and the European Chemicals Agency (ECHA).
In order to reliably assess the risk of adverse systemic effects of chemicals by using in vitro m... more In order to reliably assess the risk of adverse systemic effects of chemicals by using in vitro methods, there is a need to simulate their absorption, distribution, metabolism, and excretion (ADME) in vivo to determine the target organ bioavailable concentration, and to compare this predicted internal concentration with an effective internal concentration. The effective concentration derived from in vitro toxicity studies should ideally take into account the fate of chemicals in the in vitro test system, since there can be significant differences between the applied nominal concentration and the in vitro bioavailable concentration. Whereas PBK models have been developed to simulate ADME properties in vivo, the Virtual Cell Based Assay (VCBA) has been developed to simulate in vitro fate. In this project, the VCBA model in R code, was applied to better interpret previously obtained in vitro acute toxicity data and study how they can be compared to results from acute toxicity in vivo. For 178 chemicals previously tested in vitro with the 3T3 BALB/c cell line using the Neutral Red Uptake cytotoxicity assay, physicochemical parameters were retrieved and curated. Of these chemicals, 83 were run in the VCBA to simulate a 96-well microplate set up with 5% serum supplementation, and their no effect concentration (NEC) and killing rate (Kr) optimized against the experimental data. Analyses of results of partitioning of the chemicals show a strong relation with their lipophilicity, expressed here as the logarithm of the octanol/water partitioning coefficient, with highly lipophilic chemicals binding mostly to medium lipid. Among the chemicals analysed, only benzene and xylene were modelled to evaporate by more than 10 %, and these were also the chemicals with highest degradation rates during the 48 hours assay. Chemical degradation is dependent not only on the air and water degradation rates but also on the extent of binding of the chemical. Due to the strong binding of some chemicals to medium lipids and proteins we analysed the impact of different serum supplementations (0%, 5% and 10%) on the chemical dissolved concentrations. As expected, for the more lipophilic chemicals, different serum levels result in different dissolved concentrations, with lipid and protein binding reducing chemical loss by evaporation. Still the lack of saturation modelling might mislead the 0 % supplementation since the lipids coming solely from cells exudates are able to sequester chemical to a large extent, eg. after 48 hours, 63% (1.2E-5 M) of dimethyldioctadecylammonium chloride was bound to lipid from the cells. Although highly lipophilic chemicals have a very small bioavailable fraction, cellular uptake rate is also dependent on logKow, which compensates for this lack of bioavailability to some extent. Based on the relevance of lipophilicity on in vitro chemical bioavailability, we have developed an alert system based on logKow, creating four classes of chemicals for the experimental condition with 10% serum supplementation: logKow 5- 10 (A), logKow <5 (B), logKow <2.5 (C), and logKow <2 (D). New chemicals from Classes A and B, which will in the future be tested in vitro, were run first on the VCBA, without considering toxicity (NEC and Kr set to 0). VCBA simulations indicated that these chemicals are more than 50% bound to medium proteins, lipids and plastic. Therefore, for chemicals with logKow falling in these classes, special care should be taken when extrapolating the obtained in vitro toxic concentrations to in vivo relevant doses. A comparison of the VCBA-predicted dissolved concentrations corresponding to nominal IC50 values with the available rat oral LD50 values did not improve the previously obtained correlations. This is probably because other in vivo kinetic processes play an important role but were not considered in this in vitro-in vivo extrapolation. The comparison of the VCBA predicted IC50 dissolved concentrations with the available rat oral LD50 values, did not improve the previously obtained correlations. Nevertheless, other in vivo kinetic processes that are not modelled may play an important role. They should be considered in the in vitro-in vivo extrapolations. A local sensitivity analysis showed the relative low impact of Molar Volume and Molecular Diffusion Volume on the final dissolved concentration, supporting the use of approximated values obtained through the herein created QSARs. The logkow and Henry Law Constant showed, as expected, a high impact in partitioning. Killing rate was shown to also have a relative low impact in the final chemical concentration, indicating that although its optimization is important, finding the Kr that leads to the absolute best correlation between experimental and predicted concentration-viability curves, is not imperative. The VCBA can be applied to virtually any chemical as long as the physicochemical data (for the fate model) and the experimental toxicity data (that include cell growth/death) are available. However, being such a generic model, several assumptions had to be made: i) no distinction of chemical classes (inorganic, polar organic chemicals), ii) no consideration of metabolism, iii) saturation kinetics and iv) external in vitro conditions. The advantages of having a generic model are that the VCBA can fit several experimental set ups and should be used in an exploratory manner, to help refinement of experimental conditions. The herein obtained VCBA results should be double checked experimentally the partition with a set of chemical compounds to better understand to what extent VCBA represents chemicals of different properties. In future developments, it would be important to reduce the uncertainties of the model such as binding-saturation and consider inclusion of other endpoints such as metabolic activity.
The approaches for evaluating the carcinogenic potential of substances, including whether carcino... more The approaches for evaluating the carcinogenic potential of substances, including whether carcinogenicity studies should be conducted, differ substantially across sectors. Despite variations in testing schemes, the two-year bioassay study in rodents represents the standard element across all sectors.
The validity of the two-year bioassay though has been questioned in the last decade. Uncertainty is associated with the extrapolation of data from rodents to humans. Furthermore, these studies are extremely time and resource-consuming; the high animal burden has raised ethical concerns. For all these reasons, there is a strong demand for alternative strategies and methods. The development of new in vitro methods for carcinogenicity testing, however, has progressed slowly and those available are far from being accepted for regulatory decision making, especially when evaluating the carcinogenicity of non-genotoxic chemicals or specific classes of compounds such as biologicals, microorganisms and nanomaterials.
The European Union Reference Laboratory for alternatives to animal testing (EURL ECVAM) has carried out an analysis of carcinogenicity testing requirements and assessment approaches across different sectors. This consisted of: a systematic review of the different testing requirements; a review of the number of animals used per sector; an estimation of the number of carcinogenicity and genotoxicity studies conducted or waived in respect of the number of substances authorized per sector per year; a review of the type of justifications for waiving the two-year bioassay and how opportunities for waiving are being used.
Results from this analysis will provide context for initiatives aimed at: 1) reducing the need for animal use where animal testing is still a requirement; 2) ensuring an adequate carcinogenicity assessment in sectors where animal use is banned or limited; and 3) improving the process of cancer hazard identification where existing methods are not suitable.
The assessment of aquatic toxicity is an important component of the environmental hazard and risk... more The assessment of aquatic toxicity is an important component of the environmental hazard and risk assessment of all types of chemicals, and is therefore included in several pieces of EU chemicals legislation. Aquatic toxicity refers to the effects of chemicals on organisms living in the water and is usually determined by testing on organisms representing three trophic levels, i.e. plants (or algae), invertebrates (crustaceans such as Daphnia spp.) and vertebrates (fish). Whereas acute aquatic toxicity testing is a basic requirement in most pieces of EU chemicals legislation, chronic aquatic toxicity testing may be required when the outcome of the acute testing indicates a risk, or in the case that long-term exposure is expected. EU chemicals legislation encourages the use of all available information for hazard and risk assessment before new tests on vertebrates are proposed or conducted. In this context, scientific options for avoiding chronic fish testing on the basis of existing data and extrapolation approaches have been explored. For the purposes of this work, data on acute and chronic aquatic toxicity (Daphnia and fish) from several databases (US EPA Ecotox database, Aquatic ECETOC, Aquatic OASIS, Aquatic Japan MoE databases and ECHA database as implemented in the OECD QSAR Toolbox Version 2.3) were collated and analysed. Simple linear relationships and interspecies sensitivity ratios were calculated using either acute Daphnia data (48h LC50) or chronic Daphnia data (14 days NOEC) and chronic fish data (>21 days NOEC). Acute to chronic relationships and acute to chronic ratios (ACR) were also calculated based on acute fish data (96h LC50) and chronic fish data. These analyses were carried out on the whole set of chemicals and on subgroups of chemicals classified according to the Verhaar mode of action (MOA) scheme, which attribute general mode of acute aquatic toxic action based on the chemical structure of the molecule. Outliers were identified applying the Robust regression and Outlier removal (ROUT) method. Our results show that the best fitted relationships for the prediction of chronic fish toxicity are obtained based on acute fish data (r2=0.87) and acute Daphnia data (r2=0.64) when dealing with the whole set of chemicals regardless of the MOA. The quality of the relationships was increased by using the geometric mean (calculated across all the values extracted for a given chemical and a given endpoint) instead of the lowest value for a given endpoint. When considering the MOA, MOA 3 and MOA 1 chemicals give the strongest acute Daphnia to chronic fish relationship and chronic Daphnia to chronic fish relationship; however the relationships obtained with acute Daphnia data are better (r2= 0.83 and 0.69 for MOA 3 and MOA 1 respectively) than the one obtained with chronic Daphnia data (r2= 0.66 and 0.65 for MOA 1 and 3 respectively). When considering acute fish data, all the MOA classes give strong relationships (r2=0.88 for MOA 3 and MOA 5 chemicals, 0.85 for MOA 4 chemicals and 0.83 for MOA 1 and MOA 2 chemicals). Therefore when acute toxicity data on fish are available, they might give a reliable basis to extrapolate the chronic toxicity on fish as a first tier assessment or within a weight of evidence approach. There is a correlation between chemicals with high ACR values or interspecies sensitivity ratios and the outliers identified in the above-mentioned relationships. When considering chemicals with a high interspecies sensitivity ratio, Daphnia being more sensitive than fish, several aniline derivatives and pesticides acting through cholinesterase inhibition were identified. When considering high interspecies sensitivity ratio chemicals for which Daphnia is less sensitive than fish, we found pesticides and known endocrine disruptors such as ethynil oestradiol and 17ß-oestradiol. Extreme (i.e. <1 or > 100) interspecies sensitivity ratios were mainly evident for MOA 2, 4 and 5 chemicals. Regarding ACR for fish, around 50% of the chemicals in each MOA class have an ACR within a factor of 10; whereas 100% of MOA 3, 90.9% of MOA 2, 88.3% of MOA 4 and 85.5% of MOA 1 chemicals have an ACR within a factor of 100. Therefore, the safety factor of 100 commonly applied in environmental risk assessment does not seem to be equally protective for every MOA.
Humans and wildlife can be exposed to an infinite number of different combinations of chemicals i... more Humans and wildlife can be exposed to an infinite number of different combinations of chemicals in mixtures via food, consumer products and the environment, which might impact health. The number of chemicals and composition of chemical mixtures one might be exposed to is often unknown and changing over time. To gain further insight into the current practices and limitations, published peer reviewed literature was searched for case studies showing risk assessments for chemical mixtures. The aim was to find examples of mixture assessments in order to identify chemical mixtures of potential concern, methodologies used, factors hampering mixture risk assessments, data gaps, and future perspectives. Twenty-one case studies were identified, which included human and environmental risk assessments. Several compound classes and environmental media were covered, i.e. pesticides, phthalates, parabens, PBDEs, pharmaceuticals, food contact materials, dioxin-like compounds, anti-androgenic chemicals, contaminants in breast milk, mixtures of contaminants in surface water, ground water and drinking water, and indoor air. However, the selection of chemical classes is not necessarily representative as many compounds groups have not been covered. The selection of these chemical classes is often based on data availability, recent concerns about certain chemical classes or legislative requirements. Several of the case studies revealed a concern due to combined exposure for certain chemical classes especially when considering specific vulnerable population groups. This is very relevant information, but needs to be interpreted with caution, considering the related assumptions, model parameters and related uncertainties. Several parameters that could lead to an over- or underestimation of risks were identified. However, there is clear evidence that chemicals need to be further addressed not only in single substance risk assessment and that mixtures should be considered also across chemical classes and legislative sectors. Furthermore, several issues hampering mixture risk assessments were identified. In order to perform a mixture risk assessment, the composition of the mixture in terms of chemical components and their concentrations need to be known, and relevant information on their uptake and toxicity are required. Exposure data are often lacking and need to be estimated based on production and use/consumption information. Also relevant toxicity data are not always available. Toxicity data gaps can be filled e.g. using the Threshold of Toxicological Concern approach. Reference values used in single substance risk assessments can be found for several chemical classes, however, they are usually derived based on the lowest endpoint. If a refined toxicity assessment of a mixture for a specific effect/cumulative assessment group is envisaged, this is often hampered by a lack of specific toxicity and mode of action information. In all case studies, concentration addition based assessments were made, mainly applying the Hazard Index. To further characterise the drivers of the mixture risk, the maximum cumulative ratio was calculated in several case studies. This showed that the scientific methodologies to address mixtures are mostly agreed and lead to reasonable predictions. However, especially for some groups of compounds that are designed as active substances, it cannot be excluded that interactions appear and they should therefore be addressed on a case-by-case basis. Most of the mixtures addressed in the identified case studies examined specific chemical groups. Only few of them looked at mixtures comprising chemicals regulated under different legislative frameworks. The examples indicated that there is evidence for combined exposure to chemicals regulated under different legislation as well as evidence that such chemicals can elicit similar effects or have a similar mode of action. A mixture risk assessment across regulatory sectors should therefore be further investigated.
This state-of-the art review is based on the final report of a project carried out by the Europea... more This state-of-the art review is based on the final report of a project carried out by the European Commission’s Joint Research Centre (JRC) for the European Chemicals Agency (ECHA). The aim of the project was to review the state of the science of non-standard methods that are available for assessing the toxicological and ecotoxicological properties of chemicals. Non-standard methods refer to alternatives to animal experiments, such as in vitro tests and computational models, as well as animal methods that are not covered by current regulatory guidelines. This report therefore reviews the current scientific status of non-standard methods for a range of human health and ecotoxicological endpoints, and provides a commentary on the mechanistic basis and regulatory applicability of these methods. For completeness, and to provide context, currently accepted (standard) methods are also summarised. In particular, the following human health endpoints are covered: a) skin irritation and corrosion; b) serious eye damage and eye irritation; c) skin sensitisation; d) acute systemic toxicity; e) repeat dose toxicity; f) genotoxicity and mutagenicity; g) carcinogenicity; h) reproductive toxicity (including effects on development and fertility); i) endocrine disruption relevant to human health; and j) toxicokinetics. In relation to ecotoxicological endpoints, the report focuses on non-standard methods for acute and chronic fish toxicity. While specific reference is made to the information needs of REACH, the Biocidal Products Regulation and the Classification, Labelling and Packaging Regulation, this review is also expected to be informative in relation to the possible use of alternative and non-standard methods in other sectors, such as cosmetics and plant protection products.
Significant progress has been made in the development, validation and regulatory acceptance of in... more Significant progress has been made in the development, validation and regulatory acceptance of in chemico and in vitro test methods for skin sensitisation. Although these methods have been shown to perform relatively well (about 80% accuracy in predicting Local Lymph Node Assay (LLNA) classifications) a concern was raised on the regulatory acceptability of negative results since it was questioned whether these methods are able to predict chemicals that need to be activated to act as sensitisers. In order to inform ongoing discussions at the regulatory level in the EU, EURL ECVAM held an expert meeting on 10-11 November 2015 to analyse the extent to which in chemical and in vitro methods are able to correctly identify chemicals that need to be activated either through abiotic activation (pre-haptens) and/or through biotic (enzymemediated) mechanisms (pro-haptens) to acquire skin sensitisation potential. The expert group analysed a list of 127 chemicals, with available LLNA and in vitro data, 22% of which were considered to be pre- and/or pro-haptens. The pre-haptens, constituting the vast majority of chemicals requiring activation, were mostly correctly identified by both the in chemico and in vitro assays whereas the pro-haptens which represent a small subset of sensitising chemicals, were identified correctly by at least one of the cell-based assays. As a result, the expert group recommended that negative in vitro data should be accepted unless there is a compelling scientific argument that a substance is likely to be an exclusively metabolically activated pro-hapten.
Exposure of humans and wildlife to chemicals via food, consumer products, the environment etc. ca... more Exposure of humans and wildlife to chemicals via food, consumer products, the environment etc. can imply exposure to an infinite number of different combinations of chemicals in mixtures. It is practically impossible to test all these possible mixtures experimentally and it is therefore needed to find smart strategies to assess the potential hazards using new tools that rely less on in vivo testing and incorporate instead alternative experimental and computational tools. In this report the current state of the art for the application of these alternative tools for assessing the hazard of chemical mixtures is briefly reviewed. The focus is hereby on the adverse outcome pathway (AOP) concept, in vitro methods, omics techniques, in silico approaches such as quantitative structure activity relationships (QSARs) and read-across, toxicokinetic and dynamic energy budget (DEB) modelling, and on integrated approaches to testing and assessment (IATA). Furthermore, an expert survey was performed to collect up to date information and experience on the current use of different approaches for assessing human and environmental health risks from exposure to chemical mixtures, with a view to informing the development of a consistent assessment approach. An online survey was performed among experts in the field of combined exposure assessment in the period of January to March 2014, addressing both, human health and environmental risk assessment. Fiftyeight experts from 21 countries, different stakeholder groups and sectors of legislation participated in the survey. The main sectors where most experience is already gained in assessing mixtures are in the area of plant protection products and chemicals under REACH. These were also rated highest regarding the priority for performing mixture assessments. Experts have experience with the whole mixture as well as the componentbased approaches applying them to both, intentional and unintentional mixtures. Mostly concentration addition (CA) based methods are used for predicting mixture effects. Regarding the use of novel and alternative tools in the risk assessment of mixtures, expert opinions are split between those applying them (often more in a research context) and those that generally think these tools are valuable but their use is currently limited because of lack of guidance, lack of data, or lack of expertise. A general need for clear guidance for combined exposure assessments was identified. Overall, a high potential in applying novel tools and scientific methodologies for the assessment of chemical mixtures can be identified. They allow deriving meaningful information on individual mixture components or whole mixtures, enabling a better understanding of the underlying mechanisms of mixture effects. Their main strengths lie in their integrated use and smart combination to put different aspects regarding the hazard from combined exposure to multiple chemicals into context. In order to benefit from these tools in the hazard assessment of mixtures, more guidance on their use is needed to facilitate a more widespread application.
Information on human toxicokinetics plays an important role in the safety assessment of chemicals... more Information on human toxicokinetics plays an important role in the safety assessment of chemicals, even though there are few data requirements in the EU regulatory framework. While existing EU test methods and OECD test guidelines are mostly based on animal procedures, there are increasing opportunities to achieve a 3Rs impact in this area by exploiting modern developments. For example, whole-body toxicokinetic information can be obtained by using physiologically-based toxicokinetic (PBTK) models that integrate data generated by in vitro methods for absorption, distribution, metabolism and excretion (ADME). The development of an infrastructure providing access to such models and their underlying data needs to be accompanied by the establishment of standards for human in vitro ADME methods, the development of guidance on the development and application of such models and the creation of regulatory incentives. Taking these needs into account, this report describes the EURL ECVAM strategy to achieve a 3Rs impact in the area of toxicokinetics and systemic toxicity. The proposed activities are expected to lay the foundation for a risk assessment approach that is increasingly based on human data. Implementation of the strategy will rely on the coordinated efforts of multiple stakeholders.
Information on acute systemic toxicity represents a standard requirement within several pieces of... more Information on acute systemic toxicity represents a standard requirement within several pieces of chemicals legislation in the EU. One of the main drivers of conducting the test is classification and labelling. Currently, only in vivo tests are accepted by regulatory bodies and most of the standard tests use lethality as endpoint. Based on an assessment of the regulatory needs and the scientific state-of-the art in the area, EURL ECVAM considers that efforts should be directed towards a) the reduction and replacement of animal tests for the identification and classification of acute systemic toxicity, and b) the refinement of in vivo studies. Consideration should be given to collecting, organising and applying mechanistic knowledge related to this endpoint, to provide a strong mechanistic basis for the design and validation of integrated prediction models. EURL ECVAM proposes to evaluate promising components of integrated approaches for testing and assessment (IATA), including the better use of existing alternative methods, such as mechanistically relevant in vitro assays. Information on repeated dose toxicity might also be useful in supporting classification and labelling for acute systemic toxicity. One clear target is minimising animal use for satisfying information requirements for acute systemic toxicity in relation to the 2018 REACH registration deadline. The aims and objectives underpinning the EURL ECVAM strategy can only be achieved through the coordinated and concerted efforts of all stakeholders.
The assessment of aquatic toxicity and bioaccumulation are important components of the environmen... more The assessment of aquatic toxicity and bioaccumulation are important components of the environmental hazard and risk assessment of all types of chemicals and are therefore included in several pieces of European Union and international legislation. In this document, the European Union Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) outlines approaches which will deliver an impact on the replacement, reduction and refinement (3Rs) of fish tests used for aquatic toxicity and bioaccumulation testing. The document is based on an assessment of the regulatory needs for these endpoints, the scientific state-of-the art and recent activities in these areas. It highlights ongoing efforts at research, validation, guideline development and regulatory level. The proposed strategy is also intended to provide a framework for the prioritisation of alternative test methods submitted to EURL ECVAM for validation. Implementation of the strategy will rely on the coordinated efforts of multiple stakeholders.
The assessment of genotoxicity represents an important component of the safety assessment of all ... more The assessment of genotoxicity represents an important component of the safety assessment of all types of substances. Although several in vitro tests are available at different stages of development and acceptance, they cannot at present be considered to fully replace animal tests needed to evaluate the safety of substances. Based on an analysis of regulatory requirements for this endpoint within different pieces of EU legislation, EURL ECVAM proposes a pragmatic approach to improve the traditional genotoxicity testing paradigm that offers solutions in both the short- and medium-term and that draws on the considerable experience of 40 years of regulatory toxicology testing in this area. EURL ECVAM considers that efforts should be directed towards the overall improvement of the current testing strategy for better hazard and risk assessment approaches, which either avoids or minimises the use of animals, whilst satisfying regulatory information requirements, irrespective of regulatory context. Several opportunities for the improvement of the testing strategy have been identified which aim to i) enhance the performance of the in vitro testing battery so that fewer in vivo follow-up tests are necessary and ii) guide more intelligent in vivo follow-up testing to reduce unnecessary use of animals. The implementation of this strategic plan will rely on the cooperation of EURL ECVAM with other existing initiatives and the coordinated contribution from various stakeholders.
Provisions of Regulation No 1223/2009 on cosmetic products require that the European Commission r... more Provisions of Regulation No 1223/2009 on cosmetic products require that the European Commission reports on a yearly basis to the European Parliament and Council on the progress made in the development, validation and regulatory acceptance of alternative methods and on the compliance with the deadlines of the animal testing and marketing bans. This EURL ECVAM technical report provides an update since 2010 on the state of play of alternative methods for all the toxicological areas relevant to the Cosmetics Regulation and supplements the 2013 Commission Communication on the animal testing and marketing ban and on the state of play in relation to alternative methods in the field of cosmetics. Overall good progress has been made in the validation and regulatory acceptance in areas such as local toxicity where the underpinning science is more advanced and mature alternative methods are available. For very complex endpoints on the other hand, such as chronic systemic toxicity, carcinogenicity or reproductive toxicity, efforts are predominantly focused on research and development where the emphasis is on the integration of a variety of methods based on mechanistic understanding. The future is bright however, since considerable advances in new in vitro technologies, systems biology, bioinformatics and computational modelling are driving a paradigm shift in toxicological testing and assessment where non-animal methods will ultimately become the tools of choice.
In the absence of validated and regulatory accepted alternative methods, the assessment of the sk... more In the absence of validated and regulatory accepted alternative methods, the assessment of the skin sensitisation potential of chemicals still relies on animal testing. Progress in the development of alternative methods has been prompted by the increasing knowledge on the key mechanisms of the skin sensitisation pathway, as recently documented in the OECD Adverse Outcome Pathway for skin sensitisation. Based on an analysis of the regulatory requirements for this endpoint within relevant pieces of EU chemicals legislation, EURL ECVAM has decided to focus its efforts on the development of non-animal testing strategies for skin sensitisation hazard identification and classification, including the subcategorisation of sensitisers, according to the GHS classification system. This would satisfy the majority of the regulatory requirements within the EU and would have a significant impact in terms of replacing animal experiments. This report describes the EURL ECVAM strategy for achieving this goal.
This report describes the application of chemoinformatic methods to explore the applicability of ... more This report describes the application of chemoinformatic methods to explore the applicability of the Threshold of Toxicological Concern (TTC) approach to cosmetic ingredients. For non-cancer endpoints, the most widely used TTC approach is the Cramer classification scheme, which categorises chemicals into three classes (I, II and III) depending on their expected level of concern for oral systemic toxicity (low, medium, high, respectively). The chemical space of the Munro non-cancer dataset was characterised to assess whether this underlying TTC dataset is representative of the “world” of cosmetic ingredients, as represented by the COSMOS Cosmetics Inventory. In addition, the commonly used Cramer-related Munro threshold values were applied to a toxicological dataset of cosmetic ingredients, the COSMOS TTC dataset, to assess the degree of protectiveness resulting from the application of the Cramer classification scheme. This analysis is considered preliminary, since the COSMOS TTC dataset and Cosmetics Inventory are subject to an ongoing process of extension and quality control within the COSMOS project. The results of this preliminary analysis show that the Munro dataset is broadly representative of the chemical space of cosmetics, although certain structural classes are missing, notably organometallics, silicon-containing compounds, and certain types of surfactants (non-ionic and cationic classes). Furthermore, compared with the Cosmetics Inventory, the Munro dataset has a higher prevalence of reactive chemicals and a lower prevalence of larger, long linear chain structures. The COSMOS TTC dataset, comprising repeat dose toxicity data for cosmetics ingredients, shows a good representation of the Cosmetics Inventory, both in terms of physicochemical property ranges, structural features and chemical use categories. Thus, this dataset is considered to be suitable for investigating the applicability of the TTC approach to cosmetics. The results of the toxicity data analysis revealed a number of cosmetic ingredients in Cramer Class I with No Observed Effect Level (NOEL) values lower than the Munro threshold of 3000 µg/kg bw/day. The prevalence of these “false negatives” was less than 5%, which is the percentage expected by chance resulting from the use of the 5th percentile of cumulative probability distribution of NOELs in the derivation of TTC values. Furthermore, the majority of these false negatives do not arise when structural alerts for DNA-binding are used to identify potential genotoxicants, to which a lower TTC value of 0.0025 µg/kg bw/day is typically applied. Based on these preliminary results, it is concluded that the current TTC approach is broadly applicable to cosmetics, although a number of improvements can be made, through the quality control of the underlying TTC datasets, modest revisions / extensions of the Cramer classification scheme, and the development of explicit guidance on how to apply the TTC approach.
We have assessed the abilities of five alternative (non-animal) approaches to predict acute oral ... more We have assessed the abilities of five alternative (non-animal) approaches to predict acute oral toxicity, a toxicological endpoint relevant to multiple pieces of legislation on chemicals and consumer products. In particular, we have investigated four QSAR models (ToxSuite, TOPKAT, TEST and ADMET Predictor) and one in vitro method (3T3 NRU). Based on a test set of in vitro and in vivo data for 180 compounds, we have characterized the predictive performance of each method when used alone (both for LD50 prediction and acute toxicity classification into three categories), as well as multiple test combinations (batteries) and stepwise testing strategies (for acute toxicity classification into three categories). When used individually, the alternative methods showed an ability to predict LD50 with correlation coefficients in the range from 49% to 84%, and to classify into three toxicity groups with accuracies in the range from 41% to 72%. When the alternative methods were combined into batteries or testing strategies, the overall accuracy of prediction could reach 76%. We also illustrate how different combinations of methods can be used to optimize sensitivity or specificity.
This special issue entitled “The Virtual Cell Based Assay” is dedicated to our former colleague a... more This special issue entitled “The Virtual Cell Based Assay” is dedicated to our former colleague and friend Dr J.M. Zaldívar Comenges, “Josema”, who passed away on 14 September 2012, after a period of prolonged illness. At the time, he was a senior scientist at the European Commission's Joint Research Centre (JRC), where he had been working for over 25 years.
There is a need to interpret in vitro concentration-viability data in terms of the actual concent... more There is a need to interpret in vitro concentration-viability data in terms of the actual concentration that the cells are exposed to, rather than the nominal concentration applied to the test system. We have developed a processbased model to simulate the kinetics and dynamics of a chemical compound in cell-based in vitro assays. In the present paper we describe the mathematical equations governing this model as well as the parameters that are needed to run the model. The Virtual Cell Based Assay (VCBA) is an integrated model composed of: [1] a fate and transport model; [2] a cell partitioning model; [3] a cell growth and division model; [4] a toxicity and effects model; [5] the experimental set up. The purpose of the VCBA is to simulate the medium and intracellular concentrations, which can be used on its own to design and interpret in vitro experiments, and in combination with physiologically based kinetic (PBK) models to perform in vitro to in vivo extrapolation. The results can be used in chemical risk assessment to link an external dose to an internal effect or vice versa, using solely in vitro and in silico tools and thereby avoiding animal testing.
In order to replace the use of animals in toxicity testing, there is a need to predict human in v... more In order to replace the use of animals in toxicity testing, there is a need to predict human in vivo toxic doses from concentrations that cause adverse effects in in vitro test systems. The virtual cell based assay (VCBA) has been developed to simulate intracellular concentrations as a function of time, and can be used to interpret in vitro concentration-response curves. In this study we refine and extend the VCBA model by including additional target-organ cell models and by simulating the fate and effects of chemicals at the organelle level. In particular, we describe the extension of the original VCBA to simulate chemical fate in liver (HepaRG) cells and cardiomyocytes (ICell cardiomyocytes), and we explore the effects of chemicals at the mitochondrial level. This includes a comparison of: a) in vitro results on cell viability and mitochondrial membrane potential (mmp) from two cell models (HepaRG cells and ICell cardiomyocytes); and b) VCBA simulations, including the cell and mitochondrial compartment, simulating the mmp for both cell types. This proof of concept study illustrates how the relationship between intra cellular, intra mitochondrial concentration, mmp and cell toxicity can be obtained by using the VCBA.
The Virtual Cell Based Assay (VCBA) was applied to simulate the long-term (repeat dose) toxic eff... more The Virtual Cell Based Assay (VCBA) was applied to simulate the long-term (repeat dose) toxic effects of chemicals, including substances in cosmetics and personal care products. The presented model is an extension of the original VCBA for simulation of single exposure and is implemented in a KNIME workflow. This work illustrates the steps taken to simulate the repeated dose effects of two reference compounds, caffeine and amiodarone. Using caffeine, in vitro experimental viability data in single exposure from two human liver cell lines, HepG2 and HepaRG, were measured and used to optimize the VCBA, subsequently repeated exposure simulations were run. Amiodarone was then tested and simulations were performed under repeated exposure conditions in HepaRG. The results show that the VCBA can adequately predict repeated exposure experiments in liver cell lines. The refined VCBA model can be used not only to support the design of long term in vitro experiments but also practical applications in risk assessment. Our model is a step towards the development of in silico predictive approaches to replace, refine, and reduce the in vivo repeated dose systemic toxicity studies in the assessment of human safety.
Physiologically based kinetic (PBK) models and the virtual cell based assay can be linked to form... more Physiologically based kinetic (PBK) models and the virtual cell based assay can be linked to form so called physiologically based dynamic (PBD) models. This study illustrates the development and application of a PBK model for prediction of estragole-induced DNA adduct formation and hepatotoxicity in humans. To address the hepatotoxicity, HepaRG cells were used as a surrogate for liver cells, with cell viability being used as the in vitro toxicological endpoint. Information on DNA adduct formation was taken from the literature. Since estragole induced cell damage is not directly caused by the parent compound, but by a reactive metabolite, information on the metabolic pathway was incorporated into the model. In addition, a user-friendly tool was developed by implementing the PBK/D model into a KNIME workflow. This workflow can be used to perform in vitro to in vivo extrapolation and forward as backward dosimetry in support of chemical risk assessment.
Automation is universal in today's society, from operating equipment such as machinery, in factor... more Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively.
In order to replace the use of animals in toxicity testing, there is a need to predict in vivo to... more In order to replace the use of animals in toxicity testing, there is a need to predict in vivo toxic doses from concentrations that cause toxicological effects in relevant in vitro systems. The Virtual Cell Based Assay (VCBA) estimates time-dependent concentration of a test chemical in the cell and cell culture for a given in vitro system. The concentrations in the different compartments of the cell and test system are derived from ordinary differential equations, physicochemical parameters of the test chemical and properties of the cell line. The VCBA has been developed for a range of cell lines including BALB/c 3T3 cells, HepG2, HepaRG, lung A459 cells, and cardiomyocytes. The model can be used to design and refine in vitro experiments and extrapolate in vitro effective concentrations to in vivo doses that can be applied in risk assessment. In this paper, we first discuss potential applications of the VCBA: i) design of in vitro High Throughput Screening (HTS) experiments; ii) hazard identification (based on acute systemic toxicity); and iii) risk assessment. Further extension of the VCBA is discussed in the second part, exploring potential application to i) manufactured nanomaterials, ii) additional cell lines and endpoints, and considering iii) other opportunities.
Journal of Chemical Information and Modeling, 2015
Chemotypes are a new approach for representing molecules, chemical substructures and patterns, re... more Chemotypes are a new approach for representing molecules, chemical substructures and patterns, reaction rules, and reactions. Chemotypes are capable of integrating types of information beyond what is possible using current representation methods (e.g., SMARTS patterns) or reaction transformations (e.g., SMIRKS, reaction SMILES). Chemotypes are expressed in the XML-based Chemical Subgraphs and Reactions Markup Language (CSRML), and can be encoded not only with connectivity and topology but also with properties of atoms, bonds, electronic systems, or molecules. CSRML has been developed in parallel with a public set of chemotypes, i.e., the ToxPrint chemotypes, which are designed to provide excellent coverage of environmental, regulatory, and commercial-use chemical space, as well as to represent chemical patterns and properties especially relevant to various toxicity concerns. A software application, ChemoTyper has also been developed and made publicly available in order to enable chemotype searching and fingerprinting against a target structure set. The public ChemoTyper houses the ToxPrint chemotype CSRML dictionary, as well as reference implementation so that the query specifications may be adopted by other chemical structure knowledge systems. The full specifications of the XML-based CSRML standard used to express chemotypes are publicly available to facilitate and encourage the exchange of structural knowledge.
Background: Safety assessment for repeated dose toxicity is one of the largest challenges in the ... more Background: Safety assessment for repeated dose toxicity is one of the largest challenges in the process to replace animal testing. This is also one of the proof of concept ambitions of SEURAT-1, the largest ever European Union research initiative on alternative testing, co-funded by the European Commission and Cosmetics Europe. This review is based on the discussion and outcome of a workshop organized on initiative of the SEURAT-1 consortium joined by a group of international experts with complementary knowledge to further develop traditional read-across and include new approach data.
Objectives: The aim of the suggested strategy for chemical read-across is to show how a traditional read-across based on structural similarities between source and target substance can be strengthened with additional evidence from new approach data--for example, information from in vitro molecular screening, "-omics" assays and computational models--to reach regulatory acceptance.
Methods: We identified four read-across scenarios that cover typical human health assessment situations. For each such decision context, we suggested several chemical groups as examples to prove when read-across between group members is possible, considering both chemical and biological similarities.
Conclusions: We agreed to carry out the complete read-across exercise for at least one chemical category per read-across scenario in the context of SEURAT-1, and the results of this exercise will be completed and presented by the end of the research initiative in December 2015.
Critical Reviews in Food Science and Nutrition, 2015
Herbs, herbal extracts, or phytochemicals are broadly used as foods, drugs, and as traditional me... more Herbs, herbal extracts, or phytochemicals are broadly used as foods, drugs, and as traditional medicines. These are well regulated in Europe, with thorough controls on both safety and efficacy or validity of health claims. However, the distinction between medicines and foods with health claims is not always clear. In addition, there are several cases of herbal products that claim benefits that are not scientifically demonstrated. This review details the European Union (EU) legislative framework that regulates the approval and marketing of herbal products bearing health claims as well as the scientific evidence that is needed to support such claims. To illustrate the latter, we focus on phytoecdysteroid (PE)-containing preparations, generally sold to sportsmen and bodybuilders. We review the limited published scientific evidence that supports claims for these products in humans. In addition, we model the in silico binding between different PEs and human nuclear receptors and discuss the implications of these putative bindings in terms of the mechanism of action of this family of compounds. We call for additional research to validate the safety and health-promoting properties of PEs and other herbal compounds, for the benefit of all consumers.
In the two years since the last workshop report, the environment surrounding the prediction of sk... more In the two years since the last workshop report, the environment surrounding the prediction of skin sensitisation hazards has experienced major change. Validated non-animal tests are now OECD Test Guidelines. Accordingly, the recent cross sector workshop focused on how to use in vitro data for regulatory decision-making. After a review of general approaches and six case studies, there was broad consensus that a simple, transparent stepwise process involving non-animal methods was an opportunity waiting to be seized. There was also strong feeling the approach should not be so rigidly defined that assay variations/additional tests are locked out. Neither should it preclude more complex integrated approaches being used for other purposes, e.g. potency estimation. All agreed the ultimate goal is a high level of protection of human health. Thus, experience in the population will be the final arbiter of whether toxicological predictions are fit for purpose. Central to this is the reflection that none of the existing animal assays is perfect; the non-animal methods should not be expected to be so either, but by integrated use of methods and all other relevant information, including clinical feedback, we have the opportunity to continue to improve toxicology whilst avoiding animal use.
In spite of recent advances in describing the health outcomes of exposure to nanoparticles (NPs),... more In spite of recent advances in describing the health outcomes of exposure to nanoparticles (NPs), it still remains unclear how exactly NPs interact with their cellular targets. Size, surface, mass, geometry, and composition may all play a beneficial role as well as causing toxicity. Concerns of scientists, politicians and the public about potential health hazards associated with NPs need to be answered. With the variety of exposure routes available, there is potential for NPs to reach every organ in the body but we know little about the impact this might have. The main objective of the FP7 NanoTEST project (www.nanotest-fp7.eu) was a better understanding of mechanisms of interactions of NPs employed in nanomedicine with cells, tissues and organs and to address critical issues relating to toxicity testing especially with respect to alternatives to tests on animals. Here we describe an approach towards alternative testing strategies for hazard and risk assessment of nanomaterials, highlighting the adaptation of standard methods demanded by the special physicochemical features of nanomaterials and bioavailability studies. The work has assessed a broad range of toxicity tests, cell models and NP types and concentrations taking into account the inherent impact of NP properties and the effects of changes in experimental conditions using well-characterized NPs. The results of the studies have been used to generate recommendations for a suitable and robust testing strategy which can be applied to new medical NPs as they are developed.
This work illustrates the use of Physiologically-Based Toxicokinetic (PBTK) modelling for the hea... more This work illustrates the use of Physiologically-Based Toxicokinetic (PBTK) modelling for the healthy Caucasian population in in vitro-to-in vivo correlation of kinetic measures of caffeine skin penetration and liver clearance (based on literature experiments), as well as dose metrics of caffeine-induced measured HepaRG toxicity. We applied a simple correlation factor to quantify the in vitro and in vivo differences in the amount of caffeine permeated through the skin and concentration-time profiles of caffeine in the liver. We developed a multi-scale computational approach by linking the PBTK model with a Virtual Cell-Based Assay to relate an external oral and dermal dose with the measured in vitro HepaRG cell viability. The results revealed higher in vivo skin permeation profiles than those determined in vitro using identical exposure conditions. Liver clearance of caffeine derived from in vitro metabolism rates was found to be much slower than the optimised in vivo clearance with respect to caffeine plasma concentrations. Finally, HepaRG cell viability was shown to remain almost unchanged for external caffeine doses of 5-400 mg for both oral and dermal absorption routes. We modelled single exposure to caffeine only.
Joint physiologically-based toxicokinetic and toxicodynamic (PBTK/TD) modelling was applied to si... more Joint physiologically-based toxicokinetic and toxicodynamic (PBTK/TD) modelling was applied to simulate concentration-time profiles of nicotine, a well-known stimulant, in the human body following single and repeated dosing. Both kinetic and dynamic models were first calibrated by using in vivo literature data for the Caucasian population. The models were then used to estimate the blood and liver concentrations of nicotine in terms of the Area Under Curve (AUC) and the peak concentration (C max) for selected exposure scenarios based on inhalation (cigarette smoking), oral intake (nicotine lozenges) and dermal absorption (nicotine patches). The model simulations indicated that whereas frequent cigarette smoking gives rise to high AUC and C max in blood, the use of nicotine-rich dermal patches leads to high AUC and C max in the liver. Venous blood concentrations were used to estimate one of the most common acute effects, mean heart rate, both at rest and during exercise. These estimations showed that cigarette smoking causes a high peak heart rate, whereas dermal absorption causes a high mean heart rate over 48 h. This study illustrates the potential of using PBTK/TD modelling in the safety assessment of nicotine-containing products.
The application of physiologically based toxicokinetic (PBTK) modelling in route-to-route (RtR) e... more The application of physiologically based toxicokinetic (PBTK) modelling in route-to-route (RtR) extrapolation of three cosmetic ingredients: coumarin, hydroquinone and caffeine is shown in this study. In particular, the oral no-observed-adverse-effect-level (NOAEL) doses of these chemicals are extrapolated to their corresponding dermal values by comparing the internal concentrations resulting from oral and dermal exposure scenarios. The PBTK model structure has been constructed to give a good simulation performance of biochemical processes within the human body. The model parameters are calibrated based on oral and dermal experimental data for the Caucasian population available in the literature. Particular attention is given to modelling the absorption stage (skin and gastrointestinal tract) in the form of several sub-compartments. This gives better model prediction results when compared to those of a PBTK model with a simpler structure of the absorption barrier. In addition, the role of quantitative structure-property relationships (QSPRs) in predicting skin penetration is evaluated for the three substances with a view to incorporating QSPR-predicted penetration parameters in the PBTK model when experimental values are lacking. Finally, PBTK modelling is used, first to extrapolate oral NOAEL doses derived from rat studies to humans, and then to simulate internal systemic/liver concentrations-Area Under Curve (AUC) and peak concentration-resulting from specified dermal and oral exposure conditions. Based on these simulations, AUC-based dermal thresholds for the three case study compounds are derived and compared with the experimentally obtained oral threshold (NOAEL) values.
Compared with traditional animal methods for toxicity testing, in vitro and in silico methods are... more Compared with traditional animal methods for toxicity testing, in vitro and in silico methods are widely considered to permit a more cost-effective assessment of chemicals. However, how to assess the cost-effectiveness of alternative methods has remained unclear. This paper offers a user-oriented tutorial for applying cost-effectiveness analysis (CEA) to alternative (non-animal) methods. The purpose is to illustrate how CEA facilitates the identification of the alternative method, or the combination of methods, that offers the highest information gain per unit of cost. We illustrate how information gains and costs of single methods and method combinations can be assessed. By using acute oral toxicity as an example, we apply CEA to a set of four in silico methods (ToxSuite, TOPKAT, TEST, ADMET Predictor), one in vitro method (the 3T3 Neutral Red Uptake cytotoxicity assay), and various combinations of these methods. Our results underline that in silico tools are more cost-effective than the in vitro test. Battery combinations of alternative methods, however, do not necessarily outperform single methods, because additional information gains from the battery are easily outweighed by additional costs.
There is demand for methodologies to establish levels of safety concern associated with dietary e... more There is demand for methodologies to establish levels of safety concern associated with dietary exposures to chemicals for which no toxicological data are available. In such situations, the application of in silico methods appears promising. To make safety statement requires quantitative predictions of toxicological reference points such as no observed adverse effect level and carcinogenic potency for DNA-reacting chemicals. A decision tree (DT) has been developed to aid integrating exposure information and predicted toxicological reference points obtained with quantitative structure activity relationship ((Q)SAR) software and read across techniques. The predicted toxicological values are compared with exposure to obtain margins of exposure (MoE). The size of the MoE defines the level of safety concern and should account for a number of uncertainties such as the classical interspecies and inter-individual variability as well as others determined on a case by case basis. An analysis of the uncertainties of in silico approaches together with results from case studies suggest that establishing safety concern based on application of the DT is unlikely to be significantly more uncertain than based on experimental data. The DT makes a full use of all data available, ensuring an adequate degree of conservatism. It can be used when fast decision making is required.
Chemical regulation is challenged by the large number of chemicals requiring assessment for poten... more Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or already on the market. The need for timely and robust decision making demands that regulatory toxicity testing becomes more cost-effective and efficient. One way to realize this goal is by being more strategic in directing testing resources; focusing on chemicals of highest concern, limiting testing to the most probable hazards, or targeting the most vulnerable species. Hypothesis driven Integrated Approaches to Testing and Assessment (IATA) have been proposed as practical solutions to such strategic testing. In parallel, the development of the Adverse Outcome Pathway (AOP) framework, which provides information on the causal links between a molecular initiating event (MIE), intermediate key events (KEs) and an adverse outcome (AO) of regulatory concern, offers the biological context to facilitate development of IATA for regulatory decision making. This manuscript summarizes discussions at the Workshop entitled "Advancing AOPs for Integrated Toxicology and Regulatory Applications" with particular focus on the role AOPs play in informing the development of IATA for different regulatory purposes.
The potential (eco)toxicological hazard posed by engineered nanoparticles is a major scientific a... more The potential (eco)toxicological hazard posed by engineered nanoparticles is a major scientific and societal concern since several industrial sectors (e.g. electronics, biomedicine, and cosmetics) are exploiting the innovative properties of nanostructures resulting in their large-scale production. Many consumer products contain nanomaterials and, given their complex life-cycle, it is essential to anticipate their (eco)toxicological properties in a fast and inexpensive way in order to mitigate adverse effects on human health and the environment. In this context, the application of the structure-toxicity paradigm to nanomaterials represents a promising approach. Indeed, according to this paradigm, it is possible to predict toxicological effects induced by chemicals on the basis of their structural similarity with chemicals for which toxicological endpoints have been previously measured. These structure-toxicity relationships can be quantitative or qualitative in nature and they can predict toxicological effects directly from the physicochemical properties of the entities (e.g. nanoparticles) of interest. Therefore, this approach can aid in prioritizing resources in toxicological investigations while reducing the ethical and monetary costs that are related to animal testing. The purpose of this review is to provide a summary of recent key advances in the field of QSAR modelling of nanomaterial toxicity, to identify the major gaps in research required to accelerate the use of quantitative structure-activity relationship (QSAR) methods, and to provide a roadmap for future research needed to achieve QSAR models useful for regulatory purposes.
National legislations for the assessment of the skin sensitization potential of chemicals are inc... more National legislations for the assessment of the skin sensitization potential of chemicals are increasingly based on the globally harmonized system (GHS). In this study, experimental data on 55 non-sensitizing and 45 sensitizing chemicals were evaluated according to GHS criteria and used to test the performance of computer (in silico) models for the prediction of skin sensitization. Statistic models (Vega, Case Ultra, TOPKAT), mechanistic models (Toxtree, OECD (Q)SAR toolbox, DEREK) or a hybrid model (TIMES-SS) were evaluated. Between three and nine of the substances evaluated were found in the individual training sets of various models. Mechanism based models performed better than statistical models and gave better predictivities depending on the stringency of the domain definition. Best performance was achieved by TIMES-SS, with a perfect prediction, whereby only 16% of the substances were within its reliability domain. Some models offer modules for potency; however predictions did not correlate well with the GHS sensitization subcategory derived from the experimental data. In conclusion, although mechanistic models can be used to a certain degree under well-defined conditions, at the present, the in silico models are not sufficiently accurate for broad application to predict skin sensitization potentials.
The use of Integrated Testing Strategies (ITS) permits the combination of diverse types of chemic... more The use of Integrated Testing Strategies (ITS) permits the combination of diverse types of chemical and toxicological data for the purposes of hazard identification and characterisation. In November 2008, the European Partnership for Alternative Approaches to Animal Testing (EPAA), together with the European Centre for the Validation of Alternative Methods (ECVAM), held a workshop on Overcoming Barriers to Validation of Non-animal Partial Replacement Methods/Integrated Testing Strategies, in Ispra, Italy, to discuss the extent to which current ECVAM approaches to validation can be used to evaluate partial replacement in vitro test methods (i.e. as potential ITS components) and ITS themselves. The main conclusions of these discussions were that formal validation was only considered necessary for regulatory purposes (e.g. the replacement of a test guideline), and that current ECVAM approaches to validation should be adapted to accommodate such test methods (1). With these conclusions in mind, a follow-up EPAA–ECVAM workshop was held in October 2009, to discuss the extent to which existing validation principles are applicable to the validation of ITS test methods, and to develop a draft approach for the validation of such test methods and/or overall ITS for regulatory purposes. This report summarises the workshop discussions that started with a review of the current validation methodologies and the presentation of two case studies (skin sensitisation and acute toxicity), before covering the definition of ITS and their components, including their validation and regulatory acceptance. The following main conclusions/recommendations were made: that the validation of a partial replacement test method (for application as part of a testing strategy) should be differentiated from the validation of an in vitro test method for application as a stand-alone replacement, especially with regard to its predictive capacity; that, in the former case, the predictive capacity of the whole testing strategy (rather than of the individual test methods) would be more important, especially if the individual test methods had a high biological relevance; that ITS allowing for flexible and ad hoc approaches cannot be validated, whereas the validation of clearly defined ITS would be feasible, although practically quite difficult; and that test method developers should be encouraged to develop and submit to ECVAM not only full replacement test methods, but also partial replacement methods to be placed as parts of testing strategies. The added value from the formal validation of testing strategies, and the requirements needed in view of regulatory acceptance of the data, require further informed discussion within the EPAA forum on the basis of case studies provided by industry.
The distinctive characteristics of nanoparticles, resulting from properties that arise at the nan... more The distinctive characteristics of nanoparticles, resulting from properties that arise at the nano-scale, underlie their potential applications in the biomedical sector. However, the very same characteristics also result in widespread concerns about the potentially toxic effects of nanoparticles. Given the large number of nanoparticles that are being developed for possible biomedical use, there is a need to develop rapid screening methods based on in silico methods. This study illustrates the application of conceptual Density Functional Theory (DFT) to some carbon nanotubes (CNTs) optimized by means of static DFT calculations. The computational efforts are focused on the geometry of a family of packed narrow-diameter carbon nanotubes (CNTs) formed by units from four to twelve carbons evaluating the strength of the C-C bonds by means of Mayer Bond Orders (MBO). Thus, width and length are geometrical features that might be used to tune the electronic properties of the CNTs. At infinite length, partial semi-conductor characteristics are expected.
The COSMOS Project (Integrated In Silico Models for the Prediction of Human Repeated Dose Toxicit... more The COSMOS Project (Integrated In Silico Models for the Prediction of Human Repeated Dose Toxicity of COSMetics to Optimise Safety, www.cosmostox.eu) is a unique international collaboration addressing the safety assessment needs of the cosmetics industry without the use of animals. COSMOS is developing an integrated suite of computational workflows including models based on the threshold of toxicological concern (TTC) approach, innovative toxicity prediction strategies based on chemical categories, read-across and quantitative structure activity relationships (QSARs) related to key events in adverse outcome pathways and multi-scale modeling based on physiologically-based pharmacokinetics to predict target organ concentrations and extrapolate from in vitro to in vivo exposure scenarios. The KNIME technology is being used to integrate access to databases and modeling approaches into flexible computational workflows that will be made publicly accessible and provide a transparent method for use in the safety assessment of cosmetics. First results include establishing the COSMOS chemical inventory of cosmetic ingredients and their associated chemical structures; a new dataset for TTC analysis for assessment of the applicability of the TTC approach to cosmetics; developing the COSMOS database for repeated dose toxicity data; as well as KNIME workflows to identify structural rules, fragments and properties associated with particular mechanisms of toxicity.
The toxicological assessment of genotoxic impurities is important in the regulatory framework for... more The toxicological assessment of genotoxic impurities is important in the regulatory framework for pharmaceuticals. In this context, the application of promising computational methods (e.g. Quantitative Structure-Activity Relationships (QSARs), Structure-Activity Relationships (SARs) and/or expert systems) for the evaluation of genotoxicity is needed, especially when very limited information on impurities is available. To gain an overview of how computational methods are used internationally in the regulatory assessment of pharmaceutical impurities, the current regulatory documents were reviewed. The software recommended in the guidelines (e.g. MCASE, MC4PC, Derek for Windows) or used practically by various regulatory agencies (e.g. US Food and Drug Administration, US and Danish Environmental Protection Agencies), as well as other existing programs were analysed. Both statistically based and knowledge-based (expert system) tools were analysed. The overall conclusions on the available in silico tools for genotoxicity and carcinogenicity prediction are quite optimistic, and the regulatory application of QSAR methods is constantly growing. For regulatory purposes, it is recommended that predictions of genotoxicity/carcinogenicity should be based on a battery of models, combining high-sensitivity models (low rate of false negatives) with high-specificity ones (low rate of false positives) and in vitro assays in an integrated manner.
A thorough understanding of the relationships between the physicochemical properties and the beha... more A thorough understanding of the relationships between the physicochemical properties and the behavior of nanomaterials in biological systems is mandatory for designing safe and efficacious nanomedicines. Quantitative structure-activity relationship (QSAR) methods help to establish such relationships, although their application to model the behavior of nanomaterials requires new ideas and applications to account for the novel properties of this class of compounds. This review presents and discusses a number of recent inspiring applications of QSAR modeling and descriptors for nanomaterials with a focus on approaches that attempt to describe the interactions that take place at the nano/bio-interface. The paradigm shift from classic to nano-QSAR currently relies on both theoretically and experimentally derived descriptors, and the solutions adopted for modeling are diverse, mirroring the structural and behavioral heterogeneity of nanomaterials. Research should focus on both aspects of a QSAR study: the generation of nanospecific theoretical descriptors and experimental test data.
The 2023 EURL ECVAM Status Report outlines research and development activities, along with initia... more The 2023 EURL ECVAM Status Report outlines research and development activities, along with initiatives that foster the implementation and utilisation of non-animal methods and approaches in scientific research and regulation. The Three Rs principle, which advocates for Replacement, Reduction, and Refinement of animal use in basic, applied, and translational research, as well as for regulatory purposes, is firmly established in EU legislation, with the ultimate goal of fully replacing animal testing. New approach methodologies encompassing a range of innovative technologies, including in vitro methods employing 3D tissues and cells, organ-on-chip technologies, computational models (including machine learning and artificial intelligence), and 'omics (transcriptomics and metabolomics), are developed, evaluated, and integrated into assessment frameworks in order to enhance the efficiency and effectiveness of hazard and risk assessment of chemicals and products across various regulatory contexts. Furthermore, substantial efforts are directed at promoting the development and utilisation of non-animal approaches in fundamental and applied research, where the majority of animal testing occurs, as well as for educational purposes. The achievements and accomplishments documented in this report are the culmination of collaborative efforts with EURL ECVAM's dedicated partners and stakeholders.
The 2022 EURL ECVAM Status Report describes research, dissemination and promotion activities unde... more The 2022 EURL ECVAM Status Report describes research, dissemination and promotion activities undertaken recently by the Joint Research Centre’s EU Reference Laboratory for alternatives to animal testing (EURL ECVAM) to further the uptake and use of non-animal methods and approaches in science and regulation. The principle of the Three Rs, i.e. Replacement, Reduction and Refinement of animal use in basic, applied and translational research, as well as for regulatory testing purposes, is firmly anchored in EU legislation, with complete phasing out of animal procedures being the ultimate goal. This is achievable through a transition to new approach methodologies and models based on innovative non-animal technologies and better understanding of biology and disease. The activities and results described in this report have only been possible through fruitful collaboration with EURL ECVAM’s many committed partners and stakeholders.
The 2021 EURL ECVAM status report describes research, development and validation activities, as w... more The 2021 EURL ECVAM status report describes research, development and validation activities, as well as initiatives that promote the uptake and use of non-animal methods and approaches in science and regulation. The principle of the Three Rs, i.e. Replacement, Reduction and Refinement of animal use in basic, applied and translational research, as well as for regulatory purposes is firmly anchored in EU legislation, full replacement of animal testing being the ultimate goal. New Approach Methodologies including a variety of innovative technologies, such as in vitro methods using 3D tissues and cells, organ-on-chip, computational models (including AI) and ‘omics (genomics, proteomics, metabolomics) technology are developed, evaluated and integrated in assessment frameworks with a view to improve the efficiency and effectiveness of chemical and product hazard and risk assessment in a variety of regulatory contexts. Important activities to promote the development and use of non-animal approaches are also pursued in the areas of basic and applied research, where most of the animals are used, as well as for education purposes.
The annual EURL ECVAM status report describes research, development and validation activities, as... more The annual EURL ECVAM status report describes research, development and validation activities, as well as initiatives that promote the regulatory use and international adoption of non-animal methods and approaches and their dissemination in the regulatory and research arenas. EU policies and legislation call for innovative and more efficient ways of safety testing and chemical risk assessment that do not depend on animal testing. Advanced technologies such as computational models, in vitro methods and organ-on-chip devices are being developed, evaluated and integrated to translate mechanistic understanding of toxicity into safety testing strategies. The ultimate goal is to achieve better protection of human health and the environment while supporting EU innovation and industrial competitiveness, without the use of animals. The development and use of non-animal models and methods are also essential for advancing basic, applied and translational research. Education also plays an essential role in enabling a shift to non-animal methods through the introduction of the Three Rs (Replacement, Reduction and Refinement of animal use in science) into secondary school curricula and programmes of higher education.
The European Union Reference Laboratory for alternatives to animal testing (EURL ECVAM) is an int... more The European Union Reference Laboratory for alternatives to animal testing (EURL ECVAM) is an integral part of the European Commision's Joint Research Centre. The EURL ECVAM status report informs about research and development activities, test method submissions, validation and peer reviews, activities that promote the regulatory acceptance and international adoption of alternative methods and approaches and their dissemination. Replacement, reduction and refinement of animal testing (Three Rs) in basic, applied and translational research, as well as for regulatory purposes is anchored in EU legislation. Innovative technologies (e.g., computational models, in vitro methods, organ-on-chip devices) are developed, evaluated and integrated in assessment strategies for chemicals and products safety testing in order to preserve a high level of protection for human health and the environment. Important activities to promote the development and use of alternative methods and approaches are also pursued in the areas of basic and applied research as well as for education purposes.
The European Union Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) is an int... more The European Union Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) is an integral part of the European Commission's Joint Research Centre (JRC). The EURL ECVAM status report provides updates on the progress made in the development, validation and regulatory acceptance and use of alternative methods and approaches and their dissemination. The status report describes research, development and validation activities, as well as on initiatives that promote the regulatory and international adoption and use of alternative approaches and their dissemination.
Every year, EURL ECVAM prepares a Status Report with the primary purpose of which is to inform it... more Every year, EURL ECVAM prepares a Status Report with the primary purpose of which is to inform its stakeholders and all interested parties (the public, press etc.) on updates on the status of alternative methods and approaches. The EURL ECVAM status report provides updates on activities since the last report published in October 2016. It reports on research and development, validation activities as well as on activities which promote the regulatory and international adoption and use of alternative approaches and their dissemination. It describes primarily, but not exclusively, all the activities that EURL ECVAM has undertaken or has been involved in since the publication of the last report.
Replacement, Reduction and Refinement of animal testing is anchored in EU legislation. Alternativ... more Replacement, Reduction and Refinement of animal testing is anchored in EU legislation. Alternative non-animal approaches facilitate a shift away from animal testing. Cell-based methods and computational technologies are integrated to translate molecular mechanistic understanding of toxicity into safety testing strategies.
The EURL ECVAM status report provides an update on the progress made in the development, validati... more The EURL ECVAM status report provides an update on the progress made in the development, validation and regulatory acceptance of alternative methods and approaches and their dissemination since the last report published in June 2014. It is informing on ongoing research and development activities, validation studies, peer reviews, recommendations, strategies and regulatory/international acceptance of alternative methods and approaches and dissemination activities. R&D activities within large European or International consortia continued in toxicity areas where 3Rs solutions are more difficult to find due to the underlying complexity of the area. On the other hand, toxicity areas where promising non-animal approaches have been developed, their validation and regulatory acceptance/international adoption could be progressed. Particular emphasis was given to the best and most intelligent combination and integration of these different non-animal approaches to ultimately obtain the required information without resorting to animal testing.
The EURL ECVAM status report provides an update on the progress made in the development, validati... more The EURL ECVAM status report provides an update on the progress made in the development, validation and regulatory acceptance of alternative methods and approaches since the last report published in April 2013. It is informing on ongoing research and development activities, validation studies, peer reviews, recommendations, strategies and international acceptance of alternative methods and approaches. R&D activities are ongoing for the complex endpoints where the toxicological processes and the mechanistic understanding have not been sufficiently elucidated yet and for which 3Rs solutions are more difficult to find. On the other hand, good progress In the validation and regulatory acceptance is made in areas where non-animal alternative methods have been developed and validated and where the focus lies in an intelligent combination/ integration of the various non-animal approaches.
Provisions of Regulation No 1223/2009 on cosmetic products require that the European Commission r... more Provisions of Regulation No 1223/2009 on cosmetic products require that the European Commission reports on a yearly basis to the European Parliament and Council on the progress made in the development, validation and regulatory acceptance of alternative methods and on the compliance with the deadlines of the animal testing and marketing bans. This EURL ECVAM technical report provides an update since 2010 on the state of play of alternative methods for all the toxicological areas relevant to the Cosmetics Regulation and supplements the 2013 Commission Communication on the animal testing and marketing ban and on the state of play in relation to alternative methods in the field of cosmetics. Overall good progress has been made in the validation and regulatory acceptance in areas such as local toxicity where the underpinning science is more advanced and mature alternative methods are available. For very complex endpoints on the other hand, such as chronic systemic toxicity, carcinogenicity or reproductive toxicity, efforts are predominantly focused on research and development where the emphasis is on the integration of a variety of methods based on mechanistic understanding. The future is bright however, since considerable advances in new in vitro technologies, systems biology, bioinformatics and computational modelling are driving a paradigm shift in toxicological testing and assessment where non-animal methods will ultimately become the tools of choice.
This paper reviews different approaches described in the literature for estimating the aquatic to... more This paper reviews different approaches described in the literature for estimating the aquatic toxicity of chemical substances. It is based on an extended review performed by the European Chemicals Bureau of the European Commission's Joint Research Centre in support of the development of technical guidance for the implementation of the REACH legislation, and is one of a series of minireviews in this journal. The paper is organised by approach for (Q)SAR development and includes a review of: (i) (Q)SARs for acute aquatic toxicity by chemical class, (ii) (Q)SARs for acute aquatic toxicity by mode of action, (iii) review of statistically derived (Q)SARs, (iv) structural alerts for excess aquatic toxicity and (v) expert systems that combine structural rules and multiple (Q)SAR models to predict aquatic toxicological endpoints. Directions and recommendations for further research are also provided.
This paper reviews the current status of structure-based methods for predicting adverse reproduct... more This paper reviews the current status of structure-based methods for predicting adverse reproductive effects in mammals. The methods described include (Quantitative) Structure–Activity Relationships ((Q)SARs), expert systems and the less formalised approaches of read-across within (chemical) categories. There are a number of problems with applying QSARs to reproductive toxicology notably the complexity, subtlety and sometimes ill-defined nature of the endpoints and lack of data available for modelling. A small number of ((Q)SARs have been developed for individual classes of compounds for well-defined effects. These are supplemented by expert systems approaches of all types [e.g. DEREK for Windows, TOPKAT, MC4PC, PASS, Organisation for Economic Co-operation and Development (OECD) QSAR Application Toolbox] for a variety of endpoints associated with reproductive toxicology. By far the largest, and best developed, group of models are those for receptor binding effects related to endocrine disruption, in particular to the Oestrogen Receptor (ER) and, to a lesser extent, the Androgen Receptor (AR). Strategies to improve predictive capabilities for reproductive toxicology are suggested.
This paper reviews the state-of-the-art of in silico methods for assessing dermal and ocular irri... more This paper reviews the state-of-the-art of in silico methods for assessing dermal and ocular irritation and corrosion. It is based on an in-depth review performed by the European Chemicals Bureau of the European Commission's Joint Research Centre in support of the development of technical guidance for the implementation of the REACH legislation, and is one of a series of minireviews in this journal. The most widely used in silico approaches are classified into methods to assess (1) skin irritation, (2) skin corrosion and (3) eye irritation. In this review, emphasis is placed on literature-based (Q)SAR models.
This paper is based on an in-depth review performed by the European Chemicals Bureau of the Europ... more This paper is based on an in-depth review performed by the European Chemicals Bureau of the European Commission's Joint Research Centre in support of the development of technical guidance for the implementation of the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) legislation, and is one of a series of minireviews in this journal. Compared with the QSAR modelling of ecotoxicity, the modelling of mammalian toxicity is more complicated. This is reflected by the relatively small number of QSAR studies on mammalian toxicity published in the literature. The main problem is related to wide variations in the accuracy of the in vivo data, in the organisms used, and also in the complexity and poor understanding of the mechanisms involved, especially for long-term effects. Nevertheless, the efforts already made to develop QSAR models for mammalian toxicity demonstrate the usefulness of the approach not only for predictive purposes but also for a better understanding of the multiple mechanisms involved in the toxicity. However, there is a need to better characterise existing models in accordance with the OECD QSAR validation principles, to develop new models, and to investigate the applicabilities of individual models in the regulatory context, such as for the purposes of the REACH legislation.
This paper reviews the state-of-the-art of in silico methods for estimating the readily biodegrad... more This paper reviews the state-of-the-art of in silico methods for estimating the readily biodegradability of organic chemicals, and describes their (potential) applicability in the identification of chemicals that are persistent or very persistent in the environment, such as Persistent Organic Pollutants (POPs). The paper is based on an in-depth review performed by the European Chemicals Bureau (ECB) of the European Commission's Joint Research Centre in support of the development of technical guidance for the implementation of the Registration, Evaluation, Authorisation of Chemicals (REACH) legislation, and is one of the series of minireviews in this journal. The review covers the most commonly used approaches for estimating the ready biodegradability of organic compounds, such as group contribution approaches, expert systems, probabilistic approaches and hybrid approaches. The main sources of experimental data used to develop the models are also briefly described. The review concludes with an appraisal of the challenges of developing new models and some recommendations are made for further research.
This paper reviews the state of the art of in silico methods for assessing the tendency of a subs... more This paper reviews the state of the art of in silico methods for assessing the tendency of a substance to bioconcentrate in aquatic organisms usually expressed as its Bioconcentration Factor (BCF). It is based on an in-depth review performed by the European Chemicals Bureau of the European Commission's Joint Research Centre in support of the development of technical guidance for the implementation of the REACH legislation, and is one of a series of minireviews in this journal. The most widely used in silico approaches to estimate BCF are described, with emphasis on literature-based (Quantitative) Structure-Activity Relationship [(Q)SAR] models: (1) linear relationships that correlate the BCF with octanol–water partition coefficient; (2) (Q)SAR models based on experimentally derived descriptors and (3) new (Q)SAR models aimed to evaluate the potential of other descriptors. Recommendations for further research are also provided.
This paper reviews the applicability of different types of non-testing methods and in silico tool... more This paper reviews the applicability of different types of non-testing methods and in silico tools in the framework of a structured workflow that aids their exploitation for the prediction of properties that contribute to hazard and risk assessments of chemicals. These properties include basic physicochemical properties, metabolic and environmental fate, and ecological and health effects of chemicals. The workflow for the use of methods comprises a structured sequence of operations that integrates the functionalities of a wide array of in silico tools. The workflow could be used for in-house decision making (e.g. screening the properties of potential drugs and commercial chemicals) as well generating data required in regulatory submissions. The general workflow presented here is intended to broadly applicable to all endpoints and different regulatory frameworks, including the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) legislation in the European Union (EU). The general framework can be adapted to meet the needs of specific chemicals, endpoints and regulatory purposes. This review is one of a series of minireviews in this journal.
A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the ... more A physiologically-based pharmacokinetic (PBPK) model represents the structural components of the body with physiologically relevant compartments connected via blood flow rates described by mathematical equations to determine drug disposition. PBPK models are used in the pharmaceutical sector for drug development, precision medicine, and the chemical industry to predict safe levels of exposure during the registration of chemical substances. However, one area of application where PBPK models have been scarcely used is forensic science. In this review, we give an overview of PBPK models successfully developed for several illicit drugs and environmental chemicals that could be applied for forensic interpretation, highlighting the gaps, uncertainties, and limitations.
This report describes the application of chemoinformatic methods to explore the applicability of ... more This report describes the application of chemoinformatic methods to explore the applicability of the Threshold of Toxicological Concern (TTC) approach to cosmetic ingredients. For non-cancer endpoints, the most widely used TTC approach is the Cramer classification scheme, which categorises chemicals into three classes (I, II and III) depending on their expected level of concern for oral systemic toxicity (low, medium, high, respectively). The chemical space of the Munro non-cancer dataset was characterised to assess whether this underlying TTC dataset is representative of the “world” of cosmetic ingredients, as represented by the COSMOS Cosmetics Inventory. In addition, the commonly used Cramer-related Munro threshold values were applied to a toxicological dataset of cosmetic ingredients, the COSMOS TTC dataset, to assess the degree of protectiveness resulting from the application of the Cramer classification scheme. This analysis is considered preliminary, since the COSMOS TTC data...
Safety assessment for repeated dose toxicity is one of the largest challenges in the process to r... more Safety assessment for repeated dose toxicity is one of the largest challenges in the process to replace animal testing. This is also one of the proof of concept ambitions of SEURAT-1, the largest ever EU research initiative on alternative testing, co-funded by the European Commission and Cosmetics Europe. This paper is based on the discussion and outcome of a workshop organised on initiative of the SEURAT-1 consortium joined by a group of international experts with complementary knowledge to further develop traditional read-across and include new approach data. The aim of the suggested strategy for chemical read-across is to show how a traditional read-across based on structural similarities between source and target substance can be strengthened with additional evidence from new approach data, e.g., information from in vitro molecular screening, '-omics' assays and computational models, to reach regulatory acceptance. We identified four read-across scenarios that cover typi...
Uploads
Papers (2016-Present) by Andrew P Worth
Although considerable progress has been made and is ongoing in view of overcoming these shortcomings, there currently still are gaps in the knowledge of nanomaterial behavior; there is fragmentation in the scientific results and need for more harmonization and standardization. A lack of public access to the results and tools is preventing uptake and use of the models in regulatory decision-making. More guidance is needed and, in particular, more coordination is required to direct the model development to elucidating relevant gaps of knowledge and addressing questions and endpoints relevant for regulatory applications.
Overall, a number of recommendations are made in view of increasing the availability and uptake of computational methods for regulatory nanosafety assessment. They concern inherent scientific uncertainties in the understanding of nanomaterial behavior; data quality and variability; standardization and harmonization; the availability of data and predictive models; and the accessibility and visibility of models, including the need for infrastructures.
Two other consultations conducted under this Fitness Check, one focused on stakeholder organisations and the other on small and medium-sized enterprises will be reported separately. Responses will provide an essential input to the Fitness Check analysis carried out by the JRC. A more detailed analysis of the responses from the three consultations will be published in a synopsis report at the end of the of the process.
In this report we propose that in vitro studies could contribute to the identification of potential triggers for DNT evaluation since existing cellular models permit the evaluation of a chemical impact on key neurodevelopmental processes, mimicking different windows of human brain development, especially if human models derived from induced pluripotent stem cells are applied. Furthermore, the battery of currently available DNT alternative test methods anchored to critical neurodevelopmental processes and key events identified in DNT Adverse Outcome Pathways (AOPs) could be applied to generate in vitro data useful for various regulatory purposes. Incorporation of in vitro mechanistic information would increase scientific confidence in decision making, by decreasing uncertainty and leading to refinement of chemical grouping according to biological activity. In this report development of IATA (Integrated Approaches to Testing and Assessment) based on key neurodevelopmental processes and AOP-informed is proposed as a tool for not only speeding up chemical screening, but also providing mechanistic data in support of hazard assessment and in the evaluation of chemical mixtures. Such mechanistically informed IATA for DNT evaluation could be developed integrating various sources of information (e.g., non-testing methods, in vitro approaches, as well as in vivo animal and human data), contributing to screening for prioritization, hazard identification and characterization, and possibly safety assessment of chemicals, speeding up the evaluation of thousands of compounds present in industrial, agricultural and consumer products that lack safety data on DNT potential. It is planned that the data and knowledge generated from such testing will be fed into the development of an OECD guidance document on alternative approaches to DNT testing.
For 178 chemicals previously tested in vitro with the 3T3 BALB/c cell line using the Neutral Red Uptake cytotoxicity assay, physicochemical parameters were retrieved and curated. Of these chemicals, 83 were run in the VCBA to simulate a 96-well microplate set up with 5% serum supplementation, and their no effect concentration (NEC) and killing rate (Kr) optimized against the experimental data. Analyses of results of partitioning of the chemicals show a strong relation with their lipophilicity, expressed here as the logarithm of the octanol/water partitioning coefficient, with highly lipophilic chemicals binding mostly to medium lipid. Among the chemicals analysed, only benzene and xylene were modelled to evaporate by more than 10 %, and these were also the chemicals with highest degradation rates during the 48 hours assay. Chemical degradation is dependent not only on the air and water degradation rates but also on the extent of binding of the chemical.
Due to the strong binding of some chemicals to medium lipids and proteins we analysed the impact of different serum supplementations (0%, 5% and 10%) on the chemical dissolved concentrations. As expected, for the more lipophilic chemicals, different serum levels result in different dissolved concentrations, with lipid and protein binding reducing chemical loss by evaporation. Still the lack of saturation modelling might mislead the 0 % supplementation since the lipids coming solely from cells exudates are able to sequester chemical to a large extent, eg. after 48 hours, 63% (1.2E-5 M) of dimethyldioctadecylammonium chloride was bound to lipid from the cells. Although highly lipophilic chemicals have a very small bioavailable fraction, cellular uptake rate is also dependent on logKow, which compensates for this lack of bioavailability to some extent.
Based on the relevance of lipophilicity on in vitro chemical bioavailability, we have developed an alert system based on logKow, creating four classes of chemicals for the experimental condition with 10% serum supplementation: logKow 5- 10 (A), logKow <5 (B), logKow <2.5 (C), and logKow <2 (D). New chemicals from Classes A and B, which will in the future be tested in vitro, were run first on the VCBA, without considering toxicity (NEC and Kr set to 0). VCBA simulations indicated that these chemicals are more than 50% bound to medium proteins, lipids and plastic. Therefore, for chemicals with logKow falling in these classes, special care should be taken when extrapolating the obtained in vitro toxic concentrations to in vivo relevant doses.
A comparison of the VCBA-predicted dissolved concentrations corresponding to nominal IC50 values with the available rat oral LD50 values did not improve the previously obtained correlations. This is probably because other in vivo kinetic processes play an important role but were not considered in this in vitro-in vivo extrapolation.
The comparison of the VCBA predicted IC50 dissolved concentrations with the available rat oral LD50 values, did not improve the previously obtained correlations. Nevertheless, other in vivo kinetic processes that are not modelled may play an important role. They should be considered in the in vitro-in vivo extrapolations.
A local sensitivity analysis showed the relative low impact of Molar Volume and Molecular Diffusion Volume on the final dissolved concentration, supporting the use of approximated values obtained through the herein created QSARs. The logkow and Henry Law Constant showed, as expected, a high impact in partitioning. Killing rate was shown to also have a relative low impact in the final chemical concentration, indicating that although its optimization is important, finding the Kr that leads to the absolute best correlation between experimental and predicted concentration-viability curves, is not imperative. The VCBA can be applied to virtually any chemical as long as the physicochemical data (for the fate model) and the experimental toxicity data (that include cell growth/death) are available. However, being such a generic model, several assumptions had to be made: i) no distinction of chemical classes (inorganic, polar organic chemicals), ii) no consideration of metabolism, iii) saturation kinetics and iv) external in vitro conditions.
The advantages of having a generic model are that the VCBA can fit several experimental set ups and should be used in an exploratory manner, to help refinement of experimental conditions. The herein obtained VCBA results should be double checked experimentally the partition with a set of chemical compounds to better understand to what extent VCBA represents chemicals of different properties.
In future developments, it would be important to reduce the uncertainties of the model such as binding-saturation and consider inclusion of other endpoints such as metabolic activity.
The validity of the two-year bioassay though has been questioned in the last decade. Uncertainty is associated with the extrapolation of data from rodents to humans. Furthermore, these studies are extremely time and resource-consuming; the high animal burden has raised ethical concerns. For all these reasons, there is a strong demand for alternative strategies and methods. The development of new in vitro methods for carcinogenicity testing, however, has progressed slowly and those available are far from being accepted for regulatory decision making, especially when evaluating the carcinogenicity of non-genotoxic chemicals or specific classes of compounds such as biologicals, microorganisms and nanomaterials.
The European Union Reference Laboratory for alternatives to animal testing (EURL ECVAM) has carried out an analysis of carcinogenicity testing requirements and assessment approaches across different sectors. This consisted of: a systematic review of the different testing requirements; a review of the number of animals used per sector; an estimation of the number of carcinogenicity and genotoxicity studies conducted or waived in respect of the number of substances authorized per sector per year; a review of the type of justifications for waiving the two-year bioassay and how opportunities for waiving are being used.
Results from this analysis will provide context for initiatives aimed at: 1) reducing the need for animal use where animal testing is still a requirement; 2) ensuring an adequate carcinogenicity assessment in sectors where animal use is banned or limited; and 3) improving the process of cancer hazard identification where existing methods are not suitable.
For the purposes of this work, data on acute and chronic aquatic toxicity (Daphnia and fish) from several databases (US EPA Ecotox database, Aquatic ECETOC, Aquatic OASIS, Aquatic Japan MoE databases and ECHA database as implemented in the OECD QSAR Toolbox Version 2.3) were collated and analysed. Simple linear relationships and interspecies sensitivity ratios were calculated using either acute Daphnia data (48h LC50) or chronic Daphnia data (14 days NOEC) and chronic fish data (>21 days NOEC). Acute to chronic relationships and acute to chronic ratios (ACR) were also calculated based on acute fish data (96h LC50) and chronic fish data. These analyses were carried out on the whole set of chemicals and on subgroups of chemicals classified according to the Verhaar mode of action (MOA) scheme, which attribute general mode of acute aquatic toxic action based on the chemical structure of the molecule. Outliers were identified applying the Robust regression and Outlier removal (ROUT) method.
Our results show that the best fitted relationships for the prediction of chronic fish toxicity are obtained based on acute fish data (r2=0.87) and acute Daphnia data (r2=0.64) when dealing with the whole set of chemicals regardless of the MOA. The quality of the relationships was increased by using the geometric mean (calculated across all the values extracted for a given chemical and a given endpoint) instead of the lowest value for a given endpoint.
When considering the MOA, MOA 3 and MOA 1 chemicals give the strongest acute Daphnia to chronic fish relationship and chronic Daphnia to chronic fish relationship; however the relationships obtained with acute Daphnia data are better (r2= 0.83 and 0.69 for MOA 3 and MOA 1 respectively) than the one obtained with chronic Daphnia data (r2= 0.66 and 0.65 for MOA 1 and 3 respectively). When considering acute fish data, all the MOA classes give strong relationships (r2=0.88 for MOA 3 and MOA 5 chemicals, 0.85 for MOA 4 chemicals and 0.83 for MOA 1 and MOA 2 chemicals). Therefore when acute toxicity data on fish are available, they might give a reliable basis to extrapolate the chronic toxicity on fish as a first tier assessment or within a weight of evidence approach.
There is a correlation between chemicals with high ACR values or interspecies sensitivity ratios and the outliers identified in the above-mentioned relationships. When considering chemicals with a high interspecies sensitivity ratio, Daphnia being more sensitive than fish, several aniline derivatives and pesticides acting through cholinesterase inhibition were identified. When considering high interspecies sensitivity ratio chemicals for which Daphnia is less sensitive than fish, we found pesticides and known endocrine disruptors such as ethynil oestradiol and 17ß-oestradiol. Extreme (i.e. <1 or > 100) interspecies sensitivity ratios were mainly evident for MOA 2, 4 and 5 chemicals. Regarding ACR for fish, around 50% of the chemicals in each MOA class have an ACR within a factor of 10; whereas 100% of MOA 3, 90.9% of MOA 2, 88.3% of MOA 4 and 85.5% of MOA 1 chemicals have an ACR within a factor of 100. Therefore, the safety factor of 100 commonly applied in environmental risk assessment does not seem to be equally protective for every MOA.
Twenty-one case studies were identified, which included human and environmental risk assessments. Several compound classes and environmental media were covered, i.e. pesticides, phthalates, parabens, PBDEs, pharmaceuticals, food contact materials, dioxin-like compounds, anti-androgenic chemicals, contaminants in breast milk, mixtures of contaminants in surface water, ground water and drinking water, and indoor air. However, the selection of chemical classes is not necessarily representative as many compounds groups have not been covered. The selection of these chemical classes is often based on data availability, recent concerns about certain chemical classes or legislative requirements. Several of the case studies revealed a concern due to combined exposure for certain chemical classes especially when considering specific vulnerable population groups. This is very relevant information, but needs to be interpreted with caution, considering the related assumptions, model parameters and related uncertainties. Several parameters that could lead to an over- or underestimation of risks were identified. However, there is clear evidence that chemicals need to be further addressed not only in single substance risk assessment and that mixtures should be considered also across chemical classes and legislative sectors.
Furthermore, several issues hampering mixture risk assessments were identified. In order to perform a mixture risk assessment, the composition of the mixture in terms of chemical components and their concentrations need to be known, and relevant information on their uptake and toxicity are required. Exposure data are often lacking and need to be estimated based on production and use/consumption information. Also relevant toxicity data are not always available. Toxicity data gaps can be filled e.g. using the Threshold of Toxicological Concern approach. Reference values used in single substance risk assessments can be found for several chemical classes, however, they are usually derived based on the lowest endpoint. If a refined toxicity assessment of a mixture for a specific effect/cumulative assessment group is envisaged, this is often hampered by a lack of specific toxicity and mode of action information.
In all case studies, concentration addition based assessments were made, mainly applying the Hazard Index. To further characterise the drivers of the mixture risk, the maximum cumulative ratio was calculated in several case studies. This showed that the scientific methodologies to address mixtures are mostly agreed and lead to reasonable predictions. However, especially for some groups of compounds that are designed as active substances, it cannot be excluded that interactions appear and they should therefore be addressed on a case-by-case basis.
Most of the mixtures addressed in the identified case studies examined specific chemical groups. Only few of them looked at mixtures comprising chemicals regulated under different legislative frameworks. The examples indicated that there is evidence for combined exposure to chemicals regulated under different legislation as well as evidence that such chemicals can elicit similar effects or have a similar mode of action. A mixture risk assessment across regulatory sectors should therefore be further investigated.
The results of this preliminary analysis show that the Munro dataset is broadly representative of the chemical space of cosmetics, although certain structural classes are missing, notably organometallics, silicon-containing compounds, and certain types of surfactants (non-ionic and cationic classes). Furthermore, compared with the Cosmetics Inventory, the Munro dataset has a higher prevalence of reactive chemicals and a lower prevalence of larger, long linear chain structures. The COSMOS TTC dataset, comprising repeat dose toxicity data for cosmetics ingredients, shows a good representation of the Cosmetics Inventory, both in terms of physicochemical property ranges, structural features and chemical use categories. Thus, this dataset is considered to be suitable for investigating the applicability of the TTC approach to cosmetics. The results of the toxicity data analysis revealed a number of cosmetic ingredients in Cramer Class I with No Observed Effect Level (NOEL) values lower than the Munro threshold of 3000 µg/kg bw/day. The prevalence of these “false negatives” was less than 5%, which is the percentage expected by chance resulting from the use of the 5th percentile of cumulative probability distribution of NOELs in the derivation of TTC values. Furthermore, the majority of these false negatives do not arise when structural alerts for DNA-binding are used to identify potential genotoxicants, to which a lower TTC value of 0.0025 µg/kg bw/day is typically applied. Based on these preliminary results, it is concluded that the current TTC approach is broadly applicable to cosmetics, although a number of improvements can be made, through the quality control of the underlying TTC datasets, modest revisions / extensions of the Cramer classification scheme, and the development of explicit guidance on how to apply the TTC approach.
Objectives: The aim of the suggested strategy for chemical read-across is to show how a traditional read-across based on structural similarities between source and target substance can be strengthened with additional evidence from new approach data--for example, information from in vitro molecular screening, "-omics" assays and computational models--to reach regulatory acceptance.
Methods: We identified four read-across scenarios that cover typical human health assessment situations. For each such decision context, we suggested several chemical groups as examples to prove when read-across between group members is possible, considering both chemical and biological similarities.
Conclusions: We agreed to carry out the complete read-across exercise for at least one chemical category per read-across scenario in the context of SEURAT-1, and the results of this exercise will be completed and presented by the end of the research initiative in December 2015.
New approach methodologies encompassing a range of innovative technologies, including in vitro methods employing 3D tissues and cells, organ-on-chip technologies, computational models (including machine learning and artificial intelligence), and 'omics (transcriptomics and metabolomics), are developed, evaluated, and integrated into assessment frameworks in order to enhance the efficiency and effectiveness of hazard and risk assessment of chemicals and products across various regulatory contexts. Furthermore, substantial efforts are directed at promoting the development and utilisation of non-animal approaches in fundamental and applied research, where the majority of animal testing occurs, as well as for educational purposes. The achievements and accomplishments documented in this report are the culmination of collaborative efforts with EURL ECVAM's dedicated partners and stakeholders.
The principle of the Three Rs, i.e. Replacement, Reduction and Refinement of animal use in basic, applied and translational research, as well as for regulatory purposes is firmly anchored in EU legislation, full replacement of animal testing being the ultimate goal.
New Approach Methodologies including a variety of innovative technologies, such as in vitro methods using 3D tissues and cells, organ-on-chip, computational models (including AI) and ‘omics (genomics, proteomics, metabolomics) technology are developed, evaluated and integrated in assessment frameworks with a view to improve the efficiency and effectiveness of chemical and product hazard and risk assessment in a variety of regulatory contexts. Important activities to promote the development and use of non-animal approaches are also pursued in the areas of basic and applied research, where most of the animals are used, as well as for education purposes.
EU policies and legislation call for innovative and more efficient ways of safety testing and chemical risk assessment that do not depend on animal testing. Advanced technologies such as computational models, in vitro methods and organ-on-chip devices are being developed, evaluated and integrated to translate mechanistic understanding of toxicity into safety testing strategies. The ultimate goal is to achieve better protection of human health and the environment while supporting EU innovation and industrial competitiveness, without the use of animals.
The development and use of non-animal models and methods are also essential for advancing basic, applied and translational research. Education also plays an essential role in enabling a shift to non-animal methods through the introduction of the Three Rs (Replacement, Reduction and Refinement of animal use in science) into secondary school curricula and programmes of higher education.
Replacement, reduction and refinement of animal testing (Three Rs) in basic, applied and translational research, as well as for regulatory purposes is anchored in EU legislation. Innovative technologies (e.g., computational models, in vitro methods, organ-on-chip devices) are developed, evaluated and integrated in assessment strategies for chemicals and products safety testing in order to preserve a high level of protection for human health and the environment. Important activities to promote the development and use of alternative methods and approaches are also pursued in the areas of basic and applied research as well as for education purposes.