We do not rely on the strength of the science alone to engineer change.
Precision Toxicology is based on three core concepts. PhyloToxicology replaces mammal models with an evolutionarily diverse suite of non-sentient animal species from across the tree of Life by proving toxicity by descent. Quantitative Susceptibility determines safety factors that are based on genetic variability by establishing causation and sources of variation in susceptibility to chemicals. Embedded Translation engages directly with chemical risk managers, regulators and law makers in project planning and execution, and in case studies for regulatory application.
Problem: Whole-organism toxicity testing is crucial but no single model is a perfect human surrogate. Although mammals such as rats are considered the ‘gold standard’ for toxicology research, ample evidence demonstrates that these animals are not perfect predictors of human toxicological adversity. This limitation, coupled with the desire to reduce expenses, improve throughput, and reduce experimentation on animals, have driven recent toxicology efforts to instead use human derived cell-lines. Unfortunately, this approach, when used in isolation to replace whole-organism research, has major drawbacks as the focus on cells rather than whole animals fails to provide a viable animal-free solution to address systemic health effects in humans or to capture complex higher-order cell-to-cell interactions that govern biological processes necessary for health. This need for whole-organism testing combined with the present push for animal alternatives has encouraged research involving non-mammalian model organisms, an area in which our Centre excels. While this approach has merits due to the conservation of gene functions and biochemical and cellular processes across species, reliance on one model organism is insufficient to improve on the extrapolation limitations of rodent testing. Additionally, outside the scientific community, an invertebrate model such as Drosophila, or Daphnia, or Caenorhabditis elegans used in isolation can be met with skepticism (despite conserved or homologous genes and pathways involved in toxicity) due to reflexive resistance to associating humans with “lesser” animals.
Solution: Evolutionarily diverse non-sentient organisms plus human cell lines (PhyloToxicology). Our approach overcomes these three limitations — cross-species extrapolation, cell-line constraints, and species credulity — by leveraging the power of phylogenetic toxicology to examine not just one species but instead a suite of biomedically relevant model species representing the major branches of the animal kingdom, in addition to human cell lines (Figure 1). Furthermore, this approach reveals the fundamental biological mechanisms by which organisms respond to toxic perturbation, enabling the prediction of adverse outcomes based on molecular and physiological processes rather than relying on extrapolations from observations of endpoints such as reproductive failure and death.
Method Overview: Identify evolutionarily conserved molecular toxicity pathways. To identify molecular toxicity pathways, we investigate not just how one model species responds to chemical exposure, but rather how an evolutionarily diverse suite of organisms respond. Where toxicological pathways are evident throughout the animal kingdom, we can confidently hypothesize that humans share these responses. Moreover, having identified the biochemical processes by which these responses occur, we can determine the appropriate human cell lines for further investigation and perform empirically targeted in vitro research. Effects are observed across this diverse suite of alternative model species by: (1) automated, unbiased measurements of behaviour, reproduction, embryology and morphology to enable an integrative diagnostic assessment of complex interactions through phenomics; (2) detecting the products of metabolism to indicate changes in health state using metabolomics: and (3) identifying the genes responsible for these metabolic events via RNA sequencing using transcriptomics. This triangulation through a multi-omics pattern of key toxicity events is then used to determine whether the mechanisms of toxicity are shared among organisms by evolutionary descent, including humans (Figure 1). The conservation of molecular response pathways across vertebrates and invertebrates is inferred by comparative omic approaches computed on each node of the phylogenetic tree. Machine learning is then employed to link mechanistic and chemical characteristics to human toxicity. Algorithms from the emerging field of ‘explainable-AI’ reveal statistical associations between the responding elements of animal genomes to adverse outcomes and to known pathways of disease for the design of toxicity classification biomarkers.
Problem: The adjustment factor is arbitrary. Current methods for establishing safe exposure levels employ benchmark dose estimation, which fits a dose-response curve to data from genetically restricted animal experiments. Thresholds for human exposures are established by applying an arbitrary adjustment factor that lowers the benchmark dose estimation by 100-fold. The adjustment factor creates a buffer to account for inherent differences in susceptibility between the animal test population and a human population, as well as individual variation in sensitivity that exists within the genetically diverse human population. But the arbitrary nature of the adjustment factor is problematic: an underestimated factor could put exposed populations at undue risk, while an overestimated factor could incur unnecessary regulatory costs to mitigate exposure that is not actually dangerous and create undue anxiety in a population. Further scientific developments must improve methods for determining exposure thresholds that can rationally account for genetic variation in susceptibility. By studying the general principles that underlie the relationship between genetic variation and susceptibility to being harmed by chemical exposure, and using new designs that introduce genetic variability in non-sentient test species, the power of quantitative genetics to identify genetic targets controlling the shape of the dose-response relationship dramatically improves estimates of the intra-species variability component of the adjustment factor.
Solution: Study markers of susceptibility. PrecisionTox addresses this challenge by conducting a detailed study of population genetic variation modulating the response of the molecular toxicity pathways and their biomarkers, and by evaluating linkages between the genetic targets and individual susceptibility using unique genetically diverse model systems for quantitative estimates of intra-species variability.
Problem: Scientific answers are not sufficient; change requires buy-in from the stakeholders. Our direct work with risk managers and regulators has shown us that despite a conceptual commitment to ‘advancing the safety assessment of chemicals without the use of animal testing’, the prospect of departing from mammal-based assays incites anxiety in those who must evaluate evidence of chemical safety. Although ‘replacement’ is the goal, people are reluctant to implement alternative processes based on science that they perceive to be less certain than mammalian testing.
Solution: Co-produce PrecisionTox with key decision makers. Rather than push results towards stakeholders and end users, PrecisionTox integrates them into project planning, implementation, and dissemination to ensure stakeholder needs for evidence of human applicability are met. A formal Stakeholder Advisory Group is integrated into project management and actively participates in defining project activities, including identification of socio-technical barriers to uptake of new approach methodologies and implementation of actions to overcome these barriers, selection of chemicals for testing, validation of case studies, and participation in the innovation and exploitation of the project results. We translate the discovery of molecular toxicity pathway biomarkers into tailored solutions for specific regulatory needs through case studies conducted in collaboration risk managers and regulators that advance the use of biomarker for the protection of health from chemical exposures. The methods developed by the project will enable harmful chemicals to be classified as carcinogens, hormonal disrupters, neurotoxins and other disease-causing agents using reliable molecular toxicity measurements. This will offer a faster and cheaper alternative to animal testing that will help industry to design safer chemicals and also provide biomarkers of toxicity to detect and control harmful chemicals that are already in the environment.