Last year, a long-planned European Union ban on using animals to test the safety of cosmetic ingredients went into effect. The motivation for the new rule stems in part from concerns about the ethics of testing the safety of chemicals on living creatures, but scientists say there is more at play here than a moral conundrum.
Reliance on animal testing, according to several experts, has actually hindered the evaluation of many chemicals and ingredients inside and outside the cosmetics industry. Animal-based tests take too long and are too expensive, they say, often requiring several years and millions of dollars or more to carry out.
The significant physiological differences between humans and the mice, rats, and other animals used to evaluate the safety of chemicals also can limit the validity of the results. “We need a better way to test toxic chemicals,” says toxicologist Kristie Sullivan, director of regulatory testing issues at the non-profit Physicians Committee for Responsible Medicine.
In 2009, industry trade group Cosmetics Europe, along with a division of the European Commission on Research & Innovation, launched a $68-million research initiative to develop lab technologies and computational models capable of predicting the toxicity of chemicals in humans. Although cosmetics companies are anxiously awaiting these alternatives, the so-called Safety Evaluation Ultimately Replacing Animal Testing (SEURAT) program was not meant to produce a quick solution. The scientists involved with SEURAT, which is made up of five individual research clusters and involves more than 70 universities, research groups, and biotechnology companies, say these new technologies will take at least five more years to develop.
As the researchers push the boundaries of their fields to build these new tests, one thing is clear. “The future direction clearly has to involve computation,” says Elmar Heinzle, a chemical engineer at the University of Saarland in Germany and a leader of NOTOX, one of the SEURAT research clusters.
Mechanisms of Action
Pharmaceutical companies and regulatory agencies have been using computational tools to evaluate toxicity for years. Generally, these methods look at the structure of the chemical compound or molecule under consideration, run it up against a database of chemicals with known toxological effects, and search for substances with similar chemical structures. If a new chemical or ingredient lines up with one that has proven toxic in previous tests, this suggests the new compound could be similarly troublesome.
Instead of relying on structural similarities, NOTOX technologies will base predictions on actual biological mechanisms of action. Heinzle says scientists have reached a consensus that there are similarities in terms of how toxic chemicals act within the body. “Although the whole systems are very, very complex and the number of possible interactions is huge,” he says, “we have found that the number of pathways that leads to adverse outcomes is very limited.”
The NOTOX plan is to analyze these pathways in the lab, picking out key events such as a molecule binding to a particular protein, and then using this data to bolster virtual or in silico models. A combination of laboratory and computer modeling would then be used to evaluate whether a new chemical would spark reactions that match those critical, harm-inducing events. Heinzle acknowledges this approach leaves open the possibility of a molecule acting via an unknown mechanism, but he adds the same holds true for animal testing, and that it would be difficult to eliminate all risk. The combination of powerful models and robust lab tests, though, will reduce that risk to safe levels.
The scientists say the time is right for the new approach, in part because of more advanced electron microscopes and other imaging technologies that allow scientists to capture the inner architecture of organs and their cells. Additionally, in vitro or cell-culture-based studies of toxicity have vastly improved. In the past, scientists often relied on short-lived, two-dimensional cultures of human liver cells to test for possible toxic effects; now scientists not only have more accurate 3D cell cultures at their disposal, but these cultures also last longer, allowing lab researchers to study the effects of a substance over a longer period of time.
Virtual Body
Cosmetics present a unique challenge for the researchers. Although generally designed to cling to the surface of skin and not move through the body like a drug, the compounds in cosmetics can still seep inside. An ingredient in a facial lotion, especially if it is lipophilic (attracted to fats), might move through the upper layer of skin into the sub-mucosa, which contains fatty tissue and small blood vessels; some of the compound might then leak into those vessels, travel through the bloodstream, heart, and lungs, and eventually reach the liver.
Modeling this complex chain of events is a difficult task for the researchers. “In the near future, one of the big challenges will be understanding exposure,” says Mark Cronin, a computational toxicologist at Liverpool John Moores University in the U.K., and the leader of one of the SEURAT research groups. “If you apply cosmetics to your skin, how much will get to a particular organ? How much will get into your blood? How much will get to the liver? And then, we need to be able to predict whether or not that will be toxic.”
To do so, the SEURAT scientists will eventually interweave models of each of these systems—the skin, heart, lungs, and more—to trace that path. In these first few years of the program, several NOTOX researchers have been focusing on developing a virtual model of the liver, the main detoxifying organ in the human body. Dirk Drasdo, a biophysicist and bioinformatics expert at the French research agency INRIA, toxicologist Jan Hengstler of the University of Dortmund, and their colleagues are working toward a more realistic interpretation of the liver, from the shape and architecture of the organ down to the individual hepatocytes (liver cells).
“In vitro and in silico testing will play a much larger part in how we assess chemicals in the future.”
The model is still in its early stages, but Drasdo, Hengstler, and their co-workers have already shown its potential. In a 2010 paper in the Proceedings of the National Academy of Sciences (see Hoehme et. al. below), the pair and a team of colleagues detailed an early version that showed how the liver would respond when exposed to a known toxic compound. The model’s findings matched up with laboratory tests of the same exposure, verifying its accuracy, but it also revealed a never-before-seen process that proved central to the liver’s ability to heal itself. While this result showed the potential of a virtual liver, the ultimate goal is a model that can recreate the actual processes taking place inside. “This would be as if you could look into the liver and follow the fate of all the molecules you are interested in,” Drasdo says.
Outlook
The liver is only one important aspect of the overall toxicological picture, and even the predictive model Drasdo describes remains a relatively distant goal. “To really establish this as a solid tool could take another five to 10 years,” he notes.
In the more immediate future, the SEURAT effort has other goals to achieve. Mark Cronin of the University of Liverpool notes that one of the tasks has been to create good databases upon which the computer models can draw. “That might sound trivial,” he says, “but organizing the toxicological data has turned out to be very challenging.”
The range of scientific and logistical hurdles remaining is not good news to the cosmetics companies. When the ban went into effect last year, Cosmetics Europe, the industry group funding the research, protested that it was too soon to eliminate animal testing as the alternatives were not yet robust enough. In the meantime, they are effectively hamstrung; if they identify a new ingredient that has not previously been proven safe, they basically have to wait, as there are few ways to validate its safety now.
Still, experts say the research is progressing faster than they expected. Regardless of the exact timing, the scientists insist some combination of laboratory-based and virtual work is the future of testing.
“I don’t think there is any going back at this point in time. I think the train has left the station,” says toxicologist Bette Meek of the University of Ottawa. “In vitro and in silico testing will play a much larger part in how we assess chemicals in the future. It will happen; it is just a question of how quickly.”
Further Reading
Towards the replacement of in vivo repeated dose systemic toxicity testing. The Annual Scientific Report of the SEURAT Research Initiative. Vol. 3, 2013. http://www.seurat-1.eu/
Niklas, J., Bucher, J., et al.
Quantitative evaluation and prediction of drug effects and toxicological risk using mechanistic multiscale models. Molecular Informatics, Nov. 2012.
Gunness, P., Mueller, D., et al.
3D organotypic cultures of human HepaRG cells: A tool for in vitro toxicity studies. Toxicological Sciences, 133:1, 2013.
Hoehme, S., et al.
Prediction and validation of cell alignment along microvessels as order principle to restore tissue architecture in liver regeneration. PNAS, June 8, 2010.
NOTOX
A video introduction to the computer-modeling-focused research cluster: http://notox-sb.eu/film
Join the Discussion (0)
Become a Member or Sign In to Post a Comment