Toxicity testing aims to evaluate what harm may be caused by exposure to chemicals, including medicines, industrial and consumer products, and food additives, to humans and the environment. Currently; however, toxicity testing is usually performed on animals, including rabbits, rats, mice, dogs, cats, primates, hamsters, and even fish. Tests are performed by exposing animals to very high doses of chemicals—often at levels 100 to 1,000 times higher than humans would typically be exposed to. Some tests are required by agencies like the Environmental Protection Agency (EPA) before a company can market a product, such as pesticide chemicals. Generally, the United States does not require tests on cosmetics or household products—with the exception of antibacterial cleaning products, which must be tested.
Toxicity testing can vary in duration from four hours to several days or months to animals’ entire life spans. The animals are observed for toxic effects, including vomiting, diarrhea, convulsions, respiratory distress, appetite or weight loss, rashes, salivation, paralysis, lethargy, bleeding, organ abnormalities, tumors, and, ultimately, death. In some tests, animals are exposed to chemicals and then bred with other animals; these animals are then observed for harm to the reproductive system, or their offspring are observed for birth defects.
Toxicity tests conducted on nonhuman animals are very controversial, for both ethical and scientific reasons. But nonanimal methods can now test thousands of potentially toxic chemicals at once and provide toxicity information more relevant to humans.
What are some concerns with the use of animals in toxicity testing?
Animal-based testing—which causes millions of animals to suffer unalleviated pain and death—has questionable value for predicting human health effects.
To use the results of tests performed on animals to predict human reactions, scientists and regulators need to assume that the biological and physical processes of other mammals respond to chemicals in ways similar to those of humans. But that assumption is often incorrect, as scientists are beginning to understand. All animals do share aspects of biology very generally; however, the details about how each species absorb, metabolize, and excrete chemicals vary widely. Species differences and the use of very high testing doses lead to difficulties in interpreting animal toxicity testing results in humans. This can lead to more testing, delaying appropriate chemical regulations.
Testing on animals is also very costly and time consuming. A series of tests on one chemical costs around $6 million and can take three years to plan, conduct, and summarize. Because of these constraints, toxicity assessment needs are increasingly outpacing the capacity of toxicity testing laboratories.
In addition, these animal-based methods are clearly not the best science. A paradigm shift in regulatory testing is needed to achieve more human-relevant results. Some in vitro (cell culture) methods are available, and more are being developed, that provide more accurate ways of determining chemical hazard information. These methods, which determine specific human toxicity responses at the molecular level, are much faster because they make use of high-throughput robotic technology to screen thousands of chemicals at once.
What is the Toxic Substances Control Act (TSCA)? Is TSCA in need of modernization?
The Toxic Substances Control Act (TSCA) is federal legislation passed in 1976 that regulates industrial chemicals. Congress is currently reviewing TSCA to see if it should be modernized to meet current standards in science and regulation. It is being revisited in light of legislative changes to the regulation of industrial chemicals in other regions of the world, most notably Europe’s Registration, Evaluation, Authorisation and Restriction of Chemical substances (REACH) program. Although REACH will lead to millions of additional animals being killed in toxicity tests, it also requires nonanimal testing methods and integrated testing strategies for chemical risk assessment when possible.
If toxicity-testing methods are not updated in tandem with TSCA reform, tens of millions of additional animals could be killed to fulfill new data requirements. Furthermore, reliance on animal testing will only delay effective regulation; the longer toxicity testing takes, the longer the public is exposed to untested and potentially toxic chemicals.
How can chemical regulatory legislation be more effective?
To be effective, a revised TSCA needs to include strong language requiring toxicity testing techniques that rely less on animal testing and more on humane and human-relevant methods that adequately and efficiently determine risks to human health before chemicals reach the market. Chemicals already in commerce should be prioritized and tested based on potential risk to public health.
What is the Safe Chemicals Act of 2011 (S. 847)
On April 14, 2011, Sen. Frank Lautenberg introduced the Safe Chemicals Act of 2011 (SCA). The bill was similar to the Safe Chemicals Act of 2010 in that it would extensively amend the existing law and require producers of industrial chemicals to better assess the potential hazards of chemicals before they reach the market. Because of the limitations of animal-based toxicity testing, the goals of SCA will not be accomplished through reliance on current toxicity testing methods.
SCA contains a number of provisions designed to reduce the numbers of animals killed in chemical testing and direct the EPA to fund and develop nonanimal test methods, as recommended by the National Academy of Sciences. The bills gave the EPA the authority and flexibility to modify testing requirements based on a chemicals characteristics or what might already be known about a chemical or groups of chemicals. While this is excellent news, these provisions are only a first step. Most importantly, the EPA must be given the authority to require nonanimal tests.
What is “21st-Century Toxicology” and how will it improve chemical safety?
With the success of Tox21, the expenses of toxicity testing—time, money, and animal suffering—would be greatly reduced. Christopher Austin, M.D., Director of the NIH Chemical Genomics Center, found that "...it would take a person eight hours a day, five days a week, for 12 years to do what we do in three days." And scientists will be able to determine how chemical effects (such as a cancer) are caused with great accuracy and precision, creating a more efficient chemical regulation process.
“Twenty-first Century Toxicology” or “Tox21” refers to a diverse set of efforts intended to implement a report written by the National Research Council called Toxicity Testing in the 21st Century: A Vision and a Strategy. The report details a step-by-step plan to modernize toxicity testing using methods that better predict human responses to chemicals, rather than relying on animal tests to guess the effects in humans. Through Tox21, university, government, and private efforts and partnerships are working to achieve toxicity testing that is more efficient, uses fewer and eventually no animals, and is more beneficial to human and environmental safety. This new paradigm rests on methods that can identify biologically-based mechanisms of toxicity, prioritize chemicals for evaluation, and develop more predictive models of human responses to chemicals.
Tox21 strives to:
achieve broad coverage of chemicals, outcomes, and life stages through scientific methods and data sharing;
reduce the cost and time required for toxicity testing;
develop a more robust scientific basis for assessing health effects of environmental chemicals; and
minimize the use of animals.
With the success of Tox21, the expenses of toxicity testing—time, money, and animal suffering—would be greatly reduced. And scientists will be able to determine how chemical effects (such as a cancer) are caused with great accuracy and precision, creating a more efficient chemical regulation process. Tox21 offers a path to a more advanced—and safer—chemical market.
Because many tests can be run at once, scientists can investigate the effects of mixtures of chemicals, and a larger range of low and high doses, on human cells and tissues. Christopher Austin, M.D., Director of the NIH Chemical Genomics Center, gives an overview of how the system works. Video Courtesy National Human Genome Research Institute.
Using more human-relevant methods, as the NRC Vision recommends, will allow regulatory agencies like the EPA to better understand the direct effects of a chemical on humans at a molecular level. Because many tests can be run at once, scientists can investigate the effects of mixtures of chemicals, and a larger range of low and high doses, on human cells and tissues. The NRC Vision recommends a nationally-integrated biomonitoring program, which will detect potential chemicals exposures before they become harmful. Eventually, these methods will also allow a much more thorough examination of the potential effects of chemicals on sensitive populations, such as developing children, workers, or those with genetic susceptibilities. Finally, we will be able to save a great deal of time and money and assess many more chemicals than currently possible with animal tests.
What about wildlife? How will Tox21 protect our environment?
Currently, some chemicals—mostly pesticides—are tested in one species of birds and a few species of fish in a laboratory setting. Just as with animal testing intended for human health predictions, there are crucial weaknesses to this scheme. First, it is extremely unethical to confine undomesticated animals in a laboratory setting. Also, wild animals are exposed to many chemicals at once, not just one, and each species may respond to a chemical exposure differently. A test in a quail doesn’t necessarily predict the response in a robin. Finally, the dearth of information on chemicals’ potential effects on wildlife dwarfs the human health question—this problem is too big to be tested away using inefficient animal-based methods. We need a better way to address these weaknesses, and because the toxicity scheme proposed by the National Research Council is intended to be cost- and time-efficient and species specific, it presents a real solution to imperfect protection of wild animals.
How will the Tox21 vision be reached?
Mel Anderson, Ph.D., answers the question “What is the biggest hurdle to getting Tox 21 off the ground?"
The NRC report lays out specific steps to modernize toxicity testing. A focused research effort will be needed to develop, assess, and implement the alternative methods recommended in the report. This vision would be accomplished in phases, beginning with elucidating toxicity pathways and developing relevant databases. This information would be used to develop testing methods that address specific biological elements of toxicity. These assays would then be evaluated for reliability and relevance to human responses and used in place of current chemical toxicity tests. Some of these efforts are already underway in several areas of the federal government and in private and public research institutes, but a much larger and more concerted effort has yet to emerge.
But we need not wait for these developments to apply some of the scientific advances that have already been made. Right now, many nonanimal methods are well developed and have been shown to be reliable. They can be used, along with exposure information, to prioritize chemicals for further testing. High-throughput screens can screen thousands of compounds quickly and relatively inexpensively. These screens have been used by pharmaceutical companies to prioritize lead compounds for development; these same approaches are currently being applied to chemicals for toxicity assessment.
What is an “Integrated Testing Strategy”?
An Integrated Testing Strategy, or ITS, is a process of holistic and iterative chemical evaluation based on all available information. Such an evaluation can be used to tailor a testing program to the particular characteristics of a chemical, as an alternative to conducting a long list of animal tests on each chemical. To assess a chemical or group of chemicals, one gathers all available information, including uses and exposures, physico-chemical properties, in vitro, in silico, toxicological, and epidemiological data, and all data on any similar chemicals. Then one assesses this information to determine the likely hazards a chemical may pose, before considering any testing. Testing then proceeds in a step-wise or tiered manner, and an assessment of the weight of the available evidence is made again before further testing is considered.
As new tools evolve, ITS approaches have become popular as a way to reduce the use of animals in toxicity tests. PCRM and Dow Chemical followed the approach of the EPA’s High Production Volume (HPV) Challenge Program to determine that a reproductive and developmental toxicity test should not be conducted. The company used both information about similar chemicals and a computer algorithm based on known information about human skin to test the chemical commercial hydroxyethylpiperazine (CHEP) to determine that since CHEP was not absorbed through the skin, and workers wore protective equipment, toxic effects due to skin exposure were extremely unlikely. This approach saved approximately 675 animals.
Check the Resources page to see a few more examples of ITS approaches.
What are some Tox21 approaches for chemical reform?
Basic strategies that save animals include the release of existing data on industrial chemicals, whether from industry files or other testing programs like HPV or REACH, and the formation of categories of chemicals to allow read-across of hazard data among similar chemicals. Chemicals should be prioritized based on human or environmental exposures, and priority should be given to chemicals of known concern, such as those that bioaccumulate or persist in the environment.
A minimum list of toxicity tests to be conducted on all chemicals is not appropriate—and simply not feasible. Instead, integrated testing strategies, which rely on all existing information in a weight-of-evidence approach, should be created for chemicals or groups of chemicals.
In vitro and in silico (computer-based) techniques can be used now to test many chemicals at once. This would reduce the uncertainty that currently exists and give regulatory agencies much more information than is currently possible using animal tests, allowing prioritization of potentially toxic chemicals for further evaluation. This information could also be used to group chemicals into categories. Chemicals with similar structures or functions often have similar toxicities, so toxicity information that already exists for one chemical can be applied to other chemicals in a category.
To be successful, chemical reform needs to provide strong incentives encouraging the modernization of toxicity testing approaches. Revised legislation should require the use of nonanimal methods before any animal testing is considered, and it should include a public review process before new animal testing takes place. Finally, and most importantly, Congress should provide a significant amount of funding and organizational support for the development, evaluation, and implementation of alternative methods in accordance with NRC's Vision for Toxicity Testing in the 21st Century.
What can I do to get involved?
Receive our action alerts and join our Facebook cause, to stay up to date on information regarding TSCA reform. PCRM will get in touch with you when it's time to take action.
If you are a scientist or physician and would like to help PCRM advocate for toxicity testing reform, please contact research at pcrm dot org.
How can I get more information?
Visit the Resources page for articles and abstracts about this topic. The National Research Council report and more information about Tox21 can be found at Chemical Testing Basics. If you have comments or questions, please e-mail us at research at pcrm dot org or call 202-686-2210.