Artificial radioactive affliction to Earth’s environment began with nuclear weapon testing during World War II, but did not become a prominent topic of public discussion until the 1980s.
[2] As demand for construction of nuclear power plants increased, it became necessary for humankind to understand how radioactive material interacts with various ecosystems in order to prevent or minimize potential damage.
Independent researchers collected data regarding the various dosage levels and geographical differences among the afflicted areas, allowing them to draw conclusions about the nature and intensity of the damage caused to ecosystems by the disaster.
[5] These local studies were the best available resources in containing the effects of Chernobyl, yet the researchers themselves recommended a more cohesive effort between the neighboring countries to better anticipate and control future radioecological issues, especially considering the ongoing terrorism threats of the time and the potential use of a "dirty bomb.
"[6] Japan faced similar issues when the Fukushima Daiichi nuclear disaster occurred, as its government also experienced difficulty organizing collective research efforts.
[7] European scientists from various countries had been pushing for joint efforts to combat radioactivity in the environment for three decades, but governments were hesitant to attempt this feat because of the secrecy involved in nuclear research, as technological and military developments remained competitive.
[10] Basic herbaceous or bivalve species such as mosses, lichens, clams, and mussels are often the first organisms affected by fallout in an ecosystem,[11] as they are in closest proximity to the abiotic sources of radionuclides (atmospheric, geological, or aquatic transfer).
Radioecology often calls into question the ethics of protecting human health versus the preservation of the environment in the interest of fighting extinction of other species,[13] but public opinion on this matter is shifting.