It is used in the preparation of environmental impact assessments, and in many circumstances in which human activities may cause harmful effects on the natural environment.
In some cases this may involve collecting data related to events in the distant past such as gasses trapped in ancient glacier ice.
With the evolution of new chemicals and industrial processes has come the introduction or elevation of pollutants in the atmosphere, as well as environmental research and regulations, increasing the demand for air quality monitoring.
Passive samplers, such as diffusion tubes, have the advantage of typically being small, quiet, and easy to deploy, and they are particularly useful in air quality studies that determine key areas for future continuous monitoring.
[19] Soil monitoring has historically focused on more classical conditions and contaminants, including toxic elements (e.g., mercury, lead, and arsenic) and persistent organic pollutants (POPs).
Monitoring programmes have varied over the years, from long-term academic research on university plots to reconnaissance-based surveys of biogeoclimatic areas.
[28] The monitoring process itself may be performed using technologies such as remote sensing and geographic information systems (GIS) to identify salinity via greenness, brightness, and whiteness at the surface level.
In the last 20 years acid rain, synthetic hormone analogues, halogenated hydrocarbons, greenhouse gases and many others have required changes to monitoring strategies.
[30] One of the most familiar examples is the monitoring of numbers of Salmonid fish such as brown trout or Atlantic salmon in river systems and lakes to detect slow trends in adverse environmental effects.
Although pathogens are the primary focus of attention, the principal monitoring effort is almost always directed at much more common indicator species such as Escherichia coli,[35] supplemented by overall coliform bacteria counts.
Monitoring strategies can produce misleading answers when relaying on counts of species or presence or absence of particular organisms if there is no regard to population size.
Therefore, conclusions about the target population are limited and depend entirely on the validity and accuracy of professional judgment; probabilistic statements about parameters are not possible.
Simple random sampling is most useful when the population of interest is relatively homogeneous; i.e., no major patterns of contamination or “hot spots” are expected.
Random systematic sampling is used to search for hot spots and to infer means, percentiles, or other parameters and is also useful for estimating spatial patterns or trends over time.
One sampling unit from each set is then selected (based on the observed ranks) for subsequent measurement using a more accurate and reliable (hence, more expensive) method for the contaminant of interest.
It enables delineating the boundaries of hot spots, while also using all data collected with appropriate weighting to give unbiased estimates of the population mean.
For example, an autosampler can be programmed to start taking samples of a river at 8-minute intervals when the rainfall intensity rises above 1 mm / hour.
The trigger in this case may be a remote rain gauge communicating with the sampler by using cell phone or meteor burst[46] technology.
Such systems routinely provide data on parameters such as pH, dissolved oxygen, conductivity, turbidity and ammonia using sondes.
[47] It is also possible to operate gas liquid chromatography with mass spectrometry technologies (GLC/MS) to examine a wide range of potential organic pollutants.
Dissolved oxygen concentration is difficult to sustain through a pumped system and GLC/MS facilities can detect micro-organic contaminants from the pipework and glands.
Passive samplers are semi-disposable and can be produced at a relatively low cost, thus they can be employed in great numbers, allowing for a better cover and more data being collected.
The use of remote surveillance also allows for the installation of very discrete monitoring equipment which can often be buried, camouflaged or tethered at depth in a lake or river with only a short whip aerial protruding.
The output of data analysis from remote sensing are false colour images which differentiate small differences in the radiation characteristics of the environment being monitored.
Active remote sensing emits energy and uses a passive sensor to detect and measure the radiation that is reflected or backscattered from the target.
Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides information to monitor trends such as El Niño and other natural long and short term phenomena.
The interpretation of environmental data produced from a well designed monitoring programme is a large and complex topic addressed by many publications.
Regrettably it is sometimes the case that scientists approach the analysis of results with a pre-conceived outcome in mind and use or misuse statistics to demonstrate that their own particular point of view is correct.
Since the start of science-based environmental monitoring, a number of quality indices have been devised to help classify and clarify the meaning of the considerable volumes of data involved.
[55] The Environment Agency and its devolved partners in Wales (Countryside Council for Wales, CCW) and Scotland (Scottish Environmental Protection Agency, SEPA) now employ a system of biological, chemical and physical classification for rivers and lakes that corresponds with the EU Water Framework Directive.