Risk Assessment for Water

Risk assessment framework

Tables 1 and 2 show the essential terminology as well as the methodological framework followed for risk assessment. Hazards or potential sources of contamination which may lead to some form of harm on the local environment should be first identified and evaluated. Then the establishment of a potential source-pathway-target relationship is required, such that targets could be exposed to or affected by a hazard. For each plausible pollutant pathway identified, a risk assessment should be undertaken. Thereafter the risk is evaluated by assessing the probability of the hazard reaching the target as well as by assessing the significance of the impact should the hazard reach the target (Harris and Herbert, 1994; DOE, 1996; Komnitsas et al., 1998).

Table 1: Risk assessment terminology (Komnitsas et al., 1998)

Table 2: Methodology for risk assessment (Komnitsas et al., 1998)

The aim of a site investigation is to provide appropriate and reliable data in order to estimate risk for selected and identified receptors. Therefore site investigation is required to identify the sources of contamination as well as gather all information regarding site contamination. The required data for risk assessment should provide information on (Ferguson et.al., 1998):

  • Location, extent and types of contaminants that may exist
  • Geological and geochemical conditions of the site and surrounding land
  • Hydrogeological and hydrological regime for the broader area
  • Known/anticipated presence and behaviour of receptors
  • Ecological and ecotoxicological characteristics of the site

Many researchers use geostatistics in order to predict the extent of soil and groundwater contamination as well as to calculate risk due to waste disposal (Komnitsas and Modis, 2006 and 2009; Modis et al., 2008).
The evaluation of hazards by comparing test results with established standards is not always adequate since risk does not exist if there is no relationship between source, pathway and target. In line of this approach, the study of fate and transport is mainly concerned with describing and understanding the various pathways or routes through which a receptor might be placed at risk from contamination. Routes through which a contaminant might be transported include soil, groundwater, surface water, uptake or absorption by plants, dust, aerosols etc. In addition, a contaminant may also undergo transformations through biological, chemical or physical means that might affect its toxicity, availability and mobility.
During dispersion in soils (as in the case of OOMW), contaminants (e.g. polyphenols) are affected by a number of physical, geochemical and biological processes, which may attenuate, concentrate, immobilize, liberate, degrade or otherwise transform them. Therefore the precise calculation of risk depends on both the concentration of a contaminant and the route of exposure (Deliverable, PROSODOL project, Action 7, 2011).
The risk for each potential pathway is considered to be a combination of the probability that a hazard will reach the target (e.g. contamination of surface- and groundwater from OOMW) and the magnitude of harm if the target is exposed to the hazard (e.g. if contaminated surface or groundwater is used for irrigation purposes). The probability that a contaminant will reach a target in sufficient concentration to cause harm may be assessed qualitatively according to the scale: high (certain or near certain to occur), medium (reasonably likely to occur), low (seldom likely to occur) or negligible (never likely to occur). The magnitude of harm is assessed as: severe (human fatality or irreparable damage to the ecosystem), moderate (e.g. human illness or injury, negative effects on ecosystem function), mild (minor human illness or injury, minor changes to ecosystem) or negligible (nuisance rather than harm to humans and the ecosystem). The qualitative level of risk associated with each pollutant pathway is then assigned by the combination of the aforementioned probability with the magnitude of harm as shown in Table 3 (Xenidis et al., 2003).

Table 3: Risk assessment rating

In order to mitigate different risks, the following diagram shown in Figure 1 (Ganoulis, 2009) may be used in practice. However, the cases where risk analysis may be used as an analytical tool may be distinguished from those where there is limited knowledge on risks and therefore a precautionary principle is used for deciding on alternative preparedness measures.
A residual or unexpected risk, which has a very small but never zero probability of being realized is always possible. In this case restoration and rehabilitation plans should be developed in advance and effectively applied in case of unexpected natural or technological disasters (e.g. the 1986 Chernobyl nuclear explosion in Europe or the August 2005 hurricane Katrina and the resulting catastrophic floods in New Orleans, USA) (Ganoulis, 2009).

Assessing priorities for risk management

Safe drinking water is a basic need for human development, health and well-being. The World Health Organization (WHO) has published procedures for assessing chemical health risks in drinking water and established safe guideline (WHO, 1994; 1999). These assessments may be used to manage chemical risks to water safety by the development of control and monitoring programs and of national standards for drinking water quality.
However, to make such assessments and develop management strategies for every chemical would be impractical and would require considerable resources, posing problems for many countries. A more effective approach where resources are limited is to identify and focus on priority chemicals for which significant human exposure is expected to occur, taking into consideration that priorities may vary from country to country as well as within countries.
In many countries, the development of appropriate risk management strategies is hampered by a lack of information on the presence and concentration of chemicals in drinking water. Water authorities attempting to identify priority chemicals despite having limited information would benefit from guidance on simple and rapid assessment methods. These could be applied at a national or local level to provide a shortlist of priority chemicals, which could then be more rigorously assessed for health risks.
The WHO Guidelines for Drinking-water Quality (WHO, 2004; 2006) cover both microbial and chemical contaminants of drinking water and describe in detail the scientific approaches used in deriving guideline values. Therefore provide sound guidance for ensuring an appropriate level of safety and acceptability of drinking water for the development of national standards, while taking into consideration the specific problems and cultural, social, economic and environmental conditions of a particular country.
The criteria for including specific chemicals in the WHO Guidelines for drinking water quality are any of the following:

  • there is credible evidence of occurrence of the chemical in drinking water, combined with evidence of actual or potential toxicity
  • the chemical is of significant international concern

Applying these criteria, the guidelines list nearly 200 chemicals for which guideline values have been set or considered; this number may vary over time. It is important to note that the lists of chemicals for which WHO guideline values have been set do not imply that all those chemicals will always be present, nor do they imply that specific chemicals for which no guideline values currently exist will not be present in a water supply. However, it is not necessary for national or local authorities to develop risk management strategies for each and every chemical for which guideline values have been set, but rather to identify and select those chemicals that may be of greatest priority for risk management purposes in the particular setting.

Fault tree analysis for risk in water systems

Drinking water systems are vulnerable and subject to a wide range of risks. To avoid suboptimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events.
Fault tree analysis is a risk estimation tool with the ability to model interactions between events. A fault tree models the occurrence of an event based on the occurrence or non occurrence of other events. Therefore an integrated risk analysis of drinking water systems may be carried out, by estimating not only the probability of failure but also the mean failure rate of the system.
A fault tree illustrates the interactions between different events using logic gates and shows how the events may lead to system failure, i.e. the top event. Starting with the top event, the tree is developed until the required level of detail is reached. Events whose causes have been further developed are intermediate events, and events that terminate branches are basic events. While the top event can be seen as a system failure, the basic events are component failures (Bedford and Cooke, 2001; Lindhe et al., 2009).

Need for quality standards and guidelines for drinking water

Every country should have a policy on drinking-water quality. Effective national programs to control drinking-water quality depend ideally on the existence of adequate legislation, standards and codes. The precise nature of the legislation in each country will depend on national, constitutional and other considerations.
The nature and form of drinking-water standards may vary between countries and regions – no single approach is universally applicable. It is essential in the development and implementation of standards to take into account current and planned legislation relating to the water, health and local government sectors and to assess the capacity of potential regulators in the country, as well as the needs of each country for drinking-water standards.
National and regional standards should be developed from the scientific basis provided by the WHO Guidelines for Drinking-water Quality (WHO, 2004; 2006), adapted to take account of local or national environmental, sociocultural (including dietary) and economic conditions. The guidelines provide further information on the development and implementation of national standards.
The implementation of a successful risk management strategy requires the development of an understanding of those hazards that may impact on the quality of water being provided to a community. A wide range of chemicals in drinking-water could potentially cause adverse human health effects. The detection of these chemicals in both raw water and in water delivered to consumers is often slow, complex and costly, which means that detection is too impractical and expensive to serve as an early warning system. Thus, reliance on water-quality determination alone is insufficient to protect public health.
As it is neither physically nor economically feasible to test for all drinking-water quality parameters, monitoring effort and resources should be carefully planned and directed at significant or key characteristics. A preventive management strategy should be implemented to ensure drinking-water quality. The strategy should combine protection of water sources, control of treatment processes and management of the distribution and handling of water.
Α water safety plan should address all aspects of the water supply and should focus on the control of water production, treatment and delivery of drinking-water, up to the point of consumption. The plan provides the basis for a process control methodology to ensure that concentrations of chemicals are acceptable. Such a plan, which is the basis of ensuring water safety, contains three key components:

  • A full system assessment to determine whether an existing or planned drinking-water supply is capable of meeting health-based targets.
  • Identification of measures that will control identified risks and ensure that those health-based targets are met within the system. For each measure, appropriate monitoring procedures should be defined to ensure that deviations from performance criteria are quickly detected.
  • Development of management plans to describe actions to be taken during normal operation or incident conditions and to document the system assessment (including upgrades and improvements), monitoring and communication plans and supporting programs.

Assessment of risk for humans due to fertilizer application

The recalcitrant compounds such as metals contained in treated or untreated AW, may cause risk for humans, mainly farmers and their children as well as ecosystems in case of compost/fertilizer application on soil. Both direct and indirect exposure to contaminants through ingestion of vegetables produced on fertilized soil or animals fed in these areas, should be taken into consideration. Therefore the concentrations for each metal in soils, surface water, plant tissue (fruits, vegetables, grains and forage) and animal tissue (fish and beef and dairy products) should be measured (U.S. EPA and CEA, 1999).

The following parameters are to be taken into consideration:

  • characterization of fertilizers produced i.e. composition, patterns of use and application rates
  • characterization of the implementation area i.e. agricultural land use, climate data, soil data, farm size, crop types and plant uptake factors
  • description of receptors i.e. body weight and inhalation rate for humans, as well as exposure pathways and duration

The general framework of the ecological risk assessment is shown in Table 4 and consists of three major phases: 1) problem identification, 2) analysis and 3) risk characterization (U.S. EPA, 1992).

The contaminant fate and transport models are used to determine contaminant concentrations in the soil, to estimate risk from the air pathway or from surface water as well as to estimate exposure point concentrations in the food chain. The relevant exposure routes for humans are:

  • Direct ingestion of the compost (fertilizer) during application
  • Ingestion of soil amended with compost
  • Inhalation of particles and vapors in the air during and after compost application
  • Ingestion of plant, vegetables and fruits produced on amended soil as well as animals fed in this soils
  • Ingestion of fish from streams located adjacent to amended fields

The dose-response assessment determines the most sensitive health effects associated with a compound and attempts to express the relationship between dose and effect in quantitative terms, known as toxicity values or health benchmarks. Generally, health benchmarks are developed by EPA in four types: reference doses (RfDs), reference concentrations (RfCs), cancer slope factors (CSFs) and unit risk factors (URFs).


Table 4: Critical phases of the ecological risk assessment

Phase I

Problem identification

  • Identify stressor characteristics such as type, intensity, duration, frequency, timing, scale
  • Identify the ecosystem potentially at risk
  • Evaluate existing data of ecological effects
  • Select appropriate endpoints, considering ecological relevance and susceptibility to the stressor
  • Develop a conceptual model, working hypothesis regarding how the stressor might affect the ecological components of the ecosystem

Phase II


Characterization of exposure:

  • Characterize the stressor in terms of distribution or pattern of change
  • Characterize the ecosystem
  • Analyze the potential exposure
  • Develop an exposure profile

Characterization of ecological effects:

  • Evaluate the relevant effects data
  • Analyze the ecological response in terms of stressor –response determinations or extrapolations and causal evidence evaluation
  • Develop a stressor-response profile

Phase III

Risk characterization

  • Estimate the risk
  • Integrate the stressor-response and exposure profiles
  • Identify uncertainty in the analyses
  • Describe the risk
  • Summarize the risk assessment
  •  Interpret the ecological significance

RfDs and RfCs, used to evaluate non-cancer effects for ingestion and inhalation exposures, respectively, are defined as an estimate of a daily exposure level for the human population, including sensitive subpopulations that is likely to be without an appreciable risk of deleterious effects during a lifetime (U.S. EPA, 1989). RfDs are expressed in mg of chemical intake per kg body weight per day (mg/kg/d) and RfCs are expressed as mg of chemical per m3 of air (mg/m3). RfCs may be converted into inhalation RfDs in mg/kg/d by multiplying by the inhalation rate and dividing by the body weight.

CSFs and URFs may be derived from a number of statistically and/or biologically based models. CSFs and URFs are used to evaluate cancer effects for ingestion and inhalation exposures, respectively. However, they do not represent “safe” exposure levels; rather they are expressed as an upperbound slope factor that relates levels of exposure with a probability of effect or risk. The CSF is expressed in units of (mg/kg/d)-1 and the URF for inhalation exposures is expressed in units of (μg/m3)-1.

The health benchmark values used in risk analysis for metals were developed by U.S. EPA and are presented in Table 5.


Table 5: Health benchmark values for metals in fertilizers (U.S. EPA and CEA, 1999)


RfD (mg/kg/d)

RfC (mg/m3)

CSF oral (mg/kg/d)-1

Inhalation URF (μg/m3)-1


























Hg (Elemental divalent methyl mercury)




















NA = Not available

 The estimated exposure point concentration in soil, plants and animal products should be combined with toxicity benchmarks and exposure factors such as exposure duration and ingestion rates to estimate human health risk.

The risk for all ingestion pathways for a single constituent is assumed to be additive. Ingestion exposures are assumed to occur in the same time for the same individuals and the same health benchmark is applicable to all ingestion exposures. For constituents with non-cancerous endpoints, inhalation exposure is additive to ingestion exposure only if the same human health benchmark endpoint is applicable to both pathways.

Similar additions of risk may be considered for different metals in the same product. For metals, however, only arsenic is considered as carcinogen by the oral route. All other metals are not carcinogenic by the oral route and do not have common health benchmark endpoints; thus, ingestion exposures to different metal constituents in a single product are not considered additive. No other metal exposures are considered additive because there are no common target organs for the non-cancerous human health benchmarks for these metals (U.S. EPA and CEA, 1999).


  • Bedford T. and R.M. Cooke (2001). Probabilistic Risk Analysis: Foundations and Methods. Cambridge University Press, Cambridge, ISBN 0 521 22320 2.
  • Deliverable “Risk map of the pilot municipality in Crete” (March, 2011). LIFE07 ENV/GR/000280 Project “Strategies to improve and protect soil quality from the disposal of olive oil mills’ wastes in the Mediterranean region” (PROSODOL), Action 7, Prepared by Technical University Crete, Hania, Crete, Greece.
  • DOE (1996). Environmental Protection Act 1990 Part IIA Contaminated Land--Draft Statutory Guidance on Contaminated Land, U.K.
  • Ferguson C., D. Darmendrail, K. Freier, B.K. Jensen, J. Jensen, H. Kasamas, A. Urzelai and J. Vegter (editors) (1998). Risk Assessment for Contaminated Sites in Europe. Volume 1. Scientific Basis. LQM Press, Notitingham, ISBN 0 9533090 0 2.
  • Ganoulis J. (2009). Risk Analysis of Water Pollution, Second, Revised and Expanded Edition, WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim, Germany, ISBN: 978-3-527-32173-5.
  • Harris M. and S. Herbert (1994). ICE Design and Practice Guides Contaminated Land Investigation, Assessment and Remediation, Institution of Civil Engineers, U.K.
  • Komnitsas K. and K. Modis (2006). Soil risk assessment of As and Zn contamination in a coal mining region using geostatistics, Sci Total Environ 371, 190-196.
  • Komnitsas K. and K. Modis (2009). Geostatistical risk estimation at waste disposal sites in the presence of hot spots, J Hazard Mater 164, 1185-1190.
  • Komnitsas K., A. Kontopoulos, I. Lazar and Cambridge M. (1998). Risk Assessment and proposed remedial actions in coastal tailing disposal sites in Romania, Miner Eng 11, 1179-1190.
  • Lindhe A., L. Rosѐn, T. Norberg and O. Bergstedt (2009). Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems, Water Research 43, 1641-1653.
  • Modis K., G. Papantonopoulos, K. Komnitsas and K. Papaodysseus (2008). Mapping optimasation based on sampling in earth related and environmental phenomena, Stoch Environ Res Risk Assess 22, 83-93.
  • U.S. Environmental Protection Agency (U.S. EPA) (1989). Interim Procedures for Estimating Risks Associated with Exposures to Mixtures of Chlorinated Dibenzo-p-dioxins and -ibenzofurans (CDDs and CDFs) and 1989 Update. Washington, DC.
  • U.S. Environmental Protection Agency (1992). Framework for Ecological Risk Assessment. EPA 630-R-92-001, Washington DC.
  • U.S. Environmental Protection Agency (U.S. EPA) and Center for Environmental Analysis (CEA) (1999). Estimating Risk from Contaminants Contained in Agricultural Fertilizers, Draft Report, available on-line at http://www.epa.gov/osw/hazard/recycling/fertiliz/risk/report.pdf.
  • WHO (1994). Environmental Health Criteria Document No. 170 Assessing human health risks of chemicals: derivation of guidance values for health-based exposure limits, World Health Organization, Geneva.
  • WHO (1999). Environmental Health Criteria Document No. 210. Principles for the assessment of risks to human health from exposure to chemicals, World Health Organization, Geneva.
  • WHO (2004). Guidelines for Drinking-water Quality, 3rd ed., Volume 1: Recommendations, World Health Organization, Geneva.
  • WHO (2006). Guidelines for Drinking-water Quality, 1st Addendum to the 3rd ed., Volume 1: Recommendations, World Health Organization, Geneva.
  • Xenidis A., N. Papassiopi and K. Komnitsas (2003). Carbonate-rich mining tailings in Laurion: risk assessment and proposed rehabilitation schemes, Adv Environ Res 7, 479-494.