Online water quality sensors produce large amounts of measurement

results, typically one reading every 1 – 15 minutes. This data is of

unknown quality and needs to be validated, but it is difficult for

operators to manually evaluate the amount of data collected,

especially when a network has a large number of sensors.


After data cleansing, the next step is to analyse the information contained in the data. Automated software routines, often referred to as Big Data Analytics, are crucial for detection and classification of events, such as recurring, operationally induced changes (e.g. pump and valve switching, changing water source, diurnal and seasonal changes, consumption patterns, etc.), changes induced by operational problems (e.g. pipe breaks, leaks, treatment upsets, etc.), and water quality issues (e.g accidental or intentional water contamination). With faster data processing software, an operator can fully evaluate and respond to unusual events and changes more efficiently.




Data processing software performs data validation and information extraction from large quantities of data that are too time-consuming to be processed manually. Data processing software also allows real-time operational control and proactive asset management. Furthermore, data processing software can be used not only to evaluate the response from individual sensors but to recognise patterns between combinations of sensors, as well as between multiple installation sites in the network. This may allow for localisation of events or sources of contaminants. 




Various types of data processing software are available:


  • Software from sensor manufacturers, focused on validation and interpretation of the data from those vendors sensors

  • Specialised data processing software, available as commercial packages for deployment as central data processing tool in a utility

  • Third-party software focused on event detection

  • Big Data analytics used by other sectors (i.e. the credit card industry)