Date Published: August 30, 2019
Publisher: Public Library of Science
Author(s): Catherine Leigh, Sevvandi Kandanaarachchi, James M. McGree, Rob J. Hyndman, Omar Alsibai, Kerrie Mengersen, Erin E. Peterson, Weili Duan.
Water-quality monitoring in rivers often focuses on the concentrations of sediments and nutrients, constituents that can smother biota and cause eutrophication. However, the physical and economic constraints of manual sampling prohibit data collection at the frequency required to adequately capture the variation in concentrations through time. Here, we developed models to predict total suspended solids (TSS) and oxidized nitrogen (NOx) concentrations based on high-frequency time series of turbidity, conductivity and river level data from in situ sensors in rivers flowing into the Great Barrier Reef lagoon. We fit generalized-linear mixed-effects models with continuous first-order autoregressive correlation structures to water-quality data collected by manual sampling at two freshwater sites and one estuarine site and used the fitted models to predict TSS and NOx from the in situ sensor data. These models described the temporal autocorrelation in the data and handled observations collected at irregular frequencies, characteristics typical of water-quality monitoring data. Turbidity proved a useful and generalizable surrogate of TSS, with high predictive ability in the estuarine and fresh water sites. Turbidity, conductivity and river level served as combined surrogates of NOx. However, the relationship between NOx and the covariates was more complex than that between TSS and turbidity, and consequently the ability to predict NOx was lower and less generalizable across sites than for TSS. Furthermore, prediction intervals tended to increase during events, for both TSS and NOx models, highlighting the need to include measures of uncertainty routinely in water-quality reporting. Our study also highlights that surrogate-based models used to predict sediments and nutrients need to better incorporate temporal components if variance estimates are to be unbiased and model inference meaningful. The transferability of models across sites, and potentially regions, will become increasingly important as organizations move to automated sensing for water-quality monitoring throughout catchments.
Measuring the concentrations of sediments and nutrients in rivers, and understanding how they change through time, is a major focus of water-quality monitoring given the potential detrimental effects these constituents have on aquatic ecosystems. Such knowledge can help inform the effective management of our land, waterways and oceans, including World Heritage Areas such as the Great Barrier Reef in the Australian tropics [1,2,3]. In regions dominated by highly seasonal, event-driven climates, such as those in the tropics, high-magnitude wet-season flows can transport large quantities of sediments and nutrients from the land downstream in relatively short time frames . The rapidity of change in sediment and nutrient concentrations during high-flow events poses challenges for water-quality monitoring based on discrete manual sampling of water followed by laboratory measurement of concentrations, which is time consuming, costly and typically temporally sparse. Relatively low sampling frequency increases the chances of missing water-quality events, but high flows may preclude the safety conditions required for manual sampling, and sample collection at the frequency required to capture change in concentrations may not always be physically or economically practical. The spatial sparsity of measurements from manual sampling is also problematic. For example, the Great Barrier Reef lagoon stretches over 3000 km of coastline, but the data currently used to validate estimates of sediments and nutrients flowing to the lagoon are collected from just 43 sites . This lack of data limits knowledge and understanding of sediments and nutrient concentrations in both space and time.