Projected changes to rainfall patterns may exacerbate existing risks posed by flooding. Furthermore, increased surface runoff from agricultural land increases pollution through nutrient losses. Agricultural systems are complex because they are managed in individual fields, and it is impractical to provide resources to monitor their water fluxes. In this respect, modelling provides an inexpensive tool for simulating fluxes. At the field-scale, a daily time-step is used routinely. However, it was hypothesised that a finer time-step will provide more accurate identification of peak fluxes. To investigate this, the process-based SPACSYS model that simulates water fluxes, soil carbon and nitrogen cycling, as well as plant growth, with a daily time-step was adapted to provide sub-daily simulations. As a case study, the water flux simulations were checked against a 15-minute measured water flux dataset from April 2013 to February 2016 from a pasture within a monitored grassland research farm, where the data were up-scaled to hourly, 6-hourly and daily. Analyses were conducted with respect to model performance for: (a) each of the four data resolutions, separately (15-minute measured versus 15-minute simulated; hourly measured versus hourly simulated; etc.); and (b) at the daily resolution only, where 15-minute, hourly and 6-hourly simulations were each aggregated to the daily scale. Comparison between measured and simulated fluxes at the four resolutions revealed that hourly simulations provided the smallest misclassification rate for identifying water flux peaks. Conversely, aggregating to the daily scale using either 15-minute or hourly simulations increased accuracy, both in prediction of general trends and identification of peak fluxes. For the latter investigation, the improved identification of extremes resulted in 9 out of 11 peak flow events being correctly identified with only 2 false positives, compared with 5 peaks being identified with 4 false positives of the usual daily simulations. Increased peak flow detection accuracy has the potential to provide clear field management benefits in reducing nutrient losses to water.
This is the author’s version of a work that was accepted for publication in Agricultural Water Management. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Agricultural Water Management, 255, 2021 DOI: 10.1016/j.agwat.2021.107049