A simple catchment scale model simulating diffuse phosphorus (P) loss from agricultural land to water, the Phosphorus Indicators Tool (PIT), has been developed. Previous research has shown that this model worked well in simulating the average annual P lost from two catchments: Windermere and Windrush, but it was not known which drivers in the model had the greatest control on predicted P delivery to water from agricultural land. In order to simulate the P export from each catchment source via each hydrological pathway specified individually, 108 coefficients are used in the model code. A univariate sensitivity analysis was conducted to evaluate which coefficient exerted the greatest control on the model output. Results from the univariate analysis suggest that the model is sensitive to a number of coefficients, but importantly, not all of the coefficients that were varied in the sensitivity analysis, altered the model output. The PIT model has been calibrated by optimizing results from the univariate analysis against observed data in the Windermere catchment. The simulated results from model calibration fit the observed data well, at the 95% level. This paper describes the methodology developed for the univariate analysis and evaluates the model calibration procedure against observed data from the Windermere catchment.