Published online by Cambridge University Press: 20 January 2017
Field experiments were conducted from 1996 to 2000 near Manhattan, KS, to determine the effects of application timing on atrazine loss in surface water runoff. In addition, Groundwater Loading Effects of Agricultural Management Systems (GLEAMS) was run to compare simulated loss with actual loss in the field. Atrazine treatments were fall plus preemergence (FALL + PRE), early preplant plus PRE (EPP + PRE), PRE at a low rate (PRE-LOW), and PRE at a full (recommended) rate (PRE-FULL). Ridge-till furrows served as mini watersheds for the collection of surface water runoff. Water runoff volumes and herbicide concentrations were determined for each runoff event. Across four sampling years, mean atrazine runoff loss was 1.7, 4.3, and 1.7% of applied for FALL + PRE, EPP + PRE, and the mean of the PRE treatments, respectively. Thus, actual average losses from FALL + PRE and EPP + PRE treatments were somewhat higher than that predicted by GLEAMS. For PRE treatments, actual average losses were significantly lower than that predicted by GLEAMS, with measured losses falling below the bottom of the graph in 3 of 4 yr. These findings suggest that in certain parts of the Great Plains, FALL + PRE split applications of atrazine offer acceptably low atrazine runoff loss potential; EPP + PRE is more vulnerable to loss than FALL + PRE; and the GLEAMS model may overestimate atrazine runoff potential for PRE applications.