This study examines the analytical methods used to test drinking water for atrazine along with the seasonal variation of atrazine in drinking water. Samples from 117 counties throughout Kentucky from January 2000 to December 2008 were analyzed. Methods 507 and 508.1 were compared using the Mann-Whitney U test. Median values of these methods were similar (p = .7421). To examine seasonal variation, data from each year and from the entire period were analyzed using one-way ANOVA; pairwise multiple comparisons were made for years with significant differences. All the years except 2001, 2005, 2006, and 2007 had significantly different atrazine concentrations between seasons. The Seasonal Kendall Test for Trend was used to identify trends in atrazine over time. Yearly means ranged from 0.000043 mg/L (± 0.000011 mg/L) to 0.000995 mg/L (± 0.000510 mg/L). The highest levels were observed during spring in most years. A significant (p = .000092) decreasing trend of -7.6 x 10-6 mg/L/year was found. Decreasing trends were also present in all five regions of the state during this period. This study illustrates the need for changes in sampling methodology used today, so that effective exposure assessments can be conducted to study the public’s exposure to atrazine in drinking water.