Previous studies of historical risk have used either nominal or real data to calculate risk measures for agricultural prices and income. However, the effects of using nominal and real data have not been evaluated. This study utilizes theoretical variance approximation relationships to examine variances from detrended real and nominal time series. The relationships between variances are derived for quarterly U.S. farm milk prices for 1960-72, 1973-80, and 1981-90. Contrary to common intuitive arguments, results indicate that variances of real time series can be larger than variances of nominal series. While definitive conclusions are not possible, several reasons for using nominal data in risk analysis are given.