This paper provides a quasi-likelihood or minimum-contrast-type method for the parameter estimation of random fields in the frequency domain based on higher-order information. The estimation technique uses the spectral density of the general kth order and allows for possible long-range dependence in the random fields. To avoid bias due to edge effects, data tapering is incorporated into the method. The suggested minimum contrast functional is linear with respect to the periodogram of kth order, hence kernel estimation for the spectral densities is not needed. Furthermore, discretization is not required in the estimation of continuously observed random fields. The consistency and asymptotic normality of the resulting estimators are established. Illustrative applications of the method to some problems in mathematical finance and signal detection are given.