The simulation of distributions of financial assets is an important issue for financial institutions. If risk measures are evaluated for a simulated distribution instead of the model-implied distribution, the errors in the risk measurements need to be analyzed. For distribution-invariant risk measures which are continuous on compacts, we employ the theory of large deviations to study the probability of large errors. If the approximate risk measurements are based on the empirical distribution of independent samples, then the rate function equals the minimal relative entropy under a risk measure constraint. We solve this minimization problem explicitly for shortfall risk and average value at risk.