A rationale is proposed for approximating the normal distribution with a logistic distribution using a scaling constant based on minimizing the Kullback-Leibler (KL) information, that is, the expected amount of information available in a sample to distinguish between two competing distributions using a likelihood ratio (LR) test, assuming one of them is true. The new constant 1.749, computed assuming the normal distribution is true, yields an approximation that is an improvement in fit of the tails of the distribution as compared to the minimax constant of 1.702, widely used in item response theory (IRT). The minimax constant is by definition marginally better in its overall maximal error. It is argued that the KL constant is more statistically appropriate for use in IRT.