The entropy score of an observed outcome that has been given a probability forecast p is defined to be –log p. If p is derived from a probability model and there is a background model for which the same outcome has probability π, then the log ratio log(p/π) is the probability gain, and its expected value the information gain, for that outcome. Such concepts are closely related to the likelihood of the model and its entropy rate. The relationships between these concepts are explored in the case that the outcomes in question are the occurrence or nonoccurrence of events in a stochastic point process. It is shown that, in such a context, the mean information gain per unit time, based on forecasts made at arbitrary discrete time intervals, is bounded above by the entropy rate of the point process. Two examples illustrate how the information gain may be related to realizations with a range of values of ‘predictability'.