Published online by Cambridge University Press: 01 April 2022
A model of a Markov process is presented in which observing the present state of a system is asymmetrically related to inferring the system's future and inferring its past. A likelihood inference about the system's past state, based on observing its present state, is justified no matter what the parameter values in the model happen to be. In contrast, a probability inference of the system's future state, based on observing its present state, requires further information about the parameter values.
I am grateful to Martin Barrett, Ellery Eells, Malcolm Forster, and an anonymous referee for valuable comments.