Lameness is considered a major problem in dairy production. Lameness is commonly detected with locomotion scores assigned to cows under farm conditions, but raters are often trained and assessed for reliability and agreement by using video recordings. The aim of this study was to evaluate intra- and inter-rater reliability and agreement of experienced and inexperienced raters for locomotion scoring performed live and from video, and to calculate the influence of raters and the method of observation (live or video) on the probability of classifying a cow as lame. Using a five-level locomotion score, cows were scored twice live and twice from video by three experienced and two inexperienced raters for three weeks. Every week different cows were scored. Intra- and inter-rater reliability (expressed as weighted kappa, kw) and agreement (expressed as percentage of agreement, PA) for live/live, live/video and video/video comparisons were determined. A logistic regression was performed to estimate the influence of the rater and method of observation on the probability of classifying a cow as lame in live and video observation. Experienced raters had higher values for intra-rater reliability and agreement for video/video than for live/live and live/video comparison. Inexperienced raters, however, did not differ for intra- and inter-rater reliability and agreement for live/live, live/video and video/video comparisons. The logistic regression indicated that raters were responsible for the main effect and the method of observation (live or from video) had a minor effect on the probability for classifying a cow as lame (locomotion score ≥ 3). In conclusion, under the present experimental conditions, experienced raters performed better than unexperienced raters when locomotion scoring was done from video. Since video observation did not show any important influence in the probability of classifying a cow as lame, video observation seems to be an acceptable method for locomotion scoring and lameness assessment in dairy cows.