The equilibrium delay distribution for a system with finite-source input, constant service time, and s > 1 servers, as determined by simulation, showed the existence of values of t such that the probability of delay greater than t is a non-monotone function of the input intensity. Since repeated efforts to remove this counter-intuitive result by debugging failed, a simple case was attacked by analytical and numerical methods; and the same anomaly was revealed. The method of analysis is described and an explanation of the result is offered.