Disparity-tuned cells in primary visual cortex (V1) are thought
to play a significant role in the processing of stereoscopic
depth. The disparity-specific responses of these neurons have
been previously described by an energy model based on local,
feedforward interactions. This model fails to predict the response
to binocularly anticorrelated stimuli, in which images presented
to left and right eyes have opposite contrasts. The original energy
model predicts that anticorrelation should invert the disparity tuning
curve (phase difference π), with no change in the amplitude of the response. Experimentally, the amplitude tends to be reduced with
anticorrelated stimuli and a spread of phase differences is
observed, although phase differences near π are the most
common. These experimental observations could potentially reflect
a modulation of the V1 signals by feedback from higher visual
areas (because anticorrelated stimuli create a weaker or
nonexistent stereoscopic depth sensation). This hypothesis could
explain the effects on amplitude, but the spread of phase
differences is harder to understand. Here, we demonstrate that
changes in both amplitude and phase can be explained by a
straightforward modification of the energy model that involves
only local processing. Input from each eye is passed through
a monocular simple cell, incorporating a threshold, before being
combined at a binocular simple cell that feeds into the energy
computation. Since this local feedforward model can explain
the responses of complex cells to both correlated and
anticorrelated stimuli, there is no need to invoke any influence
of global stereoscopic matching.