How should we think about the ways search engines can go wrong? Following the publication of Safiya Noble's Algorithms of Oppression (Noble, 2018), a view has emerged that racist, sexist, and other problematic results should be thought of as indicative of algorithmic bias. In this paper, I offer an alternative angle on these results, building on Noble's suggestion that search engines are complicit in a racial contract (Mills, 1997). I argue that racist and sexist results should be thought of as part of the workings of the social system of white ignorance. Along the way, I will argue that we should think about search engines not as sources of testimony, but as information-classification systems, and make a preliminary case for the importance of the social epistemology of technology.