Access to vast datasets of visual and textual materials has become significantly easier. How to take advantage of the conveniently available data to support creative design activities remains a challenge. In the phase of idea generation, the visual analogy is considered an effective strategy to stimulate designers to create innovative ideas. Designers can read useful information off vague and incomplete conceptual visual representations, or stimuli, to reach potential visual analogies. In this paper, a computational framework is proposed to search and retrieve visual stimulation cues, which is expected to have the potential to help designers generate more creative ideas by avoiding visual fixation. The research problems include identifying and detecting visual similarities between visual representations from various categories and quantitatifying the visual similarity measures serving as a distance metric for visual stimuli search and retrieval. A deep neural network model is developed to learn a latent space that can discover visual relationships between multiple categories of sketches. In addition, a top cluster detection-based method is proposed to quantify visual similarity based on the overlapped magnitude in the latent space and then effectively rank categories. The QuickDraw sketch dataset is applied as a backend for evaluating the functionality of our proposed framework. Beyond visual stimuli retrieval, this research opens up new opportunities for utilizing extensively available visual data as creative materials to benefit design-by-analogy.