Each year, people spend less time reading and more time viewing images, which are proliferating online. Images from platforms such as Google and Wikipedia are downloaded by millions every day, and millions more are interacting through social media, such as Instagram and TikTok, that primarily consist of exchanging visual content. In parallel, news agencies and digital advertisers are increasingly capturing attention online through the use of images, which people process more quickly, implicitly and memorably than text.
View Article and Find Full Text PDFDeep neural networks (DNNs) are increasingly proposed as models of human vision, bolstered by their impressive performance on image classification and object recognition tasks. Yet, the extent to which DNNs capture fundamental aspects of human vision such as color perception remains unclear. Here, we develop novel experiments for evaluating the perceptual coherence of color embeddings in DNNs, and we assess how well these algorithms predict human color similarity judgments collected via an online survey.
View Article and Find Full Text PDFThe embodied cognition paradigm has stimulated ongoing debate about whether sensory data - including color - contributes to the semantic structure of abstract concepts. Recent uses of linguistic data in the study of embodied cognition have been focused on textual corpora, which largely precludes the direct analysis of sensory information. Here, we develop an automated approach to multimodal content analysis that detects associations between words based on the color distributions of their Google Image search results.
View Article and Find Full Text PDF