CompCog: Generating Object Percepts in Peripheral Vision During Naturalistic Attention Tasks

Project Abstract/Summary

Imagine doing some everyday activity, such as walking down a street to meet a friend at a restaurant. As you do this, your eyes are constantly moving—darting from your phone to street signs to the people and cars around you. Yet, despite what should be a jittery visual mess, we perceive a relatively smooth and stable visual world. For decades, scientists have been theorizing about how the brain creates this illusion of visual stability—recent advances in AI suggest an answer that can finally help crack this human perceptual code.

The latest AI-powered vision-language models are becoming very good at recognizing visual objects and understanding scenes, even imagining what should be there and filling in missing parts much like the human brain does. This begs the question of whether these models are making sense of the visual world the same as humans. This project puts these new AI models to the test by having them generate plausible visual scenes, in real time, using only the few input samples that correspond to where a person looked while freely viewing the scene. This person is then shown either the real or AI-generated scene and asked if it is the scene that they just viewed. By identifying generated scenes that people cannot tell from real, cognitive scientists learn more about the semantic variability that objects viewed in peripheral vision might take while still seeming plausible (i.e., stable), and computational neuroscientists learn more about the algorithms underlying human visual stability and scene perception. This knowledge may also contribute to the development of smarter AI systems capable of perceiving the world more like humans do, thereby potentially improving performance in time-critical applications such as self-driving cars and enhancing user experiences in augmented reality contexts.

This award reflects NSF’s statutory mission and has been deemed worthy of support through evaluation using the Foundation’s intellectual merit and broader impacts review criteria.

Principal Investigator

Gregory Zelinsky – SUNY at Stony Brook located in STONY BROOK, NY

Co-Principal Investigators

Funders

National Science Foundation

Funding Amount

$283,832.00

Project Start Date

04/01/2025

Project End Date

03/31/2028

Will the project remain active for the next two years?

The project has more than two years remaining

Source: National Science Foundation

Please be advised that recent changes in federal funding schemes may have impacted the project’s scope and status.

Updated: April, 2025

 

Scroll to Top