Understanding how organisms integrate environmental and social information to coordinate their actions is a fundamental scientific question. Collective animal behavior, observed in phenomena such as the synchronized movements of bird flocks and fish schools, is a captivating example of coordinated behavior influenced by perceptual interactions. In these scenarios, individuals within a group rely on environmental cues, including the actions of their fellow group members, to adjust their own movements. However, conventional models of collective behavior often assume that individuals have access to information that may not be readily available, such as precise data on the positions of their nearest neighbors.
To address this challenge, we propose a cohesive framework for studying perceptual interactions, integrating generalized models with virtual reality experiments. Our study places a particular emphasis on individual sensory experiences, with a focus on vision. Vision provides a direct means for organisms to gather environmental information through the projection of their visual field. We introduce a straightforward and adaptable model that bridges the gap between vision and locomotion, rooted in empirical observations of symmetrical patterns and the intrinsic symmetry of the system itself. This model offers a fresh perspective on how both visual and sensory cues shape individual behaviors, potentially opening up new research avenues exploring how environmental object appearances influence collective dynamics.
Leveraging the capabilities of Virtual Reality, which allows individuals to interact with dynamic computer-generated environments, we illustrate how this technology can be utilized to create open and automated experimental systems. These systems can finely tune the visual experiences of individuals within complex environments, shedding light on the intricate relationship between perception and action in living organisms.