Probabilistic modeling reveals coordinated social interaction states and their multisensory bases — The Association Specialists

Probabilistic modeling reveals coordinated social interaction states and their multisensory bases (21745)

Sarah Stednitz 1 , Andrew Lesak 2 , Philip Washbourne 2 , Luca Mazzucato 2 , Ethan Scott 1
  1. University of Melbourne, South Yarra, VIC, Australia
  2. Institute of Neuroscience, University of Oregon, Eugene, OR, USA

Social behavior across animal species ranges from simple pairwise interactions to thousands of individuals coordinating goal-directed movements. Regardless of the scale, these interactions are governed by the interplay between multimodal sensory information and the internal state of each animal. Here, we investigate how animals use multiple sensory modalities to guide social behavior in the highly social zebrafish (Danio rerio) and uncover the complex features of pairwise interactions early in development. To identify distinct behaviors and understand how they vary over time, we developed a new hidden Markov model with constrained linear-model emissions to automatically classify states of coordinated interaction, using the movements of one animal to predict those of another. We discovered that social behaviors alternate between two interaction states within a single experimental session, distinguished by unique movements and timescales. Long-range interactions, akin to shoaling, rely on vision, while mechanosensation underlies rapid synchronized movements and parallel swimming, precursors of schooling. Altogether, we observe spontaneous interactions in pairs of fish, develop novel hidden Markov modeling to reveal two fundamental interaction modes, and identify the sensory systems involved in each. Our modeling approach to pairwise social interactions has broad applicability to a wide variety of naturalistic behaviors and species and solves the challenge of detecting transient couplings between quasi-periodic time series.