Playbacks are a common technique used to study animal vocalizations, involving the experimental presentation of stimuli to animals (usually recorded calls played back to them) to build an understanding of their physiological and cognitive abilities. With current playback tools, biologists are limited in their ability to manipulate the vocalizations in ways that will establish or change their meaning, and their exploratory power is limited. Senior AI research scientist Jen-Yu Liu is exploring whether it is possible to train AI models to generate new vocalizations in a way that allows us to solve for a particular research question or task.
Jen-yu has been working on generating calls for a number of species, including chiff-chaffs (Phylloscopus collybita) and humpback whales (Megaptera novaeangliae). A new project is now allowing us to test this work through interactive playback experiments with captive zebra finches (Taeniopygia guttata) in partnership with research scientist Dr. Logan James at the University of McGill. Providing researchers with the ability to control various aspects of the vocalization production process will greatly expand the exploratory and explanatory power of bioacoustics research, and is an important step on our Roadmap to Decode.Given that we may not understand the meaning of the novel vocalizations being generated by the model, there are important ethical considerations related to potentially interfering with animals and their culture. For this reason, we are beginning this research only with captive populations and working exclusively with scientists who follow strict ethical protocols.