Artificial intelligence helps scientists spy on chimp behavior in the wild

Chimpanzees in West Africa have a clever trick to get at the tasty kernels inside oil palm nuts. They carefully select a flat rock to act as an anvil and place a nut on top. Then, using another stone as a hammer, they pound away until the nut’s hard exterior cracks with a crunch.

Until now, scientists eager to learn more about this tool use could spend weeks combing through hours of raw footage to find the relevant recordings. But a new artificial intelligence system out today can do the grunt work for them, automatically finding and identifying the right clips in footage captured from the wild.

If the system can be used on videos of other primates and behaviors such as scratching or sleeping, “that could be pretty exciting,” says Sara Beery, a conservation technology researcher at the California Institute of Technology who was not involved with the study. “[This will] really help speed up data processing and reduce costs.”

To conduct the research, scientists focused on two behaviors: nut cracking and drumming. Those actions are important for understanding primate learning and communication, says co-author Susana Carvalho, a primatologist at the University of Oxford’s Primate Models for Behavioral Evolution Lab. But they have another bonus: they’re noisy. That meant the researchers could use sound, in addition to visuals, to train their computer model.

For nut cracking, the team utilized about 40 hours of archival video recordings from the Bossou forest in Guinea. For drumming, they used about 10 hours of camera trap footage from Cantanhez National Park in Guinea-Bissau. The drumming clips were of a particularly raucous behavior called “buttress drumming,” in which chimps pound on protruding tree roots with their hands or feet; researchers think drumming patterns might vary between groups.

First, researchers trained a neural network by hand-coding the audio and visual footage (see video, above). Researchers drew boxes around each chimpanzee and labeled their activities, teaching the network what to look for and what to ignore. Then, they quizzed their program by showing it a video clip it hadn’t seen before. Sometimes, though, the chimps would wander behind a tree or were “literally on top of each other,” says Max Bain, a machine learning researcher at Oxford who led the study. So they also trained the model to recognize nut cracking and buttress drumming from the sound alone.

Once the scientists were done training, it was time to put their model to the test. When they showed the network nut cracking and buttress drumming footage it had never seen before, it was able to correctly identify both behaviors with 77% and 86% precision, respectively, the researchers report today in Science Advances. To showcase how their framework could be applied for research, they also used it to analyze how long the chimps spent cracking nuts and drumming, and how often males and females engaged in each activity.

“I was really impressed by how this was truly a collaboration [between machine learning and primate behavior researchers],” Beery says. “They were very thoughtful about the needs for this model and tested it appropriately.”

One of the most exciting things about the new finding is its potential to speed up the time it takes to find footage, Carvalho says. During her own Ph.D. research, she says, she spent up to 150 hours manually searching 40 hours of footage for interesting behaviors. Scientists have amassed thousands of hours of archival footage over decades, but “it would take decades for someone to code [that] footage manually.”

Now that the new system has been tested, Carvalho hopes it can be trained to track other species and other behaviors—and even individual chimps to see how their skills develop over an entire lifetime. Analysis of footage over time could also help conservationists understand how things such as climate change and habitat loss are changing primates’ behaviors. “The possibilities,” she says, “are endless.”

source: sciencemag.org