The DeepWild project uses deep learning approaches to study animal behaviour. Video coding is a gold standard across animal behaviour research; allowing researchers to extract rich behavioural datasets and validate reliability. At the Wild Minds lab we have access to very large video data archives from which we can explore animal behaviour as it occurs in natural habitats. However, in practice, these videos are only useful if data can be efficiently extracted. Manually locating the right footage in 10,000s of hours is very time-consuming! As is the manual coding of animal behaviour, which requires extensive training to become reliable. Computer coding of animal behaviour makes locating and analysing data much faster and more reliable. While there’s been substantial progress, until recently, machine learning tools called deep neural networks were only able to track behaviour in ‘clean’ predictable environments (stable and free of visual noise). In practice this made their use for fieldwork-based data near impossible: wild primates live in visually noisy unpredictable environments – in our videos we move, they move, the light changes and forests do tend to be full of, well, trees and lots of other visual ‘noise’!

In 2021 we were able to train a network model using DeepLabCut to locate and track ape body movements in wild handheld footage of great apes. We are now developing several tools that will help us (and other people) to locate animals and behaviour of interest in video databases, and then to track the movement of each individual. We’re planning to use these to explore ape gestural communication as part of our Great Ape Dictionary project, but we’ll also be able to look at tool-use, locomotion, and much much more.

We’re committed to making these tools and trained networks open source so stay tuned for more news, including ways you can help us by getting involved!