29th November 2023

Learn about BCT’s cutting-edge audio-identification work in this interview with our Research Manager, Dr Lia Gilmour. Lia talks about the incredible AI-powered acoustic survey tool – BCT's SCS - her team is developing. She explains how this system will remove barriers to surveying, improve bat conservation and increase impact on policy.

You just won a big grant from AWS to develop BCT’s SCS (Sound Classification System). What is that system?

AI and audio-identification of bats: Lia Gilmour interview part 1

Dr Lia Gilmour is BCT's Research Manager and oversees the development of BCT’s SCS.

BCT’s SCS is an end-to-end audio-identification system, which basically means the whole survey process is done through it.

So, people can book equipment, upload files, get bat sounds identified and generate reports all in one place.

It is artificial intelligence (AI) that does the identification, though human users will verify them. And humans also train the AI.

Why should BCT be at the forefront of audio-identification using AI?

Bat population monitoring is the grounded application of this cutting-edge science.

BCT produces the official statistics on the species, which are used by government and to inform conservation efforts. So, we need the best data. As well as building on monitoring data the BCT has going back into the 1990s, this technology will help us monitor a wider range of species.

Currently, BCT is at the forefront of this kind of research because we invest resources to shape its development. Overall, it’s vital that we're leading the conversation on how to monitor bats and develop these new techniques that are the future of monitoring.

This system lets us do bigger, better bat surveys? Why is this important?

We urgently need to upscale bat surveys, because bat species are ecological indicators. So as well as informing conservation and policy, our survey data lets us track ecosystem health. It’s increasingly important to do this in the face of global changes.

However, conventional bat surveys are time intensive. They need laborious field surveys and expert sound identification skills. Plus, although passive monitoring means we can generate survey data without using a lot of human time, we still need to process that data, which can be a huge task.

Using this grant, and the AI and cloud technologies that come with it, means we can now process that data with unprecedented efficiency and accuracy.

BCT’s SCS was also designed to bring underrepresented demographics into bat surveying. How does that work?

AI and audio-identification of bats: Lia Gilmour interview part 1

Elliot Bastos from BCT's Woodland Hope sets up an AudioMoth, these create a lot of data, which AI can help to process.

Unlike traditional surveys, SCS users don’t need to watch bats in real time. So it is accessible to those who can’t take part in physical surveys. Instead, the sensor can be put on a balcony or in a garden to collect audio data. It also means more people in towns and cities can take part, for example as part of the NightWatch and BBatS surveys. Identification is also ‘gamified’ to appeal to younger demographics.

Users get bespoke reports about bat species in their area, which helps connect people to nature, which we know is good for health and wellbeing. And some users can get specialised training to verify identifications, which will increase the number of people who can do this work. Once they have that training, they can then help fine-tune the algorithm. And this will expand EchoHub (an open-source library of bat sounds). Also, calls from EchoHub of good quality and known species can be used in turn to train the algorithm.

All of this will create gold standard datasets. And give people useful, transferable skills.

Can you explain how AI is used for bat audio-identification?

Traditional analysis of bat calls is manual. First, we convert a bat call into a picture, called a spectrogram. Someone then analyses that picture to see what characteristics it has. A long sweeping tail through the frequencies and then a bit that's more constant means it’s a pipistrelle.

You get your ID, but it’s time consuming, specialist work. And importantly, we don’t know what we miss because of our human preconceptions.

Auto-identification systems are software that looks at those bat calls and provides an identification rather than a human. When I started bat work, they weren't that accurate, but that's changing. The ID systems that we research and use at BCT use artificial intelligence based on neural networks from the brain, which we train with known species calls.

Your AI is based on brains?

Yes, there are a lot of different kinds of neural networks, we use a convolutional neural network (CNN). CNNs are great at getting features out of data. They work especially well with images and are used in things like facial recognition.

My team have already done significant training of the AI and are now collecting a bit more data for certain species, to improve the classifier accuracy at identifying those species. Our system will incorporate ‘BatDetect2’, developed in collaboration with Kate Jones at UCL and Oisin Mac Aodha at Edinburgh University, who are at the forefront of AI automated systems in ecology and acoustics research.

A spectrogram of a pipistelle's call. The picture shows a graph of soundwaves, used to identify bats.

A spectrogram of a pipistelle's call - in traditional analysis, a human identifies a bat from this sort of image. In auto-identification systems, it is software that provides an ID, though human input will always be needed to verify and train these systems.

How do you train AI?

We train AI with datasets of bat calls, so it can recognise different calls in different parts of the picture. As we give them more data, the more AI will understand the huge variation that exists, even within species. And the more data you feed them the more they improve.

In the future, AI should be able to learn more than humans could about what's going on in the training data set.

However, there will always be a need for an expert human eye!

How much quicker is AI powered identification than manual ID?

So much quicker! To give an idea, the data going through our system from this year alone would take a person a decade to analyse manually. But we have results in a matter of months.

Why is auto-ID so useful for bats?

A young woman with dark hair is holding a very small bat in a gloved hand. The woman has a headtorch on and is smiling.

Lia Gilmour: "It’s vital that we're leading the conversation on how to monitor bats and develop these new techniques that are the future of monitoring." This photos was taken in 2012 during Dr Lia Gilmour's MSc project * Photo credit: Bob Cornes.

What’s crucial to understand is that in some animals it's easy to say, ‘this sound means that animal’. But bats can be flexible. So, they may have a characteristic call type, but within that there's a huge range. And that range is dependent on the habitat, region and even what other bat species are present.

Plus, there's lots of untapped information in bat calls. We hope that new technology will give us access to ‘big bat data’ we can’t currently use.

Now, I must stress that it's always going to need human oversight. Humans need to make sure that the AI is trained properly and validate results. And importantly, to be make sure that AI is used in the right way.

Could AI ever totally replace traditional surveying?

Certain species may always need some targeted surveys. For example, Passive Acoustic Monitoring isn’t always possible for some species that have cryptic calls (calls which have a similar acoustic structure to other species’ calls).

But even then, AI can give a baseline before more intensive surveying. And I feel that there will be the nuances in cryptic calls that we can’t pick up with the with the human eye.

What’s the next thing in AI?

Bats, sound and science: Lia Gilmour interview part 2Currently, humans train the AI. But at some point, there may be a communication back from the AI - generative AI.

Generative AI has a conversation back with you. So, you give it information and then it suggests stuff to look for. That's further down the line, but it’s quite exciting.

I can’t wait to explore how that could influence conservation and biodiversity monitoring in the future.

Read the second part of our interview with Lia - Bats, sound and science: Lia Gilmour interview part 2.

Want to know more about AI the BCT's work? Check out this article in Charity Times: How AI is transforming the Bat Conservation Trust.

* This photo was taken in 2012, during Dr Lia Gilmour's MSc project. Since the COVID-19 pandemic, experts advise everyone that comes into close contact with bats to wear a mask to prevent human-to-bat transmission of SARS-CoV-2.