Using AI to save coral reefs, an interview with Ben Williams

Listen to the marine biologist discuss how AI can expedite the process of finding solutions to preserve and restore coral reefs

By Google Arts & Culture

Meet Ben Williams

Meet the AI expert and marine biologist Ben Williams

His research using technology to restore coral reef ecosystems has led to the exciting release of a new, innovative AI model called SurfPerch, developed in collaboration with Google researchers and enriched by contributions from citizen scientists on the Calling in our Corals.

A healthy coral reef in Sulawesi, Indonesia by Timothy Lamont

Ben Williams, Marine biologist, PhD
00:00

Can you tell us a bit about what you developed?

"Analysing the audio is a manual process and scientists haven’t been able to keep up. To overcome this challenge I worked with Google DeepMind to develop SurfPerch, an AI model trained to listen to coral reef audio."

Ben deploying hydrophone by Ocean Culture Life

Ben Williams, Marine biologist, PhD
00:00

How is Calling in our Corals helping these efforts?

"Visitors to Google Arts & Culture’s Calling in Our Corals platform listened to over 400 hours of reef audio from around the world. This brought thousands of eyes and ears to data that would take bioacousticians months to analyse if working flat out."

A damaged coral reef in Sulawesi, Indonesia by The Ocean Agency

Damaged reef sound
00:00

This is what a damaged reef sounds like

No fish can be heard. We can only hear the crackling of shrimp.

A clownfish in an anemone by Tim Lamont

Healthy reef sound
00:00

This is what a healthy reef sounds like

We can hear the purring of damsel fish and the characteristic scraping sound produced by parrot fish feeding, as well as other unknown species - scientists are still discovering what they are.

Ben analysing a reef audio waveform

Ben Williams, Marine biologist, PhD
00:00

What makes SurfPerch special?

"Previous AI led approaches typically require days or weeks of labelling training data to develop a model. SurfPerch can be fine-tuned to new datasets in seconds using a standard personal laptop, making it more flexible and much quicker than previous approaches."

Hydrophone recording coral reef by Tim Lamont

Ben Williams, Marine biologist, PhD
00:00

What was the model trained on?

"A new dataset called ReefSet, consisting of 57K sounds from coral reefs. Also while training the model on fish sounds we had an exciting discovery, by mixing in our much larger library of bird recordings, we greatly improved the performance of our model."

A diver swims over a damaged coral reef in Sulawesi, Indonesia by The Ocean Agency

Ben Williams, Marine biologist, PhD
00:00

Why is it so important to monitor reef habitats now?

"We’ve lost over half the world's coral reefs in the last 70 years, and this loss is accelerating. We urgently need monitoring data to track where this is happening."

Communities in Indonesia work with Mars Sustainable Solutions to restore coral using Reef Stars by The Ocean Agency

Detailed monitoring helps scientists know where to restore

This is a restoration in progress, using metal frames for coral to latch onto for easier growth.

A restored reef, 3 years after installing Reef Stars by The Ocean Agency

Regrowing coral reefs improves the underwater ecosystems

Two years after installing the metal structures, fish now have a place to live. This method for restorating reefs is known as the MARRS method.

Ben Williams analysing the audio waveform of a coral reef

Ben Williams, Marine biologist, PhD
00:00

What are the main difficulties for manual listening?

"Traditionally, marine bioacousticians have had to rely on manually listening. SurfPerch offers a powerful tool to overcome this barrier, detecting all occurrences of these sounds, at lightning speed and with high accuracy."



A diver surveys a healthy coral reef in Sulawesi, Indonesia by The Ocean Agency

Ben Williams, Marine biologist, PhD
00:00

How does SurfPerch impact research today?

"We are already working with partners using this model on a range of important marine conservation challenges. We’re hoping SurfPerch will represent a big step towards making acoustic monitoring of reef habitats mainstream." 

Calling in our Corals by Google Arts & Culture Lab

Ben Williams, Marine biologist, PhD
00:00

How can people support your research?

"Anyone can help us start finding new sounds by listening to data we’ve shared on the Calling in Our Corals. By listening to some of this audio for us, users will be helping to discover new sounds that we can then build detectors for using SurfPerch."

Calling in our Corals by Google Arts & Culture Lab

Ben Williams, Marine biologist, PhD
00:00

Where is the AI model accessible for researchers to use?

"For experienced users, SurfPerch is available for download on Kaggle Models. However, perhaps the best place to start is the tutorial we put together using example data from the first round of Calling in Our Corals." 

Credits: Story

Photo credits by Tim Lamont, The Ocean Agency, & Ocean Culture Life

Credits: All media
The story featured may in some cases have been created by an independent third party and may not always represent the views of the institutions, listed below, who have supplied the content.
Explore more
Related theme
A Coral Chorus
Dive into coral reef ecosystems
View theme
Google apps