Acoustic diagnosis in the gardens of Poseidon: Researchers have used machine learning to teach a computer system to recognize the health of a coral reef based on the ‘music’ its inhabitants produce. The method could now develop into an important tool for investigating the world’s threatened reefs: The method has already enabled scientists to document the progress of reef restoration projects.
They are fascinating natural wonders and are of far-reaching importance for the complex relationship systems in the marine environment. But the world’s coral reefs are extremely threatened: climate change and other man-made pressures are making things difficult for them. Where once the lush splendor of the underwater gardens unfolded, deserts are spreading in many places. Conservationists and scientists are trying to counteract this trend with protective measures and rehabilitation projects. The diagnosis of the state of health of the reefs is of great importance.
In addition to the analysis of visible signs, conclusions can also be drawn from acoustic clues, reports the researchers led by Ben Williams from the University of Exeter. Because healthy coral reefs are characterized by a complex soundscape: the concert is made up of signal sounds from fish, the crackling of crabs and other sound elements from the inhabitants of the underwater world. It has also been shown that the sound of a healthy reef is attractive to marine life. In damaged reefs, on the other hand, it gets quieter and quieter until finally an eerie silence returns.
Difficult diagnoses in the reef
“A major difficulty in diagnosing reef health to date has been that visual and aural surveys typically rely on labor-intensive methods,” says Williams. “Visual surveys are also limited by the fact that many reef inhabitants hide or are only active at night. In terms of acoustic cues, however, the complexity of the riff noise makes it difficult to determine the state of the riff from individual recordings.” Therefore, the researchers have now explored the extent to which artificial intelligence can automate reef diagnosis using sound recordings.
So-called machine learning was used. It is a process based on computer systems that can, in examples, capture patterns that are linked to certain factors. During the learning phase, algorithms then build a statistical model based on this training data. The system can then be applied to new information. Williams and his colleagues’ algorithm was now trained with sound recordings from healthy and damaged reef sites in Indonesia, the respective conditions of which were precisely known from previous studies.
AI with analytical hearing
As the scientists report, the computer system was actually able to develop into an expert in the analysis and diagnosis of “riff music”: After the “training” it was able to record the state of health of the respective underwater landscape with great accuracy in new recordings of soundscapes. “The results show that our system can also recognize patterns that are imperceptible to the human ear. In this way, the AI can tell us more quickly and precisely how a particular riff is doing,” says Williams.
In addition to status assessment, the potential of the AI method lies primarily in the monitoring of development processes in damaged or recovering reefs. The team was able to show this by analyzing images from the Mars Coral Reef Restoration Project, which is restoring badly damaged reefs in Indonesia. The comparisons showed the progress made in the context of reef rehabilitation.
“This is a really exciting development. Sound recording devices and artificial intelligence could be deployed around the world to monitor the condition of reefs and find out if attempts to protect and restore them are working,” says co-author Tim Lamont of Lancaster University. “In many cases, it is easier and cheaper to attach an underwater hydrophone to a reef and leave it there than for experienced divers to repeatedly visit the reef to examine it – especially in remote locations,” the researcher explains Potential of the AI method.
Source: University of Exeter, professional article: Ecological Indicators, doi: 10.1016/j.ecolind.2022.108986
Video with explanations in English and sound impressions © Ben Williams