Using AI to Study Hawaiian Bird Songs
By Lichen Forster
Spectrogram images provided by Ann Tanimoto-Johnson
Artificial Intelligence (AI) use has become ubiquitous in the past couple of years. It is widely accessible to ask AI to make somewhat unsettling art, or write a poem about tangerines and spaceships, and AI is now a feature at the top of every page you Google.
At the Listening Observatory for Hawaiian Ecosystems (LOHE) bioacoustics lab at University of Hawaiʻi at Hilo, biology professor Patrick Hart and his team are also using AI, to identify and catalog Hawaiian bird songs.
The LOHE (Listening Observatory for Hawaiian Ecosystems, from ‘lohe’, the Hawaiian word for ‘to perceive with the ear’) bioacoustics laboratory was started in 2014. Its goal, which Hart says will be “ongoing forever”, is to catalog Hawaiian bird songs.
“Hawaiian birds seem to be more variable in what they say to each other than most birds anywhere else in the world,” Hart said.
A few years ago, the LOHE lab teamed up with the Cornell Lab of Ornithology, which uses the BirdNet algorithm to catalog bird sounds. They fed the algorithm their years of data, hoping to analyze future data faster than they are able to.
Currently, the LOHE lab has waterproof recorders out in all the Hawaiian islands — mostly capturing forest birds, though the team also focuses on seabirds and bats. The recordings can then be analyzed at the lab (Wentworth Building, room 19), and before AI, researchers spent hours classifying each bird on a recording. An accurate AI would allow the team to analyze and use the same data at speeds orders of magnitude faster.
Cornell has made the BirdNet algorithm accessible to the public via the app Merlin Bird ID. Users can record a bird song, and the app will search its catalog of sounds and give its best estimate for the species of the inputted song. Hart says that Merlin is the best app for bird watchers of its kind — however, it is more accurate in North America and Europe than in Hawaiʻi.
“The Cornell Lab made some good progress [...] we worked with them a lot, but the model wasn’t really working well for Hawaiian birds,” Hart said.
Hart’s and others’ research has found that Hawaiian bird song is much more varied than in other parts of the world. Depending on the bird, their song can change within miles (ʻamakihi), hundreds of meters (ʻapapane), or from bird to bird (ʻōmaʻo.) For example, from one kīpuka (clusters of vegetation in a lava field) on Saddle Road to the next, the ʻapapane’s song changes. From his work, Hart has learned that birds born in one kīpuka will learn the “bird dialect” of that area, and that other birds coming to that kīpuka will sing the same way there, even if they come from a kīpuka with a different dialect.
Hart thinks these variations make it more difficult for AI to categorize Hawaiian bird song accurately. While Merlin is the best birding app available to the public, LOHE Acoustics Bioinformatics Specialist Amanda Navine has been working with Google in the past few years to utilize a different algorithm: Perch. It was developed by Tom Denton, who also worked on BirdNet, but Perch seems to catalog Hawaiian bird songs with greater accuracy.
Navine told Ke Kalahea that she’s excited about a soon to be published study on Perch that finds it is matching the data for traditional monitoring. For the same area, Perch is counting the same number and abundance of birds as human monitoring done years ago did, giving credibility to the AI. Perch is not likely to become available on the app store, said Navine, as its use lies more in research.
Lab manager Ann Tanimoto-Johnson gets students involved with the work by training them to train Perch. She teaches them to use the Raven software, which predates Perch and shows visual “spectrograms” of the songs. At the same time, students learn to use Raven, and to identify a range of bird songs. Once Johnson feels they can accurately identify the songs, she moves them over to Perch, where they can perform “validations.”
For these validations, the AI will spit out five second chunks of 50 different songs that they believe are a certain species. For each soundbite, it provides the probability that the AI “thinks” it is that species. The student then listens to each one and tells the AI if it was right or not.
Johnson says that depending on the student’s level of interest, they can turn their work in the lab into a directed studies class or a project that answers specific research questions.
Being able to analyze the data on this scale means that determinations about population size can be made, and over time, the amount of bird song at each location can be categorized and long-term trends can be established. This can then be used to find solutions to problems like declining population, and justify further research into certain areas and/or species.