Results for ""
Artificial intelligence (AI) has been impacting the environmental sector with its ability to solve problems, make decisions, and recognize patterns. The significance of AI in wildlife acoustic monitoring is vital because of the vast amounts of data available in this field, which can be leveraged for computer vision and interpretation. Despite the increasing use of AI in wildlife ecology, its future in acoustic wildlife monitoring remains uncertain.
A study published in the Nature journal follows how Indian researchers leverages AI and ML tools to observe and conserve species and ecosystems. Alongside best use cases, they are noting the challenges with AI and building high-quality training datasets.
According to Seema Lokandwala, who is a part of the Elephant Acoustics Project, they can identify if the elephant is in conflict if they can recognize its sound. The algorithms help her clarify overlapping sounds and decipher infrasonic ones that are inaudible to humans. They also separate trumpet calls based on whether the elephants interact with their mahouts or other elephants.
Listening to these calls could take a researcher up to 30 minutes to decode the message. Lokhandwala says AI can figure out a hundred calls in five minutes.
Five years ago, ecologist V. V. Robin, a researcher at the Indian Institute of Science Education and Research (IISER), Tirupati, initiated a project to understand why birds found in some Western Ghats habitats did not appear in others. It took him two years to analyze avian sound recordings collected over a year. In his opinion, with AI, they would have completed the study in a year.
According to Devi Shankar Suman, an entomologist at the Zoological Survey of India, they analyze thousands of samples using acoustics. Their recent study deciphered buzzing mosquitoes to detect species, sex, and feeding status.
At the 86-acre campus of Bengaluru's Srishti Manipal Institute of Art Design and Technology, 20 recorders pick up sounds. Each recorder collects 144 minutes of data daily, creating a reference dataset that can point to changes in the landscape and biodiversity. The AI models are trained on sounds of collective acoustic signatures of organisms (biophony), of the environment (geophony) and humans (anthrophony), says Gururaja K V, a batrachologist at the institute.
These AI-driven methods can provide continuous, real-time surveillance without human intervention. "Real-time AI analysis could alert authorities to activities such as illegal tree cutting."
Vijay Ramesh, a postdoctoral research fellow at K. Lisa Yang Center for Conservation Bioacoustics at the Cornell Lab of Ornithology in the United States, opines that compared to birds, we have limited audio data on other taxa like amphibians and insects.
In Robin's opinion, birds from India must be better represented in AI-enabled platforms such as BirdNET, which identify birds through sounds. The algorithm represents most North American and European birds.