Berkeley, CA – A team of researchers from UC Berkeley recently achieved a major milestone in deciphering sperm whale vocalizations using artificial intelligence. Their findings reveal new intricate structures within whale communication, bringing us closer to understanding the full depth of whale language.
The breakthrough centers around identifying distinguishable “vowel” sounds in sperm whale click vocalizations. Graduate student researcher Gašper Beuš built a deep neural network model named fiwGAN that successfully extracted and replicated acoustic properties amounting to vowel sounds from sperm whale recordings.
“We found that sperm whales don’t just continuously vocalize in a way that might be compared to just static noise, but actually form these distinct reoccurring pronunciations that functionally mimic vowels,” explained Beuš.
See Also: How to Use Graphics AI in Their Work
AI Mimics Whale Vocal Signature
The fiwGAN model was able to generate new synthetic whale click vocalizations nearly identical to a real whale. This shows these “vowel” sounds are integral components purposefully used by whales to vocally identify themselves.
“Being able to artificially reconstruct part of a sperm whale’s vocal signature hints at an underlying language where individual whales can effectively name themselves,” said senior author Dr. David Gruber. “Mimicking their ability to say their own names opens up a new realm of possibilities for understanding the foundations of whale communication.”
Two Vowel Sounds Detected
Analysis of a dataset with nearly 4,000 real-world whale click recordings revealed two distinct vowel sounds repetitively used by whales of the Pacific and Atlantic oceans. Researchers dubbed these two coda vowels the “a-vowel” and “i-vowel” for their acoustic similarities to human vowels.
The breakthrough hints at a deeper linguistic complexity than previously thought behind sperm whales’ echolocation clicks and other vocalizations. This could indicate they possess advanced language capabilities not present in other mammals apart from humans.
AI Critical to Quantifying Whale Dialects
Past difficulties in quantifying differences between whale groups’ dialects stemmed from the reliance on human ears alone to parse subtle distinctions in vocals. AI eliminates this barrier through detecting minute patterns across thousands of whale recordings simultaneously.
“Now with further machine learning work, we can objectively break down any notions of hierarchies when it comes to labeling intelligence levels across species,” said UC Berkeley ethologist Dr. Lori Marino. “Sperm whales may have complex means of communication rivaling primates.”
Researchers next plan to investigate whether vowel sounds help whales distinguish regional sub-species and family groups. The fiwGAN model will continue to be refined with more whale vocalization data to eventually translate full conversations. Uncovering the inner workings of whale language can profoundly expand our grasp of interspecies communication complexity and cognition.