Google is developing an AI model called DolphinGemma to help researchers understand dolphin communication. The model was trained on acoustic data from Atlantic spotted dolphins provided by the Wild Dolphin Project (WDP). WDP has been studying these dolphins for 40 years.
Meet DolphinGemma, an AI helping us dive deeper into the world of dolphin communication. 🐬 pic.twitter.com/2wYiSSXMnn
— Google DeepMind (@GoogleDeepMind) April 14, 2025
DolphinGemma can generate dolphin-like sounds, including clicks, whistles, and burst pulses.
Google AI is bringing us one step closer to interspecies communication. Introducing DolphinGemma, a new large language model to help scientists study how dolphins communicate. Dive into this project from @GoogleDeepMind, @dolphinproject and @GeorgiaTech 🐬 https://t.co/hrki9LW58L
— News from Google (@NewsFromGoogle) April 14, 2025
The model’s ability to predict the next sound in a sequence could help researchers identify patterns and potential meanings in dolphin vocalizations. This process would take much longer if done manually.
The research team is also exploring two-way communication with dolphins using a technology called CHAT. Developed by Georgia Tech, CHAT involves a device worn by divers that can recognize and play sounds.
Decoding dolphin vocalizations with AI
Introducing DolphinGemma, an LLM fine-tuned on many years of dolphin sound data 🐬 to help advance scientific discovery. We collaborated with @dolphinproject to train a model that learns vocal patterns to predict what sound they might make next. It’s small enough (~400M params)…
— Sundar Pichai (@sundarpichai) April 14, 2025
Researchers plan to present dolphins with novel, AI-generated vocalizations associated with specific objects and observe the animals’ reactions. Google Pixel phones play a crucial role in this research. The phones can record and analyze dolphin sounds in real-time, reducing the need for custom hardware and lowering costs.
This is wild.
Google just built an AI model that might help us talk to dolphins.
It’s called DolphinGemma.
And they used a Google Pixel to listen and analyze. 🤯👇pic.twitter.com/INovOFo0RM
— Min Choi (@minchoi) April 14, 2025
The upcoming Pixel 9 will be able to run deep learning and template matching simultaneously, further enhancing the ability to understand and replicate dolphin sounds. While the ultimate goal is to understand how dolphin language works, some experts caution that it’s still unclear whether dolphins have a language in the same sense as humans. They also note that individual dolphin populations may have their own vocalization differences.
Google plans to share DolphinGemma as an open model this summer, allowing researchers worldwide to study other dolphin species and contribute to a deeper understanding of these intelligent marine mammals. The collaboration between WDP, Georgia Tech, and Google is opening up new possibilities in the quest to bridge the communication gap between humans and dolphins.
Image Credits: Photo by Leon Overweel on Unsplash
April Isaacs is a news contributor for DevX.com She is long-term, self-proclaimed nerd. She loves all things tech and computers and still has her first Dreamcast system. It is lovingly named Joni, after Joni Mitchell.























