Scientists have made a groundbreaking advancement in understanding humpback whale communication by using advanced AI and interdisciplinary collaboration, moving beyond mere listening to the potential for two-way dialogue and revealing a complex linguistic system previously thought unique to humans.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
A breakthrough for scientists after they
had a 20inute conversation with a
humpback whale.
>> Beneath the ocean surface, an ancient
conversation hums in rapid clicks.
Messages passed between giants with
brains six times our own. Using advanced
AI, scientists are listening like never
before. And what they have uncovered is
something no one would have guessed in a
million years.
>> This is a little how their conversation went.
What they are finding is revealing an
entirely new way to understand
communication between intelligent life
forms on Earth. Since the dawn of ocean
exploration, the vast, mysterious ocean
has held its secret from us. For
centuries, sailors who returned from
long journeys told incredible stories of
what some people believe to be mythical
sea serpents. and more believably the
haunting melodic calls of whales. These
sounds were just part of the seas
marvel. Beautiful but barely explored.
These marine animals have been a subject
of admiration because of their size. But
we never truly believed they had a mind
that could match their might. For man,
they were just resources to get blubber
for lamps and bones for corsets. Until
one unexpected event, the call of the
deep. In the 1970s, the world opened its
eyes and was on the new wave of a new
environmental awakening. It was there
and then that a quiet discovery would
change everything. Scientists who had
been studying marine life for ages took
an interest in humpback whales and soon
set out to sea to record the
vocalizations of these whales. And what
they found was astonishing. The moans
and groans of these animals weren't just
random. They were intricate sounds that
evolved over time, lasting several
hours. The recordings were saved and
then compiled into an album called Songs
of the Humpback Whale. This scientific
documentation was the start of what
would get the attention of the public
and also tickled the imagination of
many. This one album helped take off a
global movement that saved these
magnificent creatures from the brink of
extinction. As scientists went further
into bio acoustics, which is the study
of animal sounds, they were left with
more questions than answers. Early
researchers in the field used basic
recording tools and spent hours
carefully listening and analyzing, all
in an effort to understand how whales
communicate. They couldn't quite place
what the sounds were. There were
theories of the whale songs being a
simple mating call, a call for help in
time of danger. or just a form of communication.
communication.
No one truly knew. For decades, we could
only guess, held back by the sheer
volume of data and the limits of our own
senses. All we could hear were the
sounds. But we didn't have the tools to
understand them. And this is where a guy
named Dr. David Gruber came in with a
bright idea. Dr. David Gruber, an
American marine biologist and National
Geographic researcher, believed that to
understand whales, we didn't need the
old tools used back then, but something
new and technologically advanced. He
looked to a different field for
inspiration, one that was far from the
sea, but focused on space called the
search for extraterrestrial intelligence
or SETI. The idea behind SETI is to
listen for signals from space in hopes
of one day hearing a message from
another civilization.
Dr. Gruber thought, what if we've been
looking for alien life in all the wrong
places? He thought an alien intelligence
could be right here on our own planet in
the form of a whale, the humpback whale.
This crazy almost sci-fiike notion was
the beginning of project CI
which means the citation translation
initiative. Not only did Dr. Gruber
bring this insane idea to life, but he
also brought together an extraordinary
team consisting of marine biologists,
computer scientists, linguists, and
roboticists. All joined by a shared
vision. Their aim wasn't just to
document sounds. They were planning to
do the impossible.
Their plan was to create a two-way
dialogue with another species.
How was this group of people planning to
listen to voices from a world so
different from ours? To truly connect
with whales, they needed more than just
ears. They needed a way in. The dream team.
team.
The team Dr. David Gruber built for
project CI was unlike any scientific
group you've ever seen. It was a fusion
of brilliant minds from all kinds of
fields who at first might not seem to
have much in common, but were all united
by the same audacious goal to crack the
code of whale communication.
This unconventional alliance was exactly
what the project needed. Understanding
the voices of the whales was a
complicated task that no single
discipline would have been able to
solve. It's no surprise that these
chosen experts who are masters in their
own fields were picked. This
collaboration was the engine of project CI
CI
leading to something incredible. At the
heart of the project were the people who
had spent their lives in the water with
these animals. People like Shane Jurro,
for example, who is a marine biologist,
spent over a decade studying a single
family of sperm whales near the
Caribbean island of Dominica. This
wasn't just a job for him. He was as
dedicated to his job as he was to loving
these magnificent creatures. He even
knew the whales by name and watched them
grow up.
This long-term intimate fieldwork became
the bedrock of project CI.
He was a great asset to the team. Who
better to understand the workings of
Wales than Shane? Apart from Shane
Gerro's contributions, another asset to
the team that wasn't present in the old
research team was natural language
processing thanks to AI. This is where
the start of what seemed impossible
becomes a modern marvel. Before project
seti became a thing, the idea of
translating whale communication was a
figment of researchers imagination
because of the limitations of technology
at the time. However, a major
breakthrough was on the horizon, not
just in marine biology, but also in technology.
technology.
Around 2019, the team came to a
realization of getting the technology
needed to build an underwater recording
studio. This studio would be the first
studio that could finally be able to
translate the language of sperm whales.
And just like the thought of getting on
the moon, this discovery ignited the
project and made the impossible suddenly
feel within reach. The island of
Dominica became the heart of this
project. It was the perfect place for
this kind of work with its volcanic
landscape and waters that get incredibly
deep just off the coast. Its unique
geography allows sperm whales to swim
close to land, making them easier to
study than in most other places. Not
only that, but the island also has a
stable and good amount of whale
population with many of the same
families returning to the island year
after year. It was also the same island
Shane Gerro's decades of work started
on. With this newfound possibility,
Project Siti officially launched in
2020. The initiative received a massive
boost from the TED Audacious project,
which provided $33 million in funding to
get it off the ground and rolling. The
funding allowed them to build an
incredible team and also begin their
first major task to create what they
call a 20 km by 20 km underwater
listening and recording studio off the
coast of Dominica. This setup would
allow the AI to not only hear the
whales, but also to understand who was
speaking, to whom, and in what social situation.
situation.
This was the technological bridge they
needed to cross to go from listening to
truly understanding.
The two groups, the field scientists and
the tech experts worked together in a
way that had never been done before. The
biologists provided the deep knowledge
and context. While the computer
scientists brought the tools and power
to sort the data, they worked together
in a close partnership. The hands-on
research guided the technology and the
technology helped them uncover new discoveries.
discoveries.
This collaboration was project city's
greatest strength, allowing them to
tackle the problem from all angles. What
made them different from past efforts to
understand animal language was how they
combined knowledge from different fields
and worked as one team. With this
incredible team in place, they were
ready to tackle the next big hurdle.
How do you collect sound in the deep
without asking a whale to wear a mic?
The team had a bold answer to everyone's
question, the oceans and the working technology.
technology.
The main tool and part of the plan that
would seal it all together would be the sensors.
sensors.
These sensors are designed and deployed
arrays of hydrophones, which is another
fancy word for underwater microphones.
They are placed on the ocean floor. They
do exactly what a normal microphone
would do, but a million times better.
They not only amplify the sounds from
these creatures, but they are also
listening stations placed in a grid-like
pattern to constantly record everything 24/7.
24/7.
It works like how a room filled with
microphones on the wall would be able to
catch any sound any time, even as little
as a pin dropping. The same goes for the
hydrophones, which the project city team
did on a large scale, which can detect
where any sound under the ocean comes
directly from. By using this network of
hydrophones, they could pinpoint a
whale's clicks and figure out which
particular whale was making a specific
sound. This was a gamecher for the
expedition. It wasn't enough to just
hear a whale. They also needed to know
which whale was talking. To get an even
clearer picture of the whale, the team
had another brilliant idea of sending in
a robotic fleet. Since it was impossible
to get a whale to wear a mic, they could
send out autonomous drones and robotic
systems to get close to the whales.
These robots were designed to place
non-invasive suction cup tags onto the
whale's backs. These tags were packed
with sensors that not only recorded the
audio, but also the whale's movement at
any depth, its heart rate, and its
social interactions with other whales.
This was another important piece of data
needed to understand what the sounds
made by the whales meant. The hydrophone
network could tell them what was said
and by whom, while the tags provided the
context, which could determine whether
the whale was making a sound while
hunting for food or whether it was
socializing with its family. This
behavioral data gave the AI the vital
clues it needed to understand the
reasons behind the clicks. All of this
technology was all about collecting an
incredible amount of data needed for
their research. The comparison of the
recent research and the ones from over
40 years ago is a lot different. While
previous research might have collected a
few hours or even a few days of whale
sounds, project city was collecting
millions, even billions of vocalizations.
vocalizations.
While doing all these, they were
building the largest library of whale
sounds ever created by man. This
monumental scale was essential because
AI models, especially machine learning
models, need huge data sets to find
meaningful patterns. They learn by
example. And the more examples they
have, the smarter they get. This is why
project was so different from all
previous efforts which only focused on
trying to find a few key phrases. Unlike
recent efforts that are centered on
capturing a complete language, the data
streamed in a constant flow of clicks,
whistles, and songs from the deep. But
on its own, it was just noise. It was
raw data without meaning. A vast ocean
of information waiting to be decoded.
The microphones and sensors had listened
and the tags had recorded, but the
language remained a mystery. It was all
leading to one crucial step. The real
magic happened when this ocean of sound
was fed into the machine learning
models, transforming it from mere noise
into the building blocks of a language.
The AI breakthrough.
To the human ear, the deep ocean is only
a confusing jumble of clicks, grunts,
whistles, and waves. Trying to find a
particular pattern from all the noise is
nearly impossible. The first biggest
challenge for the project CD team was to
turn all the raw audio they collected
into something the AI could actually
interpret. In order to do this, they had
to use a process that turned sound into
a picture called a spectrogram. With
this, they could take something as basic
as the waves of the ocean and turn it
into a visual chart. With this
technology, they were able to monitor
all the data they had collected. With
this, AI could visually see the whale
sounds. The AI's first task was to sort
through these images and identify the
kodas, which are the specific sequences
of clicks sperm whales use to
communicate while ignoring all the
background noise. This initial process
of turning an invisible sound into a
visible pattern was the crucial first
step. This is where the power of machine
learning came in, acting as a kind of
modern-day Rosetta Stone. The team of
scientists and researchers then took a
step further into using advanced AI
technologies, including something called
deep learning and neural networks. These
advanced technologies were incorporated
as they are designed to learn the way
humans do by recognizing patterns in
large amounts of information. In this
case, the AI wasn't given a list of
rules about how whales communicate.
Instead, it was trained using millions
of whale kodas, which are the short
bursts of sound whales used to
communicate. By studying these, the
system slowly started to find patterns
on its own. Over time, it began to
understand which sounds often appeared
together, how they changed in different
situations, and what they might mean,
just like how we learn a new language by
listening and observing. Soon enough,
these technologies began to notice tiny
differences in the clicks that were
completely unnoticeable to a human
listener. It could detect subtle
variations in rhythm, tempo, and other
details that our brains simply couldn't process.
process.
Just like how a toddler or a young child
learns a language, the computer was
doing the same thing, only much, much
faster, and with a data set that a
single human could never analyze in a
lifetime. The sheer scale of the data,
which is over millions of vocalizations
over many years, was what gave the AI
the power to find these hidden patterns.
Without this huge data set, the AI would
have been as lost as a human biologist.
With this new technology and AI's hard
work, it led to the core of the
discovery of the sperm whales language.
And as expected, it was even more
complex than anyone had ever imagined.
These sounds weren't just a simple set
of messages. After a massive collation
of this data, it was discovered that
whales use a complex system that works a
bit like a phonetic alphabet. Of course,
they don't have letters, but it doesn't
take away the fact that they use
different variables like rhythm, tempo,
and what the scientists call
ornamentation, which are extra little
clicks at the beginning or end of a kod.
The way these different elements are
combined creates an insane amount of
unique words or messages. For example, a
slow, steady rhythm might mean one
thing, while speeding up that rhythm or
adding an extra click at the end could
completely change the meaning. This
discovery completely shattered the idea
that only humans could have a complex
structured linguistic system. It showed
that whales were also using a system
with a lot of different parts that could
be rearranged to create an intricate vocabulary.
vocabulary.
This was the moment everyone had all
been hoping for. For the first time, we
had a window into the actual syntax of
their communication. We could see the
building blocks of their language, but
we still didn't know what it all meant.
What were they actually saying? This is
where things got a lot interesting.
Unveiling the whales world view. For a
long time, scientists thought if whales
spoke a certain language, it would be
simple and direct. In all their years of
work and research, they expected a basic
set of commands or even a survival
language. But what the AI revealed was
something much more unexpected and
interesting. The patterns the computer
found weren't tied to simple actions,
but to social context. It was the big
reveal. And the moment everyone realized
all the theories they had were wrong.
Whales didn't just sing songs only
because they had to, they did so to
communicate with each other in ways that
were complex and sometimes personal that
reflected their closeness to one another
as well as their social lives. It was
less of a simple code and more of a
conversation. As the AI continued to
analyze the data, it uncovered an even
more astonishing level of detail. The
team found that within the clicks, there
were subtle variations that acted much
like vowels do in human language. They
discovered that the whales use a
sophisticated system of what's called
robato, which is a fine grained change
in rhythm. It also has very subtle
variations in the timing of clicks
within a koda and more importantly a
sequence of clicks that is the basic
unit of communication for these
creatures. These scientists were also
able to learn that a whale's koda might
have a specific rhythm and a slight
pause or a quickening of tempo could
completely change its meaning. On top of
that, the ornamentations too were also
important to convey different messages.
It was as if the basic words were being
molded and shaped to express different
emotions or intentions. This showed a
level of linguistic sophistication that
truly challenged the idea that this kind
of communication was unique to humans.
The AI also showed that this language
was deeply tied to their social
identity. The research revealed that
different whale clans or families use
distinct dialects to identify
themselves. The same way people from
different parts of a country might have
different accents, these whale families
have their own unique way of speaking.
When the AI analyzed the clicks, it
could tell which family was talking just
by the subtle differences in their
speech patterns. This meant that the
language wasn't just a universal code
for all sperm whales. It was a unique
and dynamic living part of their culture
passed down through generations.
A single word could have multiple
meanings depending on the rhythm and the
context of the conversation and the
specific dialect it was spoken in. With
the help of AI and other advanced
technology, scientists have been able to
not just identify the language of these
whales, but also give humanity the
window to a culture no one knew existed.
Now being able to understand the
communication patterns of different
whales, studies and research have been
implemented to better have an idea of
the social connection, identity, and
shared history of these whales. This
realization changes our understanding
from a simple translation of words to
knowing the ins and outs of marine life
and what it holds. But what could this
new development mean for the future?
the goal of two-way communication. For
decades, we've only been able to listen
to the whales. Now, thanks to the AI, we
finally have a way to understand the
structure of their language. This new
understanding opens up a thrilling and
somewhat scary next phase of project
Sidi. Moving from listening to talking,
the team is no longer content with just
decoding. They're aiming to communicate.
Their goal is to build a realtime
communication system, what some people
are calling an underwater chatbot. This
isn't about teaching whales English, but
also about using the patterns and
structures the AI discovered to create
messages in their own language. It's
just like how a scientist would use a
specially designed speaker to play a
specific sequence of clicks, hoping for
a response. The team is starting with
simple interactions like greeting a
whale or maybe asking a question about a
particular object. It's an ambitious
dream, a true two-way conversation
between two very different species. This
audacious goal also brings up some deep
thought-provoking questions about the
ethics of interspecies dialogue. Should
we be talking to them? and what is our
part to play in obstructing the balance
of marine life. When you start to build
a bridge between two worlds, you have to
think about what might cross that
bridge. From there on, the question and
thought of whether it is right to
introduce our technology and our
language into their world starts to come
up. Some scientists worry that a human
initiated dialogue could disrupt their
natural social structure or even change
their language. It's a huge moral
question that the project CCTI team
takes very seriously. They're not just
trying to achieve a technical goal.
They're trying to do it in a way that
respects the whales and their culture.
They want the conversation to happen on
the whales terms in their own language
and in their own time. The team had a
specific recent success story that shows
this new future is already beginning. In
2023, the team had what they call the
first intentional human whale
interaction using the language they had
decoded. A diver using a special
underwater speaker played a specific
sperm whale KOD. The response was a huge
moment for the project. A whale in the
area responded with the exact same KOD.
It was a direct back and forth exchange,
not just a random coincidence. The team
couldn't believe it. This wasn't just a
recording. This was a conversation. It
was a single small exchange, but its
significance was monumental.
It proved that their understanding of
the language was accurate enough to be
used for basic communication. This event
wasn't just a scientific breakthrough.
It was a positive sign that we might one
day be able to talk to other minds on
our planet. This realization changes
everything. We're on the verge of
something that was once only found in
science fiction. We're moving from
observing to participating, from
listening to interacting. Communicating
with another species is a monumental
step, but the implications extend far
beyond the whales themselves.
This technology could rewrite our place
in the universe. Implications for
science and humanity.
The work of project CI is forging a new
frontier for biology. For a long time,
marine biology and animal cognition were
limited by our ability to observe and
interpret. We could watch whales, but we
couldn't understand what they were
saying or thinking. Now, with AI as the
main translator, there are unlimited
opportunities waiting out there with
more to explore and uncover. It wouldn't
be about studying their bodies or
behaviors. The chances of finding out
the structures of their lives from the
communities they make up and their
intelligence, which was one humanly
impossible thing, are now high. This
research is completely changing how we
see animal communication, revealing a
level of complexity and social value
that no one could have imagined. With
this, some people believe that so many
wonders of the world might now be
explained. and also some of the biggest
questions in science could possibly have
an answer. With all the data from
project city, it can not only help us
understand sperm whales, but also
revolutionize the very existence of the
very nature of intelligence itself.
Perhaps the most interesting and
mindbending implication of this research
is its connection to space. The
project's name CE TI, which is a
deliberate term derived from SE TI, has
undeniable similarities. For decades,
scientists have been scanning the cosmos
for radio signals, hoping to find a sign
of life. But what if they've been
looking for the wrong kind of message?
With the success of Project Sidi, the
whales give us a blueprint for
extraterrestrial contact. The methods
and algorithms they developed to find
patterns in clicks and decode whale
language could be the exact tools we
would use if we ever encounter an alien
civilization. If an alien species
communicates in a way that is completely
foreign to us, we now have a proven
strategy for how to approach it. The
principles of using AI to find structure
in a complex unknown signal are universal.
universal.
This work is not just about
communicating with whales, but also
about preparing for the possibility of
communicating with anyone anywhere in
the universe. It seems like a
far-fetched thought, but the journey to
space, unlike any other, might have also
begun from the curiosity of the ocean.
Finally, the work of project CI brings
us back to Earth in the most powerful
way. With complete understanding of the
whale language, a new level of empathy
and conservation is fostered. For
centuries, whales have been seen as a
species to be harvested, then a species
to be saved, but always from a distance.
By knowing that they are a vital part of
research, we are forced to confront
their intelligence and personhood in a
way that is hard to ignore. When we hear
their communication, their concerns
about their families, their culture, and
their world, they are no longer just
animals in the ocean. They are beings
with thoughts, feelings, and a history.
This new connection strengthens the case
for their protection and the health of
our oceans. Knowing that these creatures
have a rich, complex social life makes
their survival all the more important.
It gives us a personal emotional reason
to care about the health of the marine
environment that they call home. The
discovery of the whales might seem over,
but it's only the beginning. The work of
Project CT gives us a new way to see the
world and a new role to play in it. So,
what's next for project CI?
The team at Project CT isn't slowing
down as there are already plans in place
to expand their research to other whale
species like orcas and refine their AI
with new algorithms. This effort is part
of a global movement where scientists
are using AI to study animal
communication, showing us a world of
complex language we never knew existed.
To round it up, their work is about more
than just science. It's also about
building a bridge between our species
and theirs for a shared future on a
shared planet. The whales have a lot to
tell us about the health of the ocean,
and we are finally learning how to listen.
listen.
So, what are your thoughts on this
groundbreaking research? Do you think
talking to whales is a huge step in the
right direction for humanity? Or do you
have reservations about what it might
mean? Let us know what you think in the
comments below. Don't forget to like,
subscribe, and hit the notification bell
on our channel so you never miss any of
our uploads. If you enjoyed this video,
click on the next video on your screen
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.