Advancements in technology, particularly machine learning, are revolutionizing our ability to understand and potentially communicate with animals, moving the dream of interspecies dialogue from science fiction to a tangible possibility.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
What are you going to say
to your favorite animal
when you can communicate with it?
This is one of the questions
that researchers all over
the world are tackling
in the fascinating world of animal
communication.
Today,
With advancing technology,
we're closer than ever before
to decoding the language
of some of the world's
most fascinating species.
This is how machine learning
is opening the window
into the world of animal
communication.
Hi, I’m Danielle Dufault
and you're watching Animalogic.
Today we're exploring
the near magical science
of animal language.
Talking to animals,
and more importantly, understanding
their reply is something
that is fascinated us
for pretty much ever.
Just think of all the ancient books
and religions that have stories
about people talking to animals
like geese, snakes and ravens.
But it's only in the past few years
that we've developed the tools
to listen to animals
in meaningful ways.
Today, with machine learning,
unlocking new methods
of interpreting and analyzing
animal language, the dream of becoming
real life Doctor Doolittles
seems to be not a matter of if,
but when,
and that when is
getting closer by the day.
There are ongoing
experiments to understand
the language of whales, dolphins,
dogs, prairie dogs,
elephants, and even cuttlefish.
And one day
we might be able to have casual chats
with some of those animals.
Questions like that, that really keep
feeding our interest
in nature and science.
We all love to learn
more about the world around us,
which is why I'm
super excited
to open this Curiosity Box.
is a subscription service
that sends
you awesome science
themed toys and T-shirts.
Oh nice,
as well as books,
experiments, and collectibles.
It's a subscription box for thinkers.
Cool.
Every box is unique
and full of amazing items
you can't find anywhere else.
Let's see what I got.
These are two optical illusions
in one t-shirt.
The Bezold effect makes it look like
the letters are different colors,
but they aren't.
And the Stroop effect
makes your brain work extra
hard to name those colors.
It's confusing
in the most delightful way possible.
There's also this
helix-shaped jigsaw puzzle
which tells you
the history of life on planet Earth.
And let's build this in one second.
And check out these slide rule chopsticks!
They come with an inbuilt
mechanical calculator.
They'll make figuring out the tip
easier than ever.
A puzzle
me twice book
an exclusive book with 70 puzzles.
Most people usually get wrong.
And of course, a prism.
Isaac Newton's
favorite rainbow maker.
No, really,
we owe so much of what we know
about light to this little tool.
These are all awesome,
and I'm already excited
to get my next box.
And if you love science,
I hope you get yours too.
Scan this QR code
or click
the link in the description
and use promo code Anima30
for a 30% off
discount on your first month
of your subscription,
and use Code Bodydeck
to get this awesome
deck of cards
that gives you
a series of cross-sections
in the human body.
It's super fun to shuffle
and even more fun stuff.
If you get a yearly subscription,
you get these denary dice.
You know how regular dice have
six faces?
Well, this set has dice
with an increasing number of faces
from 1 to 10.
I know I'll use them for role
playing games.
So scan the QR code
and use the promo code
to connect with science
and spark your curiosity.
Of course,
these are not the first studies
on animal communication.
From signing gorillas
to talking parrots
and button pushing puppies.
We have thrown everything at the wall
and nothing has stuck.
But in science, just as in life,
you often learn more from failure
than from success.
So let's take a quick look back
at some of the most interesting ways
we've tried and failed
to communicate
with our fellow Earthlings.
Some of the earliest studies
were done on our closest relatives,
the great apes.
This was the 1960s,
and instead of trying to understand
how animals
talk to each other,
scientists tried to teach them
how to speak like us.
This was the case
with Viki the Chimp.
Researchers tried to teach her English
by raising her as a human
and giving her speech therapy.
But after three years,
they realized that chimpanzees,
quite simply, can't make
the noises we make when we speak.
It's kind of like us
trying to communicate with dolphins
by squeaking like them.
Wait, has anyone tried that?
If there's any dolphins in the
audience, I just want to say:
Since chatting with apes
was out of the question,
the strategy changed to teaching them
sign language.
Washoe,
Nim Chimpsky,
and Koko the Gorilla
became simian celebrities
for learning signs.
Koko was said to know
over 1000 signs,
but the signs didn't have grammar.
And there were questions
about how much she knew
and how much her trainers
projected into her body
language and signs.
There's been other experiments
with different degrees of success,
but in general, we can say
that apes can learn signs,
but they don't create signs.
They don't have an organized way
of combining signs
to say different things,
and they almost never ask questions.
So it's always a one sided
level of communication.
Talk about a bad date.
The next subject would have to be
an animal that communicated vocally
and in more complex ways than apes.
Dolphins.
In 1964, scientists
partially flooded a house
with ocean water.
The idea was for researchers
to move in with a dolphin
and teach them English.
But what started out as a wacky
sitcom premise
quickly turned into a horror story.
When Peter the Dolphin
became aroused and disruptive
during English class,
the scientist would handle it
to relieve the dolphin
and get him to cooperate.
And in the end, after
the well deserved media backlash,
very little science was done
and the dolphin didn't
learn English.
But a few years later,
one of the most important studies
in animal communication was done
also on dolphins.
You don't hear about it much
because it isn't as saucy
as the dolphin house.
But thanks to this study
in the field of animal communication,
the foundations got laid
and thankfully
dolphins didn't.
The researcher
was Louis Herman.
He wanted to speak to dolphins
in a language
that they would understand.
Instead of English words,
he created electric sounds
the dolphins could easily tell apart.
Then he taught the dolphins
with the sounds meant.
The genius of it was
keeping it simple.
Instead of focusing on how many words
a dolphin could learn,
he focused on what it could do
with a small vocabulary
of about 15 words.
What he found
is that dolphins have syntax.
The order of words makes a difference.
For example, human-ball-fetch
means something different than
fetch-ball-human.
It seems super simple to us,
but it's hard as hell for animals
to understand
those studies on dolphins
sparked interest in learning
about animal communication
on their own terms.
Hundreds of studies
have been done since then.
Researchers learned over the decades
that prairie dogs calls
convey specific information,
like the size and color of a predator.
Vervet monkeys have different
sounds to alert their friends
of different predators too.
Elephants and whales use low frequency
signals to communicate
with other members of their group
over vast distances.
And bees
shake their booty in particular ways
to tell each other about the direction
and distance of flowers.
So we know
animals communicate.
The question now is
can we learn to communicate with them?
Could I have told prairie dogs
that I was a friend?
It all sounds like a
massive cacophony to us,
but there's a lot of
communication happening.
We just can't understand it
because we don't have the hardware
or the software to do it.
So here's where new evolving
technologies come in.
We can now use
neural networks to go through
massive amounts of data
and find patterns.
It sounds good in theory,
but in practice
we have the huge issue
that we still don't
have that much data.
So there are new organizations
trying to compile that data
and run it through
specialized programs.
One of these groups
is Project CETI,
a scientific
and conservation organization
of over 50 scientists
all over the world.
Their goal is to understand
the rapid fire bursts of clicks
that sperm whales use to communicate.
These bursts are called codas
and seem to make whole sentences
and even regional accents.
So for the past few years,
they've been collecting thousands
of hours of sperm whale audio
to get some insights
into their language.
It's such an ambitious project
that we ask the team leading
those efforts to tell us how it works.
These are the scientists
who might figure out
how to talk to whales.
So we couldn't be more excited
to have them on the show.
One of the most ambitious
initiatives is by Project Ceti,
who's trying to teach us
how to talk to whales.
Hi, I'm Gasper Begus.
I'm a linguistics
lead at Project Ceti,
and I'm an associate professor
at UC Berkeley.
I'm David Gruber.
I'm a marine biologist
and a distinguished professor
of biology
at the City University of New York.
So as a linguist,
I like to study human languages.
But, in the past couple of
years, linguistics
has been really opening up,
and we realize
we have so many tools to offer.
And in my opinion,
sperm whales are for many reasons,
one of the best species
to start looking in.
They're incredibly smart.
They have very complex
social structures,
social relationships,
like we have grandmas
with swimming with other whales.
They are
taking care of each other's offsprings
and they exchange in dialogs.
These clicks
that on the surface
look really alien.
But if you look into them deeply,
you can see
a lot of linguistic traits.
And, that's that's how I was,
trying to open up
my linguistics side
from humans to non-humans.
And it's has been really exciting.
This isn't a project
you can do
by sticking a microphone in the water
and then running it
through an advanced machine
learning system.
Every step of the process involves
a great deal of problem
solving and collaboration.
When the project started
putting together
interdisciplinary team,
this was really important
because like,
I don't have all the answers,
but I noticed that I needed someone
to make the cameras
and someone to make the robots
and someone to do the analysis.
So for this project,
I think one of the key things
was working with linguists.
I'm working with people like Gasper,
and there have been
very few projects
where marine biologists
and linguists were working together
up until this project.
So this has been like a real joy and,
learning experience for both of us.
The first step of their work
was collecting all the data
with as much detail as possible,
and often in multiple ways.
It's great to record
the codas of a whale,
but it's even better
if at the same time
you have the footage of what it's
doing when making the calls
or heart sensors
to see what the whale is feeling
when making those calls.
But that leaves you
with a massive amount of data
that's impossible for human brains
to analyze.
As a human, you have a lot of biases.
And when you're faced
with something like sperm whales.
It's really fascinating
because you have to step out of
those biases and really try
to find things
that are meaningful
to this other species,
which is,
you know, has very different lives.
And so you have to get out
of your human biases.
I n a sense we used advanced machine
learning to help us that
machines are just a tool here.
Machine learning
is just a tool in that approach,
where you can use this other tool
to help you understand, to help you
gain insights.
Advanced machine
learning is also better at seeing
patterns in different scales of time.
It's kind of strange
to think about it,
but different animals might experience
the passage of time
differently than we do.
It's like if a different species
was studying us,
but we sounded like this to them.
So by looking
more deeply
into the timing of the codas,
the team is finding extra details
that show a rich language.
Zoom in those clicks.
You see way richer structure
than you know
when we started doing that.
And I think that's really important
because we're showing
that this creature, this,
this species
not only has complex lives
and complex social relationships
and are really interesting,
they also have communication system
that can carry a ton of information.
And we don't know yet exactly where
that information is.
But there is a potential
that I've never seen it
basically in any other species.
And I've worked with other animals.
Before.
That's fascinating.
And one of the best things
is that it's only one of the many
ongoing projects,
one of the ones closest to my heart
is Zoolingua’s prairie dog project.
Con Slobodchikoff,
from Northern
Arizona University, and his team
recorded the vocalizations of Gunnison
prairie dogs.
They observed that
they made different alarm calls
for different animals.
So he ran the recordings
through speech recognition software
to see what the calls meant.
What they found
was that on top of specific
calls for their main predators
hawks, coyotes
and humans,
they also had descriptive adjectives
so they could communicate information
like big black hawk
when there was a recurrent predator,
like a particularly crafty coyote.
A word would emerge.
I wonder what they called me
when I was trying to film them.
One of the cool things
about this technology is that
it can be applied
to different species.
Zoolingua, for example,
is applying their findings
from prairie dogs to lab dogs.
Dogs are our closest friends.
They evolved from wolves
to be our companions.
They can eat our food,
and a big part of their success
is their ability
to communicate with us.
But most of this communication happens
through body language.
So their mission is to figure out
the combination of barks
and other sounds
with the dog
facial expressions and actions,
and figure out what they mean.
This makes sense
because few animals communicate
with their version of words.
If you've ever had a dog,
you know that a look can often
tell you more than 100 barks.
Imagine being able
to talk to your pup.
I'd love to talk to my cat, nebs.
She's an extremely talkative cat
and she makes a whole range of sounds
that all mean
something pretty different.
So maybe this would help me
be able to talk in turn with her.
Other than meowing back
and forth forever.
Hey, cuttie.
Meowing at
me from the top of the stairs.
Yeah.
You like being up top,
don't you?
Programs like the Earth Species Project
are trying to make
that happen.
Earth Species Project is a company
trying to decode animal language.
They created a program called
Nature LM audio and trained it
with over 25 million
animal sounds.
The program is still in development,
but the early results
have been positive.
And as of 2025,
it's being tested on a wide
range of animals like crows,
finches and belugas.
Back to cetaceans.
Dolphins are always one
of the main subjects of study
when it comes to animal communication
and intelligence.
And there's a lot of amazing research
about how they communicate
with each other and how it can help us
in conservation.
Just like whales, dolphins
communicate vocally
and to understand
what they're saying to each other.
We need a lot of data.
While dolphins,
as you can
imagine, are hard to follow.
But some researchers
are using the latest technology
to get their clicks on tape
and figure out their meaning.
We asked one of the researchers
who's trying to get a bit closer
to understanding dolphins,
how they're doing it.
My name is Laela Sayigh
and I work at the Woods Hole
Oceanographic Institution
in Massachusetts in the USA.
We can record them
with these contact hydrophones
that we actually put
right on their melon.
And then we also put digital
recording tags on their back
before we release them.
These are all noninvasive
invasive technologies.
And the tags,
if they're bothering them,
they can just jump up a few times
and knock them off if they want to.
And so those
but both of those reporting
methods
give us some really high quality
data from known individual dolphins,
which is kind of the holy grail
for studying dolphin communication.
Those technologies
are always evolving.
The next generation of recorders
has sensors that can detect depth
and record the dolphins movements,
so we can see what types of whistles
they do
when they're doing
specific activities.
Dolphins have signature whistles,
which are kind of like their names.
When a dolphin does
its signature call.
It's basically announcing
flipper is here!
But they also have shared
whistles that they use
to communicate with each other.
We call those calls
non signature whistles.
The issue is that to
start understanding them
you have to first know
all the signature whistles
in an area
to be able to tell them apart.
So what have we learned
so far about the
non signature whistles.
So there's two different types
that we have done
some playback experiments with.
We decided to focus on these two
that we have kind of
the most data for.
And so we've done
some experiments
that are suggestive
at least of one of these types
maybe being some kind of alarm
call or danger type signal,
because we've see most animals
avoiding the stimulus
when they hear it.
And then we have another one
that's a little bit more complicated
because that one,
it's not playing back.
That one is a little trickier.
We've seen animals
making that type of whistle
in a situation where they seem
to be kind of surprised.
So they are hearing
something unexpected
and they make that whistle.
So we have kind of speculated
that it's sort of like a what is that,
what am I hearing kind of thing?
We jokingly referred to it
as the WTF whistle
and on the boat.
That's right.
Dolphins seem to have a whistle
for: RUUUN!
or I guess, SWIIM!
And another one for: what the heck?
And the more we researched
them, the more types of calls
we will find.
And finally,
we would never forget
about the elephants.
These majestic beauties
communicate with very low rumblings
well below what we can hear.
To us it's just vibration.
But to elephants it's
a whole conversation.
The Elephant Listening Project
compiled over
300,000 hours of
elephant rumbles,
which are low frequency calls
that our ears can't hear.
One of their researchers,
Doctor Michael Pardo,
wanted to figure out
if elephants had names.
They knew
elephants were more reactive.
After hearing some specific rumbles,
which they hypothesized
was their name.
Based on that,
they compiled close to 500 calls
from over 100 elephants.
They knew who had made the call
and who seemed to be the intended
recipient of the call.
Then they ran the
rumblings through a machine
learning model.
It successfully predicted
who was being called about
25% of the time.
Finally, they played the calls
they had confirmed to be an elephants
name.
And the elephants who were called
responded strongly and locally
to hearing those calls.
So basically it went like this.
Victor!
And Victor the elephant replied,
What?
You can learn about these elephants
by checking out Doctor Pardo's
YouTube channel,
linked in the description.
Okay.
Using technology to understand animals
looks very promising and exciting.
I'm sure there will be amazing
discoveries within the next few years.
But what about the animals
that don't communicate vocally?
Can we use technology to understand
animal body language and visual cues?
I mean, take us for example.
We communicate with our bodies
all the time.
Cuttlefish are one of the last animals
you would think about
when making a list of chatty animals.
But if you look closely,
you can see them signaling each other
and changing colors.
We think they're communicate,
but we don't know what.
It's still such a poorly
understood system
that we went straight
to the leading expert
in cuttlefish sign language
to ask how their solving
this problem.
So I'm Sophie Cohen-Bodenas
Currently I'm
a researcher in sensory
neuroscience at Washington
University in Saint Louis.
So what is really fascinating
about cuttlefish is that
not only do they use this ability
of doing dynamic skin patterning
for a camouflaging,
but they also do it for communicating,
which means that it's versatile.
So sometimes they will do like
a white patterning to
blender in the skin,
but also sometimes they will do black
spots on the skin to communicate.
Doctor Cohen-Bodenas
noticed that the cuttlefish
were moving their arms
in particular ways.
They weren't random moves.
They seem to be saying something.
So she started recording them
and eventually
classified for recurrent movements.
The next part of the research
was seeing
what happened
when you played the signals back
to the cuttlefish.
We wanted to see
like putting them like,
right in front of the screen,
and we wanted to see
how would they would react
when we were presented
in what was really fascinating,
that they were very curious about it.
They would come by themselves
in front of the screen
and they would display the signs.
And what was also very interesting
is that not only
they would display the sign
like as mimicking, like, for example,
they see up the food up or something
like that, which would have been very interesting
because we would have said, oh,
it might be mimicking signals,
something, but it was a bit different
because when they would see the sign,
respond with a different type of sign.
But the ocean is dark,
and marine animals often
have to rely on their sense of touch
or mechanoreception
instead of their eyes.
So what would happen
if two cuttlefish
wanted to communicate at night,
or in murky water,
or simply when they couldn't
see each other?
So what we did
is that we put the hydrophone,
which is just like a microphone.
But for underwater to record the trace
emitted by the wave,
we could do spectrogram.
And then we did
the playback experiment,
which means that we playback test
that trace to the cuttlefish.
And why this experiment is interesting
is that it canceled the visual aspect
because you just display
like the vibration in the water
without them seeing one another.
And so it was really interesting
also for us because,
the cuttlefish
indeed got really interesting.
They would tend to come
to this hydrophone
to display the sign.
We interpret this first result
as a preliminary proof
that they could perceive it also via,
mechanoreception.
All of this data proves
that communication is happening,
but we don't know what the signs mean.
This is where technology comes in.
It's really, like,
complicated for a human to say
exactly in which context?
Behavioral context.
Which sign mean which sign.
So the idea is to have
a colony of cuttlefish,
and let's enable them
to display the signs
along with the coloring patterns
in many different behavioral contexts.
Collect a lot of data
and then have an algorithm
that will find the structure for us.
In order to start to,
dream.
We'd have to have an algorithm
that is going to say,
okay, so
two black spots plus orange.
plus raw, is most often correlated,
I say just as an example, eating.
And zebra
pattern plus
a crown plus white square
is going more often
being related to aversive display.
And this is something that for
a human is very hard to decipher.
But a powerful algorithm would be able
to possibly do this for us.
Super rad.
It's like getting an insight
into the minds of aliens,
which cuttlefish,
despite some internet
rumors, are 100% not.
Although they kind of look like
they could be from Pandora.
It's hard not to be optimistic
about all of this research,
but before we start thinking
about full conversations
with the animals in our fish bowls,
there's still a lot of work
to be done.
I think there is a misconception
that people think you just throw
AI at any animal,
and out comes some translation.
I think because it's working
so well on humans.
People are able to take that leap
in their head.
But now that we're really deep
in this project,
we see that is not the case.
We can't just take ChatGPT
and throw it at whales,
and out comes a translation.
And it's, it's a significant climb
that we're undertaking undertaking.
In all of these cases,
Machine learning is more of a tool
than a universal translator,
and trained professionals are still
the most important
part of the equation.
I think that it's really important
to actually
watch the animals and document
sort of how they respond to sounds
and how they're using sounds.
I think that just
putting a lot of sounds
into some kind of AI
and having it discern
whether they're sort of patterns
or things like that, that could be valuable
in potentially
helping to generate hypotheses
that we could test.
And yes, through a combination,
we will learn a lot
about animal communication
with the help
of advanced machine learning.
But there's something we always have
to keep in mind.
We have to do it with
the welfare of the animals in mind.
I think the key thing
is, is always keeping the whale center
and asking that question is like,
is this study
is this finding in service of
of the whale?
And you just keeping that
at our core heart of the project?
If we do that
and start to learn more about
the inner lives of whales,
dolphins, cuttlefish
and other animals,
we might have to change
not only how we see ourselves
in relation to the natural world,
but also how we apply our laws.
If we're finding
all these things about
animals and languages, in a sense,
this last frontier
that the legal system has
has used as kind of a barrier
to keep rights on the humans.
So if we're finding a species
that can communicate with codas
that are so informative,
so it can be so powerful, right?
Can transmit so much information.
I think we need to pause
and ask ourselves, well, you know,
what does that mean
for the legal system?
In other words, there might be a point
where our understanding of animals
becomes so deep
that we might give them some rights
that are currently seen
as exclusively human rights.
But if the animals are
so smart and their consciousness
is so deep and meaningful,
why wouldn't they deserve
to have some rights?
As technology advances,
it's a question
we might need to answer
sooner than we think.
You know, I'm a linguist.
I started with linguistics,
with old languages.
I studied really all languages.
And those languages allow
you time travel because, you know,
you have access to these people
who lived 3000 years ago.
And listening to the whales is like
traveling in different
intelligences, right?
You have this amazing species.
You can imagine their lives.
You know, they when they sleep,
they flow like vertical heads.
And you're just, like,
imagining,
this beautiful intelligence.
And you realize,
you know, maybe
our position as humanity
is, is not as high
as we used to think.
I think it keeps you grounded.
It keeps you closer to it,
and it can help you really reimagine,
maybe what
the position of humanity is.
So, yes, technology can help us
learn about animal communication
and support conservation projects,
but we have to keep the welfare
of animals first
and not prioritize profits.
We also have to make sure
that we listen to them
and really try to understand them,
not just hear what
we want them to say.
Many of the things animals
will say will require us
to adjust our behavior massively,
from our shopping habits
to our eating preferences,
to even where we choose to live.
So here's hoping that
when animals talk to us,
their calls don't fall on deaf ears.
So what should we talk about next?
Please let me know in the comments
and don't forget to subscribe
for new episodes every week.
Thanks for watching.
See ya and take care.
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.