0:00 What are you going to say
0:01 to your favorite animal
0:02 when you can communicate with it?
0:05 This is one of the questions
0:07 that researchers all over
0:08 the world are tackling
0:10 in the fascinating world of animal
0:12 communication.
0:14 Today,
0:15 With advancing technology,
0:17 we're closer than ever before
0:19 to decoding the language
0:20 of some of the world's
0:21 most fascinating species.
0:24 This is how machine learning
0:26 is opening the window
0:27 into the world of animal
0:29 communication.
0:37 Hi, I’m Danielle Dufault
0:39 and you're watching Animalogic.
0:41 Today we're exploring
0:42 the near magical science
0:43 of animal language.
0:45 Talking to animals,
0:46 and more importantly, understanding
0:49 their reply is something
0:50 that is fascinated us
0:52 for pretty much ever.
0:54 Just think of all the ancient books
0:56 and religions that have stories
0:58 about people talking to animals
1:00 like geese, snakes and ravens.
1:03 But it's only in the past few years
1:05 that we've developed the tools
1:07 to listen to animals
1:08 in meaningful ways.
1:10 Today, with machine learning,
1:12 unlocking new methods
1:13 of interpreting and analyzing
1:15 animal language, the dream of becoming
1:18 real life Doctor Doolittles
1:20 seems to be not a matter of if,
1:22 but when,
1:24 and that when is
1:25 getting closer by the day.
1:28 There are ongoing
1:29 experiments to understand
1:30 the language of whales, dolphins,
1:33 dogs, prairie dogs,
1:36 elephants, and even cuttlefish.
1:39 And one day
1:40 we might be able to have casual chats
1:42 with some of those animals.
1:44 Questions like that, that really keep
1:46 feeding our interest
1:48 in nature and science.
1:50 We all love to learn
1:51 more about the world around us,
1:53 which is why I'm
1:54 super excited
1:55 to open this Curiosity Box.
1:58 is a subscription service
2:00 that sends
2:00 you awesome science
2:02 themed toys and T-shirts.
2:03 Oh nice,
2:04 as well as books,
2:05 experiments, and collectibles.
2:08 It's a subscription box for thinkers.
2:10 Cool.
2:11 Every box is unique
2:13 and full of amazing items
2:14 you can't find anywhere else.
2:17 Let's see what I got.
2:18 These are two optical illusions
2:20 in one t-shirt.
2:22 The Bezold effect makes it look like
2:23 the letters are different colors,
2:25 but they aren't.
2:26 And the Stroop effect
2:27 makes your brain work extra
2:29 hard to name those colors.
2:30 It's confusing
2:31 in the most delightful way possible.
2:34 There's also this
2:35 helix-shaped jigsaw puzzle
2:37 which tells you
2:37 the history of life on planet Earth.
2:40 And let's build this in one second.
2:43 And check out these slide rule chopsticks!
2:45 They come with an inbuilt
2:47 mechanical calculator.
2:48 They'll make figuring out the tip
2:50 easier than ever.
2:51 A puzzle
2:52 me twice book
2:53 an exclusive book with 70 puzzles.
2:56 Most people usually get wrong.
2:58 And of course, a prism.
3:00 Isaac Newton's
3:01 favorite rainbow maker.
3:03 No, really,
3:04 we owe so much of what we know
3:05 about light to this little tool.
3:08 These are all awesome,
3:09 and I'm already excited
3:10 to get my next box.
3:12 And if you love science,
3:13 I hope you get yours too.
3:15 Scan this QR code
3:16 or click
3:17 the link in the description
3:18 and use promo code Anima30
3:20 for a 30% off
3:21 discount on your first month
3:23 of your subscription,
3:24 and use Code Bodydeck
3:26 to get this awesome
3:27 deck of cards
3:28 that gives you
3:28 a series of cross-sections
3:30 in the human body.
3:31 It's super fun to shuffle
3:33 and even more fun stuff.
3:35 If you get a yearly subscription,
3:37 you get these denary dice.
3:38 You know how regular dice have
3:40 six faces?
3:41 Well, this set has dice
3:42 with an increasing number of faces
3:44 from 1 to 10.
3:46 I know I'll use them for role
3:47 playing games.
3:48 So scan the QR code
3:50 and use the promo code
3:51 to connect with science
3:52 and spark your curiosity.
3:55 Of course,
3:56 these are not the first studies
3:57 on animal communication.
3:59 From signing gorillas
4:00 to talking parrots
4:02 and button pushing puppies.
4:04 We have thrown everything at the wall
4:06 and nothing has stuck.
4:07 But in science, just as in life,
4:10 you often learn more from failure
4:12 than from success.
4:13 So let's take a quick look back
4:15 at some of the most interesting ways
4:17 we've tried and failed
4:19 to communicate
4:19 with our fellow Earthlings.
4:22 Some of the earliest studies
4:23 were done on our closest relatives,
4:25 the great apes.
4:27 This was the 1960s,
4:29 and instead of trying to understand
4:31 how animals
4:32 talk to each other,
4:34 scientists tried to teach them
4:35 how to speak like us.
4:37 This was the case
4:38 with Viki the Chimp.
4:40 Researchers tried to teach her English
4:42 by raising her as a human
4:44 and giving her speech therapy.
4:46 But after three years,
4:47 they realized that chimpanzees,
4:49 quite simply, can't make
4:51 the noises we make when we speak.
4:54 It's kind of like us
4:55 trying to communicate with dolphins
4:56 by squeaking like them.
4:58 Wait, has anyone tried that?
5:00 If there's any dolphins in the
5:01 audience, I just want to say:
5:14 Since chatting with apes
5:15 was out of the question,
5:17 the strategy changed to teaching them
5:19 sign language.
5:21 Washoe,
5:22 Nim Chimpsky,
5:23 and Koko the Gorilla
5:25 became simian celebrities
5:26 for learning signs.
5:28 Koko was said to know
5:30 over 1000 signs,
5:32 but the signs didn't have grammar.
5:34 And there were questions
5:35 about how much she knew
5:37 and how much her trainers
5:38 projected into her body
5:40 language and signs.
5:42 There's been other experiments
5:44 with different degrees of success,
5:46 but in general, we can say
5:48 that apes can learn signs,
5:50 but they don't create signs.
5:52 They don't have an organized way
5:54 of combining signs
5:55 to say different things,
5:57 and they almost never ask questions.
6:01 So it's always a one sided
6:02 level of communication.
6:04 Talk about a bad date.
6:06 The next subject would have to be
6:08 an animal that communicated vocally
6:10 and in more complex ways than apes.
6:13 Dolphins.
6:14 In 1964, scientists
6:17 partially flooded a house
6:18 with ocean water.
6:20 The idea was for researchers
6:22 to move in with a dolphin
6:23 and teach them English.
6:25 But what started out as a wacky
6:27 sitcom premise
6:28 quickly turned into a horror story.
6:32 When Peter the Dolphin
6:33 became aroused and disruptive
6:34 during English class,
6:36 the scientist would handle it
6:39 to relieve the dolphin
6:41 and get him to cooperate.
6:43 And in the end, after
6:44 the well deserved media backlash,
6:47 very little science was done
6:49 and the dolphin didn't
6:50 learn English.
6:52 But a few years later,
6:54 one of the most important studies
6:56 in animal communication was done
6:58 also on dolphins.
7:00 You don't hear about it much
7:01 because it isn't as saucy
7:03 as the dolphin house.
7:05 But thanks to this study
7:06 in the field of animal communication,
7:09 the foundations got laid
7:11 and thankfully
7:12 dolphins didn't.
7:14 The researcher
7:15 was Louis Herman.
7:17 He wanted to speak to dolphins
7:19 in a language
7:19 that they would understand.
7:21 Instead of English words,
7:23 he created electric sounds
7:25 the dolphins could easily tell apart.
7:28 Then he taught the dolphins
7:29 with the sounds meant.
7:31 The genius of it was
7:33 keeping it simple.
7:34 Instead of focusing on how many words
7:37 a dolphin could learn,
7:38 he focused on what it could do
7:40 with a small vocabulary
7:42 of about 15 words.
7:44 What he found
7:45 is that dolphins have syntax.
7:48 The order of words makes a difference.
7:50 For example, human-ball-fetch
7:53 means something different than
7:55 fetch-ball-human.
7:57 It seems super simple to us,
7:58 but it's hard as hell for animals
8:00 to understand
8:02 those studies on dolphins
8:03 sparked interest in learning
8:05 about animal communication
8:07 on their own terms.
8:09 Hundreds of studies
8:10 have been done since then.
8:12 Researchers learned over the decades
8:14 that prairie dogs calls
8:16 convey specific information,
8:18 like the size and color of a predator.
8:20 Vervet monkeys have different
8:22 sounds to alert their friends
8:24 of different predators too.
8:26 Elephants and whales use low frequency
8:28 signals to communicate
8:29 with other members of their group
8:31 over vast distances.
8:33 And bees
8:34 shake their booty in particular ways
8:37 to tell each other about the direction
8:38 and distance of flowers.
8:41 So we know
8:42 animals communicate.
8:44 The question now is
8:45 can we learn to communicate with them?
8:48 Could I have told prairie dogs
8:50 that I was a friend?
8:55 It all sounds like a
8:57 massive cacophony to us,
8:58 but there's a lot of
8:59 communication happening.
9:01 We just can't understand it
9:03 because we don't have the hardware
9:05 or the software to do it.
9:07 So here's where new evolving
9:09 technologies come in.
9:11 We can now use
9:12 neural networks to go through
9:14 massive amounts of data
9:15 and find patterns.
9:17 It sounds good in theory,
9:19 but in practice
9:20 we have the huge issue
9:22 that we still don't
9:23 have that much data.
9:25 So there are new organizations
9:27 trying to compile that data
9:29 and run it through
9:30 specialized programs.
9:32 One of these groups
9:33 is Project CETI,
9:35 a scientific
9:36 and conservation organization
9:38 of over 50 scientists
9:40 all over the world.
9:42 Their goal is to understand
9:43 the rapid fire bursts of clicks
9:46 that sperm whales use to communicate.
9:49 These bursts are called codas
9:51 and seem to make whole sentences
9:53 and even regional accents.
9:56 So for the past few years,
9:59 they've been collecting thousands
10:00 of hours of sperm whale audio
10:03 to get some insights
10:04 into their language.
10:06 It's such an ambitious project
10:08 that we ask the team leading
10:10 those efforts to tell us how it works.
10:12 These are the scientists
10:13 who might figure out
10:14 how to talk to whales.
10:16 So we couldn't be more excited
10:18 to have them on the show.
10:19 One of the most ambitious
10:21 initiatives is by Project Ceti,
10:23 who's trying to teach us
10:25 how to talk to whales.
10:26 Hi, I'm Gasper Begus.
10:28 I'm a linguistics
10:29 lead at Project Ceti,
10:31 and I'm an associate professor
10:33 at UC Berkeley.
10:35 I'm David Gruber.
10:35 I'm a marine biologist
10:37 and a distinguished professor
10:38 of biology
10:39 at the City University of New York.
10:41 So as a linguist,
10:42 I like to study human languages.
10:44 But, in the past couple of
10:46 years, linguistics
10:47 has been really opening up,
10:49 and we realize
10:50 we have so many tools to offer.
10:52 And in my opinion,
10:54 sperm whales are for many reasons,
10:56 one of the best species
10:58 to start looking in.
10:59 They're incredibly smart.
11:01 They have very complex
11:02 social structures,
11:04 social relationships,
11:05 like we have grandmas
11:07 with swimming with other whales.
11:09 They are
11:10 taking care of each other's offsprings
11:12 and they exchange in dialogs.
11:16 These clicks
11:18 that on the surface
11:19 look really alien.
11:21 But if you look into them deeply,
11:23 you can see
11:24 a lot of linguistic traits.
11:26 And, that's that's how I was,
11:29 trying to open up
11:30 my linguistics side
11:31 from humans to non-humans.
11:34 And it's has been really exciting.
11:36 This isn't a project
11:38 you can do
11:38 by sticking a microphone in the water
11:40 and then running it
11:41 through an advanced machine
11:42 learning system.
11:44 Every step of the process involves
11:46 a great deal of problem
11:48 solving and collaboration.
11:50 When the project started
11:51 putting together
11:52 interdisciplinary team,
11:53 this was really important
11:54 because like,
11:55 I don't have all the answers,
11:56 but I noticed that I needed someone
11:58 to make the cameras
11:59 and someone to make the robots
12:00 and someone to do the analysis.
12:02 So for this project,
12:04 I think one of the key things
12:06 was working with linguists.
12:09 I'm working with people like Gasper,
12:11 and there have been
12:13 very few projects
12:14 where marine biologists
12:15 and linguists were working together
12:17 up until this project.
12:18 So this has been like a real joy and,
12:22 learning experience for both of us.
12:24 The first step of their work
12:26 was collecting all the data
12:27 with as much detail as possible,
12:30 and often in multiple ways.
12:33 It's great to record
12:34 the codas of a whale,
12:35 but it's even better
12:37 if at the same time
12:38 you have the footage of what it's
12:39 doing when making the calls
12:42 or heart sensors
12:43 to see what the whale is feeling
12:45 when making those calls.
12:47 But that leaves you
12:48 with a massive amount of data
12:50 that's impossible for human brains
12:53 to analyze.
12:54 As a human, you have a lot of biases.
12:56 And when you're faced
12:57 with something like sperm whales.
12:59 It's really fascinating
13:01 because you have to step out of
13:02 those biases and really try
13:04 to find things
13:06 that are meaningful
13:07 to this other species,
13:08 which is,
13:09 you know, has very different lives.
13:11 And so you have to get out
13:13 of your human biases.
13:15 I n a sense we used advanced machine
13:17 learning to help us that
13:19 machines are just a tool here.
13:21 Machine learning
13:22 is just a tool in that approach,
13:23 where you can use this other tool
13:26 to help you understand, to help you
13:28 gain insights.
13:29 Advanced machine
13:30 learning is also better at seeing
13:32 patterns in different scales of time.
13:35 It's kind of strange
13:36 to think about it,
13:37 but different animals might experience
13:39 the passage of time
13:40 differently than we do.
13:42 It's like if a different species
13:44 was studying us,
13:46 but we sounded like this to them.
13:51 So by looking
13:52 more deeply
13:52 into the timing of the codas,
13:54 the team is finding extra details
13:57 that show a rich language.
13:59 Zoom in those clicks.
14:01 You see way richer structure
14:04 than you know
14:04 when we started doing that.
14:06 And I think that's really important
14:09 because we're showing
14:11 that this creature, this,
14:13 this species
14:15 not only has complex lives
14:17 and complex social relationships
14:20 and are really interesting,
14:22 they also have communication system
14:23 that can carry a ton of information.
14:26 And we don't know yet exactly where
14:28 that information is.
14:30 But there is a potential
14:31 that I've never seen it
14:32 basically in any other species.
14:34 And I've worked with other animals.
14:37 Before.
14:37 That's fascinating.
14:39 And one of the best things
14:41 is that it's only one of the many
14:42 ongoing projects,
14:44 one of the ones closest to my heart
14:46 is Zoolingua’s prairie dog project.
14:49 Con Slobodchikoff,
14:51 from Northern
14:52 Arizona University, and his team
14:54 recorded the vocalizations of Gunnison
14:57 prairie dogs.
14:58 They observed that
14:59 they made different alarm calls
15:01 for different animals.
15:03 So he ran the recordings
15:04 through speech recognition software
15:06 to see what the calls meant.
15:09 What they found
15:10 was that on top of specific
15:12 calls for their main predators
15:14 hawks, coyotes
15:15 and humans,
15:17 they also had descriptive adjectives
15:20 so they could communicate information
15:22 like big black hawk
15:25 when there was a recurrent predator,
15:27 like a particularly crafty coyote.
15:29 A word would emerge.
15:32 I wonder what they called me
15:33 when I was trying to film them.
15:35 One of the cool things
15:36 about this technology is that
15:38 it can be applied
15:39 to different species.
15:41 Zoolingua, for example,
15:43 is applying their findings
15:44 from prairie dogs to lab dogs.
15:47 Dogs are our closest friends.
15:50 They evolved from wolves
15:52 to be our companions.
15:54 They can eat our food,
15:55 and a big part of their success
15:57 is their ability
15:58 to communicate with us.
16:00 But most of this communication happens
16:02 through body language.
16:04 So their mission is to figure out
16:06 the combination of barks
16:08 and other sounds
16:09 with the dog
16:10 facial expressions and actions,
16:12 and figure out what they mean.
16:14 This makes sense
16:16 because few animals communicate
16:17 with their version of words.
16:20 If you've ever had a dog,
16:21 you know that a look can often
16:23 tell you more than 100 barks.
16:26 Imagine being able
16:27 to talk to your pup.
16:29 I'd love to talk to my cat, nebs.
16:31 She's an extremely talkative cat
16:33 and she makes a whole range of sounds
16:36 that all mean
16:36 something pretty different.
16:38 So maybe this would help me
16:40 be able to talk in turn with her.
16:42 Other than meowing back
16:43 and forth forever.
16:45 Hey, cuttie.
16:48 Meowing at
16:48 me from the top of the stairs.
16:51 Yeah.
16:52 You like being up top,
16:53 don't you?
16:55 Programs like the Earth Species Project
16:57 are trying to make
16:58 that happen.
16:59 Earth Species Project is a company
17:02 trying to decode animal language.
17:05 They created a program called
17:06 Nature LM audio and trained it
17:09 with over 25 million
17:11 animal sounds.
17:13 The program is still in development,
17:15 but the early results
17:16 have been positive.
17:18 And as of 2025,
17:20 it's being tested on a wide
17:22 range of animals like crows,
17:24 finches and belugas.
17:27 Back to cetaceans.
17:28 Dolphins are always one
17:30 of the main subjects of study
17:32 when it comes to animal communication
17:34 and intelligence.
17:36 And there's a lot of amazing research
17:37 about how they communicate
17:39 with each other and how it can help us
17:41 in conservation.
17:43 Just like whales, dolphins
17:45 communicate vocally
17:47 and to understand
17:48 what they're saying to each other.
17:49 We need a lot of data.
17:52 While dolphins,
17:53 as you can
17:54 imagine, are hard to follow.
17:56 But some researchers
17:57 are using the latest technology
17:59 to get their clicks on tape
18:01 and figure out their meaning.
18:03 We asked one of the researchers
18:05 who's trying to get a bit closer
18:07 to understanding dolphins,
18:09 how they're doing it.
18:11 My name is Laela Sayigh
18:12 and I work at the Woods Hole
18:14 Oceanographic Institution
18:15 in Massachusetts in the USA.
18:18 We can record them
18:18 with these contact hydrophones
18:20 that we actually put
18:20 right on their melon.
18:21 And then we also put digital
18:23 recording tags on their back
18:24 before we release them.
18:26 These are all noninvasive
18:27 invasive technologies.
18:28 And the tags,
18:29 if they're bothering them,
18:30 they can just jump up a few times
18:32 and knock them off if they want to.
18:33 And so those
18:34 but both of those reporting
18:36 methods
18:36 give us some really high quality
18:38 data from known individual dolphins,
18:40 which is kind of the holy grail
18:42 for studying dolphin communication.
18:44 Those technologies
18:45 are always evolving.
18:47 The next generation of recorders
18:49 has sensors that can detect depth
18:52 and record the dolphins movements,
18:54 so we can see what types of whistles
18:56 they do
18:56 when they're doing
18:57 specific activities.
18:59 Dolphins have signature whistles,
19:02 which are kind of like their names.
19:04 When a dolphin does
19:05 its signature call.
19:06 It's basically announcing
19:08 flipper is here!
19:09 But they also have shared
19:11 whistles that they use
19:12 to communicate with each other.
19:14 We call those calls
19:15 non signature whistles.
19:17 The issue is that to
19:19 start understanding them
19:20 you have to first know
19:22 all the signature whistles
19:23 in an area
19:24 to be able to tell them apart.
19:26 So what have we learned
19:28 so far about the
19:29 non signature whistles.
19:31 So there's two different types
19:33 that we have done
19:35 some playback experiments with.
19:37 We decided to focus on these two
19:38 that we have kind of
19:39 the most data for.
19:40 And so we've done
19:42 some experiments
19:42 that are suggestive
19:44 at least of one of these types
19:45 maybe being some kind of alarm
19:47 call or danger type signal,
19:50 because we've see most animals
19:51 avoiding the stimulus
19:53 when they hear it.
19:54 And then we have another one
19:56 that's a little bit more complicated
19:58 because that one,
19:59 it's not playing back.
20:00 That one is a little trickier.
20:02 We've seen animals
20:03 making that type of whistle
20:05 in a situation where they seem
20:06 to be kind of surprised.
20:08 So they are hearing
20:10 something unexpected
20:11 and they make that whistle.
20:13 So we have kind of speculated
20:15 that it's sort of like a what is that,
20:17 what am I hearing kind of thing?
20:20 We jokingly referred to it
20:21 as the WTF whistle
20:23 and on the boat.
20:25 That's right.
20:26 Dolphins seem to have a whistle
20:28 for: RUUUN!
20:30 or I guess, SWIIM!
20:32 And another one for: what the heck?
20:35 And the more we researched
20:36 them, the more types of calls
20:38 we will find.
20:40 And finally,
20:41 we would never forget
20:42 about the elephants.
20:43 These majestic beauties
20:45 communicate with very low rumblings
20:47 well below what we can hear.
20:49 To us it's just vibration.
20:52 But to elephants it's
20:53 a whole conversation.
20:55 The Elephant Listening Project
20:57 compiled over
20:58 300,000 hours of
21:00 elephant rumbles,
21:01 which are low frequency calls
21:03 that our ears can't hear.
21:05 One of their researchers,
21:07 Doctor Michael Pardo,
21:08 wanted to figure out
21:09 if elephants had names.
21:12 They knew
21:12 elephants were more reactive.
21:14 After hearing some specific rumbles,
21:17 which they hypothesized
21:18 was their name.
21:20 Based on that,
21:21 they compiled close to 500 calls
21:23 from over 100 elephants.
21:26 They knew who had made the call
21:28 and who seemed to be the intended
21:30 recipient of the call.
21:32 Then they ran the
21:33 rumblings through a machine
21:35 learning model.
21:36 It successfully predicted
21:38 who was being called about
21:39 25% of the time.
21:41 Finally, they played the calls
21:43 they had confirmed to be an elephants
21:45 name.
21:46 And the elephants who were called
21:47 responded strongly and locally
21:50 to hearing those calls.
21:52 So basically it went like this.
21:54 Victor!
21:56 And Victor the elephant replied,
21:58 What?
22:00 You can learn about these elephants
22:02 by checking out Doctor Pardo's
22:04 YouTube channel,
22:05 linked in the description.
22:07 Okay.
22:08 Using technology to understand animals
22:11 looks very promising and exciting.
22:14 I'm sure there will be amazing
22:15 discoveries within the next few years.
22:18 But what about the animals
22:20 that don't communicate vocally?
22:22 Can we use technology to understand
22:24 animal body language and visual cues?
22:28 I mean, take us for example.
22:30 We communicate with our bodies
22:32 all the time.
22:33 Cuttlefish are one of the last animals
22:35 you would think about
22:36 when making a list of chatty animals.
22:39 But if you look closely,
22:40 you can see them signaling each other
22:42 and changing colors.
22:44 We think they're communicate,
22:46 but we don't know what.
22:48 It's still such a poorly
22:49 understood system
22:51 that we went straight
22:52 to the leading expert
22:53 in cuttlefish sign language
22:54 to ask how their solving
22:56 this problem.
22:58 So I'm Sophie Cohen-Bodenas
22:59 Currently I'm
23:00 a researcher in sensory
23:01 neuroscience at Washington
23:03 University in Saint Louis.
23:05 So what is really fascinating
23:06 about cuttlefish is that
23:08 not only do they use this ability
23:10 of doing dynamic skin patterning
23:11 for a camouflaging,
23:13 but they also do it for communicating,
23:15 which means that it's versatile.
23:17 So sometimes they will do like
23:19 a white patterning to
23:21 blender in the skin,
23:22 but also sometimes they will do black
23:23 spots on the skin to communicate.
23:26 Doctor Cohen-Bodenas
23:27 noticed that the cuttlefish
23:29 were moving their arms
23:30 in particular ways.
23:32 They weren't random moves.
23:34 They seem to be saying something.
23:36 So she started recording them
23:37 and eventually
23:38 classified for recurrent movements.
23:42 The next part of the research
23:43 was seeing
23:44 what happened
23:44 when you played the signals back
23:46 to the cuttlefish.
23:48 We wanted to see
23:49 like putting them like,
23:51 right in front of the screen,
23:52 and we wanted to see
23:54 how would they would react
23:55 when we were presented
23:56 in what was really fascinating,
23:58 that they were very curious about it.
24:00 They would come by themselves
24:02 in front of the screen
24:04 and they would display the signs.
24:06 And what was also very interesting
24:08 is that not only
24:09 they would display the sign
24:11 like as mimicking, like, for example,
24:13 they see up the food up or something
24:15 like that, which would have been very interesting
24:17 because we would have said, oh,
24:19 it might be mimicking signals,
24:20 something, but it was a bit different
24:22 because when they would see the sign,
24:24 respond with a different type of sign.
24:27 But the ocean is dark,
24:29 and marine animals often
24:30 have to rely on their sense of touch
24:33 or mechanoreception
24:34 instead of their eyes.
24:36 So what would happen
24:37 if two cuttlefish
24:38 wanted to communicate at night,
24:40 or in murky water,
24:42 or simply when they couldn't
24:43 see each other?
24:44 So what we did
24:45 is that we put the hydrophone,
24:47 which is just like a microphone.
24:49 But for underwater to record the trace
24:52 emitted by the wave,
24:53 we could do spectrogram.
24:55 And then we did
24:56 the playback experiment,
24:57 which means that we playback test
24:59 that trace to the cuttlefish.
25:01 And why this experiment is interesting
25:03 is that it canceled the visual aspect
25:06 because you just display
25:07 like the vibration in the water
25:09 without them seeing one another.
25:11 And so it was really interesting
25:12 also for us because,
25:15 the cuttlefish
25:16 indeed got really interesting.
25:17 They would tend to come
25:18 to this hydrophone
25:19 to display the sign.
25:20 We interpret this first result
25:23 as a preliminary proof
25:25 that they could perceive it also via,
25:28 mechanoreception.
25:29 All of this data proves
25:31 that communication is happening,
25:34 but we don't know what the signs mean.
25:36 This is where technology comes in.
25:38 It's really, like,
25:39 complicated for a human to say
25:42 exactly in which context?
25:45 Behavioral context.
25:46 Which sign mean which sign.
25:48 So the idea is to have
25:51 a colony of cuttlefish,
25:53 and let's enable them
25:55 to display the signs
25:56 along with the coloring patterns
25:59 in many different behavioral contexts.
26:01 Collect a lot of data
26:02 and then have an algorithm
26:05 that will find the structure for us.
26:07 In order to start to,
26:10 dream.
26:11 We'd have to have an algorithm
26:13 that is going to say,
26:14 okay, so
26:16 two black spots plus orange.
26:18 plus raw, is most often correlated,
26:21 I say just as an example, eating.
26:23 And zebra
26:25 pattern plus
26:27 a crown plus white square
26:30 is going more often
26:31 being related to aversive display.
26:34 And this is something that for
26:36 a human is very hard to decipher.
26:39 But a powerful algorithm would be able
26:42 to possibly do this for us.
26:45 Super rad.
26:46 It's like getting an insight
26:47 into the minds of aliens,
26:49 which cuttlefish,
26:51 despite some internet
26:52 rumors, are 100% not.
26:55 Although they kind of look like
26:56 they could be from Pandora.
26:58 It's hard not to be optimistic
27:00 about all of this research,
27:02 but before we start thinking
27:04 about full conversations
27:05 with the animals in our fish bowls,
27:07 there's still a lot of work
27:09 to be done.
27:10 I think there is a misconception
27:12 that people think you just throw
27:15 AI at any animal,
27:17 and out comes some translation.
27:19 I think because it's working
27:21 so well on humans.
27:23 People are able to take that leap
27:24 in their head.
27:25 But now that we're really deep
27:27 in this project,
27:28 we see that is not the case.
27:29 We can't just take ChatGPT
27:32 and throw it at whales,
27:33 and out comes a translation.
27:34 And it's, it's a significant climb
27:37 that we're undertaking undertaking.
27:39 In all of these cases,
27:40 Machine learning is more of a tool
27:42 than a universal translator,
27:44 and trained professionals are still
27:46 the most important
27:47 part of the equation.
27:49 I think that it's really important
27:51 to actually
27:52 watch the animals and document
27:54 sort of how they respond to sounds
27:56 and how they're using sounds.
27:57 I think that just
27:58 putting a lot of sounds
27:59 into some kind of AI
28:01 and having it discern
28:02 whether they're sort of patterns
28:04 or things like that, that could be valuable
28:05 in potentially
28:07 helping to generate hypotheses
28:09 that we could test.
28:10 And yes, through a combination,
28:12 we will learn a lot
28:13 about animal communication
28:15 with the help
28:15 of advanced machine learning.
28:17 But there's something we always have
28:19 to keep in mind.
28:20 We have to do it with
28:21 the welfare of the animals in mind.
28:24 I think the key thing
28:25 is, is always keeping the whale center
28:27 and asking that question is like,
28:29 is this study
28:31 is this finding in service of
28:33 of the whale?
28:34 And you just keeping that
28:35 at our core heart of the project?
28:37 If we do that
28:38 and start to learn more about
28:40 the inner lives of whales,
28:41 dolphins, cuttlefish
28:43 and other animals,
28:45 we might have to change
28:46 not only how we see ourselves
28:47 in relation to the natural world,
28:50 but also how we apply our laws.
28:52 If we're finding
28:53 all these things about
28:55 animals and languages, in a sense,
28:57 this last frontier
28:59 that the legal system has
29:00 has used as kind of a barrier
29:02 to keep rights on the humans.
29:05 So if we're finding a species
29:06 that can communicate with codas
29:08 that are so informative,
29:10 so it can be so powerful, right?
29:12 Can transmit so much information.
29:14 I think we need to pause
29:15 and ask ourselves, well, you know,
29:17 what does that mean
29:17 for the legal system?
29:18 In other words, there might be a point
29:20 where our understanding of animals
29:22 becomes so deep
29:24 that we might give them some rights
29:25 that are currently seen
29:27 as exclusively human rights.
29:29 But if the animals are
29:30 so smart and their consciousness
29:32 is so deep and meaningful,
29:34 why wouldn't they deserve
29:36 to have some rights?
29:38 As technology advances,
29:40 it's a question
29:41 we might need to answer
29:42 sooner than we think.
29:44 You know, I'm a linguist.
29:45 I started with linguistics,
29:46 with old languages.
29:48 I studied really all languages.
29:50 And those languages allow
29:51 you time travel because, you know,
29:54 you have access to these people
29:55 who lived 3000 years ago.
29:57 And listening to the whales is like
30:00 traveling in different
30:02 intelligences, right?
30:03 You have this amazing species.
30:05 You can imagine their lives.
30:07 You know, they when they sleep,
30:09 they flow like vertical heads.
30:11 And you're just, like,
30:12 imagining,
30:14 this beautiful intelligence.
30:16 And you realize,
30:18 you know, maybe
30:19 our position as humanity
30:21 is, is not as high
30:23 as we used to think.
30:24 I think it keeps you grounded.
30:26 It keeps you closer to it,
30:28 and it can help you really reimagine,
30:31 maybe what
30:32 the position of humanity is.
30:34 So, yes, technology can help us
30:36 learn about animal communication
30:38 and support conservation projects,
30:41 but we have to keep the welfare
30:43 of animals first
30:44 and not prioritize profits.
30:47 We also have to make sure
30:48 that we listen to them
30:50 and really try to understand them,
30:53 not just hear what
30:54 we want them to say.
30:56 Many of the things animals
30:57 will say will require us
30:59 to adjust our behavior massively,
31:02 from our shopping habits
31:04 to our eating preferences,
31:05 to even where we choose to live.
31:08 So here's hoping that
31:09 when animals talk to us,
31:11 their calls don't fall on deaf ears.
31:14 So what should we talk about next?
31:16 Please let me know in the comments
31:18 and don't forget to subscribe
31:19 for new episodes every week.
31:21 Thanks for watching.
31:22 See ya and take care.