0:02 A breakthrough for scientists after they
0:04 had a 20inute conversation with a
0:05 humpback whale.
0:08 >> Beneath the ocean surface, an ancient
0:11 conversation hums in rapid clicks.
0:13 Messages passed between giants with
0:17 brains six times our own. Using advanced
0:20 AI, scientists are listening like never
0:23 before. And what they have uncovered is
0:25 something no one would have guessed in a
0:26 million years.
0:28 >> This is a little how their conversation went.
0:35 What they are finding is revealing an
0:37 entirely new way to understand
0:40 communication between intelligent life
0:43 forms on Earth. Since the dawn of ocean
0:46 exploration, the vast, mysterious ocean
0:49 has held its secret from us. For
0:52 centuries, sailors who returned from
0:54 long journeys told incredible stories of
0:57 what some people believe to be mythical
1:00 sea serpents. and more believably the
1:04 haunting melodic calls of whales. These
1:06 sounds were just part of the seas
1:10 marvel. Beautiful but barely explored.
1:12 These marine animals have been a subject
1:14 of admiration because of their size. But
1:17 we never truly believed they had a mind
1:20 that could match their might. For man,
1:22 they were just resources to get blubber
1:26 for lamps and bones for corsets. Until
1:29 one unexpected event, the call of the
1:33 deep. In the 1970s, the world opened its
1:36 eyes and was on the new wave of a new
1:39 environmental awakening. It was there
1:42 and then that a quiet discovery would
1:45 change everything. Scientists who had
1:47 been studying marine life for ages took
1:50 an interest in humpback whales and soon
1:52 set out to sea to record the
1:55 vocalizations of these whales. And what
1:58 they found was astonishing. The moans
2:01 and groans of these animals weren't just
2:04 random. They were intricate sounds that
2:07 evolved over time, lasting several
2:10 hours. The recordings were saved and
2:12 then compiled into an album called Songs
2:15 of the Humpback Whale. This scientific
2:17 documentation was the start of what
2:19 would get the attention of the public
2:22 and also tickled the imagination of
2:25 many. This one album helped take off a
2:27 global movement that saved these
2:29 magnificent creatures from the brink of
2:32 extinction. As scientists went further
2:34 into bio acoustics, which is the study
2:36 of animal sounds, they were left with
2:39 more questions than answers. Early
2:42 researchers in the field used basic
2:44 recording tools and spent hours
2:46 carefully listening and analyzing, all
2:49 in an effort to understand how whales
2:51 communicate. They couldn't quite place
2:53 what the sounds were. There were
2:55 theories of the whale songs being a
2:58 simple mating call, a call for help in
3:01 time of danger. or just a form of communication.
3:02 communication.
3:06 No one truly knew. For decades, we could
3:08 only guess, held back by the sheer
3:11 volume of data and the limits of our own
3:14 senses. All we could hear were the
3:17 sounds. But we didn't have the tools to
3:19 understand them. And this is where a guy
3:22 named Dr. David Gruber came in with a
3:25 bright idea. Dr. David Gruber, an
3:27 American marine biologist and National
3:30 Geographic researcher, believed that to
3:32 understand whales, we didn't need the
3:35 old tools used back then, but something
3:38 new and technologically advanced. He
3:39 looked to a different field for
3:42 inspiration, one that was far from the
3:45 sea, but focused on space called the
3:47 search for extraterrestrial intelligence
3:51 or SETI. The idea behind SETI is to
3:53 listen for signals from space in hopes
3:55 of one day hearing a message from
3:57 another civilization.
4:00 Dr. Gruber thought, what if we've been
4:03 looking for alien life in all the wrong
4:06 places? He thought an alien intelligence
4:09 could be right here on our own planet in
4:13 the form of a whale, the humpback whale.
4:16 This crazy almost sci-fiike notion was
4:19 the beginning of project CI
4:22 which means the citation translation
4:24 initiative. Not only did Dr. Gruber
4:27 bring this insane idea to life, but he
4:29 also brought together an extraordinary
4:32 team consisting of marine biologists,
4:35 computer scientists, linguists, and
4:38 roboticists. All joined by a shared
4:41 vision. Their aim wasn't just to
4:44 document sounds. They were planning to
4:46 do the impossible.
4:48 Their plan was to create a two-way
4:50 dialogue with another species.
4:53 How was this group of people planning to
4:55 listen to voices from a world so
4:58 different from ours? To truly connect
5:01 with whales, they needed more than just
5:04 ears. They needed a way in. The dream team.
5:06 team.
5:09 The team Dr. David Gruber built for
5:13 project CI was unlike any scientific
5:16 group you've ever seen. It was a fusion
5:18 of brilliant minds from all kinds of
5:20 fields who at first might not seem to
5:23 have much in common, but were all united
5:25 by the same audacious goal to crack the
5:27 code of whale communication.
5:30 This unconventional alliance was exactly
5:33 what the project needed. Understanding
5:34 the voices of the whales was a
5:36 complicated task that no single
5:38 discipline would have been able to
5:40 solve. It's no surprise that these
5:42 chosen experts who are masters in their
5:45 own fields were picked. This
5:47 collaboration was the engine of project CI
5:49 CI
5:52 leading to something incredible. At the
5:54 heart of the project were the people who
5:56 had spent their lives in the water with
5:58 these animals. People like Shane Jurro,
6:01 for example, who is a marine biologist,
6:04 spent over a decade studying a single
6:06 family of sperm whales near the
6:09 Caribbean island of Dominica. This
6:12 wasn't just a job for him. He was as
6:15 dedicated to his job as he was to loving
6:18 these magnificent creatures. He even
6:20 knew the whales by name and watched them
6:22 grow up.
6:24 This long-term intimate fieldwork became
6:28 the bedrock of project CI.
6:31 He was a great asset to the team. Who
6:33 better to understand the workings of
6:36 Wales than Shane? Apart from Shane
6:39 Gerro's contributions, another asset to
6:41 the team that wasn't present in the old
6:44 research team was natural language
6:47 processing thanks to AI. This is where
6:50 the start of what seemed impossible
6:54 becomes a modern marvel. Before project
6:56 seti became a thing, the idea of
6:59 translating whale communication was a
7:01 figment of researchers imagination
7:03 because of the limitations of technology
7:06 at the time. However, a major
7:08 breakthrough was on the horizon, not
7:11 just in marine biology, but also in technology.
7:12 technology.
7:15 Around 2019, the team came to a
7:17 realization of getting the technology
7:20 needed to build an underwater recording
7:23 studio. This studio would be the first
7:25 studio that could finally be able to
7:27 translate the language of sperm whales.
7:29 And just like the thought of getting on
7:32 the moon, this discovery ignited the
7:35 project and made the impossible suddenly
7:37 feel within reach. The island of
7:40 Dominica became the heart of this
7:43 project. It was the perfect place for
7:45 this kind of work with its volcanic
7:48 landscape and waters that get incredibly
7:51 deep just off the coast. Its unique
7:53 geography allows sperm whales to swim
7:56 close to land, making them easier to
7:59 study than in most other places. Not
8:01 only that, but the island also has a
8:03 stable and good amount of whale
8:05 population with many of the same
8:07 families returning to the island year
8:10 after year. It was also the same island
8:12 Shane Gerro's decades of work started
8:16 on. With this newfound possibility,
8:19 Project Siti officially launched in
8:22 2020. The initiative received a massive
8:25 boost from the TED Audacious project,
8:28 which provided $33 million in funding to
8:31 get it off the ground and rolling. The
8:33 funding allowed them to build an
8:35 incredible team and also begin their
8:38 first major task to create what they
8:42 call a 20 km by 20 km underwater
8:45 listening and recording studio off the
8:47 coast of Dominica. This setup would
8:50 allow the AI to not only hear the
8:52 whales, but also to understand who was
8:56 speaking, to whom, and in what social situation.
8:57 situation.
8:59 This was the technological bridge they
9:02 needed to cross to go from listening to
9:04 truly understanding.
9:06 The two groups, the field scientists and
9:09 the tech experts worked together in a
9:12 way that had never been done before. The
9:14 biologists provided the deep knowledge
9:16 and context. While the computer
9:18 scientists brought the tools and power
9:21 to sort the data, they worked together
9:24 in a close partnership. The hands-on
9:27 research guided the technology and the
9:29 technology helped them uncover new discoveries.
9:30 discoveries.
9:33 This collaboration was project city's
9:35 greatest strength, allowing them to
9:39 tackle the problem from all angles. What
9:41 made them different from past efforts to
9:43 understand animal language was how they
9:46 combined knowledge from different fields
9:48 and worked as one team. With this
9:51 incredible team in place, they were
9:54 ready to tackle the next big hurdle.
9:57 How do you collect sound in the deep
10:00 without asking a whale to wear a mic?
10:03 The team had a bold answer to everyone's
10:06 question, the oceans and the working technology.
10:07 technology.
10:09 The main tool and part of the plan that
10:11 would seal it all together would be the sensors.
10:13 sensors.
10:15 These sensors are designed and deployed
10:17 arrays of hydrophones, which is another
10:20 fancy word for underwater microphones.
10:23 They are placed on the ocean floor. They
10:25 do exactly what a normal microphone
10:28 would do, but a million times better.
10:31 They not only amplify the sounds from
10:33 these creatures, but they are also
10:36 listening stations placed in a grid-like
10:39 pattern to constantly record everything 24/7.
10:41 24/7.
10:43 It works like how a room filled with
10:46 microphones on the wall would be able to
10:49 catch any sound any time, even as little
10:52 as a pin dropping. The same goes for the
10:55 hydrophones, which the project city team
10:58 did on a large scale, which can detect
11:00 where any sound under the ocean comes
11:03 directly from. By using this network of
11:05 hydrophones, they could pinpoint a
11:07 whale's clicks and figure out which
11:09 particular whale was making a specific
11:12 sound. This was a gamecher for the
11:15 expedition. It wasn't enough to just
11:17 hear a whale. They also needed to know
11:20 which whale was talking. To get an even
11:23 clearer picture of the whale, the team
11:25 had another brilliant idea of sending in
11:28 a robotic fleet. Since it was impossible
11:30 to get a whale to wear a mic, they could
11:33 send out autonomous drones and robotic
11:35 systems to get close to the whales.
11:37 These robots were designed to place
11:41 non-invasive suction cup tags onto the
11:43 whale's backs. These tags were packed
11:46 with sensors that not only recorded the
11:49 audio, but also the whale's movement at
11:52 any depth, its heart rate, and its
11:55 social interactions with other whales.
11:57 This was another important piece of data
11:59 needed to understand what the sounds
12:03 made by the whales meant. The hydrophone
12:05 network could tell them what was said
12:08 and by whom, while the tags provided the
12:10 context, which could determine whether
12:12 the whale was making a sound while
12:14 hunting for food or whether it was
12:17 socializing with its family. This
12:20 behavioral data gave the AI the vital
12:22 clues it needed to understand the
12:24 reasons behind the clicks. All of this
12:26 technology was all about collecting an
12:28 incredible amount of data needed for
12:31 their research. The comparison of the
12:33 recent research and the ones from over
12:36 40 years ago is a lot different. While
12:38 previous research might have collected a
12:41 few hours or even a few days of whale
12:43 sounds, project city was collecting
12:45 millions, even billions of vocalizations.
12:47 vocalizations.
12:49 While doing all these, they were
12:52 building the largest library of whale
12:55 sounds ever created by man. This
12:58 monumental scale was essential because
13:00 AI models, especially machine learning
13:03 models, need huge data sets to find
13:06 meaningful patterns. They learn by
13:08 example. And the more examples they
13:11 have, the smarter they get. This is why
13:14 project was so different from all
13:16 previous efforts which only focused on
13:19 trying to find a few key phrases. Unlike
13:21 recent efforts that are centered on
13:24 capturing a complete language, the data
13:27 streamed in a constant flow of clicks,
13:30 whistles, and songs from the deep. But
13:34 on its own, it was just noise. It was
13:37 raw data without meaning. A vast ocean
13:39 of information waiting to be decoded.
13:42 The microphones and sensors had listened
13:44 and the tags had recorded, but the
13:48 language remained a mystery. It was all
13:51 leading to one crucial step. The real
13:53 magic happened when this ocean of sound
13:55 was fed into the machine learning
13:58 models, transforming it from mere noise
14:01 into the building blocks of a language.
14:03 The AI breakthrough.
14:06 To the human ear, the deep ocean is only
14:09 a confusing jumble of clicks, grunts,
14:12 whistles, and waves. Trying to find a
14:14 particular pattern from all the noise is
14:17 nearly impossible. The first biggest
14:20 challenge for the project CD team was to
14:22 turn all the raw audio they collected
14:24 into something the AI could actually
14:27 interpret. In order to do this, they had
14:30 to use a process that turned sound into
14:33 a picture called a spectrogram. With
14:35 this, they could take something as basic
14:37 as the waves of the ocean and turn it
14:39 into a visual chart. With this
14:42 technology, they were able to monitor
14:44 all the data they had collected. With
14:48 this, AI could visually see the whale
14:51 sounds. The AI's first task was to sort
14:53 through these images and identify the
14:56 kodas, which are the specific sequences
14:58 of clicks sperm whales use to
15:00 communicate while ignoring all the
15:03 background noise. This initial process
15:05 of turning an invisible sound into a
15:08 visible pattern was the crucial first
15:10 step. This is where the power of machine
15:13 learning came in, acting as a kind of
15:16 modern-day Rosetta Stone. The team of
15:19 scientists and researchers then took a
15:21 step further into using advanced AI
15:23 technologies, including something called
15:26 deep learning and neural networks. These
15:29 advanced technologies were incorporated
15:31 as they are designed to learn the way
15:34 humans do by recognizing patterns in
15:37 large amounts of information. In this
15:39 case, the AI wasn't given a list of
15:41 rules about how whales communicate.
15:44 Instead, it was trained using millions
15:46 of whale kodas, which are the short
15:48 bursts of sound whales used to
15:51 communicate. By studying these, the
15:53 system slowly started to find patterns
15:57 on its own. Over time, it began to
15:59 understand which sounds often appeared
16:01 together, how they changed in different
16:04 situations, and what they might mean,
16:07 just like how we learn a new language by
16:10 listening and observing. Soon enough,
16:12 these technologies began to notice tiny
16:14 differences in the clicks that were
16:16 completely unnoticeable to a human
16:18 listener. It could detect subtle
16:22 variations in rhythm, tempo, and other
16:24 details that our brains simply couldn't process.
16:26 process.
16:28 Just like how a toddler or a young child
16:30 learns a language, the computer was
16:33 doing the same thing, only much, much
16:36 faster, and with a data set that a
16:38 single human could never analyze in a
16:41 lifetime. The sheer scale of the data,
16:44 which is over millions of vocalizations
16:46 over many years, was what gave the AI
16:50 the power to find these hidden patterns.
16:53 Without this huge data set, the AI would
16:56 have been as lost as a human biologist.
16:59 With this new technology and AI's hard
17:01 work, it led to the core of the
17:04 discovery of the sperm whales language.
17:06 And as expected, it was even more
17:09 complex than anyone had ever imagined.
17:11 These sounds weren't just a simple set
17:14 of messages. After a massive collation
17:16 of this data, it was discovered that
17:19 whales use a complex system that works a
17:22 bit like a phonetic alphabet. Of course,
17:24 they don't have letters, but it doesn't
17:26 take away the fact that they use
17:30 different variables like rhythm, tempo,
17:31 and what the scientists call
17:34 ornamentation, which are extra little
17:37 clicks at the beginning or end of a kod.
17:38 The way these different elements are
17:41 combined creates an insane amount of
17:45 unique words or messages. For example, a
17:47 slow, steady rhythm might mean one
17:50 thing, while speeding up that rhythm or
17:52 adding an extra click at the end could
17:54 completely change the meaning. This
17:56 discovery completely shattered the idea
17:59 that only humans could have a complex
18:02 structured linguistic system. It showed
18:04 that whales were also using a system
18:07 with a lot of different parts that could
18:09 be rearranged to create an intricate vocabulary.
18:11 vocabulary.
18:13 This was the moment everyone had all
18:16 been hoping for. For the first time, we
18:18 had a window into the actual syntax of
18:21 their communication. We could see the
18:23 building blocks of their language, but
18:26 we still didn't know what it all meant.
18:28 What were they actually saying? This is
18:31 where things got a lot interesting.
18:34 Unveiling the whales world view. For a
18:37 long time, scientists thought if whales
18:39 spoke a certain language, it would be
18:42 simple and direct. In all their years of
18:45 work and research, they expected a basic
18:48 set of commands or even a survival
18:51 language. But what the AI revealed was
18:53 something much more unexpected and
18:55 interesting. The patterns the computer
18:58 found weren't tied to simple actions,
19:01 but to social context. It was the big
19:04 reveal. And the moment everyone realized
19:07 all the theories they had were wrong.
19:09 Whales didn't just sing songs only
19:12 because they had to, they did so to
19:14 communicate with each other in ways that
19:17 were complex and sometimes personal that
19:19 reflected their closeness to one another
19:22 as well as their social lives. It was
19:24 less of a simple code and more of a
19:27 conversation. As the AI continued to
19:30 analyze the data, it uncovered an even
19:32 more astonishing level of detail. The
19:35 team found that within the clicks, there
19:37 were subtle variations that acted much
19:40 like vowels do in human language. They
19:41 discovered that the whales use a
19:43 sophisticated system of what's called
19:46 robato, which is a fine grained change
19:49 in rhythm. It also has very subtle
19:51 variations in the timing of clicks
19:54 within a koda and more importantly a
19:56 sequence of clicks that is the basic
19:58 unit of communication for these
20:01 creatures. These scientists were also
20:03 able to learn that a whale's koda might
20:06 have a specific rhythm and a slight
20:08 pause or a quickening of tempo could
20:11 completely change its meaning. On top of
20:14 that, the ornamentations too were also
20:17 important to convey different messages.
20:19 It was as if the basic words were being
20:22 molded and shaped to express different
20:24 emotions or intentions. This showed a
20:27 level of linguistic sophistication that
20:29 truly challenged the idea that this kind
20:32 of communication was unique to humans.
20:35 The AI also showed that this language
20:37 was deeply tied to their social
20:39 identity. The research revealed that
20:42 different whale clans or families use
20:44 distinct dialects to identify
20:46 themselves. The same way people from
20:48 different parts of a country might have
20:51 different accents, these whale families
20:53 have their own unique way of speaking.
20:56 When the AI analyzed the clicks, it
20:58 could tell which family was talking just
21:00 by the subtle differences in their
21:03 speech patterns. This meant that the
21:05 language wasn't just a universal code
21:08 for all sperm whales. It was a unique
21:11 and dynamic living part of their culture
21:14 passed down through generations.
21:16 A single word could have multiple
21:18 meanings depending on the rhythm and the
21:21 context of the conversation and the
21:24 specific dialect it was spoken in. With
21:26 the help of AI and other advanced
21:29 technology, scientists have been able to
21:31 not just identify the language of these
21:34 whales, but also give humanity the
21:37 window to a culture no one knew existed.
21:39 Now being able to understand the
21:41 communication patterns of different
21:44 whales, studies and research have been
21:46 implemented to better have an idea of
21:49 the social connection, identity, and
21:52 shared history of these whales. This
21:54 realization changes our understanding
21:57 from a simple translation of words to
21:59 knowing the ins and outs of marine life
22:02 and what it holds. But what could this
22:05 new development mean for the future?
22:08 the goal of two-way communication. For
22:10 decades, we've only been able to listen
22:15 to the whales. Now, thanks to the AI, we
22:17 finally have a way to understand the
22:19 structure of their language. This new
22:22 understanding opens up a thrilling and
22:25 somewhat scary next phase of project
22:29 Sidi. Moving from listening to talking,
22:32 the team is no longer content with just
22:35 decoding. They're aiming to communicate.
22:37 Their goal is to build a realtime
22:40 communication system, what some people
22:43 are calling an underwater chatbot. This
22:46 isn't about teaching whales English, but
22:48 also about using the patterns and
22:50 structures the AI discovered to create
22:53 messages in their own language. It's
22:55 just like how a scientist would use a
22:58 specially designed speaker to play a
23:01 specific sequence of clicks, hoping for
23:04 a response. The team is starting with
23:06 simple interactions like greeting a
23:09 whale or maybe asking a question about a
23:11 particular object. It's an ambitious
23:14 dream, a true two-way conversation
23:17 between two very different species. This
23:20 audacious goal also brings up some deep
23:22 thought-provoking questions about the
23:26 ethics of interspecies dialogue. Should
23:28 we be talking to them? and what is our
23:31 part to play in obstructing the balance
23:34 of marine life. When you start to build
23:36 a bridge between two worlds, you have to
23:38 think about what might cross that
23:41 bridge. From there on, the question and
23:43 thought of whether it is right to
23:45 introduce our technology and our
23:47 language into their world starts to come
23:51 up. Some scientists worry that a human
23:53 initiated dialogue could disrupt their
23:56 natural social structure or even change
23:59 their language. It's a huge moral
24:02 question that the project CCTI team
24:05 takes very seriously. They're not just
24:08 trying to achieve a technical goal.
24:09 They're trying to do it in a way that
24:12 respects the whales and their culture.
24:14 They want the conversation to happen on
24:17 the whales terms in their own language
24:20 and in their own time. The team had a
24:23 specific recent success story that shows
24:26 this new future is already beginning. In
24:29 2023, the team had what they call the
24:31 first intentional human whale
24:33 interaction using the language they had
24:37 decoded. A diver using a special
24:39 underwater speaker played a specific
24:43 sperm whale KOD. The response was a huge
24:46 moment for the project. A whale in the
24:49 area responded with the exact same KOD.
24:52 It was a direct back and forth exchange,
24:55 not just a random coincidence. The team
24:58 couldn't believe it. This wasn't just a
25:01 recording. This was a conversation. It
25:04 was a single small exchange, but its
25:07 significance was monumental.
25:08 It proved that their understanding of
25:10 the language was accurate enough to be
25:13 used for basic communication. This event
25:16 wasn't just a scientific breakthrough.
25:18 It was a positive sign that we might one
25:21 day be able to talk to other minds on
25:24 our planet. This realization changes
25:26 everything. We're on the verge of
25:28 something that was once only found in
25:30 science fiction. We're moving from
25:33 observing to participating, from
25:36 listening to interacting. Communicating
25:38 with another species is a monumental
25:41 step, but the implications extend far
25:44 beyond the whales themselves.
25:47 This technology could rewrite our place
25:50 in the universe. Implications for
25:52 science and humanity.
25:56 The work of project CI is forging a new
25:59 frontier for biology. For a long time,
26:02 marine biology and animal cognition were
26:05 limited by our ability to observe and
26:08 interpret. We could watch whales, but we
26:09 couldn't understand what they were
26:13 saying or thinking. Now, with AI as the
26:15 main translator, there are unlimited
26:17 opportunities waiting out there with
26:20 more to explore and uncover. It wouldn't
26:22 be about studying their bodies or
26:25 behaviors. The chances of finding out
26:27 the structures of their lives from the
26:29 communities they make up and their
26:31 intelligence, which was one humanly
26:35 impossible thing, are now high. This
26:38 research is completely changing how we
26:40 see animal communication, revealing a
26:43 level of complexity and social value
26:46 that no one could have imagined. With
26:48 this, some people believe that so many
26:50 wonders of the world might now be
26:53 explained. and also some of the biggest
26:56 questions in science could possibly have
26:58 an answer. With all the data from
27:01 project city, it can not only help us
27:03 understand sperm whales, but also
27:06 revolutionize the very existence of the
27:09 very nature of intelligence itself.
27:11 Perhaps the most interesting and
27:13 mindbending implication of this research
27:16 is its connection to space. The
27:19 project's name CE TI, which is a
27:23 deliberate term derived from SE TI, has
27:26 undeniable similarities. For decades,
27:28 scientists have been scanning the cosmos
27:31 for radio signals, hoping to find a sign
27:33 of life. But what if they've been
27:36 looking for the wrong kind of message?
27:39 With the success of Project Sidi, the
27:41 whales give us a blueprint for
27:43 extraterrestrial contact. The methods
27:46 and algorithms they developed to find
27:48 patterns in clicks and decode whale
27:51 language could be the exact tools we
27:53 would use if we ever encounter an alien
27:56 civilization. If an alien species
27:58 communicates in a way that is completely
28:01 foreign to us, we now have a proven
28:04 strategy for how to approach it. The
28:06 principles of using AI to find structure
28:09 in a complex unknown signal are universal.
28:11 universal.
28:12 This work is not just about
28:15 communicating with whales, but also
28:17 about preparing for the possibility of
28:20 communicating with anyone anywhere in
28:22 the universe. It seems like a
28:25 far-fetched thought, but the journey to
28:28 space, unlike any other, might have also
28:31 begun from the curiosity of the ocean.
28:35 Finally, the work of project CI brings
28:37 us back to Earth in the most powerful
28:39 way. With complete understanding of the
28:42 whale language, a new level of empathy
28:45 and conservation is fostered. For
28:47 centuries, whales have been seen as a
28:50 species to be harvested, then a species
28:54 to be saved, but always from a distance.
28:56 By knowing that they are a vital part of
28:58 research, we are forced to confront
29:00 their intelligence and personhood in a
29:03 way that is hard to ignore. When we hear
29:05 their communication, their concerns
29:08 about their families, their culture, and
29:10 their world, they are no longer just
29:13 animals in the ocean. They are beings
29:16 with thoughts, feelings, and a history.
29:18 This new connection strengthens the case
29:21 for their protection and the health of
29:24 our oceans. Knowing that these creatures
29:26 have a rich, complex social life makes
29:29 their survival all the more important.
29:32 It gives us a personal emotional reason
29:34 to care about the health of the marine
29:37 environment that they call home. The
29:40 discovery of the whales might seem over,
29:42 but it's only the beginning. The work of
29:45 Project CT gives us a new way to see the
29:48 world and a new role to play in it. So,
29:52 what's next for project CI?
29:54 The team at Project CT isn't slowing
29:57 down as there are already plans in place
30:00 to expand their research to other whale
30:03 species like orcas and refine their AI
30:06 with new algorithms. This effort is part
30:08 of a global movement where scientists
30:10 are using AI to study animal
30:13 communication, showing us a world of
30:16 complex language we never knew existed.
30:19 To round it up, their work is about more
30:22 than just science. It's also about
30:24 building a bridge between our species
30:26 and theirs for a shared future on a
30:29 shared planet. The whales have a lot to
30:31 tell us about the health of the ocean,
30:33 and we are finally learning how to listen.
30:35 listen.
30:37 So, what are your thoughts on this
30:40 groundbreaking research? Do you think
30:43 talking to whales is a huge step in the
30:45 right direction for humanity? Or do you
30:47 have reservations about what it might
30:50 mean? Let us know what you think in the
30:53 comments below. Don't forget to like,
30:56 subscribe, and hit the notification bell
30:58 on our channel so you never miss any of
31:02 our uploads. If you enjoyed this video,
31:04 click on the next video on your screen