0:02 Welcome to the Huberman Lab podcast,
0:03 where we discuss science and
0:11 I'm Andrew Huberman and I'm a professor
0:13 of neurobiology and opthalmology at
0:15 Stamford School of Medicine. My guest
0:18 today is Dr. Poppyrum. Dr. Poppyrum is a
0:21 neuroscientist, a professor at Stanford,
0:23 and the former chief scientist at Dolby
0:25 Laboratories. Her work focuses on how
0:26 technology can accelerate
0:28 neuroplasticity and learning and
0:30 generally enrich our life experience.
0:31 You've no doubt heard about and perhaps
0:34 use wearables and sleep technologies
0:35 that can monitor your sleep, tell you
0:36 how much slowwave sleep you're getting,
0:38 how much REM sleep, and technologies
0:40 that can control the temperature of your
0:41 sleep environment and your room
0:44 environment. Well, you can soon expect
0:46 wearables and hearable technologies to
0:48 be part of your life. Hearable
0:49 technologies are, as the name suggests,
0:51 technologies that can hear your voice
0:53 and the voice of other people and deduce
0:55 what is going to be best for your
0:56 immediate health and your states of
0:58 mind. Believe it or not, these
0:59 technologies will understand your brain
1:01 states, your goals, and it will make
1:03 changes to your home and working and
1:05 other environments so that you can focus
1:07 better, relax more thoroughly, and
1:08 connect with other people on a deeper
1:10 level. As Poppy explains, all of this
1:12 might seem kind of space age and maybe
1:14 even a little aversive or scary now. But
1:16 she explains how it will vastly improve
1:19 life for both kids and adults and indeed
1:20 increase human human empathy. During
1:22 today's episode, you'll realize that
1:24 Poppy is a true out ofthe- box thinker
1:26 and scientist. She has a really unique
1:27 story. She discovered she has perfect
1:29 pitch at a young age. She explains what
1:31 that is and how that shaped her
1:33 worldview and her work. Poppy also
1:35 graciously built a zerocost step-by-step
1:37 protocol for all of you. It allows you
1:40 to build a custom AI tool to improve at
1:42 any skill you want and to build better
1:44 health protocols and routines. I should
1:45 point out that you don't need to know
1:47 how to program in order to use this tool
1:49 that she's built. Anyone can use it and
1:51 as you'll see, it's extremely useful. We
1:52 provide a link to it in the show note
1:54 captions. Today's conversation is unlike
1:56 any that we've previously had on the
1:58 podcast. It's a true glimpse into the
2:00 future and it also points you to new
2:02 tools that you can use now to improve
2:04 your life. Before we begin, I'd like to
2:06 emphasize that this podcast is separate
2:07 from my teaching and research roles at
2:09 Stanford. It is however part of my
2:11 desire and effort to bring zero cost to
2:13 consumer information about science and
2:15 science related tools to the general
2:16 public. In keeping with that theme,
2:19 today's episode does include sponsors.
2:21 And now for my conversation with Dr.
2:24 Poppyrum. Dr. Poppyrum, welcome.
2:26 >> Thanks, Andy. It's great to be here.
2:28 >> Great to see you again. We should let
2:29 people know now we were graduate
2:31 students together, but that's not why
2:33 you're here. You're here because you do
2:35 incredibly original work. You've worked
2:36 in so many different domains of
2:39 technology, neuroscience, etc. Today I
2:41 want to talk about a lot of things, but
2:42 I want to start off by talking about
2:44 neuroplasticity. This incredible ability
2:46 of our nervous systems to change in
2:49 response to experience. I know how I
2:51 think about neuroplasticity, but I want
2:52 to know how you think about
2:54 neuroplasticity. In particular, I want
2:56 to know, do you think our brains are
2:58 much more plastic than most of us
3:00 believe? Like, can we change much more
3:01 than we think? and we just haven't
3:04 accessed the ways to do that. Or do you
3:06 think that our brains are pretty fixed
3:08 and in order to make progress as a
3:10 species, we're gonna have to, I don't
3:12 know, create robots or something to to
3:13 do the work that we're not able to do
3:15 because our brains are fixed. Let's
3:19 start off by just getting your take on
3:21 what neuroplasticity is and what you
3:24 think the limits on it are. I do think
3:26 we're much more plastic than and and and
3:28 then than than we talk about or we
3:30 realize in our daily lives and and just
3:32 to your point about creating robots, the
3:33 more we create robots, there's
3:36 neuroplasticity that comes with comes
3:38 with using robots as humans when we use
3:40 them in partnerships or as you know
3:43 tools to accelerate our capabilities. So
3:46 neuroplasticity the way the the where I
3:50 resonate with it a lot is uh trying to
3:52 understand and and this is what I've
3:54 done a lot of in my career is thinking
3:57 about building and developing
3:59 technologies but with an understanding
4:02 of how they shape our brain. Everything
4:04 we engage with in our daily lives,
4:06 whether it's the statistics of our
4:09 environments and our contexts or the
4:12 technologies we use on a daily basis are
4:14 shaping our brains in ways through
4:16 neuroplasticity. Um, some more than
4:19 others. Some we know as we age are very
4:21 dependent on how attentive and engaged
4:23 we are as opposed to passively just
4:27 consuming and and mo and and changing.
4:31 But we are in a place where everyone I
4:33 believe needs to be thinking more about
4:34 how the technologies they're using,
4:36 especially in the age of AI and
4:39 immersive technologies, how they are
4:42 shaping, you know, or architecting our
4:45 brains as we move forward. You go to any
4:47 neuroscience 101 medical school textbook
4:48 and there's something you'll you'll see
4:50 a few pages on something called the
4:53 homunculus. Now, what is the homunculus?
4:55 It's a data representation, but it it'll
4:57 be this sort of funnyl looking creature
4:59 when you see it. But that picture of
5:02 this sort of distorted human that you're
5:05 looking at is really just um a data
5:08 representation of how many cells in your
5:13 brain are helping or coding and
5:15 representing information for your sense
5:19 of touch, right? And that that image
5:20 though and this is where things get kind
5:23 of funny. That image comes from Wilder
5:25 Penfield back in the 40s. He recorded
5:28 the he would semataensory
5:31 cells of uh of patients just before they
5:33 were to have you know surgery for
5:35 epilepsy and such. And you know since we
5:37 don't have pain receptors in our cortex
5:39 he could have this awake human and be
5:41 able to touch different parts of their
5:43 brain and ask them you know to report
5:45 what sensation they felt on their
5:48 bodies. And so he mapped that part of
5:50 their their cortex and then that that's
5:52 how we ended up with the homunculus and
5:53 you'll see you know it'll have bigger
5:56 lips. It'll have you know smaller parts
5:58 of your back in the areas where you just
6:00 don't have the same sensitivities.
6:03 Well fast forward to today when you look
6:04 at that homunculus one of the things I
6:06 always will ask people to think about is
6:09 you know what's wrong with this image?
6:12 You know, this is an image from 1940
6:15 that is still in every textbook. And you
6:17 know, any Stamford student will look at
6:19 it and they'll immediately say, "Well,
6:20 the thumb should be bigger because we do
6:22 this all day long and I've got more
6:24 sensitivity in my fingers because I'm
6:26 always typing on my mobile device."
6:28 Which is absolutely true. Or maybe
6:30 they'll say something like, "Well, the
6:33 the ankles are the same size and and we
6:34 drive cars now a lot more than we did in
6:37 the 40s." or maybe if I live different
6:39 part of the world I drive on one side
6:41 versus the other and in in a few years
6:43 you know we probably won't be driving
6:45 and those resources get optimized
6:49 elsewhere. So what the hunculus is is
6:51 it's a representation of how our brain
6:53 has allocated resources to help us be
6:56 successful and those resources are the
6:59 limited cells we have that support
7:02 whatever we need to flourish in our
7:04 world. And the the beauty of that is
7:07 when you develop expertise, you develop
7:10 more support, more resources go to
7:13 helping you do that thing. But they also
7:15 get more specific. They develop more
7:18 specificity. So that you know I might
7:21 have suddenly a lot more cells in my
7:23 brain devoted to helping me yet you know
7:27 I'm a violinist and my well my left hand
7:28 my right hemisphere on my semata sensory
7:30 cortex I'm going to have a lot more
7:32 cells that are helping me you know feel
7:35 my fingers and and the the tips of
7:37 everything so that I can you know be
7:40 fluid and and more virtuosic but that
7:43 means I have more cells but they're more
7:45 specified they're giving me more
7:47 sensitivity they're giving me more data
7:49 that's differentiated and that's what my
7:50 brain needs and that's what my brain's
7:53 responding to. And so when we think
7:55 about that, you know, my practice as a
7:58 musician versus my practice playing
8:00 video games, all of these things
8:03 influence our brain um in and influence
8:07 our our plasticity. Now, where things
8:09 get kind of interesting to me and sort
8:12 of my obsession on that side is every
8:14 time we engage with a technology, it's
8:16 going to shape our brain, right? It's
8:19 both, you know, our environments, but
8:21 our environments are changing. Those are
8:23 shaping who we are. You know, I think
8:25 you can look at um people's hearing
8:27 thresholds and predict what city they
8:29 live in. Then absolutely. Yes.
8:32 >> Can you just briefly explain explain
8:34 thresholds and why that would be? I
8:35 mean, I was visiting the city of Chicago
8:37 a couple years ago. Beautiful city.
8:40 Yeah. Amazing food. Love the people.
8:42 >> Very loud city.
8:44 >> Wide downtown streets. Not a ton of trees
8:46 trees
8:47 >> compared to what I'm used to.
8:49 >> And I was like, "Wow, it's really loud
8:52 here." And I grew up in the suburbs. Got
8:54 out as quickly as I could. Don't like
8:56 the suburbs. Sorry. Suburb dwellers not
8:59 for me. Um I like the wilderness and I
9:03 like cities. Um, but you're telling me
9:05 that you can actually predict people's
9:09 hearing thresholds for loudness simply
9:10 based on where they were raised or where
9:11 they currently live.
9:14 >> In part, it can be both, right? Because
9:16 cities have sonic imprints, types of
9:19 noise, things that are very, you know,
9:21 very loud cities, but also what's
9:23 creating that noise, right? That's often
9:25 unique. the the the inputs, the types of
9:28 vehicles, the types of density of people
9:31 or and and um con you even the
9:33 construction in those environments, it
9:36 is changing what noise exists. That's
9:38 shaping, you know, people's hearing
9:40 thresholds at the lowest level. It's
9:42 also shaping their sensitivities. If
9:43 you're used to hearing, you know,
9:45 certain animals in your environment and
9:48 they come with, you know, uh, you should
9:50 be heightened to a certain response in
9:52 that, you're going to develop increased
9:54 sensitivity to that, right? Whereas, if
9:56 it's really abnormal, you know, to I
9:58 hear chickens. I have a neighbor who has
10:00 chickens in the city, but roosters, too.
10:02 >> Yes. Yes.
10:04 >> I grew up near a rooster. I can still
10:06 hear that rooster. >> Yeah.
10:06 >> Yeah.
10:08 >> Those those sounds are embedded deeply
10:11 in my mind. There's the semantic context
10:13 and then just the sort of spectrum,
10:14 right? And the intensity of that
10:16 spectrum. And meaning when I say
10:17 spectrum, I mean the different
10:20 frequency, amplitudes and and what that
10:20 shaping is like.
10:22 >> High pitch, low pitch, the same.
10:25 >> Yeah. Yeah. And that affects how your
10:27 neural system is is changing even at the
10:30 lowest level of what you know what it's
10:33 your your ear is your brain your cookia
10:37 is getting exposed to. But then also
10:39 where you know so that would be the
10:42 lower level you know what what sort of
10:44 noise damage might exist what exposures
10:46 but then also then there's the
10:48 amplification of you know coming from
10:50 your higher level areas that are helping
10:52 you know that these are frequencies are
10:55 more important in your context in your
10:58 environment there is a a funny like this
11:00 is kind of funny um there was a film
11:01 called I think it's the sound of silence
11:03 and it started I I love Peter Sarsgard
11:05 he was one of the the actors in it And
11:07 um it was sort of meant to be a bit
11:09 fantastical or is that a word? Is that
11:12 the right word?
11:15 But in fact to me so the the filmmakers
11:18 had inter you talked to me a lot as had
11:21 um and to to inform the sort of main
11:22 character and the way he behaved because
11:24 I have absolute pitch and there were
11:25 certain things that they were trying to
11:28 emulate in this um in this film. He he
11:30 ends up being this person who tunes
11:32 people's lives. He'll walk into their
11:34 environments and be like, "Oh, you know,
11:36 things are going badly at work or your
11:38 relationships because your your you
11:40 know, you've got this tritone, your or
11:43 your your water heater is making this,
11:45 you know, pitch and your teapot is at this."
11:45 this."
11:47 >> Oh my god, this would go over so well in
11:48 LA. People would pay millions of dollars
11:49 in Los Angeles.
11:50 >> Totally funny.
11:51 >> Do you do this for people?
11:52 >> Um, no.
11:53 >> Okay. Okay.
11:56 >> I I will tell you I I will walk into
11:58 hotel rooms and immediately if I hear
12:00 something, I'm I've moved. And so you
12:01 know that is I
12:02 >> because you have perfect pitch. Could
12:04 you define perfect pitch? Does that mean
12:05 that you can always hit a note perfectly
12:06 with your voice?
12:08 >> There is no such thing as perfect pitch.
12:10 there's absolute pitch and so think only
12:15 because uh the idea of so like that
12:18 would be a equal 440 hertz right but
12:20 that's a standard that we use in modern
12:23 time and the you know different what a
12:27 is has actually changed throughout the
12:28 our lives with aesthetic with what
12:30 people liked with the tools we used to
12:32 create music and you know in the broke
12:35 era a was 415 hertz and that
12:37 >> you hit that
12:40 >> awesome And um in any case, so that's
12:42 why it's it's absolute because you know,
12:46 guess what? As my uh Basler membrane
12:49 gets more rigid as I might age or my
12:51 temporal processing slows down, my
12:53 brain's going to still think I'm in, you
12:55 know, I'm singing 440 Hz, but it might
12:55 not be. It's
12:57 >> baselor membrane is a portion of the
13:00 internal ear that uh converts sound
13:02 waves into electrical signals, right?
13:03 Yeah. Okay, fair enough. Well,
13:06 >> I'm talking to an auditory physiologist
13:08 that help I I teach auditory physiology,
13:09 but I want to just make sure because I'm
13:12 I'm sitting across from an expert.
13:13 >> I'd like to take a quick break and
13:15 acknowledge one of our sponsors, David.
13:17 David makes a protein bar unlike any
13:21 other. It has 28 g of protein, only 150
13:23 calories, and zero gram of sugar. That's
13:26 right, 28 g of protein, and 75% of its
13:29 calories come from protein. This is 50%
13:30 higher than the next closest protein
13:32 bar. David protein bars also taste
13:35 amazing. Even the texture is amazing. My
13:36 favorite bar is the chocolate chip
13:38 cookie dough. But then again, I also
13:39 like the new chocolate peanut butter
13:41 flavor and the chocolate brownie
13:42 flavored. Basically, I like all the
13:44 flavors a lot. They're all incredibly
13:46 delicious. In fact, the toughest
13:47 challenge is knowing which ones to eat
13:49 on which days and how many times per
13:51 day. I limit myself to two per day, but
13:53 I absolutely love them. With David, I'm
13:55 able to get 28 grams of protein in the
13:57 calories of a snack, which makes it easy
13:59 to hit my protein goals of 1 gram of
14:01 protein per pound of body weight per
14:03 day. And it allows me to do so without
14:06 ingesting too many calories. I'll eat a
14:07 David protein bar most afternoons as a
14:09 snack, and I always keep one with me
14:11 when I'm out of the house or traveling.
14:13 They're incredibly delicious, and given
14:15 that they have 28 grams of protein,
14:17 they're really satisfying for having
14:19 just 150 calories. If you'd like to try
14:20 David, you can go to davidprotein.com/huberman.
14:23 davidprotein.com/huberman.
14:26 Again, that's davidprotein.com/huberman.
14:28 Today's episode is also brought to us by
14:30 Helix Sleep. Helix Sleep makes
14:31 mattresses and pillows that are
14:34 customized to your unique sleep needs.
14:35 Now, I've spoken many times before on
14:37 this and other podcasts about the fact
14:38 that getting a great night's sleep is
14:40 the foundation of mental health,
14:42 physical health, and performance. Now,
14:43 the mattress you sleep on makes a huge
14:45 difference in the quality of sleep that
14:47 you get each night. how soft it is or
14:48 how firm it is all play into your
14:50 comfort and need to be tailored to your
14:52 unique sleep needs. If you go to the
14:54 Helix website, you can take a brief
14:55 two-minute quiz and it will ask you
14:57 questions such as, "Do you sleep on your
14:58 back, your side, or your stomach? Do you
15:00 tend to run hot or cold during the
15:02 night?" Things of that sort. Maybe you
15:03 know the answers to those questions,
15:05 maybe you don't. Either way, Helix will
15:07 match you to the ideal mattress for you.
15:08 For me, that turned out to be the Dusk
15:10 mattress. I started sleeping on a Dusk
15:12 mattress about 3 and a half years ago,
15:14 and it's been far and away the best
15:16 sleep that I've ever had. If you'd like
15:17 to try Helixleep, you can go to helixleep.com/huberman,
15:20 helixleep.com/huberman,
15:22 take that 2-minute sleep quiz, and Helix
15:23 will match you to a mattress that's
15:25 customized to you. Right now, Helix is
15:27 giving up to 27% off all mattress
15:28 orders. Again, that's helixleep.com/huberman
15:30 helixleep.com/huberman
15:33 to get up to 27% off.
15:36 >> Okay, so our brains are customized to
15:38 our experience. Yeah.
15:41 >> Especially our childhood experience, but
15:42 also our adult experience. >> Yes.
15:43 >> Yes.
15:45 >> You mentioned the homunculus, this
15:47 representation of the body surface. And
15:48 you said something that I just have to
15:50 pick up on and ask some questions about,
15:52 which is that um
15:54 >> this hypothetical Stanford student could
15:56 be any student anywhere says, "What?
15:59 Wait, nowadays, uh, we spend a lot of
16:00 time writing with our thumbs and
16:02 thinking as we write with our thumbs and
16:04 emoting, right? I mean, when we text
16:06 with our thumbs, we're sometimes
16:08 involved in an emotional exchange. >> Yeah.
16:08 >> Yeah.
16:11 >> My question is this.
16:15 The last 15 years or so have represented
16:18 an unprecedented time of new technology
16:20 integration, right? I mean, the smartphone,
16:22 smartphone,
16:28 >> um, texting. And when I text, I realized
16:30 that I'm hearing a voice in my head as I text,
16:32 text,
16:35 which is my voice. Because if I'm
16:38 texting outward, I'm sending a text.
16:42 But then I'm also internalizing the
16:44 voice of the person writing to me if I
16:45 know them.
16:48 >> But it's coming through filtered by my
16:51 brain. Right. So it's like I'm not
16:52 trying to micro dissect something here
16:55 for the sake of micro dissection but the
16:57 conversation that we have by text it's
17:00 all happening in our own head but there
17:03 are two or more players group text was
17:04 too complicated to even consider right
17:08 now but what is that transformation
17:10 really about previously I would write
17:12 you a letter would send you a letter I'd
17:14 write you an email I'd send you an email
17:17 and so the process was really slowed now
17:18 you can be in a conversation with
17:20 somebody that's fast back and forth,
17:22 >> right? Some people can type fast. You
17:24 can email fast, but nothing like what
17:26 you can do with text, right? I can even
17:28 know when you're thinking because it's
17:31 dot dot dot or you're writing, right?
17:35 And so is it possible that we've now
17:37 allocated an entire region of the
17:39 homunculus or of some other region of cortex
17:41 cortex
17:46 brain to conversation that prior to 2010
17:49 or so the brain just was not involved in
17:51 conversations of any sort. In other
17:52 words, we now have the integration of
17:56 writing with thumbs. That's new.
17:59 hearing our own voice, hearing the
18:01 hypothetical voice of the other person
18:04 at the other end and doing that all at
18:06 rapid speed. Are we talking about like a
18:09 new brain area or are we talking about
18:12 using old brain areas and just trying to
18:14 find and push the overlap in the ven
18:16 diagram? Because I remember all of this
18:17 happening very quickly and very
18:19 seamlessly. I remember like texting
18:22 showed up and it was like, "All right,
18:24 well, it's a little slow, a little
18:26 clunky." Pretty soon it was autofill.
18:28 Pretty soon it was learning us. Now we
18:30 can do voice recognition. And it's it's
18:32 it you know people picked this up very
18:35 fast. So the question is are we taking
18:38 old brain areas and combining them in
18:40 new ways or is it possible that we're
18:42 actually changing the way that our brain
18:44 works fundamentally in order to be able
18:46 to carry out something as what seems to
18:50 be nowadays trivial but as uh as basic
18:52 to everyday life as texting. What's
18:54 going on in our brain? we aren't
18:56 developing new resources. we've got the
18:59 same cells that are or I mean there's
19:01 neurogenesis of course but um it's how
19:03 those are getting allocated and you know
19:06 just one one quick comment from what we
19:08 said before when we talk about the
19:10 monculus the homunculus is an example of
19:12 a map in the brain a cortical map and
19:14 maps are important in the brain because
19:16 they you know allow cells that need to
19:18 interact to give us specificity to make
19:21 us fast to have you know tight reaction
19:23 times and things you know because you
19:26 got shorter distance and you know things
19:27 that belong together. Also there's a lot
19:29 of motility in terms of you know what
19:31 those cells respond to potentially
19:32 dependent on our inputs. So the
19:34 homunculus might be one map but there
19:36 are maps all over our brain and those
19:38 maps still have a lot of cross input. So
19:40 what you're talking about is are you
19:43 having areas where we didn't used to
19:46 allocate and differentiate in you
19:49 specificity of what those cells were
19:52 doing that are now quite related to the
19:54 different ways my brain is having to
19:56 interpret a text message and the
19:58 subtlety and the nuance of that that
20:01 actually now I'm I get faster at I have
20:04 faster reaction times I also have faster
20:06 interpretations. So am I allocating
20:08 cells that used to do something else to
20:10 allow me to have that? Probably. But I'm
20:12 also building, you know, where like
20:14 think about me as a multi-ensory object
20:17 that has, you know, I have to integrate
20:19 information across sight, sound, smell
20:22 to form a holistic, you know, object
20:24 experience. That same sort of, you know,
20:27 integration and and pattern is happening
20:29 now when we communicate in ways that it
20:31 didn't used to. So what does that mean?
20:32 It means there's a lot more
20:34 repeatability, a lot faster pattern
20:37 matching, a lot more integration that is
20:38 allowing us to go faster.
20:40 >> I completely agree. I feel like there's
20:42 an entire generation of people who grew
20:44 up with smartphones,
20:46 >> uh, for which it's just part of life. I
20:48 think one of the most impactful
20:50 statements I ever heard in this kind of
20:52 general domain was I gave a talk down at
20:54 Santa Clara University one evening to
20:56 some students.
20:58 >> Um, and I made a comment about putting
21:00 the phone away and how much easier it is
21:02 to focus when you put the phone away and
21:03 how much better life is when you take
21:05 space from your smartphone and all of
21:08 this kind of thing. And afterwards, this
21:09 young guy came up to me. He's probably
21:11 in his early 20s and he said, "Listen,
21:14 you don't get it at all." Said, "What do
21:15 you And he said, "You adopted this
21:18 technology into your life and after your
21:20 brain had developed." He said, "When,"
21:22 he's speaking for himself. He said,
21:24 "When my phone runs out of charge, I
21:27 feel the life drain out of my body and
21:31 it is unbearable
21:34 or nearly unbearable until that phone
21:36 pops back on."
21:38 And then I feel life returned to my
21:41 body. And it's because I can communicate
21:42 with my friends again. I don't feel
21:45 alone. I don't feel cut off from the
21:47 rest of the world. And I was thinking to
21:49 myself, wow. Like his statements really
21:51 stuck with me because I realized that
21:54 his brain, as he was pointing out, is
21:56 indeed fundamentally different than mine
21:57 in terms of social context,
22:00 communication, feelings of safety, and
22:02 on and on. And I don't think he's alone.
22:04 I think for some people it might not be
22:06 quite as extreme,
22:11 >> but for many of us um to see that dot
22:14 dot dot in the midst of a conversation
22:16 where we really want the answer to
22:19 something um or it's an emotionally
22:22 charged conversation can be uh a very
22:24 intense human experience.
22:26 >> That's interesting. So we've we've sped
22:29 up the rate that we transfer information
22:31 between one another. But even about
22:32 trivial things, it doesn't have to be an
22:34 argument or like is it, you know, stage
22:36 four cancer or is it benign, right? Like
22:38 these are those are extreme conditions,
22:39 right? Are they alive? Are they dead?
22:41 You know, did they find him or her or
22:43 did they not? You know, those are
22:45 extreme cases. But there's just the
22:48 everyday life of um and I noticed this
22:51 like if I go um up the coast sometimes
22:53 or I'll go to Big Su and I I will
22:54 intentionally have time away from my
22:56 phone. It takes about a an hour or two
22:58 or maybe even a half day to really drop
23:00 into the local environment where you're
23:02 not looking for stimulation coming in
23:04 through the smartphone. And I don't
23:06 think I'm unusual in that regard either.
23:08 So I guess the question is do you think that
23:11 that
23:15 the technology is good, bad, neutral or
23:17 are you agnostic as to how the
23:19 technologies are shaping our brain?
23:21 >> It goes in lots of different directions.
23:23 Um, one thing I did want to say though
23:25 with what with smartphones specifically
23:28 and sort of everything, you know, in in
23:30 audio, you know, that our ability to
23:35 have, you know, carry uh our lifetime of
23:37 music and and content with us has been
23:39 because of, you know, huge advances in
23:42 the last 25, 30 years and maybe maybe
23:45 even slightly more around um compression
23:47 algorithms that have enabled us to have
23:50 really effective what we call perceptual
23:52 compression, lossy perceptual algorithms
23:54 and things like MP3 and and you know my
23:57 my past work with companies like Dolby.
23:59 But whenever you're talking about what's
24:02 the goal of content compression
24:05 algorithms, it's to translate the
24:07 entirety of the experience, the entirety
24:09 of a signal in, you know, with with a
24:11 lot of the information removed, right?
24:13 But in intelligent ways. When you look
24:16 at the way someone is communicating with
24:20 acronyms and the shorthand that the next
24:22 generations use to communicate, it is
24:24 such a rich communication. Even though
24:27 they might just say LOL, I mean, it's
24:29 like or they might you you know, it's
24:32 it's it's actually a lossy compression
24:34 that's triggering a huge cognitive
24:35 experience, right?
24:37 >> Can you explain lossy for people who
24:39 might not be familiar with it? Lossy
24:41 means that in your encoding and decoding
24:43 of that information, there is actually
24:45 information that's lost when you decode
24:47 it. But hopefully that information is
24:49 not impacting the perceptual experience.
24:52 Imagine I have, you know, a song and I
24:54 want to represent that song. I could
24:57 take out to make my file smaller. I
24:58 could take out every other, you know,
25:01 every 500 milliseconds of that and it
25:04 would sound really horrible, right? or I
25:05 could be a lot more intelligent and
25:07 instead basically, you know, if you look
25:09 at early models like MP3, they're
25:10 they're they're kind of like
25:12 computational models of the brain. They
25:14 stop, you know, they might stop at like
25:17 the auditory nerve, but they're trying
25:20 to put a model of how our brain would
25:22 deal with sound, what we would hear,
25:23 what we wouldn't. If this sound's
25:25 present, and it's present at the same
25:27 time as this sound, then this sound
25:29 wouldn't be heard, but this sound would
25:31 be. So we don't need to spend any of our
25:33 our bits coding this sound. Instead, we
25:35 just need to code this one. And so it
25:37 becomes an intelligent way for the model
25:39 and the algorithm of deciding what
25:40 information needs to be represented and
25:43 what doesn't to create the same, you
25:46 know, the best ex perceptual experience
25:49 which perceptual meaning what we get to
25:51 you know take home. I think one of the
25:54 things that's important then why I think
25:56 whenever I had used to have to teach
25:59 some of you know what it means to
26:01 represent a rich experience with minimal
26:05 data you think with minimal information
26:08 um some of the acronyms that exist in in
26:11 like mobile texting they've taken on a
26:14 very rich life in internal
26:15 >> yeah well those are simplistic ones but
26:17 I think people can have communication
26:19 now that we can't understand entirely This
26:20 This
26:21 is because you have a 10-year-old
26:22 daughter. Does she does she have
26:24 communication by acronym that to you is cryptic
26:25 cryptic
26:27 >> sometimes. But I I have to figure it out
26:29 then. But yes, but but the point is it
26:31 that is an example of a lossy
26:33 compression algorithm that actually has
26:35 a much richer perceptual experience,
26:38 right? And it often needs context, but
26:40 it's still, you know, you're using few
26:43 bits of information to try to represent
26:46 a much richer feeling in a much richer
26:48 state, right? And you know, if you look
26:49 at different people, they're going to
26:52 have, you know, bigger physiological
26:54 experience dependent on, you know, how
26:55 how they've grown up with that kind of context.
26:56 context.
26:57 >> It sounds to me, >> yeah,
26:58 >> yeah,
27:01 >> uh I don't want to um project here, but
27:04 it sounds to me like you see the great
27:06 opportunity of the of data compression.
27:07 Like let's just stay with the use of
27:10 acronyms in texting. That's a that's a
27:12 vast data compression compared to the
27:15 kind of speech and direct exchange that
27:18 people uh engaged in 30 years ago. So
27:21 there's less data being exchanged. Um
27:23 but the experience is just as rich if
27:25 not more rich is what you're saying,
27:27 which implies to me that you look at it
27:31 as generally neutral to to benevolent.
27:32 Like it's good.
27:32 >> It's just different.
27:34 >> I'm coming up on 50 in a couple months.
27:36 as opposed to somebody saying, "Well,
27:38 you know, when I was younger, we'd write
27:40 our boyfriend or girlfriend a letter.
27:43 Uh, you know, I would um I would
27:45 actually write out a birthday card. I
27:48 would um go You'd have a face tof face
27:50 conversation." And you got this younger
27:52 generation that are saying, "Yeah,
27:54 whatever." You know, this is like what
27:56 we heard about, I used to trudge to
27:57 school in the snow kind of thing. It's
27:59 like, well, we have heated school buses
28:02 now and we've got uh you driverless
28:05 cars. So um I think this is important
28:07 and useful for people of all ages to
28:11 hear that the richness of an experience
28:14 can be maintained even though the there
28:16 are data or some elements of the
28:18 exchange are being completely removed.
28:20 >> Absolutely. But it's maintained because
28:22 of the neural connections that are built
28:25 in those individuals. Right. and that
28:27 generation. I I always think of okay and
28:29 the nervous system likes to code um
28:31 along a continuum but like yum yuck or
28:33 meh like do you think that that that a
28:35 technology is kind of neutral like yeah
28:36 you lose some things you gain some
28:39 things or do you think like this is bad
28:41 these days we hear a lot of AI fear
28:43 we'll talk about that um or you hear
28:45 also people who are super excited about
28:48 what AI can do what smartphones can do I
28:51 mean some people uh like my sister and
28:53 her daughter love smartphones because
28:55 they can communicate it gives a feeling
28:56 of safety at a distance like quick
28:58 communications are easier. It's hard to
29:01 sit down and write write a letter. Um
29:02 she's going off to college soon. So the
29:03 question is like how often will you be
29:05 in touch? It raises expectations about
29:08 frequency but it reduces of contact but
29:10 it reduces expectations of depth
29:11 >> because you can do like a hey was
29:12 thinking about you this morning and that
29:15 can feel like a lot but a letter if I
29:17 sent a letter home you know during
29:18 college to my own like hey was thinking
29:21 about you this morning love Andrew and
29:23 be like okay like I don't know how that
29:25 would be like well that didn't take long
29:28 right so I think that there's a it's a
29:29 seessaw you know
29:31 >> you get more frequency and then it comes
29:33 with different levels of you know
29:35 expectation Sean those my daughter's at
29:36 camp right now and we were only allowed
29:38 to write letters for two weeks.
29:39 >> Handwritten letters.
29:40 >> Handwritten letters. How did that get
29:43 over that? It's happening. I mean,
29:44 >> I'd lost their home in a flood years
29:47 ago. And um one of the only things I
29:51 saved out of the flood, which is this
29:52 >> and and I just brought these back
29:54 because I I got them for my brother is
29:56 the the they're this communication
29:58 between one of my ancestors, you know,
29:59 during the Civil War, like they were
30:02 courting and that was all saved these
30:04 letters back and forth between the women
30:06 and you know, and it's, you know, with
30:09 these it's like 1865. And
30:09 >> you have those letters?
30:12 >> I do. I do. I had them in my in my
30:14 computer bag until flew up here and um
30:16 but you know they were on parchment and
30:18 even though they went through a flight
30:20 they they you know they didn't run they
30:23 say and it's this very different era of
30:25 communication and it's wonderful to have
30:27 that preserved because that doesn't
30:31 translate right through um and without
30:34 um that history in any case I am a hu
30:37 huge advocate for integration of
30:38 technology but it's for me the world is
30:41 data and and I I do think that way.
30:44 It's, you know, and and I I look at what
30:45 the way my daughter behaves. I'm like,
30:48 okay, well, what data is coming in? Why
30:50 did she, you know, respond that way?
30:52 And, you know, there's this an example I
30:54 I can give. But, you know, you think we
30:55 were talking about neuroplasticity. It's
30:57 like we are the creatures of sort of
31:00 three things. One is uh you know our
31:03 sensory systems how they've evolved and
31:05 be it from by you know the intrinsic
31:08 noise that is you know causing our
31:11 sensory receptors or the external stren
31:13 you know I my brain is going to have
31:15 access to about the same amount of
31:17 information as someone with hearing loss
31:19 if I'm in a very noisy environment and
31:20 so suddenly you've induced you know
31:22 you've compromised the data I have
31:25 access to and then also our sort of
31:27 experientially established priors right
31:29 our prior is being if you think about
31:32 the brain as sort of a basian model you
31:33 things aren't always deterministic for
31:35 us like they are for some creatures our
31:37 brains having to take data and make
31:39 decisions about it and respond
31:40 >> basian we should just explain for people
31:43 deterministic would be input A leads to
31:44 output B yeah
31:47 >> Beijian is it depends on the statistics
31:48 of what's happening externally and
31:49 internally yeah
31:51 >> these are probabilistic models like
31:53 there's a likelihood of A
31:56 >> becoming B or there's a likelihood of A
31:58 driving B but there's also a probability
32:00 that A will drive C, D or F.
32:02 >> Absolutely. And you know Frank and we
32:03 should get into I mean some of the
32:05 things that make us the most effective
32:07 in our environments and just in
32:10 interacting in the world is how fast and
32:11 effective we are with dealing with those
32:14 probabilistic you know situations. Those
32:16 things where your brain it's it's like
32:19 probabilistic inference is a great
32:21 indicator of success in an environment.
32:23 And you know, be it a work environment,
32:25 be it just, you know, walking down the
32:28 street and um how that's how do we deal
32:30 with this like data that doesn't just
32:32 tell us we have to go right or left, but
32:33 there's a lot of different inputs and
32:34 it's our sort of situational
32:37 intelligence in the world. And there you
32:38 we can break that down into a lot of
32:40 different ways. In any case, we are the
32:42 products of our, you know, our sensory
32:44 systems, our experience, our priors,
32:47 which are the statistics that and data
32:49 we've had up until that moment that our
32:51 brain's using to wait how it's going to
32:53 behave in the decisions it makes, but
32:55 also then our expectations, the context
32:56 of that, you know, that have shaped
32:58 where we are. And so there's this funny
32:59 story like my daughter when she was two
33:01 and a half, we're in the planetarium at
33:03 the Smithsonian and we're watching, I
33:05 think, one typical film you might watch
33:07 in a planetarium. We started in LA, zoom
33:09 out on our way to the sun, and we pass
33:11 that sort of, you know, quintessential
33:14 NASA image of the Earth, and it's
33:15 totally dark and silent. And my
33:17 daughter, as loud as she possibly could,
33:20 yells, "Minions." And I'm like, "What's
33:22 going on?"
33:24 I'm like, "Oh, yes, of course." Her
33:27 experientially established prior of that
33:30 image is coming from the Universal logo.
33:32 And you know, she never, you know, that
33:34 says Universal.
33:37 It was totally valid, but it was this
33:40 very uh you know honest and true part of
33:43 what it is to be human. Like each of us
33:46 is experiencing very different you know
33:48 having very different experiences of the
33:50 same physical information and we need to
33:54 recognize that but it is driven by our
33:56 exposures and our priors and our sensory
33:59 systems. It's sort of that trifecta and
34:01 our expectations of the moment. And once
34:04 you unpack that, you really start to rep
34:06 and and appreciate the influence of
34:10 technology. Now I am a huge advocate for
34:12 technology improving us as humans, but
34:14 also improving the data we have to make
34:17 better decisions and the sort of
34:20 insights that drive us. At the same
34:22 time, I think sometimes we're pennywise
34:25 pound foolish with how we use technology
34:27 and the quick things that make us faster
34:30 can also make us dumber and take away
34:33 our cognitive capabilities. And you know
34:35 where you'll end up with those that are
34:37 using the technologies might be to to
34:40 you know to write papers all the time
34:42 are maybe well and we we we can talk
34:43 about that more are putting themselves
34:46 in a place where they are going to be
34:48 compromised trying to do anything
34:50 without that technology and also in
34:52 terms of their their learning of that
34:55 data that information. And so you start
34:56 even ending up with bigger
34:58 differentiations and cognitive
35:01 capabilities by whether how you use a
35:03 tool a a technology tool to make you
35:06 better or faster or not. One of my sort
35:08 of things I've always done is teach at
35:10 Stanford that thus we also have that in common.
35:12 common.
35:14 >> I need to sit in on one of your lectures
35:15 >> and you know but my my class there has
35:17 been is called neuroplasticity and video
35:20 gaming and um I'm a neurohysiologist but
35:22 I'm I'm really a technologist. I like
35:23 buildings. I like you know innovation
35:27 across many domains and while that class
35:30 says video gaming it's really more well
35:32 video games are powerful in the sense
35:34 that there's this sort of closed loop
35:35 environment you give feedback you get
35:37 data on your performance but you get to
35:39 control that and know what you randomize
35:41 how you build and what our aim is in
35:44 that class is to build technology and
35:46 games with an understanding of the
35:49 neural circuits you're impacting and how
35:51 you want to what you want to train I'll
35:54 have um students that are musicians.
35:55 I'll have students that are computer
35:57 scientists. I'll have students that are,
35:59 you know, some of Samford's top
36:01 athletes. I've had a number of their top
36:04 athletes go through my my course and um
36:06 it's always focused on okay, there's
36:07 some aspect of human performance I want
36:10 to dissect and I want to really amplify
36:12 the sensitivity or the the access to
36:15 that type of learning in a closed loop
36:17 way. Just for anyone that isn't familiar
36:20 with the role or the history of gaming
36:22 in the neuroscience space, you know,
36:24 there's been some great papers in the
36:26 past. Um, take a gamer versus a
36:28 non-gamer just to start with someone
36:31 self-identified. a typical gamer um
36:33 actually has what we would call um more
36:35 sensitive and this is your domain so you
36:37 can counter me on this anytime but you
36:39 know contrast sensitivity functions and
36:41 like a contrast sensitivity function is
36:45 um you know ability to see uh edges and
36:48 differentiation um in a visual
36:52 landscape. Okay, they can see uh faster
36:54 and uh you know more they're more
36:56 sensitive to that sort of differentiation.
36:57 differentiation.
37:00 So than someone who says I'm not a video
37:02 game player or or selfidentifies that way
37:03 way
37:05 >> because they've trained it
37:06 >> like like a first person shooter game
37:08 which I've played occasionally in an
37:10 arcade or something like that. Uh I
37:11 didn't play a lot of video games growing
37:15 up. I don't these days either but um
37:17 yeah a lot of it is based on contrast
37:19 sensitivity knowing are is that a friend
37:21 or foe are you supposed to shoot them or
37:22 not? Yeah. you have to make these
37:25 decisions very fast. Yeah. Um like right
37:27 on the threshold of of what you would
37:29 call like reflexive like no no thinking
37:31 involved but but it's just it's just
37:33 rapid rapid iteration and decision-m and
37:35 then the rules will switch. Yeah.
37:37 >> Right. Like suddenly you're supposed to
37:40 uh turn other other things into targets
37:42 and other things into into
37:44 >> you're spot on because then you take
37:46 someone who that selfidentified
37:48 non-gamer, make them play 40 hours of
37:50 Call of Duty and now their contrast
37:52 sensitivity looks like a video game
37:54 player and it persists. You know, go
37:56 back, measure them a year later, but you
37:58 know, 40 hours of playing Call of Duty
37:59 and I see the world differently, not
38:01 just in my video game. I actually have
38:04 foundational shifts in how I experience
38:05 the world that give me more greater
38:07 sensitivity to my situational awareness,
38:09 my situational intelligence, real life.
38:09 >> Yeah. Yeah.
38:11 >> Yeah. Because that's a low-level
38:12 processing capability. I love
38:15 intersecting those when you can. But
38:17 what's even I think more interesting is
38:19 you also and there these were some this
38:23 was a great study by Alex Puge um and
38:26 Daphne um devel uh where it's not just
38:28 the contrast sensitivity it's let's go
38:29 to that next level where we were talking
38:31 about basian like probabilistic
38:32 decisions where things aren't
38:36 deterministic um and 40 a video game
38:38 player and I can train this they're
38:40 going to make the same decisions as a
38:43 nonvideo game player in those you know
38:46 probabilistic envir inferential
38:48 situations, but they're going to do it a
38:51 lot faster. And so that edge, that
38:53 ability to get access to that
38:55 information is phenomenal, I think. And
38:57 and and when you can tap into that, that
38:59 becomes a very powerful thing. So like
39:01 probabilistic inference goes up when
39:02 I've, you know, played 40 hours of Call
39:05 of Duty. But then what I like to do is
39:07 take it and say, okay, here's, you know,
39:09 a training environment. You know, I had
39:12 a couple of uh de of Stanford's top
39:14 soccer players on my in my course this
39:17 this year and we got um our focus was
39:20 okay, what data do you not have and how
39:22 can we build a closed loop environment
39:25 and make it something so that you're
39:28 gaining better neurological access to
39:31 your performance based on data like my
39:34 acceleration, my velocity, not at the
39:36 end of my, you know, two-hour practice,
39:38 but in real time and getting auditory
39:40 feedback. back so that I am actually
39:43 tapping into more neural training. So,
39:46 we had uh sensors, you know, like on on
39:47 their calves that were measuring
39:51 acceleration velocity and give able to g
39:54 give us um feedback in real time as they
39:56 were doing, you know, a sort of
39:58 somewhat gamified training. I I don't
40:00 want to use gamified, it's so overused,
40:04 but let's say it's it felt like fun
40:06 environment, but it's also based on
40:08 computation of that acceleration data
40:10 and what their targets were. It's
40:12 feeding them different sonic cues so
40:14 that they're building um they're
40:17 building that resolution. When I say
40:20 resolution, what I mean is, especially
40:21 as a novice, I can't tell the difference
40:23 between whether I've accelerated
40:25 successfully or not. But if you give me
40:27 more gradation in the feedback that I
40:30 get, with that sort of that closed loop
40:33 behavior, I start to my my neural
40:35 representation of that is going to start
40:37 differentiating more. So with that,
40:38 that's where the auditory feedback. So
40:41 they're getting that in real time and we
40:42 you build that kind of closed loop
40:45 environment that helps build that, you
40:48 know, create greater resolution in the
40:50 brain and greater sensitivity to differentiation.
40:51 differentiation.
40:53 >> I'd love for you to uh share the story
40:55 about your daughter um improving her
40:56 swimming stroke, right? because she's
40:59 not a D1 athlete yet. Maybe she will be
41:02 someday, but she's a swimmer, right? And
41:04 in the past, if you wanted to get better
41:05 at swimming, you needed a swimming
41:07 coach. And if you wanted to get really
41:09 good at swimming, you'd have to find a
41:10 really good swimming coach and you'd
41:12 have to work with them repeatedly. Uh,
41:13 you took a slightly different direction
41:14 that really points to just how
41:17 beneficial and inexpensive this
41:18 technology can potentially be or
41:20 relatively inexpensive.
41:21 >> First, I'll say this. Number one is
41:23 having good swimming coaches.
41:24 >> Okay, sure. I'm not trying to do away
41:27 with swimming coaches. parents who are
41:30 uh data centric and and really like
41:32 building technologies are sometimes
41:34 maybe can be red herring distractions
41:35 but hopefully not.
41:36 >> Okay. All right. Well, yes,
41:38 >> that's one of them.
41:40 >> Let's keep the swimming coaches uh h happy.
41:40 happy.
41:42 >> Yeah. So, for example, like you go and
41:44 train with elite athletes and um if you
41:46 go to a lot of um swimming camps where
41:48 you're you or training programs, it's
41:50 always about under you know work with
41:52 cameras and and you know what what
41:55 they're they're recording you. they're,
41:56 you know, assessing your strokes. But
41:59 the point is what I mean I you can use
42:02 and I did this uh you know knowing the
42:05 things that the coaches you or frankly
42:06 you can go online and learn some of
42:10 those things that matter to different
42:13 strokes. You can use you know use
42:16 perplexity labs use replet use some of these
42:16 these
42:18 >> these are online resources.
42:20 >> Yeah. Yeah. And you can build quickly
42:22 build a computer vision app that is
42:24 giving you data analytics on your
42:26 strokes and in real time.
42:27 >> So how's that work? You you're taking
42:29 the phone underwater analyzing the stroke.
42:29 stroke.
42:31 >> In this case I'm using mobile phone so
42:33 I'm doing everything above you know.
42:34 >> Okay. So you're you're filming if you
42:36 could walk us through this. So you film
42:38 your daughter doing freestyle stroke for
42:40 right or breast stroke or butterfly.
42:42 There's a lot of core things that you
42:43 know maybe you want to care about
42:45 backstroke and freestyle. What's the you
42:48 know and I am not a I was we used to run
42:51 like I know you're a good runner but I
42:53 am a runner I'm a rock climber less a
42:55 swimmer but um you know things like the
42:57 roll or how high they're coming above
42:58 the water what's your you know what
43:00 what's your velocity on a you know you
43:02 can get actually very sophisticated once
43:04 you have the data right and you know
43:06 what's your velocity on entrance how
43:08 much you know where how far in front of
43:11 your your head is your arm coming in how
43:16 you know what is um maybe There's again
43:18 maybe there are things that you you know
43:20 are obvious which is you want to know
43:23 you know how consistent are your strokes
43:25 and your cadence across you know the
43:27 pool. Um so you don't just have your
43:30 speed you suddenly have access to what I
43:32 would call and and you'll hear me use
43:35 this a lot better resolution but also a
43:37 lot more analytics that can give you
43:40 insight. Now, important thing here is,
43:41 you know, my 10-year-old is not going to
43:43 resp I'm not going to go tell my
43:45 10-year-old that she needs to change her
43:48 her velocity on this head or stroke, but
43:50 it gives me information that I can at
43:54 least understand and help her know how
43:56 something is going and how consistent
43:57 she is on certain things that her
43:59 coaches have told her to do.
44:01 um you know and and what I love about
44:05 the idea is look this isn't just for the
44:08 ease of getting access to the type of
44:10 data and information that would
44:13 previously and I mean I do code in in a
44:14 lot of areas but you don't have to do
44:16 that anymore to build these apps in fact
44:18 you shouldn't you should leverage you
44:20 know AI for development of these types
44:21 of tools
44:23 >> you you tell AI to write a code so that
44:25 it would analyze you know trajectory
44:27 jumping into the pool how that could be
44:29 improved if the goal is to swim faster.
44:31 >> You you'd use AI to build an app that
44:32 would allow you to do that so that you
44:35 would have then access to that whatever
44:37 the data is that you want to do. Yeah.
44:38 So in that case you're trying to do
44:40 better stroke analytics and and
44:42 understand things as you move forward.
44:44 Um you could do the same thing for
44:47 running for gate for uh you could do you
44:49 know in a work environment you can
44:51 understand a lot more about where
44:52 vulnerabilities are where weaknesses
44:54 are. There are sort of two different
44:56 places where I see this type of um AI
44:58 acceleration and tool building really
45:00 having major impact. It's on sort of
45:02 democratizing data, analytics and
45:04 information that would normally be
45:06 reserved for the elite to everyone
45:09 that's really engaged and that has a
45:10 huge impact on improving performance
45:13 because that kind of data is really you
45:16 know useful in understanding um
45:19 learning. It also has applications for,
45:20 you know, when you're in a work
45:22 environment and you're trying to better
45:24 understand um success in that
45:26 environment ac in in some process or
45:28 skill of, you know, what you're doing.
45:32 Um you you can gain different analytics
45:34 than you otherwise would in ways that
45:37 are become much more uh successful but
45:41 also give you um new data to think about
45:43 with regard to what I would call a
45:45 digital twin. And when I use the word
45:46 digital twin, the goal of a digital twin
45:49 is not to digitize and represent a
45:53 physical system in its entirety. It's to
45:56 gain use different interoperable meaning
45:58 data sets coming from different sources
46:01 to gain insights you know digitized data
46:03 of a physical system or a physical
46:05 environment or physical world be it a
46:07 hospital be it airplanes be it my body
46:10 be it my fish tank to give me insights
46:13 that are you know continuous and in real
46:15 time that I otherwise wouldn't be able
46:17 to gain access to
46:19 >> we've known for a long time that there
46:20 are things that we can do to improve our
46:22 leap and that includes things that we
46:24 can take things like magnesium
46:26 thrienate, theanine, chamomile extract
46:29 and glycine along with lesserk known
46:31 things like saffron and valyan root.
46:32 These are all clinically supported
46:34 ingredients that can help you fall
46:36 asleep, stay asleep, and wake up feeling
46:38 more refreshed. I'm excited to share
46:40 that our longtime sponsor AG1 just
46:43 created a new product called AGZ, a
46:44 nightly drink designed to help you get
46:46 better sleep and have you wake up
46:48 feeling super refreshed. Over the past
46:49 few years, I've worked with the team at
46:52 AG1 to help create this new AGZ formula.
46:53 It has the best sleep supporting
46:56 compounds in exactly the right ratios in
46:58 one easy to drink mix. This removes all
47:00 the complexity of trying to forge the
47:02 vast landscape of supplements focused on
47:04 sleep and figuring out the right dosages
47:07 and which ones to take for you. AGZ is,
47:08 to my knowledge, the most comprehensive
47:10 sleep supplement on the market. I take
47:12 it 30 to 60 minutes before sleep. It's
47:14 delicious, by the way, and it
47:16 dramatically increases both the quality
47:17 and the depth of my sleep. I know that
47:19 both from my subjective experience of my
47:21 sleep and because I track my sleep. I'm
47:23 excited for everyone to try this new AGZ
47:25 formulation and to enjoy the benefits of
47:28 better sleep. AGZ is available in
47:29 chocolate, chocolate mint, and mixed
47:31 berry flavors. And as I mentioned
47:33 before, they're all extremely delicious.
47:35 My favorite of the three has to be, I
47:36 think, chocolate mint, but I really like
47:39 them all. If you'd like to try AGZ, go
47:41 to drinkagz.com/huberman
47:43 to get a special offer. Again, that's drinkagz.com/huberman.
47:46 drinkagz.com/huberman.
47:48 Today's episode is also brought to us by
47:50 Rura. Rurora makes what I believe are
47:52 the best water filters on the market.
47:54 It's an unfortunate reality, but tap
47:56 water often contains contaminants that
47:58 negatively impact our health. In fact, a
48:00 2020 study by the Environmental Working
48:03 Group estimated that more than 200
48:05 million Americans are exposed to PAS
48:06 chemicals, also known as forever
48:08 chemicals, through drinking of tap
48:10 water. These forever chemicals are
48:12 linked to serious health issues such as
48:14 hormone disruption, gut microbiome
48:16 disruption, fertility issues, and many
48:18 other health problems. The Environmental
48:21 Working Group has also shown that over
48:23 122 million Americans drink tap water
48:25 with high levels of chemicals known to
48:27 cause cancer. It's for all these reasons
48:29 that I'm thrilled to have Aurora as a
48:31 sponsor of this podcast. Rurora makes
48:32 what I believe are the best water
48:34 filters on the market. I've been using
48:36 the Aurora countertop system for almost
48:38 a year now. Aurora's filtration
48:40 technology removes harmful substances,
48:42 including endocrine disruptors and
48:44 disinfection byproducts while preserving
48:46 beneficial minerals like magnesium and
48:48 calcium. It requires no installation or
48:50 plumbing. It's built from medical grade
48:51 stainless steel, and its sleek design
48:53 fits beautifully on your countertop. In
48:55 fact, I consider it a welcome addition
48:57 to my kitchen. It looks great and the
48:59 water is delicious. If you'd like to try
49:02 Rora, you can go to roora.com/huberman
49:04 and get an exclusive discount. Again,
49:08 that's r o r a.com/huberman.
49:09 We will definitely talk more about
49:11 digital twins and but what I'm hearing
49:14 is that it can be very um as nerd speak
49:17 but domain specific. I mean, like the
49:18 lowest level example I can think of,
49:20 which would actually be very useful to
49:23 me, would be a digital twin of my
49:26 refrigerator that would place an order
49:29 for the things that I need, not for the
49:31 things I don't need. Um, eliminate the
49:34 the need for a shopping list. Um, it
49:35 would just keep track of like, hey, like
49:36 you usually run out of strawberries on
49:37 this day and this day. And it would just
49:38 keep track of it in the background and
49:40 the stuff would just arrive and it would
49:41 just be there. And like eliminate what
49:43 seemed like like, well, gosh, isn't
49:44 going to the store nice? Yeah, this
49:46 morning I walked to the corner store,
49:48 bought some produce. I had the time to
49:49 do that, the the eight minutes to do
49:53 that, but really I I would like the
49:54 fridge to be stocked with the things
49:57 that I like and need, and I could hire
49:58 someone to do that, but that's
50:00 expensive. This could be done trivially
50:01 and probably will be done trivially
50:02 soon, and I don't necessarily need to
50:04 even build an app into my phone.
50:06 >> So, I like to think in terms of kind of
50:10 lowest level, but highly useful
50:12 >> and easily available now
50:15 >> type technologies. There are a couple of
50:18 areas like when it comes to students
50:20 learning information. We've heard that,
50:22 you know, AI, we we've heard of AI
50:24 generally as like this really bad thing
50:25 like, oh, they're just going to use AI
50:27 to write essays and things like that.
50:29 But there's a use of AI for learning. I
50:31 know this cuz I'm still learning. I
50:32 teach and learn all the time for the
50:37 podcast, which is I've been using AI to
50:41 take large volumes of text from papers.
50:44 So this is an AI hallucinating just take
50:46 like just take large volumes of text
50:48 verbatim from from papers. >> Yes,
50:49 >> Yes,
50:51 >> I've read those papers literally printed
50:53 them out, taken notes, etc. And then
50:56 I've been using AI to design tests for
50:58 me of what's in those papers because I
51:00 learned uh you know about eight eight
51:03 months ago when researching a podcast on
51:04 how to study and learn best. The data
51:05 all point to the fact that when we self test
51:06 test
51:08 >> Yes. Especially when we self test away
51:10 from the material like when we're being
51:12 we're thinking oh yeah like what what is
51:14 the cascade of hormones driving the
51:17 cortisol negative feedback loop when I
51:18 have to think about that on a walk. >> Yes.
51:19 >> Yes.
51:21 >> As opposed to just looking it up. It's
51:22 the it's the self- testing that is
51:24 really most impactful for memory because
51:26 most of memory is anti-forgetting. This
51:28 is kind of one way to think about it.
51:31 So, what I've been doing is is having AI
51:33 build tests for me and having ask me
51:36 questions like, you know, uh what is the
51:39 the the you know, the signal between the
51:41 pituitary and the adrenals uh that
51:43 drives the release of cortisol and and
51:45 what layer of the adrenals does cortisol
51:45 come from?
51:46 >> And I love that
51:49 >> and and so it's it's I'm sure that the
51:50 information it's drawing from is is
51:52 accurate, at least to the best of
51:54 science and medicine's knowledge now.
51:56 >> And it's just testing me and it's
51:57 learning. This is what's so incredible
51:59 about AI and I don't consider myself
52:01 like extreme on AI technology at all.
52:04 It's learning where I'm weak and where
52:05 I'm strong at remembering things because
52:07 I'm asking it where am I weak and where
52:09 am I strong and they'll say oh like like
52:11 naming and this and like like like third
52:13 order conceptual links here need a
52:14 little bit of work and I go test me on
52:16 it and it starts testing me on it. It's
52:20 amazing like I'm blown away that the
52:22 technology can do this and I'm not
52:23 building apps with AI or anything. I'm
52:25 just using it to try and learn better.
52:26 Whether you're building naps or you're
52:28 building a tool, you're you're using it
52:31 as a tool that's helping you optimize
52:33 your cognition and find your weaknesses,
52:35 but also give you feedback on your
52:38 performance and and and accelerate your
52:40 learning in this, right? Because it's
52:42 the goal, but you're still putting in
52:44 the effort to learn. And I think even
52:47 the the ways that I'm using it to you
52:48 with your computer vision with mobile
52:51 devices, AI is a huge opportunity and
52:54 tool that like using the cameras and the
52:56 data that you've collected to, you know,
52:58 have much more sophisticated input is is
53:01 huge. Um, but in both of those cases,
53:03 you're shaping cognition. You're shaping
53:06 you're using data to enrich what you can
53:09 know. and AI is just, you know,
53:12 incredibly powerful and uh a great
53:15 opportunity in those spaces.
53:19 The the place where I think it is um and
53:21 I I sort of separate it into literally
53:23 just two categories. Maybe that's too
53:25 simplistic. It's am I using and and this
53:27 is true for any tool not just AI but am
53:29 I using the tool am I using the
53:31 technology in a way to make me smarter
53:34 about in a and and let me have more
53:36 information and make me more effective
53:39 but also cognitively more effective gain
53:41 different insights or am I using it to replace
53:42 replace
53:44 replace a cognitive skill I've done
53:46 before to be faster and it doesn't mean
53:48 you don't want to do those things I mean
53:51 GPS in our car is a perfect example of a
53:52 place where we're replacing a cognitive
53:54 tool of, you know, to make me faster and
53:56 more effective. And frankly, you know,
53:58 you take away your GPS and in a city you
54:00 drive around and and we're not very
54:01 good. And
54:02 >> I remember paper maps. I remember the
54:04 early studies of the Hippoc campus were
54:06 based on London taxi drivers that had
54:08 mental maps of the city. >> Absolutely.
54:08 >> Absolutely.
54:12 >> That you know at with all due respect to
54:16 London taxi drivers up until GPS like
54:18 that those mental maps are not necessary anymore.
54:18 anymore.
54:20 >> No. And I mean they had more gray matter
54:22 in their hippocampus and we know that
54:24 and you look at them today and they they
54:26 don't have to have that because the
54:28 people in their back seats have more
54:30 data have more information have eyes
54:32 from the sky. I mean satellite data is
54:34 so huge in our success in the future and
54:37 you know it can anticipate the things
54:40 that locally you can't and so it's been
54:44 replaced but it it still means when you
54:47 lose that data you don't don't expect
54:48 yourself to have the same spatial
54:50 navigation of that environment without
54:51 it right
54:54 >> I love your two your two batches right
54:55 you're either using it to make you
54:56 cognitively better or you're using it to
54:58 speed you up but you have to be here's
54:59 where I think
55:01 >> cognitively or physically
55:02 But you're still trying to gain insight
55:04 and data and information that's making
55:06 me a more effective human.
55:08 >> Right. And I think that the the place
55:10 where people are concerned >> Yes.
55:10 >> Yes.
55:14 >> including myself is when we use these
55:17 technologies that eliminate steps,
55:18 make things faster. >> Yeah.
55:19 >> Yeah.
55:22 >> But we fill in the additional time or
55:25 mental space with things that are
55:28 neutral to detrimental.
55:29 It's sort of like saying, "Okay, I can
55:32 get all the nutrients I need from a
55:34 drink that's 8 ounces." This is not
55:35 true. But then the question is like, how
55:37 do I make up the rest of my calories,
55:38 right? Am I making up with also
55:41 nutritious food, right? Um, let's just
55:43 say that keeps me at a neutral health
55:46 status or am I eating stuff that because
55:48 I need calories that I'm not necessarily
55:50 gaining weight, but I'm bringing in a
55:52 bunch of bad stuff with those calories.
55:55 is or in the mental version of this um
55:57 things are sped up but people are
55:59 filling the space with things that are
56:01 making them dumber in some cases. There
56:04 was a recent paper from MIT that I I
56:07 actually it it was
56:09 it is very much what I spend a lot of my
56:11 time talking about but and and thinking
56:12 about but um
56:13 >> yeah could you describe that study?
56:15 >> The upshot of the paper first was that
56:18 people there's a lot less uh mental
56:20 process or cognitive process that goes
56:22 on for people when they use LLMs to
56:24 write papers and they have they don't
56:26 have the same transfer and they don't
56:28 really learn the information. Surprise
56:30 surprise. So, so that to just to briefly
56:31 describe the study even though it got a
56:33 lot of popular press, it's you know um
56:36 MIT students writing papers using AI
56:38 versus writing papers the oldfashioned
56:39 way where you think and write.
56:40 >> So there were three different
56:41 categories. People who had to write the
56:44 papers uh you know just with their using
56:46 their brain only. Uh and that that would
56:49 be case one. Case two would be I get to
56:51 use search engines which would be sort
56:52 of a middle ground. Again these are you
56:55 know rough categories. And then a third
56:58 would be I use LLMs to write my paper.
57:00 And they're looking at you know sort of
57:03 what kind of transfer happened what you
57:05 know what kind of they were measuring
57:07 neural response. So they were using EEG
57:09 to look at neural patterns of uh across
57:11 the brain to understand how much neural
57:13 engagement happened during the writing
57:16 of the papers and during the the whole
57:18 process and then what they could do with
57:19 that what they knew about that
57:21 information down down the road. It's a
57:22 really nice paper, so I don't want to
57:24 want to diminish it in any way by
57:28 summarizing it. But what I think is a
57:31 really important upshot of that paper
57:33 and also just how we talk about it that
57:36 I liked was um they I I talk a lot about
57:38 cognitive load always. And you can
57:39 measure cognitive load and the diameter
57:41 of your pupil and body posture and how
57:42 people are thinking. It's really how
57:45 hard is my brain working right now uh to
57:47 solve a problem or just in my context.
57:48 And there are a lot of different cues we
57:50 give off as humans that tell us when
57:52 we're under states of different load and
57:55 cognitively and whether we are aware of
57:58 it or not. And there's something called
58:00 cognitive load theory that breaks down
58:03 sort of what happens when our brains are
58:06 under states of uh you know load. And
58:08 that load can come from sort of three
58:09 different places. It might be coming
58:12 from intrinsic uh what you would call
58:15 intrinsic information which is what and
58:16 this is all during learning the
58:19 intrinsic load cognitive load load would
58:22 be from uh you know the difficulty of
58:24 the material I'm trying to understand
58:26 how you know really some things are easy
58:28 to learn some things are a lot harder
58:32 and that's intrinsic load extraneous
58:33 load would be the load that comes from
58:36 how the information is presented uh is
58:38 it poorly taught is it poorly organized
58:40 or also in the environment. If it's I'm
58:41 trying to learn something auditorially
58:42 and it's noisy, that's introducing
58:45 extraneous cognitive load, right? It's
58:47 it just it's not the information itself,
58:48 but it's because of everything else
58:50 happening with that data. And then the
58:52 third is germaine cognitive load. And
58:55 that's the load that is used in my brain
58:58 to build mental schemas to build to
59:00 organize that information to to really
59:03 develop a representation of what that
59:06 information is that I'm taking in. And
59:08 that germaine cognitive load that's
59:11 that's the work right and if you don't
59:12 have gerine cognitive load you don't
59:15 have learning really and what they found
59:16 is basically the germaine cognitive load
59:19 is what gets impacted most by using LLMs
59:21 which is I mean it that it's a very
59:23 obvious thing like that's
59:25 >> meaning you don't engage quite as high
59:27 levels of germanine cognitive load
59:30 >> using LLMs you're not engaging the
59:33 mental effort to build cognitive schema
59:36 to build neural schemas and you sort of
59:37 the mental representation of the
59:39 information that you can interact with
59:41 it later and you have access it to
59:44 access to it later and this is really
59:46 important because without that you won't
59:47 be as intelligent on that topic that's
59:49 for sure down the road let me give two
59:51 examples I have a doctor I have a lawyer
59:54 and both of them use LLMs extensively
59:56 for searches say or for building
59:58 information in one case it's for patient
60:00 aggregation of patient data and in another case it's for you know history
60:02 another case it's for you know history of case files and that is the GPS that's
60:04 of case files and that is the GPS that's happening in those spaces and because
60:06 happening in those spaces and because those are the tools that are quickly
60:07 those are the tools that are quickly adopted where you have someone that is
60:11 adopted where you have someone that is maybe came you know from a different
60:14 maybe came you know from a different world has learned that information has
60:16 world has learned that information has gone and worked with data in a different
60:18 gone and worked with data in a different way worked their representation of that
60:20 way worked their representation of that information is going to be better at
60:21 information is going to be better at extrapolation it's going to be better at
60:23 extrapolation it's going to be better at generalization it's going to be better
60:24 generalization it's going to be better at seeing patterns that you know would
60:26 at seeing patterns that you know would exist the brain that has done everything
60:29 exist the brain that has done everything through LLMs is going to be in a place
60:31 through LLMs is going to be in a place where they will get the answer for that
60:36 where they will get the answer for that relevant task or using the tools they
60:40 relevant task or using the tools they have. But you're not the same level of
60:44 have. But you're not the same level of um richness and depth of information or
60:48 um richness and depth of information or generalization or extrapolation for
60:50 generalization or extrapolation for those topics as someone that has learned
60:52 those topics as someone that has learned in a different way. There's a
60:53 in a different way. There's a generational
60:54 generational difference in understanding, not because
60:57 difference in understanding, not because they don't have the same information,
60:59 they don't have the same information, but there is an an acknowledgement that
61:01 but there is an an acknowledgement that there's a gap even though we're getting
61:03 there's a gap even though we're getting to the same place as as fast. And that's
61:05 to the same place as as fast. And that's because of the learning that's happened.
61:08 because of the learning that's happened. >> The gerine cognitive load.
61:09 >> The gerine cognitive load. >> Absolutely. The cognitive load like
61:11 >> Absolutely. The cognitive load like you've got to do the work. your brain
61:12 you've got to do the work. your brain has to and you know what was beautiful
61:14 has to and you know what was beautiful about your descriptions Andy is when you
61:17 about your descriptions Andy is when you were talking about how you were using it
61:19 were talking about how you were using it which I I love you know to test yourself
61:22 which I I love you know to test yourself find your weak vulnerabilities is you
61:24 find your weak vulnerabilities is you know and and actually in the paper in
61:26 know and and actually in the paper in MIT which I think again these are things
61:27 MIT which I think again these are things that are somewhat obvious but we just
61:29 that are somewhat obvious but we just have to name I think we have to talk
61:30 have to name I think we have to talk about them more is people with higher
61:32 about them more is people with higher competency on the topic use the tools in
61:35 competency on the topic use the tools in ways that still engage more germaine
61:36 ways that still engage more germaine cognitive load but helped accelerate
61:38 cognitive load but helped accelerate their their learning it's you know where
61:41 their their learning it's you know where is the biggest vulnerability and gap.
61:42 is the biggest vulnerability and gap. It's when it's especially in areas and
61:44 It's when it's especially in areas and topics where you're you're trying to
61:46 topics where you're you're trying to learn a new domain fast or you're under
61:49 learn a new domain fast or you're under pressure and you're not putting in the
61:50 pressure and you're not putting in the domain effort or you're not using the
61:52 domain effort or you're not using the tools that you have access to that AI
61:54 tools that you have access to that AI can enable.
61:56 can enable. >> You're not using them to amplify your
61:58 >> You're not using them to amplify your cognitive, you know, gain, but instead
62:01 cognitive, you know, gain, but instead to deliver something faster, more rapid,
62:05 to deliver something faster, more rapid, and then walking away from it. I'm going
62:07 and then walking away from it. I'm going to try and present two parallel
62:10 to try and present two parallel scenarios
62:11 scenarios >> in order to go further into this
62:14 >> in order to go further into this question of how to use AI to our best
62:16 question of how to use AI to our best advantage to enrich our brains as
62:18 advantage to enrich our brains as opposed to diminish our brains.
62:20 opposed to diminish our brains. >> Mhm.
62:20 >> Mhm. >> So I could imagine a world because we
62:24 >> So I could imagine a world because we already live in it where there's this
62:26 already live in it where there's this notion of slow food like you cook your
62:29 notion of slow food like you cook your food, you get great ingredients from the
62:31 food, you get great ingredients from the farmers market like like a peach that
62:33 farmers market like like a peach that quote unquote really tastes like a peach
62:35 quote unquote really tastes like a peach this kind of thing. you um you you make
62:38 this kind of thing. you um you you make your own food. You you cook it and you
62:40 your own food. You you cook it and you taste it. It's just delicious. And and
62:42 taste it. It's just delicious. And and um I can also imagine a world where you
62:45 um I can also imagine a world where you order a peach pie online, it shows up
62:47 order a peach pie online, it shows up and you take a slice and you eat it. And
62:49 and you take a slice and you eat it. And you could take two different generations
62:51 you could take two different generations of people, maybe people that are
62:52 of people, maybe people that are currently now 50 or older and people
62:55 currently now 50 or older and people that are 15 or younger, and the older
62:59 that are 15 or younger, and the older generation would say, "Oh, isn't that
63:00 generation would say, "Oh, isn't that the peach pie that you made so much
63:01 the peach pie that you made so much better? Like these peaches are amazing."
63:03 better? Like these peaches are amazing." And I could imagine a real scenario
63:05 And I could imagine a real scenario where the younger person 15 to 30 let's
63:08 where the younger person 15 to 30 let's say would say like I don't know I
63:11 say would say like I don't know I actually really like the other pie. I
63:12 actually really like the other pie. I like it just as well. And the older
63:15 like it just as well. And the older generation is like this like what are
63:17 generation is like this like what are you talking about? Like this is how it's
63:19 you talking about? Like this is how it's done.
63:20 done. What's different? Well sure experience
63:23 What's different? Well sure experience is different etc. But from a neural
63:26 is different etc. But from a neural standpoint, from a neuroscience
63:27 standpoint, from a neuroscience standpoint,
63:29 standpoint, it very well could be that it tastes
63:31 it very well could be that it tastes equally good to the two of them, just
63:33 equally good to the two of them, just differs based on their experience.
63:35 differs based on their experience. Meaning that the person isn't lying.
63:38 Meaning that the person isn't lying. It's not like this kid um, you know,
63:41 It's not like this kid um, you know, isn't as fine-tuned to taste. It's that
63:44 isn't as fine-tuned to taste. It's that their neurons acclimated to like what
63:46 their neurons acclimated to like what sweetness is and what contrast between
63:48 sweetness is and what contrast between sweet and saltiness is and what a peach
63:50 sweet and saltiness is and what a peach should taste like cuz damn it, they had
63:52 should taste like cuz damn it, they had peach gummies and that tastes like a
63:53 peach gummies and that tastes like a peach, you know. And so we can be
63:56 peach, you know. And so we can be disparaging of the kind of what we would
63:58 disparaging of the kind of what we would call the lower level or diminished
64:01 call the lower level or diminished sensory input.
64:03 sensory input. >> Yeah.
64:03 >> Yeah. >> But it depends a lot on the neural what
64:05 >> But it depends a lot on the neural what those neural circuits were weaned on.
64:07 those neural circuits were weaned on. >> Couple of comments. I love the peach pie
64:10 >> Couple of comments. I love the peach pie example. Making bread is another example
64:13 example. Making bread is another example of that. And in the 90s, everyone I knew
64:16 of that. And in the 90s, everyone I knew when they graduated from high school got
64:18 when they graduated from high school got a bread maker that was shaped like a box
64:21 a bread maker that was shaped like a box and, you know, created this
64:23 and, you know, created this >> like loaf of bread with a giant, you
64:25 >> like loaf of bread with a giant, you know, rod through it. And it was just it
64:27 know, rod through it. And it was just it was the graduation gift for many years.
64:30 was the graduation gift for many years. >> And um, you know, you don't see those
64:33 >> And um, you know, you don't see those anymore. And you know if you even look
64:35 anymore. And you know if you even look at what happened with like the
64:37 at what happened with like the millennial generation in the la you know
64:39 millennial generation in the la you know in the last 5 years especially during
64:41 in the last 5 years especially during the pandemic suddenly breadmaking
64:42 the pandemic suddenly breadmaking sourdough that became a thing. What's
64:44 sourdough that became a thing. What's the difference? You know, you've got
64:46 the difference? You know, you've got bread. It's warm. It's, you know, with
64:48 bread. It's warm. It's, you know, with the bread maker, it's fresh and it is
64:51 the bread maker, it's fresh and it is not at all desired relative to bread
64:53 not at all desired relative to bread that takes a long period of time and is
64:57 that takes a long period of time and is tactile and in the process and the
64:59 tactile and in the process and the making of it and you know is clearly
65:01 making of it and you know is clearly much more ownorous than the other in its
65:04 much more ownorous than the other in its process of development. I think the key
65:07 process of development. I think the key part is it's in in the appreciation of
65:10 part is it's in in the appreciation of the bread. it. The process is part of it
65:12 the bread. it. The process is part of it and that process is development of sort
65:15 and that process is development of sort of the germaine knowledge and the
65:16 of the germaine knowledge and the commitment and connection to that
65:18 commitment and connection to that humanness of development but also the
65:21 humanness of development but also the tactile uh commitment the work that went
65:24 tactile uh commitment the work that went into it is really appreciated in the
65:26 into it is really appreciated in the same way that that peach pie for one
65:30 same way that that peach pie for one comes with that whole time series of
65:34 comes with that whole time series of data that wasn't just about my taste but
65:37 data that wasn't just about my taste but was also smell also physical also visual
65:41 was also smell also physical also visual and saw the process you know evolve and
65:44 and saw the process you know evolve and build a different prior going into that
65:47 build a different prior going into that experience and that is I think part of
65:51 experience and that is I think part of richness of human experience will it be
65:54 richness of human experience will it be part of the richness of how humans
65:56 part of the richness of how humans interact with AI absolutely or interact
65:59 interact with AI absolutely or interact with robots absolutely so it's what are
66:02 with robots absolutely so it's what are the relationships we're building and how
66:04 the relationships we're building and how are they you know how integrated are
66:06 are they you know how integrated are these tools these you know companions
66:09 these tools these you know companions whatever they may be in our existence
66:11 whatever they may be in our existence will shape us in different ways. What I
66:15 will shape us in different ways. What I am particularly I guess bullish on and
66:18 am particularly I guess bullish on and excited for is the robot that optimizes
66:21 excited for is the robot that optimizes my health, my comfort, my intent in my
66:24 my health, my comfort, my intent in my environment, in my you know be it in the
66:27 environment, in my you know be it in the cabin of a car, be it in the my my
66:29 cabin of a car, be it in the my my rooms, my spaces.
66:31 rooms, my spaces. >> So what would that look like if you uh
66:32 >> So what would that look like if you uh could you give me the lowest level
66:34 could you give me the lowest level example? um like like would it be an
66:37 example? um like like would it be an assistant that helps you travel today
66:39 assistant that helps you travel today when you head back to the Bay Area?
66:41 when you head back to the Bay Area? Would it like what is this non-physical
66:44 Would it like what is this non-physical robot?
66:44 robot? >> And I think we already have some of
66:45 >> And I think we already have some of these like it's the point where HVAC
66:48 these like it's the point where HVAC systems actually get sexy, right? Not
66:50 systems actually get sexy, right? Not sexy in that sense, but they're actually
66:51 sexy in that sense, but they're actually really interesting because they are the
66:53 really interesting because they are the heart of, you know,
66:54 heart of, you know, >> HVAC systems,
66:55 >> HVAC systems, >> heating ventilation
66:58 >> heating ventilation AC,
66:59 AC, >> but you think about a thermostat. You
67:01 >> but you think about a thermostat. You know, a thermostat right now is
67:03 know, a thermostat right now is optimizing for you an AI thermostat
67:05 optimizing for you an AI thermostat optimizing for my behavior, but it's
67:07 optimizing for my behavior, but it's trying to save me resources, trying to
67:09 trying to save me resources, trying to save me money, but it's not doesn't know
67:11 save me money, but it's not doesn't know if I'm hot or cold. It doesn't know to
67:12 if I'm hot or cold. It doesn't know to your point, it my intent, what I'm
67:15 your point, it my intent, what I'm trying to do at that moment where and
67:17 trying to do at that moment where and this is, you know, speaks more to a lot
67:19 this is, you know, speaks more to a lot of the the things you've studied in the
67:20 of the the things you've studied in the past. You know, it doesn't know what my
67:23 past. You know, it doesn't know what my optimal state is for my goal in that
67:27 optimal state is for my goal in that moment in time,
67:28 moment in time, >> but it can very easily, frankly, you
67:30 >> but it can very easily, frankly, you know, it can talk to me, but it can also
67:32 know, it can talk to me, but it can also know how my state of my body right now
67:35 know how my state of my body right now and what is going, you know, it's if
67:36 and what is going, you know, it's if it's 1:00 a.m. and I really need to work
67:38 it's 1:00 a.m. and I really need to work on a paper.
67:39 on a paper. >> You you know, my house should not get
67:41 >> You you know, my house should not get cold, but it also should be very, it
67:44 cold, but it also should be very, it should
67:44 should >> for me it shouldn't. I know for some
67:46 >> for me it shouldn't. I know for some people it should.
67:47 people it should. >> Yeah. My my eight sleep mattress, which
67:48 >> Yeah. My my eight sleep mattress, which I love, love, love. And yes, they're a
67:51 I love, love, love. And yes, they're a podcast sponsor, but I would use one
67:52 podcast sponsor, but I would use one anyway. It knows what temperature
67:55 anyway. It knows what temperature adjustments need to be made,
67:57 adjustments need to be made, >> right,
67:57 >> right, >> across the course of the night. I put in
67:59 >> across the course of the night. I put in what I think it it is best, but it's
68:01 what I think it it is best, but it's updating all the time now because it has
68:03 updating all the time now because it has updating sensors, like dynamically
68:05 updating sensors, like dynamically updating sensors. I'm getting close to
68:08 updating sensors. I'm getting close to two hours of REM sleep a night, which is
68:10 two hours of REM sleep a night, which is outrageously good for me.
68:13 outrageously good for me. >> Much more deep sleep, and that's a
68:15 >> Much more deep sleep, and that's a little micro environment. You're talking
68:16 little micro environment. You're talking about integrating that into an entire
68:19 about integrating that into an entire home environment.
68:20 home environment. >> Home vehicle. Yes. Because it needs to
68:22 >> Home vehicle. Yes. Because it needs to treat me as a dynamic time series. It
68:24 treat me as a dynamic time series. It needs to understand the context of
68:26 needs to understand the context of everything that's driving my state
68:28 everything that's driving my state internally. There's everything that's
68:30 internally. There's everything that's driving my state in my local
68:31 driving my state in my local environment, meaning my home or my car.
68:33 environment, meaning my home or my car. And then there's what's driving my state
68:35 And then there's what's driving my state externally, my in from, you know, my
68:38 externally, my in from, you know, my external environment. And we're in a
68:41 external environment. And we're in a place where those things are rarely
68:43 place where those things are rarely treated, you know, interacting together
68:45 treated, you know, interacting together for the optimization and the, you know,
68:47 for the optimization and the, you know, the dynamic interactions that happen.
68:50 the dynamic interactions that happen. But we can know these things. We can
68:52 But we can know these things. We can know so much about the human state from
68:54 know so much about the human state from non-cont sensors.
68:55 non-cont sensors. >> Yeah. And we're right at the point where
68:56 >> Yeah. And we're right at the point where the sensors can start to feed
68:58 the sensors can start to feed information to AI to be able to deliver
69:00 information to AI to be able to deliver what effectively again a lower level
69:02 what effectively again a lower level example would be like the the cooling
69:04 example would be like the the cooling the dynamically cooling mattress or
69:05 the dynamically cooling mattress or dynamically heating mattress. Like I
69:07 dynamically heating mattress. Like I discovered through the AI that my
69:09 discovered through the AI that my mattress was applying that and I was
69:11 mattress was applying that and I was told that heating your sleep environment
69:14 told that heating your sleep environment toward the end of the night
69:16 toward the end of the night >> yes
69:16 >> yes >> increases your REM sleep dramatically
69:18 >> increases your REM sleep dramatically whereas cooling it at the beginning of
69:19 whereas cooling it at the beginning of the night increases your deep sleep has
69:21 the night increases your deep sleep has been immensely beneficial for me to be
69:23 been immensely beneficial for me to be able to shorten my total sleep need
69:25 able to shorten my total sleep need which is something that for me is like
69:26 which is something that for me is like awesome because I I like sleep a lot but
69:29 awesome because I I like sleep a lot but I don't want to need to sleep so much in
69:32 I don't want to need to sleep so much in order to feel great. Well, you you want
69:34 order to feel great. Well, you you want to have your own choice about how you
69:36 to have your own choice about how you sleep. Yeah. Given the date, it's
69:37 sleep. Yeah. Given the date, it's helping you have that.
69:38 helping you have that. >> Sometimes I have six hours, sometimes I
69:40 >> Sometimes I have six hours, sometimes I have eight hours, this kind of thing.
69:43 have eight hours, this kind of thing. >> Here's where I'm I get stuck and I've
69:47 >> Here's where I'm I get stuck and I've been wanting to have a conversation
69:48 been wanting to have a conversation about this with someone, ideally a
69:50 about this with someone, ideally a neuroscientist who's interested in
69:52 neuroscientist who's interested in building technologies for a very long
69:54 building technologies for a very long time. So, I feel like this moment is a
69:57 time. So, I feel like this moment is a moment I've been waiting for for a very
69:58 moment I've been waiting for for a very long time, which is the following. I'm
70:01 long time, which is the following. I'm hoping you can solve this for all of us,
70:03 hoping you can solve this for all of us, Bobby.
70:04 Bobby. >> We're talking about sleep and we know a
70:06 >> We're talking about sleep and we know a lot about sleep. You got slow wave
70:08 lot about sleep. You got slow wave sleep, deep sleep, growth hormone
70:09 sleep, deep sleep, growth hormone release at the beginning of the night.
70:10 release at the beginning of the night. You have less metabolic need then. Then
70:12 You have less metabolic need then. Then you have rapid eye movement sleep which
70:14 you have rapid eye movement sleep which consolidates learning from the previous
70:15 consolidates learning from the previous day. It removes the emotional load of
70:17 day. It removes the emotional load of previous day experiences. We can make
70:19 previous day experiences. We can make temperature adjustments. You do all
70:20 temperature adjustments. You do all these things. Avoid caffeine too late in
70:21 these things. Avoid caffeine too late in the day. Lots of things to optimize
70:23 the day. Lots of things to optimize these known states that occupy this
70:26 these known states that occupy this thing that we call sleep. And AI and
70:28 thing that we call sleep. And AI and technology is, I would say, is doing a
70:30 technology is, I would say, is doing a really great job, as is pharmarmacology,
70:34 really great job, as is pharmarmacology, to try and enhance sleep. Sleep's
70:36 to try and enhance sleep. Sleep's getting better. We're getting better at
70:37 getting better. We're getting better at sleeping despite more forces um uh
70:41 sleeping despite more forces um uh potentially disrupting our sleep,
70:42 potentially disrupting our sleep, >> like smartphones and noise and city
70:44 >> like smartphones and noise and city noise, etc. Okay,
70:46 noise, etc. Okay, >> here's the big problem in my mind is
70:49 >> here's the big problem in my mind is that we have very little understanding
70:51 that we have very little understanding or even names for different awake
70:54 or even names for different awake states. We have names for the goal like
70:58 states. We have names for the goal like I want to be able to work. Okay, what's
71:01 I want to be able to work. Okay, what's work? What kind of work? Uh I want to
71:04 work? What kind of work? Uh I want to write a chapter of a book. What kind of
71:06 write a chapter of a book. What kind of book? A non-fiction book based on what?
71:08 book? A non-fiction book based on what? But like we don't we talk about alpha,
71:10 But like we don't we talk about alpha, beta waves, theta waves, but I feel like
71:12 beta waves, theta waves, but I feel like as neuroscientists, we have done a
71:15 as neuroscientists, we have done a pretty poor job as a field of defining
71:18 pretty poor job as a field of defining different states of wakefulness. And so
71:20 different states of wakefulness. And so the like the technology AI and other
71:23 the like the technology AI and other technologies are don't really have they
71:27 technologies are don't really have they don't know what to to shoot for. They
71:29 don't know what to to shoot for. They don't know what to help us optimize for.
71:30 don't know what to help us optimize for. Whereas with slow wave sleep and REM
71:32 Whereas with slow wave sleep and REM sleep like we've got it. I ask questions
71:34 sleep like we've got it. I ask questions of myself all the time like is my brain
71:36 of myself all the time like is my brain and what it requires in the first three
71:38 and what it requires in the first three hours of the day anything like what my
71:40 hours of the day anything like what my brain requires in the last three hours
71:42 brain requires in the last three hours of the day if I want to work in each one
71:45 of the day if I want to work in each one of those three-hour compartments. like
71:47 of those three-hour compartments. like and so I think like we don't really
71:49 and so I think like we don't really understand
71:51 understand what to try and uh adjust to. So here's
71:54 what to try and uh adjust to. So here's my question. Do you think AI could help
71:57 my question. Do you think AI could help us understand the different states that
71:59 us understand the different states that our brain and body go through during the
72:01 our brain and body go through during the daytime?
72:03 daytime? Give us some understanding of what those
72:05 Give us some understanding of what those are in terms of body temperature, focus
72:07 are in terms of body temperature, focus ability, etc. And then help us optimize
72:10 ability, etc. And then help us optimize for those the same way that we optimize
72:11 for those the same way that we optimize for sleep. Because whether it's a
72:13 for sleep. Because whether it's a conversation with your therapist,
72:14 conversation with your therapist, whether or not it's a podcast, whether
72:16 whether or not it's a podcast, whether or not it's playing with your kids,
72:18 or not it's playing with your kids, whether or not it's Netflix and chill,
72:20 whether or not it's Netflix and chill, whatever it is, the the goal and what
72:22 whatever it is, the the goal and what people have spent so much time, energy,
72:24 people have spent so much time, energy, money, etc. And whether or not they're
72:26 money, etc. And whether or not they're drinking alcohol, caffeine, taking rolin
72:28 drinking alcohol, caffeine, taking rolin or aderall, or running or what, like
72:32 or aderall, or running or what, like humans have have spent their entire
72:34 humans have have spent their entire existence trying to build technologies
72:37 existence trying to build technologies to get better at doing the things that
72:39 to get better at doing the things that they need to do. And yet we still don't
72:41 they need to do. And yet we still don't really understand waking states. So can
72:42 really understand waking states. So can AI
72:44 AI teach it to us? Can AI teach teach us a
72:47 teach it to us? Can AI teach teach us a goal that we don't even know we have?
72:49 goal that we don't even know we have? >> Can AI teach it to us? I would say AI is
72:52 >> Can AI teach it to us? I would say AI is part of the story. But before we get AI,
72:54 part of the story. But before we get AI, we need better more data. Not just me,
72:59 we need better more data. Not just me, right? So maybe I am very focused right
73:02 right? So maybe I am very focused right now, but without my belief and this is
73:04 now, but without my belief and this is my perspective is imagine I I'm very
73:08 my perspective is imagine I I'm very focused right now. I need to know the
73:09 focused right now. I need to know the context of my environment that's driving
73:12 context of my environment that's driving that. Like what are what what's in that
73:14 that. Like what are what what's in that environment? Is it internal focus that's
73:16 environment? Is it internal focus that's gotten me there? What what is my
73:18 gotten me there? What what is my environment? What is that external
73:20 environment? What is that external environment? So the understanding my
73:23 environment? So the understanding my awake state for me is very dependent on
73:27 awake state for me is very dependent on the data and interactions that happen
73:30 the data and interactions that happen from these different environments. Let
73:31 from these different environments. Let me give an example like if I'm in my
73:33 me give an example like if I'm in my home or I'm in a say I'm in a vehicle,
73:35 home or I'm in a say I'm in a vehicle, all right, and you are measuring
73:38 all right, and you are measuring information about me and you know I'm
73:39 information about me and you know I'm under stress or you know I'm uh
73:42 under stress or you know I'm uh experiencing joy or I'm or heightens
73:45 experiencing joy or I'm or heightens attention right now. Some different
73:48 attention right now. Some different states you may want to
73:50 states you may want to uh have my home or my system react to
73:55 uh have my home or my system react to mitigate. Well, like if you get sleepy
73:57 mitigate. Well, like if you get sleepy in a self-driving in in a smart vehicle,
74:00 in a self-driving in in a smart vehicle, >> it will make adjustments
74:01 >> it will make adjustments >> potentially. It will make adjustments,
74:03 >> potentially. It will make adjustments, but not necessarily right for you.
74:05 but not necessarily right for you. That's an important part is optimizing
74:07 That's an important part is optimizing for you personalization and how a system
74:09 for you personalization and how a system responds. And you know, it can make
74:11 responds. And you know, it can make adjust any home, an HVAC system or the
74:14 adjust any home, an HVAC system or the the internal state of a vehicle is going
74:16 the internal state of a vehicle is going to adjust, you know, sound, background
74:18 to adjust, you know, sound, background sound, music. It's going to adjust, you
74:21 sound, music. It's going to adjust, you know, whatever whether it can haptic
74:23 know, whatever whether it can haptic feedback, temperature, lighting, you
74:26 feedback, temperature, lighting, you know, any number of, you know, position
74:29 know, any number of, you know, position of your, you know, your chair dynamics
74:32 of your, you know, your chair dynamics of what's in your space. All of these
74:34 of what's in your space. All of these different systems in my home or my my
74:37 different systems in my home or my my other
74:39 other what what my vehicle if it or some other
74:42 what what my vehicle if it or some other system can react, right? But the
74:44 system can react, right? But the important thing is how you react is
74:47 important thing is how you react is going to shift me. And the goal is to
74:50 going to shift me. And the goal is to not measure me but to
74:54 not measure me but to actually intersect with my state and
74:57 actually intersect with my state and move it in some direction right some
74:59 move it in some direction right some >> yeah I always think of devices as good
75:01 >> yeah I always think of devices as good at measurement or uh modification
75:04 at measurement or uh modification >> right
75:05 >> right >> measurement or modification measurement
75:07 >> measurement or modification measurement is critical and that's yeah meas but
75:09 is critical and that's yeah meas but measurement not just of my me but also
75:13 measurement not just of my me but also of like my environment and understanding
75:15 of like my environment and understanding of the external environment this is
75:17 of the external environment this is where like things like Earth observation
75:19 where like things like Earth observation and understanding, you know, we're
75:21 and understanding, you know, we're getting to a place where we're getting
75:23 getting to a place where we're getting uh image, you know, really good image
75:26 uh image, you know, really good image quality data from sat the the satellites
75:29 quality data from sat the the satellites that are going in the sky at at much
75:30 that are going in the sky at at much lower um uh
75:34 lower um uh lower distances so that you now have,
75:37 lower distances so that you now have, you know, faster reaction times between
75:40 you know, faster reaction times between technologies and the information they
75:43 technologies and the information they have to understand and be dynamic with
75:45 have to understand and be dynamic with them. Right? Can you give me an example
75:46 them. Right? Can you give me an example where that impacts everyday life? Are we
75:48 where that impacts everyday life? Are we talking about like weather analysis?
75:50 talking about like weather analysis? >> Sure. Weather predictions, uh, car
75:52 >> Sure. Weather predictions, uh, car environ, you know, things happening.
75:54 environ, you know, things happening. >> And what about traffic? Why haven't they
75:55 >> And what about traffic? Why haven't they solved traffic yet given all the
75:57 solved traffic yet given all the knowledge of of um object flow and how
76:00 knowledge of of um object flow and how to optimize for object flow? And we've
76:02 to optimize for object flow? And we've got satellites that can basically look
76:03 got satellites that can basically look at at traffic and I mean and open up
76:06 at at traffic and I mean and open up roads dynamically like change number of
76:08 roads dynamically like change number of lanes. What why isn't that happening?
76:10 lanes. What why isn't that happening? The traffic problem gets resolved when
76:12 The traffic problem gets resolved when you have autonomous vehicles in ways
76:14 you have autonomous vehicles in ways that don't have like the the human side
76:17 that don't have like the the human side of things.
76:18 of things. >> That gets resolved.
76:19 >> That gets resolved. >> It does like
76:19 >> It does like >> autonomous vehicles.
76:20 >> autonomous vehicles. >> Only autonomous vehicles. You would
76:22 >> Only autonomous vehicles. You would probably you don't have traffic in the
76:24 probably you don't have traffic in the ways that you do with
76:25 ways that you do with >> goodness. That's reason alone.
76:27 >> goodness. That's reason alone. >> That's reason alone to to shift to
76:29 >> That's reason alone to to shift to autonomous vehicles.
76:30 autonomous vehicles. >> It is that injection from human the
76:32 >> It is that injection from human the human system that you know is
76:35 human system that you know is interrupting all the models. I think the
76:37 interrupting all the models. I think the world right now we think about wearables
76:38 world right now we think about wearables a lot. Wearables track us. You have
76:40 a lot. Wearables track us. You have smart mattresses um which are wonderful
76:42 smart mattresses um which are wonderful for understanding. So there's so much
76:44 for understanding. So there's so much you learn while you know from a smart
76:47 you learn while you know from a smart mattress and ways of also both measuring
76:50 mattress and ways of also both measuring as well as intervening to optimize your
76:53 as well as intervening to optimize your sleep which is the beauty uh and it's
76:55 sleep which is the beauty uh and it's this nice incredible period of time
76:58 this nice incredible period of time where you can measure so many things. Um
77:01 where you can measure so many things. Um but you know in our home so I was I use
77:03 but you know in our home so I was I use the example of a thermostat right? it
77:05 the example of a thermostat right? it it's pretty, you know, frankly dumb
77:07 it's pretty, you know, frankly dumb about what my goals are or what I'm
77:09 about what my goals are or what I'm trying to do at that moment in time, but
77:12 trying to do at that moment in time, but it doesn't have to be. And there are,
77:13 it doesn't have to be. And there are, you know, there's a company, Passive
77:15 you know, there's a company, Passive Logic. I love them. Uh they actually
77:16 Logic. I love them. Uh they actually have, I think, some of the smartest uh
77:19 have, I think, some of the smartest uh digital twin HVAC systems, but you know,
77:21 digital twin HVAC systems, but you know, their sensors measure things like sound.
77:23 their sensors measure things like sound. They measure carbon dioxide, uh your
77:26 They measure carbon dioxide, uh your carbon, your CO2 levels, like when when
77:28 carbon, your CO2 levels, like when when we breathe, we give off CO2, you know.
77:31 we breathe, we give off CO2, you know. So imagine, you know, there's a dynamic
77:34 So imagine, you know, there's a dynamic mixture of acetone, isoprene, and carbon
77:38 mixture of acetone, isoprene, and carbon dioxide that's constantly exchanging
77:41 dioxide that's constantly exchanging when my, you know, when I get stressed
77:43 when my, you know, when I get stressed or when I'm feeling, you know, happiness
77:46 or when I'm feeling, you know, happiness or suspense in my my in my state. And
77:51 or suspense in my my in my state. And that dynamic sort of cocktail mixture
77:54 that dynamic sort of cocktail mixture that's in my breath is both an indicator
77:57 that's in my breath is both an indicator of my state, but it's also something
77:59 of my state, but it's also something that, you know, it's just the spaces
78:02 that, you know, it's just the spaces around me, you know, have more
78:04 around me, you know, have more information to contribute about how I'm
78:06 information to contribute about how I'm feeling and can also be part of that
78:08 feeling and can also be part of that solution in ways that don't I don't have
78:10 solution in ways that don't I don't have to have things on my body, right? So, I
78:12 to have things on my body, right? So, I have sensors now that can measure CO2.
78:14 have sensors now that can measure CO2. You can watch my TED talk. I have given
78:16 You can watch my TED talk. I have given examples. We brought people in when I
78:18 examples. We brought people in when I when I was at Dolby and had um had them
78:21 when I was at Dolby and had um had them watching Free Solo, you know, the Alex
78:23 watching Free Solo, you know, the Alex Hold movie where they're climbing LCAP
78:25 Hold movie where they're climbing LCAP >> stressful.
78:25 >> stressful. >> So carbon dioxide's heavier than air. So
78:27 >> So carbon dioxide's heavier than air. So we can measure we could measure carbon
78:28 we can measure we could measure carbon dioxide from s, you know, just tubes on
78:31 dioxide from s, you know, just tubes on the ground and you could get the
78:32 the ground and you could get the real-time differential of CO2 in there.
78:34 real-time differential of CO2 in there. And
78:35 And >> were they scared throughout?
78:36 >> were they scared throughout? >> No. Well, but it's I mean I like to say
78:39 >> No. Well, but it's I mean I like to say we broadcast how we're feeling, right?
78:41 we broadcast how we're feeling, right? And we do that wherever we are. And in
78:44 And we do that wherever we are. And in this uh you could look at the time
78:46 this uh you could look at the time series of carbon dioxide levels and be
78:48 series of carbon dioxide levels and be able to you know know what what was
78:51 able to you know know what what was happening in the film or in the movie
78:53 happening in the film or in the movie without actually having it annotated.
78:55 without actually having it annotated. You could tell where he summited where
78:56 You could tell where he summited where he had to abandon his climb where he
78:58 he had to abandon his climb where he hurt his ankle.
78:59 hurt his ankle. >> Absolutely. There's another study I
79:01 >> Absolutely. There's another study I forget who the authors are and they're
79:03 forget who the authors are and they're you know they've got different audiences
79:04 you know they've got different audiences watching Hunger Games and you know
79:06 watching Hunger Games and you know different days different people you can
79:08 different days different people you can tell exactly where Katniss's dress
79:10 tell exactly where Katniss's dress catches on fire and uh you know it's
79:13 catches on fire and uh you know it's like we really are sort of you know it's
79:15 like we really are sort of you know it's like digital exhaust of how we're
79:17 like digital exhaust of how we're feeling but you know and and our
79:18 feeling but you know and and our thermals we you know radiate the things
79:21 thermals we you know radiate the things we're feeling um I'm very um bullish on
79:23 we're feeling um I'm very um bullish on the power of you know our eye or in in
79:27 the power of you know our eye or in in representing our cognitive load our
79:28 representing our cognitive load our stressors
79:29 stressors >> our Okay.
79:30 >> our Okay. >> Our eye. Yes. Like the diameter.
79:32 >> Our eye. Yes. Like the diameter. >> Our eye.
79:32 >> Our eye. >> Our
79:33 >> Our >> Yeah. Our eye. Sorry. Our our literally
79:35 >> Yeah. Our eye. Sorry. Our our literally our eyes. Our pupil pupil size.
79:37 our eyes. Our pupil pupil size. >> Yes. Yes. Yes. I you know back when I
79:39 >> Yes. Yes. Yes. I you know back when I was a physiologist I always you were
79:41 was a physiologist I always you were I've worked with a lot of species on in
79:43 I've worked with a lot of species on in you know understanding information
79:45 you know understanding information processing internally in cells but also
79:48 processing internally in cells but also then I you would very often use
79:50 then I you would very often use pupilometry as an indicator of you know
79:52 pupilometry as an indicator of you know perceptual engagement and experience.
79:54 perceptual engagement and experience. >> Yeah. Bigger pupil mean more arousal
79:57 >> Yeah. Bigger pupil mean more arousal higher levels of alertness.
79:58 higher levels of alertness. >> Yeah. more arousal, cognitive load or
80:02 >> Yeah. more arousal, cognitive load or you know obviously lighting changes but
80:04 you know obviously lighting changes but the the thing that's changing from you
80:06 the the thing that's changing from you know
80:07 know >> 20 years ago 15 years ago it was very
80:10 >> 20 years ago 15 years ago it was very expensive to track the kind of
80:12 expensive to track the kind of resolution and data to you know leverage
80:14 resolution and data to you know leverage all of those autonomic nervous system
80:17 all of those autonomic nervous system you know deterministic responses because
80:19 you know deterministic responses because those ones are deterministic and not
80:20 those ones are deterministic and not probabilistic right those are the ones
80:22 probabilistic right those are the ones that it's like the body's reacting even
80:24 that it's like the body's reacting even if the brain doesn't say anything about
80:26 if the brain doesn't say anything about >> detection and uh but Today we can do
80:29 >> detection and uh but Today we can do that with I mean do it well we can do it
80:31 that with I mean do it well we can do it right now with a you know open source
80:34 right now with a you know open source software on our laptops or our mobile
80:36 software on our laptops or our mobile devices right and every pair of smart
80:38 devices right and every pair of smart glasses will be tracking this
80:40 glasses will be tracking this information when we wear them uh so it
80:42 information when we wear them uh so it is becomes a channel of data and you
80:45 is becomes a channel of data and you know you it may be an ambiguous
80:47 know you it may be an ambiguous signature in the sense that there's you
80:48 signature in the sense that there's you know changes in lighting there's changes
80:50 know changes in lighting there's changes am I aroused or am I
80:52 am I aroused or am I >> those can be adjusted for right like if
80:53 >> those can be adjusted for right like if you you can you can literally take a
80:55 you you can you can literally take a measurement wear eyeglasses that are
80:57 measurement wear eyeglasses that are measuring pupil size.
80:59 measuring pupil size. >> Um, the eyeglasses could have a sensor
81:01 >> Um, the eyeglasses could have a sensor that detects levels of illumination in
81:03 that detects levels of illumination in the room
81:04 the room >> at the level of my eyes.
81:05 >> at the level of my eyes. >> Um, it could measure how dynamic that is
81:07 >> Um, it could measure how dynamic that is and we just make that the denominator in
81:09 and we just make that the denominator in a fraction, right? And then we just look
81:10 a fraction, right? And then we just look at changes in pupil size as the
81:12 at changes in pupil size as the numerator in that fraction, right? Um,
81:15 numerator in that fraction, right? Um, more or less you just have to have other
81:16 more or less you just have to have other sensors.
81:16 sensors. >> All you need to do is cancel. So as as
81:18 >> All you need to do is cancel. So as as you walk from a shadowed area to a
81:20 you walk from a shadowed area to a brighter area, sure the pupil size
81:22 brighter area, sure the pupil size changes, but then you can adjust for
81:24 changes, but then you can adjust for that change, right? just like normalize
81:26 that change, right? just like normalize for that and you end up with an index of
81:29 for that and you end up with an index of arousal,
81:30 arousal, >> right?
81:30 >> right? >> Which is amazing. You could also use the
81:32 >> Which is amazing. You could also use the index of of illumination as a useful
81:34 index of of illumination as a useful measure of like are you getting uh
81:35 measure of like are you getting uh compared to your vitamin D levels uh to
81:38 compared to your vitamin D levels uh to your levels of maybe you need more
81:39 your levels of maybe you need more illumination in order to get more
81:40 illumination in order to get more arousal. Like it could tell all of this.
81:43 arousal. Like it could tell all of this. It could literally say hey take a
81:45 It could literally say hey take a 5minute walk outside in to the left
81:48 5minute walk outside in to the left after work and you will um get your your
81:51 after work and you will um get your your require your photon requirement for the
81:53 require your photon requirement for the day. you know, this kind of thing, not
81:54 day. you know, this kind of thing, not just measuring steps. All this stuff is
81:57 just measuring steps. All this stuff is possible now.
81:58 possible now. >> I just don't know why it's not being
82:00 >> I just don't know why it's not being integrated into single devices more
82:02 integrated into single devices more quickly
82:03 quickly >> because you'd love to also know that
82:04 >> because you'd love to also know that person's blood sugar instead of like
82:05 person's blood sugar instead of like drawing their blood, taking it down to
82:07 drawing their blood, taking it down to like you think in the with with the
82:09 like you think in the with with the resident that's been up for for 13 hours
82:12 resident that's been up for for 13 hours because that's the standard in the field
82:14 because that's the standard in the field and they're making mistakes on a on a on
82:15 and they're making mistakes on a on a on a chart. It's like I think at some point
82:17 a chart. It's like I think at some point we're just going to go I can't believe
82:18 we're just going to go I can't believe we used to do it that way. It's crazy.
82:21 we used to do it that way. It's crazy. >> Yeah. No, and it's a lot of the consumer
82:24 >> Yeah. No, and it's a lot of the consumer devices and just computation we can do
82:27 devices and just computation we can do from you know whether it's cameras or
82:30 from you know whether it's cameras or excalent or you know other data in our
82:33 excalent or you know other data in our environments that tell us about our
82:34 environments that tell us about our physical state and some of these
82:36 physical state and some of these situations that you're talking about a
82:37 situations that you're talking about a lot of the I mean why isn't it happening
82:39 lot of the I mean why isn't it happening a lot of reasons are simply the
82:41 a lot of reasons are simply the regulatory process is antiquated and not
82:43 regulatory process is antiquated and not up to keeping up with the acceleration
82:46 up to keeping up with the acceleration of innovation that's happening you know
82:48 of innovation that's happening you know getting things through the FDA even if
82:50 getting things through the FDA even if they're you deemed uh you know in the
82:54 they're you deemed uh you know in the same ballpark and supposed to move fast.
82:57 same ballpark and supposed to move fast. you know, uh, with the regulatory costs
83:00 you know, uh, with the regulatory costs and processes is really high. And
83:03 and processes is really high. And >> you know you end up many years you know
83:06 >> you know you end up many years you know down the road from when the capability
83:09 down the road from when the capability and the data and technology actually you
83:11 and the data and technology actually you know should have arisen to be used in a
83:13 know should have arisen to be used in a hospital or to be used in a place where
83:15 hospital or to be used in a place where you actually have that kind of
83:18 you actually have that kind of appreciation for the data you know
83:19 appreciation for the data you know appreci and use. The consumer grade
83:23 appreci and use. The consumer grade devices for tracking of data of our
83:25 devices for tracking of data of our biological processes are on par and in
83:28 biological processes are on par and in many cases surpassed the medical grade
83:31 many cases surpassed the medical grade devices. And that's because they they
83:33 devices. And that's because they they just have but then they will have to
83:34 just have but then they will have to bill what they do and what they're
83:36 bill what they do and what they're tracking in some way that is consumer
83:39 tracking in some way that is consumer you know is not making the medical
83:40 you know is not making the medical claims to allow them to be able to be
83:43 claims to allow them to be able to be you know continue to move forward in
83:45 you know continue to move forward in those spaces. But there's no question
83:47 those spaces. But there's no question that that's that's a big part of what
83:49 that that's that's a big part of what can you know holds back the uh
83:52 can you know holds back the uh availability of a lot of these devices
83:55 availability of a lot of these devices and capabilities.
83:58 and capabilities. I'd like to take a quick break and
84:00 I'd like to take a quick break and acknowledge one of our sponsors,
84:01 acknowledge one of our sponsors, Function. Last year, I became a Function
84:04 Function. Last year, I became a Function member after searching for the most
84:05 member after searching for the most comprehensive approach to lab testing.
84:08 comprehensive approach to lab testing. Function provides over 100 advanced lab
84:10 Function provides over 100 advanced lab tests that give you a key snapshot of
84:12 tests that give you a key snapshot of your entire bodily health. This snapshot
84:15 your entire bodily health. This snapshot offers you with insights on your heart
84:16 offers you with insights on your heart health, hormone health, immune
84:18 health, hormone health, immune functioning, nutrient levels, and much
84:20 functioning, nutrient levels, and much more. They've also recently added tests
84:22 more. They've also recently added tests for toxins such as BPA exposure from
84:24 for toxins such as BPA exposure from harmful plastics and tests for PASES or
84:27 harmful plastics and tests for PASES or forever chemicals. Function not only
84:29 forever chemicals. Function not only provides testing of over 100 biomarkers
84:31 provides testing of over 100 biomarkers key to your physical and mental health,
84:33 key to your physical and mental health, but it also analyzes these results and
84:35 but it also analyzes these results and provides insights from top doctors who
84:37 provides insights from top doctors who are expert in the relevant areas. For
84:39 are expert in the relevant areas. For example, in one of my first tests with
84:41 example, in one of my first tests with function, I learned that I had elevated
84:43 function, I learned that I had elevated levels of mercury in my blood. function
84:45 levels of mercury in my blood. function not only helped me detect that, but
84:47 not only helped me detect that, but offered insights into how best to reduce
84:49 offered insights into how best to reduce my mercury levels, which included
84:50 my mercury levels, which included limiting my tuna consumption. I've been
84:53 limiting my tuna consumption. I've been eating a lot of tuna while also making
84:54 eating a lot of tuna while also making an effort to eat more leafy greens and
84:56 an effort to eat more leafy greens and supplementing with knack and
84:57 supplementing with knack and acetylcysteine, both of which can
84:59 acetylcysteine, both of which can support glutathione production and
85:01 support glutathione production and detoxification. And I should say by
85:03 detoxification. And I should say by taking a second function test, that
85:05 taking a second function test, that approach worked. Comprehensive blood
85:06 approach worked. Comprehensive blood testing is vitally important. There's so
85:08 testing is vitally important. There's so many things related to your mental and
85:10 many things related to your mental and physical health that can only be
85:12 physical health that can only be detected in a blood test. The problem is
85:14 detected in a blood test. The problem is blood testing has always been very
85:15 blood testing has always been very expensive and complicated. In contrast,
85:18 expensive and complicated. In contrast, I've been super impressed by function
85:19 I've been super impressed by function simplicity and at the level of cost, it
85:22 simplicity and at the level of cost, it is very affordable. As a consequence, I
85:24 is very affordable. As a consequence, I decided to join their scientific
85:25 decided to join their scientific advisory board and I'm thrilled that
85:27 advisory board and I'm thrilled that they're sponsoring the podcast. If you'd
85:29 they're sponsoring the podcast. If you'd like to try Function, you can go to
85:31 like to try Function, you can go to functionhealth.com/huberman.
85:33 functionhealth.com/huberman. Function currently has a wait list of
85:35 Function currently has a wait list of over 250,000 people, but they're
85:37 over 250,000 people, but they're offering early access to Huberman
85:39 offering early access to Huberman podcast listeners. Again, that's
85:41 podcast listeners. Again, that's functionhealth.com/huberman
85:43 functionhealth.com/huberman to get early access to function. Okay,
85:46 to get early access to function. Okay, so I agree that we need more data and
85:49 so I agree that we need more data and that there are a lot of different
85:50 that there are a lot of different sensors out there that can measure blood
85:52 sensors out there that can measure blood glucose and sleep and um temperature and
85:56 glucose and sleep and um temperature and breathing and all sorts of things, which
85:58 breathing and all sorts of things, which raises the question of are we going to
86:00 raises the question of are we going to need tons of sensors? I mean, are we
86:03 need tons of sensors? I mean, are we going to be just wrapped in sensors as
86:06 going to be just wrapped in sensors as clothing?
86:07 clothing? Are we going to be wearing 12 watches?
86:10 Are we going to be wearing 12 watches? Uh what's this going to look like?
86:11 Uh what's this going to look like? >> I'm an advocate for fewer things on, you
86:14 >> I'm an advocate for fewer things on, you know, not having all this stuff on our
86:16 know, not having all this stuff on our bodies. I'm, you know, there's so much
86:18 bodies. I'm, you know, there's so much we can get out of the computer vision
86:19 we can get out of the computer vision side, you know, from how, you know, the
86:21 side, you know, from how, you know, the cameras in our spaces and how they're
86:23 cameras in our spaces and how they're supporting us in our rooms, in our the
86:25 supporting us in our rooms, in our the sensors on our in our um you know, I
86:28 sensors on our in our um you know, I brought up HVAC systems earlier. So now
86:31 brought up HVAC systems earlier. So now you've got you effectively a digital
86:33 you've got you effectively a digital twin that's track, you know, and sensors
86:35 twin that's track, you know, and sensors that are tracking my metabolic rates
86:38 that are tracking my metabolic rates just in my space. They're tracking uh
86:42 just in my space. They're tracking uh carbon dioxide. They're tracking sound.
86:45 carbon dioxide. They're tracking sound. You're getting context because of that.
86:46 You're getting context because of that. You're getting intelligence. And now
86:48 You're getting intelligence. And now you're able to start having more
86:51 you're able to start having more information from, you know, what's
86:53 information from, you know, what's happening in my environment. The same is
86:54 happening in my environment. The same is true in my my vehicle. You can tell how
86:57 true in my my vehicle. You can tell how I'm whether I'm stressed or how I'm
86:59 I'm whether I'm stressed or how I'm feeling just by the posture I have it
87:01 feeling just by the posture I have it sitting in my car, right? And you need
87:04 sitting in my car, right? And you need AI. This is AI interpretation of data.
87:07 AI. This is AI interpretation of data. But what's driving that posture might be
87:10 But what's driving that posture might be coming from also an understanding of
87:12 coming from also an understanding of what else is happening in that
87:13 what else is happening in that environment. So it's suddenly this con
87:15 environment. So it's suddenly this con with contextual intelligence uh AIdriven
87:19 with contextual intelligence uh AIdriven understanding of what's happening in
87:20 understanding of what's happening in that space that's driving you know the
87:23 that space that's driving you know the state of me and how do I you know I keep
87:28 state of me and how do I you know I keep leaning to the side because I'm talking
87:30 leaning to the side because I'm talking thinking about you know my the way I
87:32 thinking about you know my the way I move and sit is you know it's a proxy
87:35 move and sit is you know it's a proxy for what's actually happening inside me
87:37 for what's actually happening inside me and then you've also got data around me
87:41 and then you've also got data around me coming from my environment what's
87:42 coming from my environment what's happening you know if I'm driving a car
87:44 happening you know if I'm driving a car or what's happening in my home in my you
87:47 or what's happening in my home in my you know in in the weather in not threats
87:50 know in in the weather in not threats that might be outside in noise that's
87:54 that might be outside in noise that's happening not inside the space but
87:56 happening not inside the space but things that give context to have more
87:59 things that give context to have more intelligence with the systems we have so
88:02 intelligence with the systems we have so I'm a a huge believer in you don't we
88:05 I'm a a huge believer in you don't we aren't anywhere until we have
88:07 aren't anywhere until we have integration of those systems between the
88:10 integration of those systems between the body the local environment and the
88:12 body the local environment and the external environment And we're finally
88:14 external environment And we're finally at a place where AI can help us start
88:15 at a place where AI can help us start integrating that data. Um, in terms of
88:18 integrating that data. Um, in terms of wearables though, uh, you so obviously
88:21 wearables though, uh, you so obviously some of the big companies, we've got the
88:23 some of the big companies, we've got the watch we have on our hand has a lot of
88:25 watch we have on our hand has a lot of information that is very relevant to our
88:28 information that is very relevant to our bodies. Um the devices we put in our
88:30 bodies. Um the devices we put in our ears. You may not realize but you know a
88:32 ears. You may not realize but you know a dimesized patch in your in in your consc
88:36 dimesized patch in your in in your consc we can use we can know heart rate pul
88:39 we can use we can know heart rate pul blood oxygen level uh because of the the
88:44 blood oxygen level uh because of the the electrical signature that your eye
88:45 electrical signature that your eye produces when it moves back and forth.
88:46 produces when it moves back and forth. we can know what you're looking at just
88:48 we can know what you're looking at just you know in from uh measuring a
88:50 you know in from uh measuring a signature measuring um your um
88:55 signature measuring um your um electrocul ocular in your ear we can
88:58 electrocul ocular in your ear we can measure EEG electronogs you can also get
89:01 measure EEG electronogs you can also get you know eye movements out of
89:03 you know eye movements out of electronograms but you can get attention
89:05 electronograms but you can get attention you can know what people are attending
89:06 you can know what people are attending to based on signatures in their ear so
89:09 to based on signatures in their ear so our earbuds you know that become sort of
89:11 our earbuds you know that become sort of a window to our state um and you've got
89:14 a window to our state um and you've got a number of companies working on that
89:16 a number of companies working on that right now. Uh, you know, so do we need
89:19 right now. Uh, you know, so do we need to wear lots of different sensors? No.
89:22 to wear lots of different sensors? No. Do we need to have the sensors, the data
89:24 Do we need to have the sensors, the data we have, whether it's on our bodies or
89:27 we have, whether it's on our bodies or off our bodies, be able to, you know,
89:29 off our bodies, be able to, you know, work together and not be proprietary to
89:32 work together and not be proprietary to just one company, but to be able to
89:33 just one company, but to be able to integrate great with other companies.
89:35 integrate great with other companies. That that becomes really important. You
89:36 That that becomes really important. You need integrative systems so that the the
89:39 need integrative systems so that the the data they have can interact with the
89:41 data they have can interact with the systems that surround surround you or
89:44 systems that surround surround you or surround my spaces or the mattress I'm
89:48 surround my spaces or the mattress I'm sleeping on. Right.
89:49 sleeping on. Right. >> Um because you've had a lot of specialty
89:51 >> Um because you've had a lot of specialty of design come from different developers
89:55 of design come from different developers and that's partly been a product of
89:57 and that's partly been a product of again the the FDA and the regulatory
90:00 again the the FDA and the regulatory pathways because of the cost of
90:02 pathways because of the cost of development. It tends to move companies
90:06 development. It tends to move companies towards specialization unless they're
90:09 towards specialization unless they're very large.
90:10 very large. >> But where we're at today is you're
90:12 >> But where we're at today is you're going, you know, we're getting to a
90:13 going, you know, we're getting to a point where you're going to start seeing
90:15 point where you're going to start seeing a lot of this data get integrated. I I
90:18 a lot of this data get integrated. I I think and and by all means, hopefully
90:20 think and and by all means, hopefully we're not going to be wearing a lot of
90:21 we're not going to be wearing a lot of things on our bodies. I sure as heck
90:22 things on our bodies. I sure as heck won't. You know, the more we put on our
90:24 won't. You know, the more we put on our bodies, it affects our gate. It affects
90:26 bodies, it affects our gate. It affects it has ramifications in so many
90:28 it has ramifications in so many different ways. Uh when I got here, I
90:30 different ways. Uh when I got here, I was talking to some of the people that
90:31 was talking to some of the people that work with you and they're like, "Well,
90:32 work with you and they're like, "Well, what what wearables do you wear?" And I
90:34 what what wearables do you wear?" And I actually don't wear many at all. And you
90:36 actually don't wear many at all. And you know, I I have worn rings, I've worn
90:38 know, I I have worn rings, I've worn watches at different times, but for me,
90:41 watches at different times, but for me, the importance is the point at which I
90:43 the importance is the point at which I get insights that, you know, I am a big
90:46 get insights that, you know, I am a big believer in um as little on my body as
90:49 believer in um as little on my body as possible when it comes to wearables. One
90:52 possible when it comes to wearables. One interesting company that I think is uh
90:54 interesting company that I think is uh worth mentioning is Pyson. and Python,
90:57 worth mentioning is Pyson. and Python, you know, again, they've got a form
90:58 you know, again, they've got a form factor that's, you know, like a Timex
91:00 factor that's, you know, like a Timex watch or they're partnered with Timex,
91:02 watch or they're partnered with Timex, but they're measuring um are you
91:05 but they're measuring um are you familiar with Python?
91:06 familiar with Python? >> No.
91:06 >> No. >> Okay. So, they're ma measuring psycho
91:08 >> Okay. So, they're ma measuring psycho motor vigilance. So you know really
91:12 motor vigilance. So you know really trying to understand it's like a ENG
91:14 trying to understand it's like a ENG electron neurom modulation and they're
91:17 electron neurom modulation and they're trying to understand fatigue and and
91:19 trying to understand fatigue and and neural attentiveness
91:22 neural attentiveness in a way that is you know continuous and
91:26 in a way that is you know continuous and useful for say high-risk operations or
91:31 useful for say high-risk operations or uh training uh you whether be it in
91:34 uh training uh you whether be it in sport but what I like about it is it's
91:37 sport but what I like about it is it's actually trying to get at a higher level
91:39 actually trying to get at a higher level cognitive state from the biometrics or
91:42 cognitive state from the biometrics or the that you're measuring. And that to
91:45 the that you're measuring. And that to me is an exciting really exciting
91:47 me is an exciting really exciting direction is when you're actually doing
91:49 direction is when you're actually doing something that you could make a decision
91:51 something that you could make a decision about how I engage in my work or how I
91:53 about how I engage in my work or how I engage in my training or my life based
91:55 engage in my training or my life based on that data about my cognitive state
91:57 on that data about my cognitive state and how effective I'm going to be.
91:59 and how effective I'm going to be. >> And then I can start associating that
92:01 >> And then I can start associating that data with the other data to make better
92:04 data with the other data to make better to have better decisions, better
92:06 to have better decisions, better insights at a certain point in time. And
92:09 insights at a certain point in time. And that becomes that's really your digital
92:10 that becomes that's really your digital twin.
92:11 twin. >> It's interesting earlier you said you
92:13 >> It's interesting earlier you said you don't like the word gamification.
92:15 don't like the word gamification. >> But um one thing that I think has really
92:17 >> But um one thing that I think has really been effective in the sleep space has
92:20 been effective in the sleep space has been this notion of a sleep score where
92:22 been this notion of a sleep score where people aspire to get a high sleep score.
92:26 people aspire to get a high sleep score. >> Um and if they don't they don't see that
92:29 >> Um and if they don't they don't see that as a um a disparagement of them but
92:32 as a um a disparagement of them but rather that they need to adjust their
92:34 rather that they need to adjust their behavior. So, it's not like, oh, I'm a
92:37 behavior. So, it's not like, oh, I'm a terrible sleeper and I'll never be a
92:38 terrible sleeper and I'll never be a good sleeper. It gives them something to
92:40 good sleeper. It gives them something to aspire to on a night by basis.
92:42 aspire to on a night by basis. >> Yes.
92:42 >> Yes. >> And I feel like that's been pretty
92:44 >> And I feel like that's been pretty effective. When I say gamification, I
92:46 effective. When I say gamification, I don't necessarily mean competitive uh
92:48 don't necessarily mean competitive uh with others, but I mean um encouraging
92:52 with others, but I mean um encouraging of oneself, right? So I could imagine uh
92:55 of oneself, right? So I could imagine uh this showing up in other domains too um
92:58 this showing up in other domains too um for wakeful states like you know like I
93:00 for wakeful states like you know like I spend the I had very few highly
93:03 spend the I had very few highly distracted you know work bouts or
93:06 distracted you know work bouts or something like that like I'd love to
93:07 something like that like I'd love to know at the end of my day I had three
93:09 know at the end of my day I had three really solid work bouts
93:12 really solid work bouts >> um of an hour each at least um that
93:15 >> um of an hour each at least um that would feel good like that was day well
93:17 would feel good like that was day well spent even if you know I didn't
93:19 spent even if you know I didn't accomplish what I wanted to in its
93:21 accomplish what I wanted to in its entirety like I I put in some really
93:23 entirety like I I put in some really good solid work. Right now, it's all
93:26 good solid work. Right now, it's all very subjective. Uh we know that
93:27 very subjective. Uh we know that gamification of steps was very effective
93:30 gamification of steps was very effective as a public messaging. You know, 10,000
93:32 as a public messaging. You know, 10,000 steps a day. We now know you want to get
93:34 steps a day. We now know you want to get somewhere exceeding 7,000 as a
93:36 somewhere exceeding 7,000 as a threshold. But if you think about it, we
93:39 threshold. But if you think about it, we could have just as easily said, hey, you
93:40 could have just as easily said, hey, you want to walk at a at a reasonable pace
93:43 want to walk at a at a reasonable pace for you for 30 minutes per day. But
93:46 for you for 30 minutes per day. But somehow the counting steps thing was
93:48 somehow the counting steps thing was more effective because people I know who
93:51 more effective because people I know who are not fanatic about exercise at all
93:53 are not fanatic about exercise at all will tell me I make sure I get my 11,000
93:55 will tell me I make sure I get my 11,000 steps per day. Like people tell me this.
93:57 steps per day. Like people tell me this. I'm like oh okay. Like so apparently
93:59 I'm like oh okay. Like so apparently it's a meaningful thing for people. Um
94:01 it's a meaningful thing for people. Um so I think quantification of performance
94:05 so I think quantification of performance um creates this aspirational state. Mhm.
94:08 um creates this aspirational state. Mhm. >> Um so I think that can be very useful
94:11 >> Um so I think that can be very useful >> data and and
94:13 >> data and and h understanding the quantification that
94:15 h understanding the quantification that you're working towards is really
94:17 you're working towards is really important. Those are, you know, summary
94:20 important. Those are, you know, summary summary statistics effectively that
94:22 summary statistics effectively that maybe they're good on some level to aim
94:25 maybe they're good on some level to aim for. If it means that people move more,
94:28 for. If it means that people move more, >> all for it, right? And it's something
94:30 >> all for it, right? And it's something that if I didn't move as much before and
94:33 that if I didn't move as much before and I didn't get up and I didn't do
94:34 I didn't get up and I didn't do something, then you know, and this is
94:36 something, then you know, and this is making me do it. That's awesome or
94:38 making me do it. That's awesome or that's great. But it's also great when
94:40 that's great. But it's also great when now through like a computer vision app I
94:42 now through like a computer vision app I can understand it's not just 10,000
94:45 can understand it's not just 10,000 steps but maybe there's you know an you
94:47 steps but maybe there's you know an you know a small battery of things I'm
94:49 know a small battery of things I'm trying to perform against that are
94:53 trying to perform against that are helping shape me neurally with the
94:55 helping shape me neurally with the feedback and the targets that I'm
94:57 feedback and the targets that I'm getting so that there's a little more
94:58 getting so that there's a little more there's more nuance towards achieving
95:00 there's more nuance towards achieving the goal I'm aiming for which is what
95:01 the goal I'm aiming for which is what I'm all about from a neuroplasticity
95:03 I'm all about from a neuroplasticity perspective. So I just don't like the
95:05 perspective. So I just don't like the word gamification. I believe everything
95:07 word gamification. I believe everything should be fun or everything training can
95:10 should be fun or everything training can be fun and gamified in some ways. Um,
95:12 be fun and gamified in some ways. Um, you know, again, like my life has been
95:13 you know, again, like my life has been predominantly in industry, but I've
95:15 predominantly in industry, but I've always, you know, I love teaching and
95:16 always, you know, I love teaching and I've always been at Stanford to, you
95:18 I've always been at Stanford to, you know, really there I try to it's it's
95:21 know, really there I try to it's it's how do I use technology and and merge it
95:23 how do I use technology and and merge it with the human system in a way that does
95:26 with the human system in a way that does help optimize learning in and training
95:29 help optimize learning in and training in a way that is from a sort of neural
95:33 in a way that is from a sort of neural circuit first perspective. you know, how
95:35 circuit first perspective. you know, how do we think about the neural system and
95:38 do we think about the neural system and use, you know, this more enjoyable,
95:42 use, you know, this more enjoyable, understandable target to to engage with
95:45 understandable target to to engage with it. One of my favorite examples though
95:48 it. One of my favorite examples though is there was a a period it was right
95:51 is there was a a period it was right around 2018 2020 and from 2018 to 2020
95:55 around 2018 2020 and from 2018 to 2020 20 and into the pandemic where you know
95:57 20 and into the pandemic where you know there became uh the students I I noticed
96:01 there became uh the students I I noticed had a much more uh
96:04 had a much more uh there there were a lot of projects their
96:05 there there were a lot of projects their final project they can build whatever
96:07 final project they can build whatever they want um and you know they've had to
96:09 they want um and you know they've had to do projects where they build neural
96:11 do projects where they build neural brain computer interfaces they've had to
96:13 brain computer interfaces they've had to build projects in VR they've had to
96:14 build projects in VR they've had to build AR projects they've had to build
96:16 build AR projects they've had to build projects that you know use um any sort
96:20 projects that you know use um any sort of input device you know they have to
96:22 of input device you know they have to use different sensor driven input
96:24 use different sensor driven input devices and that's all part of what they
96:26 devices and that's all part of what they develop and around 2018 2020 I started
96:29 develop and around 2018 2020 I started to see almost every project had a
96:32 to see almost every project had a wellness component to it which I loved I
96:34 wellness component to it which I loved I thought that was and it was a very
96:36 thought that was and it was a very notable shift in like the student body
96:37 notable shift in like the student body and maybe you've seen that too but I
96:40 and maybe you've seen that too but I still got this like one of my favorite
96:42 still got this like one of my favorite games today it was this VR game where
96:45 games today it was this VR game where I'm you in a morg. I wake up. I've got
96:48 I'm you in a morg. I wake up. I've got to solve an escape room. I've got
96:50 to solve an escape room. I've got zombies that are coming out of me and
96:52 zombies that are coming out of me and they're climbing out of the morg and
96:53 they're climbing out of the morg and they're getting closer and there's
96:55 they're getting closer and there's people breathing on my neck and you know
96:57 people breathing on my neck and you know and everything. And it's a wellness app.
97:00 and everything. And it's a wellness app. Go figure.
97:02 Go figure. It was their idea of look, this is what
97:05 It was their idea of look, this is what I feel like. I've got to because I'm
97:07 I feel like. I've got to because I'm also measuring my breath and heart rate
97:10 also measuring my breath and heart rate and I've got to keep those biological
97:12 and I've got to keep those biological signatures. like everything about how
97:14 signatures. like everything about how the zombies in solving my escape room
97:17 the zombies in solving my escape room problems, they're going to get closer to
97:19 problems, they're going to get closer to me if my breath rate goes up, if my
97:21 me if my breath rate goes up, if my heart rate goes up. I've got to keep
97:23 heart rate goes up. I've got to keep >> So, it was about stress control
97:24 >> So, it was about stress control basically.
97:25 basically. >> Exactly. Yes. But it was in that
97:27 >> Exactly. Yes. But it was in that environment and it was, you know,
97:28 environment and it was, you know, realized for them how they felt, but
97:30 realized for them how they felt, but Yeah. And you can do it in much simpler
97:32 Yeah. And you can do it in much simpler ways, but at least I I'm a huge fan of
97:35 ways, but at least I I'm a huge fan of how do we use the right quantification
97:37 how do we use the right quantification to develop the right habits, the right
97:40 to develop the right habits, the right skills, the right acuity or resolution
97:42 skills, the right acuity or resolution in a domain we might not or an area
97:45 in a domain we might not or an area where we might not be able to break it
97:46 where we might not be able to break it into the pieces we need, but it's going
97:49 into the pieces we need, but it's going to help us get there because my brain
97:51 to help us get there because my brain actually needs to now learn to uh
97:54 actually needs to now learn to uh understand that different, you know,
97:55 understand that different, you know, that sophistication. Yeah, it's clear to
97:58 that sophistication. Yeah, it's clear to me that in the health space, giving
98:00 me that in the health space, giving people information that scares them is
98:02 people information that scares them is great for getting them to not do things,
98:04 great for getting them to not do things, but it's very difficult to scare people
98:07 but it's very difficult to scare people into doing the right things. You need to
98:09 into doing the right things. You need to incentivize people do the right things
98:10 incentivize people do the right things by making it engaging and fun and
98:12 by making it engaging and fun and quantifiable and yeah. Um, you know, I
98:15 quantifiable and yeah. Um, you know, I like the example of the zombie game. Um,
98:19 like the example of the zombie game. Um, okay. So, fortunately, we won't have to
98:21 okay. So, fortunately, we won't have to wear uh dozens of sensors. Um, they'll
98:24 wear uh dozens of sensors. Um, they'll be more integrated over time. I'm I'm
98:26 be more integrated over time. I'm I'm happy to walk through a cheat sheet
98:28 happy to walk through a cheat sheet later after you know for building out
98:30 later after you know for building out like a computer vision app if if you
98:32 like a computer vision app if if you know for quantifying some of you you
98:34 know for quantifying some of you you know some of these more personalized
98:36 know some of these more personalized domain related things that people might
98:38 domain related things that people might want to do if
98:39 want to do if >> that would be awesome. Yeah. Yeah. And
98:40 >> that would be awesome. Yeah. Yeah. And then we can we can post a link to it in
98:41 then we can we can post a link to it in the show not captions because I think
98:43 the show not captions because I think that the example you gave of of you know
98:45 that the example you gave of of you know creating an app that can analyze
98:47 creating an app that can analyze swimming performance running gate focus
98:50 swimming performance running gate focus what you know focused work bouts I think
98:52 what you know focused work bouts I think that's really intriguing to a lot of
98:53 that's really intriguing to a lot of people but I think there's a at least
98:55 people but I think there's a at least for me there's a a gap there between
98:57 for me there's a a gap there between hearing about it thinking it's really
98:59 hearing about it thinking it's really cool and and how to implement. So I
99:01 cool and and how to implement. So I would certainly appreciate it. I know
99:02 would certainly appreciate it. I know the audience would too.
99:03 the audience would too. >> I mean just in
99:04 >> I mean just in >> that's very generous of you. Thank you.
99:06 >> that's very generous of you. Thank you. >> Yes. Absolutely. and and you know we're
99:07 >> Yes. Absolutely. and and you know we're in an era where everyone all you hear
99:10 in an era where everyone all you hear about is AI and AI tools and there are
99:12 about is AI and AI tools and there are tools that absolutely accelerate our
99:15 tools that absolutely accelerate our capabilities as humans but you know we
99:18 capabilities as humans but you know we we gave the examples of talking about
99:20 we gave the examples of talking about some you know some of the LLMs I mean I
99:22 some you know some of the LLMs I mean I I sat next to for we we went to Cal I
99:26 I sat next to for we we went to Cal I sat next I was at a a film premiere and
99:28 sat next I was at a a film premiere and I was sitting it there I was sitting
99:30 I was sitting it there I was sitting next to a few students who happened to
99:31 next to a few students who happened to be from Berkeley and they said to me you
99:34 be from Berkeley and they said to me you know they were computer science students
99:35 know they were computer science students and double engineering and one of them
99:38 and double engineering and one of them when he knew what I talk about or care
99:40 when he knew what I talk about or care about he's like you know I'm really
99:41 about he's like you know I'm really worried my my peer group like my peers
99:44 worried my my peer group like my peers can't start a paper without chat GPT
99:47 can't start a paper without chat GPT and you know it was a truth but it was
99:50 and you know it was a truth but it was also a concern so they understand the
99:53 also a concern so they understand the implications of what's happening and you
99:56 implications of what's happening and you know that's on one level we're in an era
99:59 know that's on one level we're in an era of agents everywhere and you know I
100:01 of agents everywhere and you know I think Reed has said that there's you
100:04 think Reed has said that there's you know a number of people have said you we
100:06 know a number of people have said you we won't we'll be using agents AI agents
100:09 won't we'll be using agents AI agents for everything at work in in the next
100:11 for everything at work in in the next five years and um some of those things
100:14 five years and um some of those things we need to use agents will accelerate um
100:18 we need to use agents will accelerate um they will accelerate capability they
100:20 they will accelerate capability they will accelerate short-term revenue but
100:22 will accelerate short-term revenue but they also will diminish workforce capab
100:25 they also will diminish workforce capab you know cognitive uh cognitive skill
100:28 you know cognitive uh cognitive skill and as a user of agents in any
100:30 and as a user of agents in any environment as a you know an owner of
100:33 environment as a you know an owner of companies employing agents you have to
100:35 companies employing agents you have to think hard about whi what the near-term
100:38 think hard about whi what the near-term and long-term ramifications. Doesn't
100:40 and long-term ramifications. Doesn't mean you don't use your agents in places
100:42 mean you don't use your agents in places where you need to, but you need to
100:43 where you need to, but you need to without the gerine cognitive load. There
100:47 without the gerine cognitive load. There there is a different dependence now that
100:49 there is a different dependence now that you have to have down the road. But also
100:52 you have to have down the road. But also you have to think about how do you how
100:54 you have to think about how do you how do you engage with the right competence
100:56 do you engage with the right competence to keep your humans that are in you know
100:59 to keep your humans that are in you know engaged with you know developing their
101:02 engaged with you know developing their cognitive skills and their gerine cognit
101:05 cognitive skills and their gerine cognit their their mental schemas to be able to
101:08 their their mental schemas to be able to support your systems down the road.
101:11 support your systems down the road. >> Let's talk more about digital twins.
101:14 >> Let's talk more about digital twins. >> Sure. Um, I don't think this concept has
101:17 >> Sure. Um, I don't think this concept has really landed uh squarely in people's
101:20 really landed uh squarely in people's minds as as like a specific thing. I
101:22 minds as as like a specific thing. I think people hear AI, they know what AI
101:24 think people hear AI, they know what AI is more or less. They hear about a
101:26 is more or less. They hear about a smartphone, they obviously know what a
101:27 smartphone, they obviously know what a smartphone is. Everyone uses one, it
101:30 smartphone is. Everyone uses one, it seems, but um, what is a digital twin? I
101:34 seems, but um, what is a digital twin? I think when people hear the word twin,
101:35 think when people hear the word twin, they think it's a twin of us. Earlier
101:37 they think it's a twin of us. Earlier you pointed out that's not necessarily
101:39 you pointed out that's not necessarily the case. It can be a useful tool for
101:42 the case. It can be a useful tool for some area of our life but it's not a
101:46 some area of our life but it's not a replica of us. Correct.
101:47 replica of us. Correct. >> Not at all in the ways that I think are
101:49 >> Not at all in the ways that I think are most relevant. Maybe you know there are
101:51 most relevant. Maybe you know there are some you know side cases that think
101:53 some you know side cases that think about that. And so like first two things
101:56 about that. And so like first two things to think about. One when I talk about
101:58 to think about. One when I talk about digital twins to companies and such I I
102:01 digital twins to companies and such I I like to frame it on um how it's being
102:04 like to frame it on um how it's being used how the immediiacy of the data from
102:08 used how the immediiacy of the data from the digital twin. So, let's go back 50
102:11 the digital twin. So, let's go back 50 years. An example of a digital twin that
102:14 years. An example of a digital twin that we still use, air traffic controllers.
102:15 we still use, air traffic controllers. When an air traffic controller sit down
102:17 When an air traffic controller sit down sits down and looks at, you know, a
102:19 sits down and looks at, you know, a screen, they're not looking at a
102:20 screen, they're not looking at a spreadsheet. They're looking at a
102:22 spreadsheet. They're looking at a digitization of information about
102:24 digitization of information about physical objects. That is meant to give
102:27 physical objects. That is meant to give them fast reaction times, make them
102:29 them fast reaction times, make them understand the landscape as effectively
102:31 understand the landscape as effectively as possible. We would call that
102:32 as possible. We would call that situational awareness. I've got to take
102:34 situational awareness. I've got to take in data about the environment around me
102:36 in data about the environment around me and I've got to be able to action on it
102:38 and I've got to be able to action on it as rapidly as quickly as possible to
102:40 as rapidly as quickly as possible to make the right decisions that mitigate
102:42 make the right decisions that mitigate any potential you know things that I you
102:45 any potential you know things that I you know are determined to be pro you know
102:47 know are determined to be pro you know problems or risks right and so that's
102:49 problems or risks right and so that's what you're trying to engage a human
102:50 what you're trying to engage a human system you know the visualization of
102:52 system you know the visualization of that data is important or doesn't have
102:55 that data is important or doesn't have to be visualization the interpretation
102:57 to be visualization the interpretation of it right and it's not the raw data
102:59 of it right and it's not the raw data it's again it's how is that data you
103:01 it's again it's how is that data you know represented you want the key
103:03 know represented you want the key information in a way that the salient
103:06 information in a way that the salient most important information in this case
103:08 most important information in this case you know about
103:11 you know about planes h is able to be acted on by that
103:14 planes h is able to be acted on by that human or even autonomous system right
103:17 human or even autonomous system right >> could you give me an example where in
103:19 >> could you give me an example where in like a more typical home environment
103:20 like a more typical home environment >> we're both into uh reefing and um you
103:25 >> we're both into uh reefing and um you know I built a aquacultured reef in my
103:27 know I built a aquacultured reef in my kitchen partly because I have a a child
103:29 kitchen partly because I have a a child and I wanted her to understand I I love
103:32 and I wanted her to understand I I love I I of it myself. So don't get that
103:34 I I of it myself. So don't get that wrong. It wasn't just all but to
103:36 wrong. It wasn't just all but to understand sort of the fragility of the
103:38 understand sort of the fragility of the ecosystems that happen in the ocean and
103:41 ecosystems that happen in the ocean and things we need to to worry about, care
103:42 things we need to to worry about, care about and and and all. And um you know
103:46 about and and and all. And um you know initially when I started and maybe you
103:49 initially when I started and maybe you know this was is not something you
103:51 know this was is not something you encountered, but when you build aqua a
103:54 encountered, but when you build aqua a reef or a reef tank and and do saltwater
103:56 reef or a reef tank and and do saltwater fish, you're uh a couple things. you're
103:59 fish, you're uh a couple things. you're doing chemical measurements by hand
104:02 doing chemical measurements by hand usually um you know weekly bi-weekly uh
104:06 usually um you know weekly bi-weekly uh there's a whole you know like 10
104:07 there's a whole you know like 10 different chemicals that you're
104:09 different chemicals that you're measuring and I would have my daughter
104:11 measuring and I would have my daughter doing that so that she would do the
104:13 doing that so that she would do the science part of it and you're trying to
104:16 science part of it and you're trying to you know you know the ranges the
104:18 you know you know the ranges the tolerances you have and you're also
104:20 tolerances you have and you're also observing this ecosystem and looking for
104:23 observing this ecosystem and looking for problems and by the time you see a
104:25 problems and by the time you see a problem you're reacting to that problem
104:27 problem you're reacting to that problem and I can tell you it was very
104:30 and I can tell you it was very unsuccessful. I mean there's lots of
104:31 unsuccessful. I mean there's lots of error and noise and human measurements.
104:33 error and noise and human measurements. There's you don't have the right
104:35 There's you don't have the right resolution of measurements. When
104:37 resolution of measurements. When resolution I mean I I'm every other you
104:40 resolution I mean I I'm every other you know every few days is not enough to
104:43 know every few days is not enough to track a problem. Uh you also have the
104:46 track a problem. Uh you also have the issue of you know you're reactive
104:49 issue of you know you're reactive instead of being proactive. It's just
104:50 instead of being proactive. It's just you're not sensing things that where
104:52 you're not sensing things that where you're the point at which it's visible
104:54 you're the point at which it's visible to you. It's probably too late to do
104:56 to you. It's probably too late to do anything about it. So if you look at my
104:59 anything about it. So if you look at my fish tank right now or my reef tank
105:00 fish tank right now or my reef tank right now um I have a number of digital
105:03 right now um I have a number of digital sensors in it. I have dashboards. I can
105:06 sensors in it. I have dashboards. I can track a huge chemical assay that is
105:08 track a huge chemical assay that is tracked in real time so that I can go
105:10 tracked in real time so that I can go back and look at the data. I can
105:11 back and look at the data. I can understand I can see oh there was a
105:13 understand I can see oh there was a water change there. Oh the the rod tank
105:16 water change there. Oh the the rod tank you my my I can tell what's happening by
105:18 you my my I can tell what's happening by looking at the data. I have you know and
105:21 looking at the data. I have you know and you know this you've got your spe the
105:22 you know this you've got your spe the spectrum of your lights is on a cycle of
105:25 spectrum of your lights is on a cycle of effect that's representative of the
105:27 effect that's representative of the environment that the corals you're
105:29 environment that the corals you're aquaculturing would you know that their
105:31 aquaculturing would you know that their their systems their deterministic
105:33 their systems their deterministic systems are looking for right and so
105:36 systems are looking for right and so you've built this ecosystem that when I
105:39 you've built this ecosystem that when I look at my dashboards I have a digital
105:40 look at my dashboards I have a digital twin of that system and it it my tank is
105:44 twin of that system and it it my tank is very stable my tank knows what's wrong
105:46 very stable my tank knows what's wrong what's happening I can look at the data
105:48 what's happening I can look at the data and understand that import an event
105:50 and understand that import an event happens somewhere that could have been
105:52 happens somewhere that could have been mitigated or some I can understand that
105:54 mitigated or some I can understand that something's wrong quickly before it even
105:57 something's wrong quickly before it even shows up.
105:58 shows up. >> It's amazing. I mean I think for people
106:00 >> It's amazing. I mean I think for people who aren't into reefing um might ask
106:02 who aren't into reefing um might ask like you know I know people that are and
106:04 like you know I know people that are and multiple people in my life are soon to
106:05 multiple people in my life are soon to have kids. Um most everybody nowadays
106:09 have kids. Um most everybody nowadays has a has a camera on the the sleeping
106:12 has a has a camera on the the sleeping environment of their kids so that if
106:14 environment of their kids so that if their kid wakes up in the middle of the
106:15 their kid wakes up in the middle of the night they can see it, they can hear it.
106:17 night they can see it, they can hear it. Um so camera and microphone do you think
106:20 Um so camera and microphone do you think we're either have now or soon we'll have
106:22 we're either have now or soon we'll have AI tools that will help us um better
106:26 AI tools that will help us um better understand the health status of infants
106:28 understand the health status of infants like parents learn intuitively over time
106:31 like parents learn intuitively over time based on um diaper changes based on um
106:36 based on um diaper changes based on um all sorts of things cries frequency of
106:40 all sorts of things cries frequency of illnesses etc and their kids how well
106:41 illnesses etc and their kids how well their kids are doing before they kids
106:43 their kids are doing before they kids can communicate that do you think AI can
106:45 can communicate that do you think AI can help parents be better parents by giving
106:49 help parents be better parents by giving real-time feedback on the health
106:51 real-time feedback on the health information of their kids. Not just if
106:53 information of their kids. Not just if they're awake or asleep or if they're in
106:55 they're awake or asleep or if they're in some sort of uh trouble, but really help
106:57 some sort of uh trouble, but really help us adjust our care of our young like
107:00 us adjust our care of our young like what's more important for our species
107:01 what's more important for our species than, you know, supporting the the
107:03 than, you know, supporting the the growth of our uh next generation.
107:05 growth of our uh next generation. >> No, absolutely. But I' I'd even more on
107:08 >> No, absolutely. But I' I'd even more on the biological side. I mean, so think
107:10 the biological side. I mean, so think about digital twins. There's and I'll
107:13 about digital twins. There's and I'll get to babies in a moment, but just
107:16 get to babies in a moment, but just you if you've ever bought a plane
107:18 you if you've ever bought a plane ticket, which any of us have today,
107:21 ticket, which any of us have today, that's a very sophisticated digital
107:23 that's a very sophisticated digital twin. Not the, you know, not the air
107:26 twin. Not the, you know, not the air traffic controllers looking at planes,
107:27 traffic controllers looking at planes, but the pricing models for what data is
107:30 but the pricing models for what data is going into driving that price in uh real
107:33 going into driving that price in uh real time, right? you you might be trying to
107:36 time, right? you you might be trying to buy a ticket and you go back an hour
107:38 buy a ticket and you go back an hour later or half hour later and it's like
107:39 later or half hour later and it's like double or maybe it's gone up in you and
107:42 double or maybe it's gone up in you and that's because it's using constant data
107:44 that's because it's using constant data from environments from things happening
107:45 from environments from things happening in the world from geopolitical issues
107:47 in the world from geopolitical issues from things happening in the that's
107:49 from things happening in the that's driving that price and that is very much
107:51 driving that price and that is very much an AIdriven digital twin that's driving
107:56 an AIdriven digital twin that's driving you know the sort of value of that that
107:59 you know the sort of value of that that ticket and so there there are places
108:02 ticket and so there there are places where we use digital twin so that would
108:04 where we use digital twin so that would sort of the example of something that's
108:06 sort of the example of something that's affecting our lives, but we don't think
108:07 affecting our lives, but we don't think about it as a digital twin, but it is a
108:09 about it as a digital twin, but it is a digital twin.
108:10 digital twin. >> And then you think about a different
108:12 >> And then you think about a different example where you've got a whole sandbox
108:14 example where you've got a whole sandbox model. The NFL might have a a digital
108:16 model. The NFL might have a a digital twin of every player that's in the NFL,
108:18 twin of every player that's in the NFL, right? They're they know data. They they
108:20 right? They're they know data. They they they're tracking that information. They
108:22 they're tracking that information. They know how people are going to perform
108:23 know how people are going to perform many times. What do they care about?
108:25 many times. What do they care about? They want to anticipate if someone might
108:26 They want to anticipate if someone might be, you know, high risk for an injury so
108:28 be, you know, high risk for an injury so that they, you know, can mitigate it.
108:30 that they, you know, can mitigate it. >> They're using those kind of data.
108:31 >> They're using those kind of data. >> Absolutely. Yeah.
108:32 >> Absolutely. Yeah. >> Interesting. I think the word twin is
108:34 >> Interesting. I think the word twin is the misleading part. I feel like digital
108:35 the misleading part. I feel like digital twin I feel like
108:37 twin I feel like >> soon that nomenclature needs to be
108:38 >> soon that nomenclature needs to be replaced because people hear twin they
108:41 replaced because people hear twin they think a duplicate of yourself.
108:43 think a duplicate of yourself. >> Yes.
108:43 >> Yes. >> I I feel like these are are um
108:45 >> I I feel like these are are um >> well it's a duplicate of relevant data
108:48 >> well it's a duplicate of relevant data and information about yourself but not
108:52 and information about yourself but not just trying to like what's the purpose
108:54 just trying to like what's the purpose in emulating myself? It's to emulate
108:57 in emulating myself? It's to emulate key. So imagine me as a physical system.
109:01 key. So imagine me as a physical system. I'm going to digitize some of that data,
109:04 I'm going to digitize some of that data, right? And whatever, you know, data I
109:06 right? And whatever, you know, data I have, I'm it's how that data I interact
109:08 have, I'm it's how that data I interact with it to make intelligent insights and
109:11 with it to make intelligent insights and feedback loops in the digital
109:13 feedback loops in the digital environment about how that physical
109:15 environment about how that physical system is going to behave. Right.
109:16 system is going to behave. Right. >> Okay. So, it's a digital representative.
109:19 >> Okay. So, it's a digital representative. >> Yes.
109:19 >> Yes. >> More than a digital twin. Yes. I think
109:21 >> More than a digital twin. Yes. I think I'm I'm not trying to
109:22 I'm I'm not trying to >> There are many digital twins in any
109:24 >> There are many digital twins in any digital twin. So like even you know
109:25 digital twin. So like even you know you've got data you live with lots of
109:28 you've got data you live with lots of digital what I would I think the world
109:30 digital what I would I think the world would the digital twin whatever
109:33 would the digital twin whatever nomenclature would say is a digital twin
109:35 nomenclature would say is a digital twin but I like a digital representative and
109:37 but I like a digital representative and it's it's informing some aspect of
109:40 it's it's informing some aspect of decision- making and it's many feedback
109:42 decision- making and it's many feedback so I'm digitizing different things I'm
109:44 so I'm digitizing different things I'm you know and and in that situational
109:46 you know and and in that situational awareness model like just can I give a
109:49 awareness model like just can I give a quick example so imagine I so I I can
109:51 quick example so imagine I so I I can digitize an environment right I can
109:53 digitize an environment right I can digitize are the the space we're in
109:56 digitize are the the space we're in right now and would that be a digital
109:58 right now and would that be a digital twin? So first there in situational
110:00 twin? So first there in situational awareness there's the state of okay so
110:02 awareness there's the state of okay so what's the sort of sensor
110:06 what's the sort of sensor you know limitations the acuity of the
110:08 you know limitations the acuity of the data I've actually brought in okay so
110:10 data I've actually brought in okay so that's like perception same with our
110:12 that's like perception same with our sensory systems and then there's
110:14 sensory systems and then there's comprehension so comprehension would be
110:16 comprehension so comprehension would be like okay that's a table that's a chair
110:19 like okay that's a table that's a chair that's a person now I'm in those sort of
110:22 that's a person now I'm in those sort of semantic units of relevance that the
110:24 semantic units of relevance that the digitization takes then there's the
110:26 digitization takes then there's the insight so what's happening in that
110:28 insight so what's happening in that environment. What do I do with that?
110:30 environment. What do I do with that? What is, you know, and and that's that's
110:32 What is, you know, and and that's that's where things get interesting and that's
110:33 where things get interesting and that's where a lot of, you know, I think the
110:34 where a lot of, you know, I think the future of AI products is because then
110:36 future of AI products is because then it's the feedback loops of what's
110:37 it's the feedback loops of what's happening with those, you know, that
110:39 happening with those, you know, that input and that data. And it it becomes
110:42 input and that data. And it it becomes interesting and important when you start
110:44 interesting and important when you start having multiple layers of relevant data
110:46 having multiple layers of relevant data that are interacting that can give you
110:48 that are interacting that can give you the right insights about what's
110:50 the right insights about what's happening, what to anticipate and you
110:53 happening, what to anticipate and you know in that space. But that's all about
110:55 know in that space. But that's all about our situational awareness and
110:57 our situational awareness and intelligence in that environment.
110:58 intelligence in that environment. >> Yeah, I I can see where uh these
111:01 >> Yeah, I I can see where uh these technologies could take us. I think for
111:03 technologies could take us. I think for the general public right now,
111:06 the general public right now, AI is super scary because we hear most
111:09 AI is super scary because we hear most about AI developing its own forms of
111:13 about AI developing its own forms of intelligence that turn on us.
111:15 intelligence that turn on us. >> I think people are gradually getting on
111:18 >> I think people are gradually getting on board the idea that AI can be very
111:19 board the idea that AI can be very useful. We have digital representatives
111:22 useful. We have digital representatives already out there for for us in these
111:24 already out there for for us in these different domains.
111:24 different domains. >> Absolutely. And
111:25 >> Absolutely. And >> I think being able to customize them for
111:26 >> I think being able to customize them for our unique challenges and to and our
111:29 our unique challenges and to and our unique goals is really what's most
111:32 unique goals is really what's most exciting to me.
111:33 exciting to me. >> I love that because I I mean I think
111:35 >> I love that because I I mean I think what I was trying to say is exactly what
111:36 what I was trying to say is exactly what you said. Look, there they are out there
111:38 you said. Look, there they are out there and these are effectively digital twins.
111:40 and these are effectively digital twins. Every company that's you're interacting
111:42 Every company that's you're interacting with social media has an effectively a
111:44 with social media has an effectively a digital twin of you in some place. It's
111:46 digital twin of you in some place. It's not to emulate your body but it's to
111:49 not to emulate your body but it's to emulate your behaviors. So to you know
111:51 emulate your behaviors. So to you know in those spaces or you're using tools
111:54 in those spaces or you're using tools that are optim you know have digital
111:55 that are optim you know have digital twins you for things you do in your
111:58 twins you for things you do in your daily life. So the question is how do we
112:01 daily life. So the question is how do we harness that for our success for
112:04 harness that for our success for individual success for understanding and
112:06 individual success for understanding and agency of what that can mean for you? If
112:10 agency of what that can mean for you? If the NFL is using it for a player, you
112:12 the NFL is using it for a player, you can use it as an athlete, meaning as an
112:15 can use it as an athlete, meaning as an athlete at any level, right? And it's
112:17 athlete at any level, right? And it's that digitization of information that
112:19 that digitization of information that can feed you. For my baby, you can
112:22 can feed you. For my baby, you can better understand a great deal about how
112:24 better understand a great deal about how they're successful or what isn't
112:26 they're successful or what isn't successful about them. and you know some
112:27 successful about them. and you know some of not not your baby's always successful
112:30 of not not your baby's always successful I don't want to say but what is maybe
112:32 I don't want to say but what is maybe not you know working well for them you
112:35 not you know working well for them you know the things that but um I would tend
112:38 know the things that but um I would tend to say uh the the exciting places about
112:42 to say uh the the exciting places about digital twins come in and really once
112:44 digital twins come in and really once you start integrating the data from
112:48 you start integrating the data from different places that tell us about the
112:51 different places that tell us about the success of our systems and those are
112:53 success of our systems and those are anchored with actual successes right I
112:56 anchored with actual successes right I think You used an example of your
112:58 think You used an example of your mattress and sleep and or even like you
113:01 mattress and sleep and or even like you one I liked was I had three good very
113:03 one I liked was I had three good very focused work sessions. You may have used
113:05 focused work sessions. You may have used different words Andy but the idea is
113:08 different words Andy but the idea is okay you've had those but it's when you
113:11 okay you've had those but it's when you can correlate it with other systems and
113:13 can correlate it with other systems and other outputs that it becomes powerful.
113:14 other outputs that it becomes powerful. That's the way a digital representative
113:16 That's the way a digital representative or a digital twin becomes more useful is
113:19 or a digital twin becomes more useful is thinking about not you know the
113:21 thinking about not you know the resolution of the data where the data
113:22 resolution of the data where the data source where the data is coming from
113:24 source where the data is coming from meaning whether is it biometric data is
113:26 meaning whether is it biometric data is it environmental data you know is it the
113:29 it environmental data you know is it the context of the state of what else was
113:32 context of the state of what else was happening during those work sessions and
113:35 happening during those work sessions and how is that something that I don't have
113:36 how is that something that I don't have to think about but AI can help me
113:38 to think about but AI can help me understand where I'm successful and what
113:41 understand where I'm successful and what else drove that success or what drove
113:43 else drove that success or what drove that state because it's not just my
113:45 that state because it's not just my success, it's intelligence. It's I like
113:47 success, it's intelligence. It's I like to call it situational intelligence is
113:49 to call it situational intelligence is sort of the overarching goal that we
113:51 sort of the overarching goal that we want to have and that involves you know
113:54 want to have and that involves you know my body and systems having situational
113:56 my body and systems having situational awareness but it's really you know a lot
113:59 awareness but it's really you know a lot of um integration of data that you know
114:02 of um integration of data that you know AI is very powerful for thinking about
114:04 AI is very powerful for thinking about how does it optimize and give us the the
114:07 how does it optimize and give us the the insights it doesn't have to do just have
114:09 insights it doesn't have to do just have systems behave but it can give us the
114:11 systems behave but it can give us the insights of how effectively we can act
114:13 insights of how effectively we can act in those environments
114:15 in those environments >> yeah I think of AI as being able to see
114:18 >> yeah I think of AI as being able to see what we can't see. Yes. So, for
114:19 what we can't see. Yes. So, for instance, if I had some sort of AI
114:23 instance, if I had some sort of AI representative that, you know, paid
114:25 representative that, you know, paid attention to my work environment and to
114:27 attention to my work environment and to my ability to focus as I'm trying to do
114:29 my ability to focus as I'm trying to do focused work.
114:31 focused work. >> And it turned out, obviously I'm making
114:33 >> And it turned out, obviously I'm making this up, but it turned out that every
114:35 this up, but it turned out that every time my um my air conditioner clicked
114:39 time my um my air conditioner clicked over to silent or back to on that it
114:44 over to silent or back to on that it would break my focus for the next 10
114:45 would break my focus for the next 10 minutes. Yes.
114:46 minutes. Yes. >> And I wasn't aware of that. And by the
114:48 >> And I wasn't aware of that. And by the way, this for people listening, this is
114:50 way, this for people listening, this is entirely plausible because so many of
114:53 entirely plausible because so many of our states of mind are triggered by cues
114:56 our states of mind are triggered by cues that we're just fundamentally unaware of
114:59 that we're just fundamentally unaware of >> or that it's always at the 35 minute
115:02 >> or that it's always at the 35 minute mark that my eyes start to have to
115:05 mark that my eyes start to have to reread words or lines um because somehow
115:09 reread words or lines um because somehow my attention is drifting um or that it's
115:12 my attention is drifting um or that it's paragraphs of longer than a certain
115:14 paragraphs of longer than a certain length. It's a near infinite space for
115:16 length. It's a near infinite space for us to explore on our own, but for AI to
115:20 us to explore on our own, but for AI to explore it, it's straightforward. And so
115:22 explore it, it's straightforward. And so it it can see through our literal our
115:24 it it can see through our literal our cognitive blind spots and our functional
115:26 cognitive blind spots and our functional blind spots. I and I think of where
115:28 blind spots. I and I think of where people pay a lot of money right now to
115:29 people pay a lot of money right now to get information to get around their
115:31 get information to get around their blind spots are things like um when you
115:33 blind spots are things like um when you have a pain and you don't know what it
115:34 have a pain and you don't know what it is, you go to this thing called a
115:36 is, you go to this thing called a doctor.
115:37 doctor. >> Or when you have um a uh a problem and
115:41 >> Or when you have um a uh a problem and you don't know how to sort it out, you
115:42 you don't know how to sort it out, you might talk to a therapist, right? People
115:44 might talk to a therapist, right? People pay a lot of money for that. I'm not
115:46 pay a lot of money for that. I'm not saying AI should replace all of that,
115:47 saying AI should replace all of that, but I do think AI can see things that we
115:50 but I do think AI can see things that we can't see.
115:51 can't see. >> Two examples to your point, which I I
115:52 >> Two examples to your point, which I I love the, you know, the reading
115:55 love the, you know, the reading potentially you're, you know, there's a
115:56 potentially you're, you know, there's a point at which you're experiencing
115:57 point at which you're experiencing fatigue and you want, you know, you
115:59 fatigue and you want, you know, you ideally, much like the fish tank, you
116:02 ideally, much like the fish tank, you want to be not reactive. You want to be
116:03 want to be not reactive. You want to be proactive. You want to mitigate it. you
116:05 proactive. You want to mitigate it. you know stop or you could have your devices
116:07 know stop or you could have your devices can have that integration of data and
116:09 can have that integration of data and respond to give you feedback when your
116:12 respond to give you feedback when your either your mental acuity your vigilance
116:14 either your mental acuity your vigilance or your just effectiveness has waned
116:16 or your just effectiveness has waned right but also on the level of health uh
116:20 right but also on the level of health uh a we know AI is you know huge for uh
116:24 a we know AI is you know huge for uh identifying a lot of different
116:26 identifying a lot of different pathologies out of you know data that as
116:29 pathologies out of you know data that as humans we're just not that good at at
116:31 humans we're just not that good at at discerning you know our voice in the
116:33 discerning you know our voice in the last 10 years we've become come much
116:35 last 10 years we've become come much more aware of the different pathologies
116:37 more aware of the different pathologies that are um can be discerned from AI
116:42 that are um can be discerned from AI app, you know, assessments of our speech
116:45 app, you know, assessments of our speech and not what we say, but how we say it.
116:48 and not what we say, but how we say it. >> Yeah, there's a lab up in University of
116:49 >> Yeah, there's a lab up in University of Washington, um I think it's Sam Golden's
116:51 Washington, um I think it's Sam Golden's lab who um
116:54 lab who um uh working on some really impressive
116:56 uh working on some really impressive algorithms to analyze speech patterns as
117:00 algorithms to analyze speech patterns as a way to predict suicidality.
117:02 a way to predict suicidality. >> Oh, interesting. and to great success
117:04 >> Oh, interesting. and to great success where people don't realize that they're
117:06 where people don't realize that they're drifting in that direction.
117:08 drifting in that direction. >> Um and phones can potentially warn
117:11 >> Um and phones can potentially warn people,
117:12 people, >> warn them themselves, right? Um that
117:15 >> warn them themselves, right? Um that they're drifting in a particular
117:16 they're drifting in a particular direction. People who have um cycles of
117:19 direction. People who have um cycles of depression or mania can know whether or
117:21 depression or mania can know whether or not they're drifting into that. That can
117:22 not they're drifting into that. That can be extremely useful. Um they can discern
117:25 be extremely useful. Um they can discern who else gets that information. Um I
117:28 who else gets that information. Um I think it and it's all based on tonality
117:31 think it and it's all based on tonality uh at different times of day stuff that
117:34 uh at different times of day stuff that even in a close close relationship with
117:36 even in a close close relationship with a therapist over many years they might
117:38 a therapist over many years they might not be able to detect if the person
117:40 not be able to detect if the person becomes reclusive or something of that
117:42 becomes reclusive or something of that sort.
117:43 sort. >> Absolutely. I mean um neural
117:46 >> Absolutely. I mean um neural degeneration it shows up and you know
117:49 degeneration it shows up and you know short assessment of how people speak uh
117:51 short assessment of how people speak uh they've definitely been able to show
117:53 they've definitely been able to show potential likelihood of psychosis.
117:55 potential likelihood of psychosis. um you know and and that's with uh
117:57 um you know and and that's with uh syntactic completion and and how people
118:01 syntactic completion and and how people read read read paragraphs. Um neural
118:04 read read read paragraphs. Um neural degeneration though things like
118:06 degeneration though things like Alzheimer's show up in speech because of
118:08 Alzheimer's show up in speech because of the you know linguistic cues can show up
118:10 the you know linguistic cues can show up but you know sometimes 10 years before a
118:12 but you know sometimes 10 years before a typical clinical uh uh symptom would
118:16 typical clinical uh uh symptom would show up that would be identified and and
118:18 show up that would be identified and and what I what I think is important for
118:20 what I what I think is important for people to realize is it's not someone
118:22 people to realize is it's not someone saying I don't remember. It's nothing
118:24 saying I don't remember. It's nothing like that. It's not those cues that you
118:26 like that. It's not those cues that you think are actually relevant. It's more
118:29 think are actually relevant. It's more like an individual says something
118:33 like an individual says something like that. What I just did, which was I
118:35 like that. What I just did, which was I purposely stuttered. I started a word
118:37 purposely stuttered. I started a word again, right? And it's, you know, what
118:41 again, right? And it's, you know, what we might call a stutter in how we're
118:42 we might call a stutter in how we're speaking. Sometimes duration of spaces
118:46 speaking. Sometimes duration of spaces between starting one sentence to the
118:48 between starting one sentence to the next. These are things that as humans
118:49 next. These are things that as humans we've adapted to not p not pick up on
118:52 we've adapted to not p not pick up on because it makes us you know it makes us
118:54 because it makes us you know it makes us ineffective in communication or and and
118:57 ineffective in communication or and and but an algorithm can do so very well. Um
119:00 but an algorithm can do so very well. Um diabetes, heart disease both show up in
119:02 diabetes, heart disease both show up in voice. diabetes shows up because uh you
119:05 voice. diabetes shows up because uh you can pick up on uh dehydration uh in the
119:08 can pick up on uh dehydration uh in the in the voice uh you much again I'm I'm a
119:12 in the voice uh you much again I'm I'm a sound person in my heart in my past and
119:14 sound person in my heart in my past and if you look at the spectrum of sound
119:16 if you look at the spectrum of sound you're going to see changes that show up
119:18 you're going to see changes that show up you know there are very consistent
119:20 you know there are very consistent things in a voice that show up with
119:21 things in a voice that show up with dehydration in the spectral you know
119:24 dehydration in the spectral you know salance as well as with heart disease
119:26 salance as well as with heart disease you get sort of flutter that shows up
119:28 you get sort of flutter that shows up it's a proxy for things happening inside
119:31 it's a proxy for things happening inside your body you know with problems
119:32 your body you know with problems cardiovascular issues, but you're going
119:34 cardiovascular issues, but you're going to see them as certain like modulatory
119:36 to see them as certain like modulatory fluctuations in certain frequency bands.
119:39 fluctuations in certain frequency bands. And again, we don't walk around as as,
119:41 And again, we don't walk around as as, you know, a partner or a spouse or a or
119:45 you know, a partner or a spouse or a or a child, you know, you you caretaking
119:48 a child, you know, you you caretaking our parents and listening for, you know,
119:49 our parents and listening for, you know, like the the 4 kHz modulation, but an
119:52 like the the 4 kHz modulation, but an algorithm can. And you know, all of
119:54 algorithm can. And you know, all of these are places where you can identify
119:57 these are places where you can identify something that is potentially, you know,
120:00 something that is potentially, you know, mitigate something proactively before
120:03 mitigate something proactively before there's, you know, a problem. And
120:05 there's, you know, a problem. And especially with like neural
120:06 especially with like neural degeneration, we're really just getting
120:07 degeneration, we're really just getting to a place where there's
120:08 to a place where there's pharmacological, you know, opportunities
120:11 pharmacological, you know, opportunities to slow something down. And you want to
120:14 to slow something down. And you want to find that as quick as possible. So where
120:16 find that as quick as possible. So where do you you want to you want to have that
120:19 do you you want to you want to have that input so that you can do something about
120:21 input so that you can do something about it. You asked me about the babies, you
120:23 it. You asked me about the babies, you know, like before we
120:26 know, like before we the type of coughs we have tell us a lot
120:29 the type of coughs we have tell us a lot about different pathologies. So for a
120:32 about different pathologies. So for a baby their cry their you know if I'm
120:35 baby their cry their you know if I'm thinking you asked me about a digital
120:36 thinking you asked me about a digital tomb where would I be most interested in
120:38 tomb where would I be most interested in using that information if I had you know
120:40 using that information if I had you know children or I mean I do have a child but
120:42 children or I mean I do have a child but from you know in the sort of lowest
120:46 from you know in the sort of lowest touch most opportunity it's to identify
120:50 touch most opportunity it's to identify potential you know pathologies or issues
120:52 potential you know pathologies or issues early based on you know the the natural
120:55 early based on you know the the natural sounds and the natural utterances and
120:58 sounds and the natural utterances and call you know that are happening to
120:59 call you know that are happening to understand if there is something that
121:01 understand if there is something that you know there's a way it could be
121:03 you know there's a way it could be helped. It could be you know need you
121:04 helped. It could be you know need you could proactively
121:06 could proactively um make something much better.
121:08 um make something much better. >> Let's talk about you.
121:10 >> Let's talk about you. >> Oh boy.
121:10 >> Oh boy. >> And how you got into all of this stuff
121:14 >> And how you got into all of this stuff because you're highly unusual in the
121:16 because you're highly unusual in the neuroscience space. I recall when we
121:18 neuroscience space. I recall when we were graduate students who when you were
121:20 were graduate students who when you were working on auditory perception and
121:21 working on auditory perception and physiology and then years later uh now
121:24 physiology and then years later uh now you're involved with in AI
121:27 you're involved with in AI neuroplasticity you were at Dolby. the
121:30 neuroplasticity you were at Dolby. the what is to you the most of interesting
121:34 what is to you the most of interesting question that's driving all of this like
121:36 question that's driving all of this like what what guides your choices about what
121:38 what what guides your choices about what to work on
121:39 to work on >> human technology intersection and
121:42 >> human technology intersection and perception is my core right I say
121:45 perception is my core right I say perception but the world is data and you
121:48 perception but the world is data and you know how our brains take in the data
121:50 know how our brains take in the data that we consume to optimize how we
121:53 that we consume to optimize how we experience the world is is what I care
121:56 experience the world is is what I care about across all of what I've spent my
121:58 about across all of what I've spent my time doing and for me technology is such
122:01 time doing and for me technology is such a huge part of that
122:02 a huge part of that >> that it is you know I I like to innovate
122:05 >> that it is you know I I like to innovate I like to build things but I also like
122:07 I like to build things but I also like to think about how do we improve human
122:08 to think about how do we improve human performance core to improving human
122:11 performance core to improving human performance is understanding how we're
122:12 performance is understanding how we're different not just how similar but you
122:14 different not just how similar but you know the nuances of how our brains are
122:17 know the nuances of how our brains are shaped and how they're influenced and
122:18 shaped and how they're influenced and thus why I care you know I've spent so
122:20 thus why I care you know I've spent so much time in neuroplasticity and it is
122:22 much time in neuroplasticity and it is at the intersection of everything is how
122:25 at the intersection of everything is how are we changing and how do we harness
122:27 are we changing and how do we harness that how Do we make it something that we
122:30 that how Do we make it something that we have agency over? Whether it's from the
122:32 have agency over? Whether it's from the technologies we build and we innovate to
122:35 technologies we build and we innovate to the point of I want to feel better. I
122:37 the point of I want to feel better. I want to be successful. I don't want that
122:38 want to be successful. I don't want that to be something left to surprise me.
122:41 to be something left to surprise me. Right?
122:42 Right? >> So you asked me how do I get there? One
122:44 >> So you asked me how do I get there? One thing that so I was violinist back in
122:47 thing that so I was violinist back in the day. I'm still a violinist and
122:49 the day. I'm still a violinist and music's a part of my life. But I was
122:51 music's a part of my life. But I was studying viol music and engineering a uh
122:55 studying viol music and engineering a uh when I was in undergrad and I think we
122:58 when I was in undergrad and I think we alluded to the fact I have uh absolute
123:00 alluded to the fact I have uh absolute pitch and absolute pitches for anyone
123:03 pitch and absolute pitches for anyone that doesn't know it's not it it's not
123:08 that doesn't know it's not it it's not anything that means I always sing in
123:10 anything that means I always sing in tune. What it means is I hear the world
123:14 tune. What it means is I hear the world uh like I hear sound like people see
123:17 uh like I hear sound like people see color. Okay. Um and I can't turn it off
123:20 color. Okay. Um and I can't turn it off really. I can kind of push it back.
123:21 really. I can kind of push it back. >> Wait, sorry. Don't we all hear sound
123:23 >> Wait, sorry. Don't we all hear sound like we see? I mean, I hear sounds and I
123:24 like we see? I mean, I hear sounds and I see colors. Could you clarify what you
123:26 see colors. Could you clarify what you mean?
123:26 mean? >> When you Okay. So, when you walk down
123:28 >> When you Okay. So, when you walk down the street, your brain is going, "Oh,
123:29 the street, your brain is going, "Oh, that's red, that's black, that's blue,
123:31 that's red, that's black, that's blue, that's green." My brain's going, "That's
123:32 that's green." My brain's going, "That's an A, that's a B, that's a G, that's an
123:34 an A, that's a B, that's a G, that's an F."
123:34 F." >> I see. You're cate your You're your
123:35 >> I see. You're cate your You're your category.
123:36 category. >> There's a categorical perception about
123:37 >> There's a categorical perception about it. And because of the nature of I think
123:40 it. And because of the nature of I think my exposure to sound in my life, I also
123:44 my exposure to sound in my life, I also know what frequency it is, right? You
123:46 know what frequency it is, right? You know, so I can say that's, you know, 350
123:48 know, so I can say that's, you know, 350 Hz or that's 400 Hz or that's 442 hertz.
123:51 Hz or that's 400 Hz or that's 442 hertz. And um it has different applications. I
123:54 And um it has different applications. I mean, I can transcribe a jazz solo when
123:56 mean, I can transcribe a jazz solo when I listen to it. That's a great party
123:57 I listen to it. That's a great party trick. But but it doesn't mean that it's
124:01 trick. But but it doesn't mean that it's not necessarily a good thing for a
124:03 not necessarily a good thing for a musician, right? you know as well as I
124:05 musician, right? you know as well as I do that um you know categorical
124:07 do that um you know categorical perception is we all have different
124:10 perception is we all have different forms of it usually for speech and
124:12 forms of it usually for speech and language like the units of vowels or
124:14 language like the units of vowels or phonetic units will especially vowels
124:16 phonetic units will especially vowels will you can hear many different
124:18 will you can hear many different versions of a an e and still hear it as
124:22 versions of a an e and still hear it as an e and that's what we would call
124:24 an e and that's what we would call categorical perception and I my brain
124:27 categorical perception and I my brain does the same thing for you a sort of
124:28 does the same thing for you a sort of set of frequencies to hear it as an a
124:31 set of frequencies to hear it as an a and um that's that that can be good at
124:35 and um that's that that can be good at times, but when you're actually a
124:36 times, but when you're actually a musician, there's a lot more subtlety
124:39 musician, there's a lot more subtlety that goes into how you play with other
124:41 that goes into how you play with other people. And um what what key you're in
124:44 people. And um what what key you're in or what you know the the details like if
124:47 or what you know the the details like if you ask me to sing happy birthday, I'm
124:49 you ask me to sing happy birthday, I'm always going to sing it in the key of G
124:50 always going to sing it in the key of G if I am left to my own devices and I
124:52 if I am left to my own devices and I will I will get you there somehow if we
124:55 will I will get you there somehow if we start somewhere else. M so what happened
124:57 start somewhere else. M so what happened to me when I was in music school when I
125:00 to me when I was in music school when I was in conservatory and also engineering
125:01 was in conservatory and also engineering school is um I was taking two things
125:04 school is um I was taking two things happened. I knew that I had to override
125:06 happened. I knew that I had to override my brain because it was not allowing me
125:09 my brain because it was not allowing me the subtlety I wanted to play my shots
125:14 the subtlety I wanted to play my shots or play my chamber music in the ways
125:15 or play my chamber music in the ways that were that I was having to work too
125:19 that were that I was having to work too hard to override what you know these
125:22 hard to override what you know these these sort of categories of sounds I was
125:24 these sort of categories of sounds I was hearing. So I started playing early
125:27 hearing. So I started playing early music. Early music, Baroque music for
125:29 music. Early music, Baroque music for anyone. I I said I think I said earlier
125:31 anyone. I I said I think I said earlier A has is a social construct. Today we
125:34 A has is a social construct. Today we typically as a set as a standard A is
125:37 typically as a set as a standard A is 440 hertz. Um if you go back to like the
125:40 440 hertz. Um if you go back to like the 1700s, A was uh 415 hertz in the Baroque
125:45 1700s, A was uh 415 hertz in the Baroque era and 415 hertz is effectively a G
125:49 era and 415 hertz is effectively a G sharp. So it's the difference between H
125:51 sharp. So it's the difference between H and H. Okay. And um what would happen to
125:56 and H. Okay. And um what would happen to me when I was trying to override this is
125:58 me when I was trying to override this is I was playing in an early music ensemble
126:00 I was playing in an early music ensemble and I would tune my violin up and I
126:02 and I would tune my violin up and I would see a on the page and I'd hear
126:04 would see a on the page and I'd hear G#arp in my brain and it was completely
126:08 G#arp in my brain and it was completely it it was it was I was terrible. I was
126:10 it it was it was I was terrible. I was like always it was really hard for my
126:12 like always it was really hard for my brain to override and uh I mean wind
126:15 brain to override and uh I mean wind brass and wind players do this all the
126:17 brass and wind players do this all the time. It's like transposition and they
126:19 time. It's like transposition and they modulate to the key that they're in and
126:21 modulate to the key that they're in and they doesn't their brains have evolved,
126:23 they doesn't their brains have evolved, you know, through their training and
126:26 you know, through their training and neuroplasticity to be able to not have
126:29 neuroplasticity to be able to not have the same sort of experience I had.
126:31 the same sort of experience I had. Anyhow,
126:33 Anyhow, long story long, I uh was also taking a
126:36 long story long, I uh was also taking a neuroscience course. This neuroscience
126:38 neuroscience course. This neuroscience course, we were reading papers about
126:40 course, we were reading papers about sort of different mapmaking and
126:41 sort of different mapmaking and neuroplasticity. And I read this paper
126:44 neuroplasticity. And I read this paper by a professor at Stanford named Eric
126:47 by a professor at Stanford named Eric Kudson. And Eric Kudson did these
126:49 Kudson. And Eric Kudson did these amazing well he did a lot of seminal
126:52 amazing well he did a lot of seminal work for how we understand the auditory
126:54 work for how we understand the auditory pathways as well as how we form
126:56 pathways as well as how we form multiensory objects and and the way the
126:58 multiensory objects and and the way the brain integrates um you know cells data
127:02 brain integrates um you know cells data from across our modalities meaning you
127:04 from across our modalities meaning you know sight and sound. Um but in this
127:07 know sight and sound. Um but in this paper what he was doing was he had
127:10 paper what he was doing was he had identified cells in the brain that
127:12 identified cells in the brain that optimally responded their receptive
127:14 optimally responded their receptive fields. You know receptive field being
127:15 fields. You know receptive field being that sort of like in all of that giant
127:18 that sort of like in all of that giant data set of the world it's that you know
127:20 data set of the world it's that you know it's the the set of data that optimally
127:24 it's the the set of data that optimally causes that cell to respond. And for
127:27 causes that cell to respond. And for these cells, they cared about a
127:29 these cells, they cared about a particular location in auditory and
127:31 particular location in auditory and visual space, which you know, frankly,
127:33 visual space, which you know, frankly, for mammals, we don't have the same sort
127:35 for mammals, we don't have the same sort of like cells because we can move our
127:37 of like cells because we can move our eyes back and forth in our sockets
127:39 eyes back and forth in our sockets unlike owls. And he studied owls. And
127:41 unlike owls. And he studied owls. And owls have a very hardwired map of
127:44 owls have a very hardwired map of auditory visual space.
127:45 auditory visual space. >> On the other hand, if I hear click off
127:47 >> On the other hand, if I hear click off to my right, I turn my head to the
127:48 to my right, I turn my head to the right.
127:48 right. >> You turn your head it triggers a
127:50 >> You turn your head it triggers a different, you know, vestibular ocular
127:51 different, you know, vestibular ocular response that moves, you know, all of
127:53 response that moves, you know, all of that. Yes. But in this case, he had
127:55 that. Yes. But in this case, he had these beautiful hardwired maps of
127:57 these beautiful hardwired maps of auditory visual space. And then he would
127:59 auditory visual space. And then he would rear and raise these owls with prism
128:01 rear and raise these owls with prism glasses that effectively shifted their
128:04 glasses that effectively shifted their their visual system by 15 degrees. And
128:07 their visual system by 15 degrees. And um then he would put them key to
128:09 um then he would put them key to developing neuroplasticity. He would put
128:11 developing neuroplasticity. He would put them in high, you know, important, you
128:15 them in high, you know, important, you know, high not stress, but let's say
128:17 know, high not stress, but let's say situations where they had to do
128:18 situations where they had to do something critical to their, you know,
128:20 something critical to their, you know, their survival or their their
128:22 their survival or their their well-being. And so they would hunt and
128:24 well-being. And so they would hunt and they would feed and do things like that
128:26 they would feed and do things like that with with the um this 15 degree shift,
128:29 with with the um this 15 degree shift, you know. And consequently, he saw the
128:31 you know. And consequently, he saw the cells, the auditory neurons, he saw
128:34 cells, the auditory neurons, he saw their their denderrites realigned to the
128:37 their their denderrites realigned to the now 15 degree visually shifted cells.
128:41 now 15 degree visually shifted cells. And and it was this realization that
128:43 And and it was this realization that they developed a secondary map that was
128:46 they developed a secondary map that was now aligned with the 15 degree shift of
128:48 now aligned with the 15 degree shift of the prism glasses as well as their
128:50 the prism glasses as well as their original map was was super interesting
128:52 original map was was super interesting for understanding how our brains
128:54 for understanding how our brains integrate data and the feedback and
128:56 integrate data and the feedback and neuroplasticity.
128:57 neuroplasticity. So I go back to my Baroque violin where
129:01 So I go back to my Baroque violin where I'm always out of tune and I'm tuning up
129:04 I'm always out of tune and I'm tuning up with, you know, tuning up my vi my
129:06 with, you know, tuning up my vi my baroque violin and I realize I had
129:08 baroque violin and I realize I had developed absolute pitch at A415. So I
129:11 developed absolute pitch at A415. So I developed a secondary absolute pitch map
129:13 developed a secondary absolute pitch map and then I would go play Shastikovich
129:16 and then I would go play Shastikovich right after it A440 and I had that map
129:18 right after it A440 and I had that map and I have nothing in between but I
129:20 and I have nothing in between but I could modulate between the two and
129:22 could modulate between the two and that's like the point at which I said I
129:24 that's like the point at which I said I I I think I just you know my brain is a
129:27 I I think I just you know my brain is a little weird and I just did something
129:28 little weird and I just did something that I need to go better understand. So
129:30 that I need to go better understand. So that's how I like ended up here as a
129:33 that's how I like ended up here as a neuroscientist.
129:33 neuroscientist. >> I know Eric's work really well. Um our
129:36 >> I know Eric's work really well. Um our labs were next door. Yes, our offices
129:38 labs were next door. Yes, our offices were next door. He's retired now, but um
129:40 were next door. He's retired now, but um >> I I've he knows I told him the story.
129:42 >> I I've he knows I told him the story. >> He's he's wonderful. I I think one of my
129:44 >> He's he's wonderful. I I think one of my favorite things about those studies I
129:46 favorite things about those studies I think people will find interesting is
129:48 think people will find interesting is that um
129:51 that um if
129:52 if an animal, human or owl, you know, has a
129:58 an animal, human or owl, you know, has a displacement in the world, something's
130:00 displacement in the world, something's different, something changes and you
130:01 different, something changes and you need to adjust to it. could be like new
130:03 need to adjust to it. could be like new information coming to you that you need
130:05 information coming to you that you need to learn in order to perform your sport
130:07 to learn in order to perform your sport correctly or to perform well in class or
130:10 correctly or to perform well in class or or an emotionally challenging situation
130:12 or an emotionally challenging situation that you need to adjust to. All of that
130:15 that you need to adjust to. All of that >> can happen,
130:17 >> can happen, but it happens much much faster if your
130:20 but it happens much much faster if your life depends on it.
130:22 life depends on it. >> Yes.
130:22 >> Yes. >> And we kind of intuitively know this,
130:24 >> And we kind of intuitively know this, but one of my favorite things about his
130:26 but one of my favorite things about his work is where he said, "Okay, well,
130:28 work is where he said, "Okay, well, yeah, these owls can adjust to the prism
130:30 yeah, these owls can adjust to the prism shift. their maps in the brain can
130:32 shift. their maps in the brain can change, but they sure as heck form much
130:35 change, but they sure as heck form much faster if you say, "Hey, in order to
130:39 faster if you say, "Hey, in order to eat, in other words, in order to
130:41 eat, in other words, in order to survive, these maps have to change." You
130:43 survive, these maps have to change." You know, and I I like that study so much
130:45 know, and I I like that study so much because, you know, we hear all the time,
130:47 because, you know, we hear all the time, you know, it takes 29 days to form a new
130:49 you know, it takes 29 days to form a new habit or it takes 50 days to form a new
130:51 habit or it takes 50 days to form a new habit or whatever it is. Actually, you
130:54 habit or whatever it is. Actually, you can form a new habit as quickly as is
130:56 can form a new habit as quickly as is necessary to form that new habit. And so
130:58 necessary to form that new habit. And so the limits on neuroplasticity are really
131:00 the limits on neuroplasticity are really set by how critical it is.
131:02 set by how critical it is. >> Yeah.
131:02 >> Yeah. >> And you know, of course, if you put a
131:04 >> And you know, of course, if you put a gun to my head right now and you said,
131:06 gun to my head right now and you said, "Okay, remap your your auditory world."
131:09 "Okay, remap your your auditory world." I mean, there are limits at the at the
131:10 I mean, there are limits at the at the other end, too. I mean, I can't do that
131:12 other end, too. I mean, I can't do that quickly. But I I think um
131:16 quickly. But I I think um it's a reminder to me anyway, and thank
131:18 it's a reminder to me anyway, and thank you for bringing up Eric's work. It's a
131:20 you for bringing up Eric's work. It's a reminder to me that neuroplasticity is
131:23 reminder to me that neuroplasticity is always in reach.
131:25 always in reach. If the incentives are high enough, we
131:27 If the incentives are high enough, we can do it. Yeah.
131:28 can do it. Yeah. >> And so I think with AI it's going to be
131:31 >> And so I think with AI it's going to be very interesting or with technology
131:32 very interesting or with technology generally. You know our ability to form
131:35 generally. You know our ability to form these new maps of experience at least
131:37 these new maps of experience at least with smartphones has been pretty
131:39 with smartphones has been pretty gradual. I really see 2010 as kind of
131:41 gradual. I really see 2010 as kind of the beginning of the smartphone and then
131:43 the beginning of the smartphone and then now by 2025
131:45 now by 2025 >> we're in a place where most everyone
131:47 >> we're in a place where most everyone young and old has integrated this new
131:49 young and old has integrated this new technology. I think AI is coming at us
131:51 technology. I think AI is coming at us very fast and it's not unclear what form
131:53 very fast and it's not unclear what form it's coming at us and and where and as
131:55 it's coming at us and and where and as you said it's already here. And I think
131:57 you said it's already here. And I think um we will adapt
131:59 um we will adapt >> for sure. We'll form the necessary maps.
132:01 >> for sure. We'll form the necessary maps. I think uh being very conscious of which
132:03 I think uh being very conscious of which maps we're are changing is so key. I
132:06 maps we're are changing is so key. I mean I think we're still doing a lot of
132:07 mean I think we're still doing a lot of cleanup
132:08 cleanup >> of the kind of detrimental aspects of
132:11 >> of the kind of detrimental aspects of smartphones. Short wavelength light late
132:13 smartphones. Short wavelength light late at night.
132:14 at night. >> Um you know being in contact with so
132:16 >> Um you know being in contact with so many people all the time maybe not so
132:18 many people all the time maybe not so good. I mean, I think what scares
132:20 good. I mean, I think what scares people, certainly me, is the idea that,
132:22 people, certainly me, is the idea that, you know, we're going to be doing a lot
132:23 you know, we're going to be doing a lot of error correction over the next 30
132:25 of error correction over the next 30 years because we're going so fast with
132:26 years because we're going so fast with technology because maps can change
132:29 technology because maps can change really, really fast.
132:29 really, really fast. >> Well, they they do change. Sam Alman had
132:32 >> Well, they they do change. Sam Alman had um I I I saw him
132:35 um I I I saw him say this and I actually thought was
132:36 say this and I actually thought was really good description. It's like, you
132:38 really good description. It's like, you know, PE Gen X or you know, there's a
132:40 know, PE Gen X or you know, there's a group that is using AI as a tool that's
132:44 group that is using AI as a tool that's sort of novel, interesting. Then you you
132:46 sort of novel, interesting. Then you you know you've got a different millennials
132:48 know you've got a different millennials or are using it as you know a search
132:52 or are using it as you know a search algorithm and maybe that's even Genex
132:54 algorithm and maybe that's even Genex but you know it's it's a little more
132:55 but you know it's it's a little more deeply integrated but then you go back
132:57 deeply integrated but then you go back you know to to younger generations and
132:59 you know to to younger generations and it's an operating system and it already
133:01 it's an operating system and it already is and that has major changes in neural
133:05 is and that has major changes in neural structure for how not just you know maps
133:07 structure for how not just you know maps but also neural processes for how we
133:10 but also neural processes for how we deal with information how we learn. uh
133:13 deal with information how we learn. uh you know the idea that we are very
133:15 you know the idea that we are very plastic under pressure. Absolutely. And
133:18 plastic under pressure. Absolutely. And that's where it gets interesting to talk
133:20 that's where it gets interesting to talk about different species too. I mean
133:22 about different species too. I mean we're talking about owls and that was
133:23 we're talking about owls and that was under pressure but you know what is
133:26 under pressure but you know what is successful human performance in training
133:28 successful human performance in training and all of these things. It's to make
133:29 and all of these things. It's to make those probabilistic situations more
133:32 those probabilistic situations more deterministic. Right? That's when you
133:34 deterministic. Right? That's when you are, if you're training as an athlete,
133:36 are, if you're training as an athlete, you're really trying to not have to
133:38 you're really trying to not have to think and to have the fastest reaction
133:40 think and to have the fastest reaction time to very complex behaviors given
133:43 time to very complex behaviors given complex stimula, complex situations and
133:46 complex stimula, complex situations and contexts, but you're you know that
133:47 contexts, but you're you know that situational awareness or physical
133:49 situational awareness or physical behavior in those environments. You you
133:51 behavior in those environments. You you want that as fast as possible with as
133:53 want that as fast as possible with as little cognitive, you know, load as
133:55 little cognitive, you know, load as possible. And you know, it's like that
133:57 possible. And you know, it's like that execution is critical. You love looking
133:59 execution is critical. You love looking across species. So do I. and looking for
134:02 across species. So do I. and looking for these ways where you know we we are a
134:06 these ways where you know we we are a brain is changing or you've got a
134:08 brain is changing or you've got a species that can do something that is
134:11 species that can do something that is absolutely not what you would predict or
134:14 absolutely not what you would predict or it's incredible in its you know how it
134:17 it's incredible in its you know how it can evade a predator how it can find a
134:20 can evade a predator how it can find a target you find a a mate and you know
134:23 target you find a a mate and you know it's doing things that are critical to
134:25 it's doing things that are critical to it being able to survive much as you
134:27 it being able to survive much as you said like I if I make it something that
134:30 said like I if I make it something that is absolutely necessary for success.
134:34 is absolutely necessary for success. It's going to do it. You know, one of my
134:36 It's going to do it. You know, one of my favorite examples is a particular moth
134:39 favorite examples is a particular moth that bats predate on um echolocating
134:41 that bats predate on um echolocating bats and and you know, frankly,
134:42 bats and and you know, frankly, echolocating bats are sort of nature's
134:44 echolocating bats are sort of nature's engineered amazing predatory species.
134:47 engineered amazing predatory species. You know, their their brains when you
134:49 You know, their their brains when you look at them, you know, are are just
134:51 look at them, you know, are are just incredible. They have huge amounts of
134:53 incredible. They have huge amounts of their their brain just dedicated to
134:56 their their brain just dedicated to what's called a FM constant frequency FM
135:00 what's called a FM constant frequency FM sort of sweep that some of the bats you
135:02 sort of sweep that some of the bats you know elicit a call that's sort of likeoo
135:06 know elicit a call that's sort of likeoo but really high
135:07 but really high >> so we so we can't hear it what does that
135:10 >> so we so we can't hear it what does that do for them
135:10 do for them >> it's doing two things one that constant
135:13 >> it's doing two things one that constant frequency portion is allowing them to
135:15 frequency portion is allowing them to sort of track the Doppler in a moving
135:16 sort of track the Doppler in a moving object so and and they're they're even
135:20 object so and and they're they're even so uh I mean It's such clever and
135:22 so uh I mean It's such clever and sophisticated. They're not changing um
135:25 sophisticated. They're not changing um they're changing subtly how what
135:28 they're changing subtly how what frequencies they elicit the call at so
135:30 frequencies they elicit the call at so that it always comes back in the same
135:32 that it always comes back in the same frequency range because that's where
135:33 frequency range because that's where their heightened sensitivity is.
135:35 their heightened sensitivity is. >> So otherwise you you know so they're
135:37 >> So otherwise you you know so they're modifying their vocal cords to make sure
135:39 modifying their vocal cords to make sure that the call comes back in the same
135:41 that the call comes back in the same range and then they're tracking how much
135:42 range and then they're tracking how much they've had to modify their their the
135:45 they've had to modify their their the call
135:45 call >> just so that people are on board. Yeah.
135:47 >> just so that people are on board. Yeah. Bats echoloccate. They're sending out
135:50 Bats echoloccate. They're sending out sound and they can measure distance and
135:51 sound and they can measure distance and sh they can essentially
135:54 sh they can essentially >> see in their mind's eye. They can sense
135:56 >> see in their mind's eye. They can sense distance. They can uh sense speed of
135:58 distance. They can uh sense speed of objects. They can sense shape of objects
136:00 objects. They can sense shape of objects by virtue of sounds being sent out and
136:02 by virtue of sounds being sent out and coming back. Absolutely.
136:03 coming back. Absolutely. >> And they're shaping those the sounds
136:04 >> And they're shaping those the sounds going out differently so that they can
136:06 going out differently so that they can look at multiple objects simultaneously.
136:07 look at multiple objects simultaneously. >> But also so they're shaping the sounds
136:09 >> But also so they're shaping the sounds they send out so that whatever comes
136:12 they send out so that whatever comes back is in their optimal neural like
136:15 back is in their optimal neural like range. so that they don't have to go
136:16 range. so that they don't have to go through more neuroplasticity that they
136:18 through more neuroplasticity that they already have like circuits that are
136:20 already have like circuits that are really dedicated to these certain
136:21 really dedicated to these certain frequency ranges. And so they send it
136:24 frequency ranges. And so they send it out and then they're keeping track of
136:25 out and then they're keeping track of the deltas. They're keeping track of how
136:27 the deltas. They're keeping track of how much they've had to change it and that's
136:29 much they've had to change it and that's what's in, you know, tells them the
136:30 what's in, you know, tells them the speed. So that constant frequency is a
136:32 speed. So that constant frequency is a lot like you know the ambulance sound
136:33 lot like you know the ambulance sound going by. That's the compression of
136:35 going by. That's the compression of sound waves that you hear as a when when
136:39 sound waves that you hear as a when when things move past you at speed. That's
136:40 things move past you at speed. That's the Doppler effect. And then there also
136:42 the Doppler effect. And then there also it has usually a really fast FM
136:45 it has usually a really fast FM frequency modulated sweep and that lets
136:47 frequency modulated sweep and that lets me take kind of a an imprint of you know
136:49 me take kind of a an imprint of you know so one's telling me the speed of the
136:51 so one's telling me the speed of the object another one's telling me sort of
136:53 object another one's telling me sort of what the surface structure looks like
136:56 what the surface structure looks like right that FM sweep lets me get uh you
136:59 right that FM sweep lets me get uh you know a sonic imprint of what's there so
137:01 know a sonic imprint of what's there so I can tell topography I can tell if
137:03 I can tell topography I can tell if there's a you know a moth on a a hard
137:06 there's a you know a moth on a a hard surface right so what's beautiful about
137:09 surface right so what's beautiful about other species is you've got a little
137:11 other species is you've got a little moth and you've got nature's predatory
137:14 moth and you've got nature's predatory marvel and 80% of the time about that
137:18 marvel and 80% of the time about that moth gets away
137:20 moth gets away >> how
137:21 >> how >> multiple things I call it almost an
137:22 >> multiple things I call it almost an acoustic arms race that's happening
137:24 acoustic arms race that's happening between the two and there's a lot of
137:25 between the two and there's a lot of acoustic sub subtrifuge between the moth
137:28 acoustic sub subtrifuge between the moth you know but there's also beautiful
137:30 you know but there's also beautiful deterministic responses that they have
137:32 deterministic responses that they have and um so first uh deterministic
137:36 and um so first uh deterministic behaviors again be it an athlete be it
137:38 behaviors again be it an athlete be it you know effectiveness being fast, quick
137:41 you know effectiveness being fast, quick in making good decisions that get you
137:44 in making good decisions that get you the right answer are always important.
137:46 the right answer are always important. So, you know, moss have just a few
137:48 So, you know, moss have just a few neurons. When that echolocating bat is
137:50 neurons. When that echolocating bat is flying, you know, at a certain point,
137:52 flying, you know, at a certain point, uh, when those neurons start firing,
137:54 uh, when those neurons start firing, they will start, you know, they'll start
137:56 they will start, you know, they'll start flying in more of a random pattern.
137:58 flying in more of a random pattern. You'll see the same thing with seals
137:59 You'll see the same thing with seals when there are great white sharks
138:00 when there are great white sharks around, right? It's decreasing the
138:02 around, right? It's decreasing the probability that, you know, it's easy
138:04 probability that, you know, it's easy for them to continue to track you. So
138:06 for them to continue to track you. So they'll f fly in a random pattern and
138:08 they'll f fly in a random pattern and then when their neurons saturate you
138:09 then when their neurons saturate you when when the when it gets those calls
138:12 when when the when it gets those calls get close enough the moth will drop to
138:15 get close enough the moth will drop to the ground with the idea that yeah in
138:18 the ground with the idea that yeah in assuming we don't live in cities in a
138:20 assuming we don't live in cities in a natural world the ground is you know
138:23 natural world the ground is you know wheat grass it's a difficult environment
138:26 wheat grass it's a difficult environment for an echo locating back to locate you
138:29 for an echo locating back to locate you right so that is just a deterministic
138:31 right so that is just a deterministic behavior that will happen regardless but
138:34 behavior that will happen regardless but then the interesting part is their body
138:36 then the interesting part is their body is reflecting metarlectors effectively
138:40 is reflecting metarlectors effectively so that the bat may put out its call and
138:42 so that the bat may put out its call and it deflects the you know the energy of
138:46 it deflects the you know the energy of the call away from its body. So you're
138:48 the call away from its body. So you're deflecting it away from critical
138:50 deflecting it away from critical critical areas and you know this is all
138:53 critical areas and you know this is all like happening and that's the the
138:57 like happening and that's the the changes in the physi physical body are
139:00 changes in the physi physical body are interesting but then it's the behavioral
139:02 interesting but then it's the behavioral differences they're really key right
139:05 differences they're really key right it's how fast does that moth react if it
139:07 it's how fast does that moth react if it has to question you know or if it were
139:10 has to question you know or if it were cognitively responsive instead of being
139:12 cognitively responsive instead of being deterministic in its behavior it
139:14 deterministic in its behavior it wouldn't escape right but it gets
139:17 wouldn't escape right but it gets Yeah, I've never thought about bats and
139:19 Yeah, I've never thought about bats and and moths. I I um I never got the insect
139:23 and moths. I I um I never got the insect I was about to say I never got the
139:24 I was about to say I never got the insect bug that that then no pun
139:27 insect bug that that then no pun intended. I I never got the insect bug
139:29 intended. I I never got the insect bug because um I I don't think of things in
139:34 because um I I don't think of things in the auditory domain. I think of things
139:36 the auditory domain. I think of things in the visual domain. And some insects
139:38 in the visual domain. And some insects are very visual. But um it's it's it's
139:42 are very visual. But um it's it's it's good for me to think about that. You
139:43 good for me to think about that. You know, one of my favorite people,
139:45 know, one of my favorite people, although I never met him, was Oliver
139:46 although I never met him, was Oliver Saxs, like the neurologist and writer.
139:48 Saxs, like the neurologist and writer. And he claimed to have spent a lot of
139:51 And he claimed to have spent a lot of time imagining, just sitting in a chair
139:53 time imagining, just sitting in a chair and trying to imagine what life would be
139:55 and trying to imagine what life would be like as a bat as a way to enhance his um
139:58 like as a bat as a way to enhance his um clinical abilities with patients
140:00 clinical abilities with patients suffering from different neurologic
140:02 suffering from different neurologic disorders.
140:03 disorders. >> Huh. So when he would interact with
140:04 >> Huh. So when he would interact with somebody with Parkinson's or with severe
140:07 somebody with Parkinson's or with severe autism or with lockin syndrome or uh any
140:10 autism or with lockin syndrome or uh any number of different deficits of the of
140:12 number of different deficits of the of the nervous system, he would um
140:16 the nervous system, he would um he felt that he could go into their mind
140:19 he felt that he could go into their mind a bit to understand what their
140:21 a bit to understand what their experience was like. He could empathize
140:23 experience was like. He could empathize with them and that would make him more
140:24 with them and that would make him more effective at treating them. And he
140:26 effective at treating them. And he certainly was very effective at storing
140:28 certainly was very effective at storing out their um their experience in ways
140:31 out their um their experience in ways that brought about a lot of compassion
140:33 that brought about a lot of compassion and understanding. Like he never
140:34 and understanding. Like he never presented a a neural condition in a way
140:37 presented a a neural condition in a way that made you feel sorry for the person.
140:40 that made you feel sorry for the person. It was always the opposite.
140:42 It was always the opposite. >> Um and I should point out, not trying to
140:44 >> Um and I should point out, not trying to be politically correct here, but when I
140:46 be politically correct here, but when I say autistic, I meant the patients he
140:47 say autistic, I meant the patients he worked with were severely autistic to
140:49 worked with were severely autistic to the point of, you know, never being able
140:51 the point of, you know, never being able to take care of themselves. This is
140:53 to take care of themselves. This is we're not talking about along a
140:54 we're not talking about along a spectrum. We're talking about the far
140:55 spectrum. We're talking about the far end of the spectrum of uh needing
140:59 end of the spectrum of uh needing assisted living their entire lives and
141:00 assisted living their entire lives and being sensory very uh from a sensory
141:04 being sensory very uh from a sensory standpoint extremely sensitive, couldn't
141:05 standpoint extremely sensitive, couldn't go out in public, that kind of thing.
141:06 go out in public, that kind of thing. That we're not talking about people that
141:08 That we're not talking about people that are uh functioning uh with autism. So um
141:12 are uh functioning uh with autism. So um apparently thinking in the auditory
141:14 apparently thinking in the auditory domain was useful for him. So I should
141:16 domain was useful for him. So I should probably do that. So I have one final
141:18 probably do that. So I have one final question for you. Uh which is what's
141:22 question for you. Uh which is what's really two questions. First question,
141:25 really two questions. First question, why did you sing to spiders? And second,
141:29 why did you sing to spiders? And second, what does that tell us about spiderw
141:31 what does that tell us about spiderw webs? Because uh I confess I know the
141:34 webs? Because uh I confess I know the answers to these questions, but I was
141:36 answers to these questions, but I was absolutely blown away to learn what
141:37 absolutely blown away to learn what spiderw webs are actually for. Um and
141:42 spiderw webs are actually for. Um and you singing to spiders reveals what
141:45 you singing to spiders reveals what they're for. So why did you sing to
141:46 they're for. So why did you sing to spiders?
141:47 spiders? >> Two things. And um you can watch me sing
141:49 >> Two things. And um you can watch me sing to a spider on a TED talk I gave a few
141:52 to a spider on a TED talk I gave a few years ago. We'll put it
141:53 years ago. We'll put it >> here back. Okay. And um no uh so maybe
141:58 >> here back. Okay. And um no uh so maybe this comes back to I have absolute pitch
142:00 this comes back to I have absolute pitch so I know what frequencies I'm singing
142:02 so I know what frequencies I'm singing but I also recognize by having absolute
142:04 but I also recognize by having absolute pitch I know my brain is just a little
142:06 pitch I know my brain is just a little different. Again what you ask me what
142:07 different. Again what you ask me what threads drive me. It's always been we we
142:10 threads drive me. It's always been we we do experience the world differently. And
142:12 do experience the world differently. And I believe that our success, everyone's
142:14 I believe that our success, everyone's success and the success of our growth as
142:17 success and the success of our growth as humans is is partly dependent on how we
142:19 humans is is partly dependent on how we use technology to help you know improve
142:22 use technology to help you know improve and optimize each of us with you know
142:24 and optimize each of us with you know the different variables we need. Right?
142:27 the different variables we need. Right? So different species and how they
142:29 So different species and how they respond to sound is very interesting to
142:31 respond to sound is very interesting to me. And as much as you I know Andy you
142:36 me. And as much as you I know Andy you look at how different species respond to
142:39 look at how different species respond to color and to information in the world be
142:41 color and to information in the world be it cuttlefish or such I have jellyfish
142:43 it cuttlefish or such I have jellyfish too and I can see how they you know
142:46 too and I can see how they you know their pulsing rates change with their
142:47 their pulsing rates change with their photo receptors when they uh you know
142:49 photo receptors when they uh you know with different light colors it's very
142:51 with different light colors it's very obvious that some clearly make you know
142:53 obvious that some clearly make you know that they are under when they're under
142:54 that they are under when they're under stress versus when they're in a a more
142:57 stress versus when they're in a a more calming state. And so it's like
142:58 calming state. And so it's like understanding the stimula in our world
143:00 understanding the stimula in our world that shape us, those changes is a huge
143:03 that shape us, those changes is a huge part of being human. In my perspective,
143:05 part of being human. In my perspective, in this case, this happens to be an orb
143:07 in this case, this happens to be an orb spider, the one I sing to. And when I
143:10 spider, the one I sing to. And when I hit about 880 hertz, uh you will see the
143:12 hit about 880 hertz, uh you will see the spider kind of dances. But what this
143:16 spider kind of dances. But what this particular species and not all spiders
143:18 particular species and not all spiders will do this is predated on by
143:20 will do this is predated on by echolocating bats and birds which makes
143:23 echolocating bats and birds which makes sense that then you know it tunes its
143:25 sense that then you know it tunes its web effectively and and the orb weavers
143:28 web effectively and and the orb weavers are all over California. It's what they
143:29 are all over California. It's what they they show up a lot in around uh
143:31 they show up a lot in around uh Thanksgiving if you are October,
143:33 Thanksgiving if you are October, November for anyone that's on the you
143:35 November for anyone that's on the you know out here on the west coast. Um
143:37 know out here on the west coast. Um they're not bad spiders. They they are
143:38 they're not bad spiders. They they are not spiders you need to get rid of.
143:40 not spiders you need to get rid of. They're totally happy spiders. or some,
143:42 They're totally happy spiders. or some, you know, that maybe you're should worry
143:44 you know, that maybe you're should worry about more. Anyhow, they tune their webs
143:47 about more. Anyhow, they tune their webs to resonate like a violin. And when, you
143:51 to resonate like a violin. And when, you know, you'll see it as I hit a certain
143:52 know, you'll see it as I hit a certain frequency, it'll effectively tell me to
143:55 frequency, it'll effectively tell me to to to go away. And uh it's it's it's a
144:00 to to go away. And uh it's it's it's a pretty interesting sort of deterministic
144:02 pretty interesting sort of deterministic response. Other insects do different
144:04 response. Other insects do different things. Uh the one kind of uh funny for
144:08 things. Uh the one kind of uh funny for that was when my daughter was I think at
144:11 that was when my daughter was I think at the time she was about two and a half or
144:13 the time she was about two and a half or three and she kind of adopted
144:16 three and she kind of adopted uh asking me when we would see spiders
144:18 uh asking me when we would see spiders if it was the kind we would we should
144:20 if it was the kind we would we should sing to or the kind we shouldn't touch.
144:24 sing to or the kind we shouldn't touch. >> Those were the two classes.
144:25 >> Those were the two classes. >> So uh amazing. So if I understand
144:29 >> So uh amazing. So if I understand correctly, these orb spiders use their
144:31 correctly, these orb spiders use their web.
144:32 web. >> Yes.
144:33 >> Yes. more or less as an instrument to detect
144:35 more or less as an instrument to detect certain sound frequencies in their
144:36 certain sound frequencies in their environment.
144:37 environment. >> Resonances absolutely
144:38 >> Resonances absolutely >> so that they can respond appropriately.
144:40 >> so that they can respond appropriately. Yeah.
144:40 Yeah. >> Either by raising their legs to protect
144:43 >> Either by raising their legs to protect themselves or to attack or whatever it
144:45 themselves or to attack or whatever it is that that the spiderweb is a
144:47 is that that the spiderweb is a functional thing not just for catching
144:49 functional thing not just for catching prey. It's it's a detection device also.
144:52 prey. It's it's a detection device also. And we know that because when prey are
144:53 And we know that because when prey are in caught in a spiderweb, they wiggle
144:55 in caught in a spiderweb, they wiggle and then the spider goes over to it and
144:57 and then the spider goes over to it and wraps it and and eats it. But um but the
144:59 wraps it and and eats it. But um but the idea that it would be tuned to
145:01 idea that it would be tuned to particular frequencies is really wild.
145:04 particular frequencies is really wild. >> Yeah. Not just any vibration, right? You
145:06 >> Yeah. Not just any vibration, right? You know, there's the idea that there's any
145:08 know, there's the idea that there's any vibration, I know I've got, you know,
145:09 vibration, I know I've got, you know, food somewhere, I should go to that food
145:11 food somewhere, I should go to that food source, but instead it's something that
145:13 source, but instead it's something that if I experience a threat or something,
145:15 if I experience a threat or something, I'm going to behave. And that is a more
145:18 I'm going to behave. And that is a more selective, you know, response that I've
145:20 selective, you know, response that I've tuned it towards.
145:21 tuned it towards. >> It's so interesting because if I just
145:23 >> It's so interesting because if I just transfer it to the visual domain, it's
145:25 transfer it to the visual domain, it's like, yeah, of course, like if an
145:27 like, yeah, of course, like if an animal, including us, sees something
145:29 animal, including us, sees something like a looming object coming at us.
145:31 like a looming object coming at us. >> Yeah. uh closer to dark, we our
145:34 >> Yeah. uh closer to dark, we our immediate response is to either freeze
145:35 immediate response is to either freeze or flee. Like that's just what we do.
145:37 or flee. Like that's just what we do. The looming response is one of the most
145:38 The looming response is one of the most fundamental responses, but that's in the
145:40 fundamental responses, but that's in the visual domain. So the fact that there
145:41 visual domain. So the fact that there would be auditory cues that would bring
145:43 would be auditory cues that would bring about what you said the sort of
145:45 about what you said the sort of deterministic responses seems very real.
145:47 deterministic responses seems very real. I feel like that there the whale of a of
145:50 I feel like that there the whale of a of somebody in pain.
145:51 somebody in pain. >> Yes.
145:52 >> Yes. >> Evokes a certain response. the um
145:55 >> Evokes a certain response. the um yesterday there was a lot of noise
145:56 yesterday there was a lot of noise outside my window at night and I there
145:58 outside my window at night and I there was a moment where I couldn't tell were
145:59 was a moment where I couldn't tell were these um shouts of glee or shouts of
146:02 these um shouts of glee or shouts of fear
146:04 fear >> and I like I can't do and then I heard
146:06 >> and I like I can't do and then I heard this like kind of like uh highpitch
146:10 this like kind of like uh highpitch um fluttering that came after the scream
146:12 um fluttering that came after the scream and I realized these were kids playing
146:14 and I realized these were kids playing in the in the alley outside my house and
146:16 in the in the alley outside my house and I went and looked I was like oh yeah
146:17 I went and looked I was like oh yeah they're they're definitely playing but I
146:19 they're they're definitely playing but I knew even before I went and looked based
146:21 knew even before I went and looked based on the kind of the the flutter of sound
146:24 on the kind of the the flutter of sound that came after the like the the shriek.
146:28 that came after the like the the shriek. It was like and then it was it was like
146:30 It was like and then it was it was like I can't I can't reproduce the sound at
146:32 I can't I can't reproduce the sound at that high frequency.
146:33 that high frequency. >> That's that's um
146:35 >> That's that's um >> so the idea that this would be true all
146:37 >> so the idea that this would be true all the time is uh is super interesting. We
146:40 the time is uh is super interesting. We just don't tend to focus just on our
146:41 just don't tend to focus just on our hearing unless of course somebody's
146:43 hearing unless of course somebody's blind in which case they have to rely on
146:44 blind in which case they have to rely on it much more.
146:45 it much more. >> So two interesting things to go with
146:47 >> So two interesting things to go with that. So like crickets for example,
146:49 that. So like crickets for example, crickets um have biodal neurons that
146:52 crickets um have biodal neurons that have sort of peaks in two different
146:54 have sort of peaks in two different frequency ranges for the same neuron.
146:55 frequency ranges for the same neuron. And each frequency range will elicit a
146:58 And each frequency range will elicit a completely different behavior to when
147:00 completely different behavior to when when so you've got a peak at 6k and
147:02 when so you've got a peak at 6k and you've got a peak at 40k and cricket and
147:06 you've got a peak at 40k and cricket and this is the same neuron. cricket hears
147:07 this is the same neuron. cricket hears 40k from a speaker, run over to it
147:10 40k from a speaker, run over to it because that's got to be my bait or some
147:12 because that's got to be my bait or some you know that and you hear 40k and they
147:14 you know that and you hear 40k and they run away and you know it's very
147:16 run away and you know it's very predictive behavior. Uh I spend a lot of
147:19 predictive behavior. Uh I spend a lot of well I spent a good period of time
147:21 well I spent a good period of time working with non- primate non-human
147:23 working with non- primate non-human primate species marmicetses. Marmicetses
147:26 primate species marmicetses. Marmicetses are very interesting when you get to a
147:27 are very interesting when you get to a more sophistic you know you know a more
147:30 more sophistic you know you know a more sophisticated neural system. Um, but
147:32 sophisticated neural system. Um, but they're you marmicetses are very social.
147:36 they're you marmicetses are very social. You know, it's critical to their
147:37 You know, it's critical to their happiness. If you ever see a single
147:38 happiness. If you ever see a single marmicet in the zoo or something, that's
147:40 marmicet in the zoo or something, that's a very unhappy uh animal. But they're
147:43 a very unhappy uh animal. But they're they're native to the Amazon. You know,
147:46 they're native to the Amazon. You know, new world monkeys native to Brazil and
147:48 new world monkeys native to Brazil and the Amazon, but they're aroreal. They
147:49 the Amazon, but they're aroreal. They live in trees and they're very social.
147:52 live in trees and they're very social. So that kind of can, you know, be in
147:54 So that kind of can, you know, be in conflict with each other because you're,
147:56 conflict with each other because you're, you know, in dense foliage, but yet you
147:59 you know, in dense foliage, but yet you need to communicate. So they've evolved
148:01 need to communicate. So they've evolved very interesting systems to be able to
148:04 very interesting systems to be able to you know achieve what they needed to
148:06 you know achieve what they needed to which one um they if you ever see a
148:10 which one um they if you ever see a marmicetses they're very stoic unlike
148:11 marmicetses they're very stoic unlike macac monkeys that you know often have a
148:14 macac monkeys that you know often have a lot of visual you know expression of how
148:16 lot of visual you know expression of how they're feeling. Armicetses always look
148:18 they're feeling. Armicetses always look about the same and um but their their
148:22 about the same and um but their their vocalizations are almost like bird song
148:24 vocalizations are almost like bird song and they're very rich in the information
148:27 and they're very rich in the information that they're you know communicating.
148:29 that they're you know communicating. They also have a f pherommonal system
148:32 They also have a f pherommonal system like you know they um thought you can
148:34 like you know they um thought you can have a dominant female in the colony who
148:38 have a dominant female in the colony who may not be because you have to have ways
148:39 may not be because you have to have ways of communic when one sense is
148:41 of communic when one sense is compromised the other senses sort of
148:43 compromised the other senses sort of rise up to help assure that the success
148:46 rise up to help assure that the success of what that s you know that that
148:49 of what that s you know that that species or system needs is going to be
148:51 species or system needs is going to be you know thrive. And in the case of
148:54 you know thrive. And in the case of marmicetses, you can have the dominant
148:55 marmicetses, you can have the dominant female effectively causes the ovulation
148:58 female effectively causes the ovulation of like the biology to change of all the
149:00 of like the biology to change of all the other females and you can have a female
149:03 other females and you can have a female that you put just in the same proximity
149:06 that you put just in the same proximity but now as part of a different group and
149:10 but now as part of a different group and her biology will change. I mean it's
149:12 her biology will change. I mean it's very powerful the pherommonal
149:13 very powerful the pherommonal interactions that happen in the because
149:15 interactions that happen in the because those are things that can travel even
149:17 those are things that can travel even when I can't see you. One thing when I
149:20 when I can't see you. One thing when I was working with them, you know, that I
149:21 was working with them, you know, that I thought was and and I never I like
149:25 thought was and and I never I like writing pads more than publishing
149:26 writing pads more than publishing papers. So, but these things are real
149:28 papers. So, but these things are real because I was studying pupilometry is is
149:31 because I was studying pupilometry is is understanding the power of the you know
149:33 understanding the power of the you know their sacads. I could know what they
149:34 their sacads. I could know what they were hearing based on their eye
149:36 were hearing based on their eye movements, right? So, if I play
149:38 movements, right? So, if I play marmicetses have, you know, call some of
149:40 marmicetses have, you know, call some of their calls are really antipinal.
149:42 their calls are really antipinal. They're to see, hey, are you out there?
149:45 They're to see, hey, are you out there? Am I alone? Who else is around?
149:46 Am I alone? Who else is around? >> Texting for humans. Yeah.
149:48 >> Texting for humans. Yeah. >> Yeah. And sometimes it's light or
149:49 >> Yeah. And sometimes it's light or sometimes it might be like oh you know
149:51 sometimes it might be like oh you know from be careful there's you know there's
149:54 from be careful there's you know there's somebody you know around that we got to
149:56 somebody you know around that we got to watch out for maybe there's a leopard on
149:57 watch out for maybe there's a leopard on the ground or somebody something right
149:59 the ground or somebody something right and then sometimes it's like you're in
150:01 and then sometimes it's like you're in my face get out of here now right and
150:04 my face get out of here now right and those are three different things and I
150:06 those are three different things and I can play that to you and I can tell you
150:08 can play that to you and I can tell you without hearing it and I know exactly
150:09 without hearing it and I know exactly what's being heard in the case of the
150:11 what's being heard in the case of the antipol hey are you out there you see
150:12 antipol hey are you out there you see like the the eye will just start
150:14 like the the eye will just start scanning back and forth right because
150:16 scanning back and forth right because that's the right movement I'm looking
150:18 that's the right movement I'm looking for where's this coming from?
150:19 for where's this coming from? >> Yeah. They paired the right eye movement
150:20 >> Yeah. They paired the right eye movement with the right sound.
150:21 with the right sound. >> Exactly. In the case of, you know, look,
150:24 >> Exactly. In the case of, you know, look, it's um you know, there's something to
150:26 it's um you know, there's something to be scar threatened of. You're going to
150:27 be scar threatened of. You're going to see dilation and you're also going to
150:29 see dilation and you're also going to see some scanning, but it's not as slow.
150:32 see some scanning, but it's not as slow. It's a lot faster because there's a
150:33 It's a lot faster because there's a threat to me. I my you know, my
150:35 threat to me. I my you know, my autonomic system and my cognitive system
150:37 autonomic system and my cognitive system are like be reacting differently. And in
150:40 are like be reacting differently. And in the case of you're in my face, it's
150:41 the case of you're in my face, it's going to be, you know, without even so
150:44 going to be, you know, without even so without seeing you, if I hear another,
150:47 without seeing you, if I hear another, you know, sort of aggressive sound, I'm
150:49 you know, sort of aggressive sound, I'm going to react. I'm going to be, you
150:50 going to react. I'm going to be, you know, I'm not scanning anywhere, but I
150:53 know, I'm not scanning anywhere, but I my dilation is going to be fast and, you
150:55 my dilation is going to be fast and, you know, my and I'm also going to be much
150:57 know, my and I'm also going to be much more on top of things. But we do this
150:59 more on top of things. But we do this as, you know, humans too, right? And
151:02 as, you know, humans too, right? And it's like I you can you walk into a
151:04 it's like I you can you walk into a business meeting, you walk into a
151:05 business meeting, you walk into a conference room and you know it's these
151:07 conference room and you know it's these subtle cues that are con you we can't
151:10 subtle cues that are con you we can't don't always suppress them. We show them
151:12 don't always suppress them. We show them whether we think we do or we don't. But
151:14 whether we think we do or we don't. But you know when you look at species like
151:15 you know when you look at species like that it's very much like okay you know
151:18 that it's very much like okay you know there's there's a lot of you know
151:20 there's there's a lot of you know sophistication in and how their bodies
151:22 sophistication in and how their bodies are helping them be successful even in a
151:24 are helping them be successful even in a world or an environment that has a lot
151:27 world or an environment that has a lot of things that that could maybe you know
151:29 of things that that could maybe you know come after them. So interesting to think
151:32 come after them. So interesting to think about that in terms of um our own human
151:35 about that in terms of um our own human behavior and what we're optimizing for,
151:38 behavior and what we're optimizing for, especially as all these technologies
151:39 especially as all these technologies come on board and are sure to come on
151:41 come on board and are sure to come on board even more quickly. Um Poppy, thank
151:45 board even more quickly. Um Poppy, thank you so much for coming here today to
151:47 you so much for coming here today to educate us about what you've done,
151:50 educate us about what you've done, what's here now, what's to come. We
151:52 what's here now, what's to come. We covered a lot of different territories
151:54 covered a lot of different territories and I I'm glad we did because um you
151:57 and I I'm glad we did because um you have expertise in a lot of areas and I
151:59 have expertise in a lot of areas and I love that you are constantly thinking
152:00 love that you are constantly thinking about technology development and I you
152:03 about technology development and I you know I drew a little diagram for myself
152:05 know I drew a little diagram for myself that I'll just describe for people
152:07 that I'll just describe for people because um if I understood correctly one
152:10 because um if I understood correctly one of the reasons you got into neuroscience
152:12 of the reasons you got into neuroscience and research at all is about this um
152:16 and research at all is about this um interface between inputs and us and what
152:20 interface between inputs and us and what sits in between those two things is this
152:23 sits in between those two things is this incredible feature of our nervous
152:24 incredible feature of our nervous systems which is neuroplasticity
152:26 systems which is neuroplasticity or what I sometimes like to refer to as
152:29 or what I sometimes like to refer to as self-directed plasticity because unlike
152:31 self-directed plasticity because unlike other species
152:33 other species we can decide what we want to change and
152:35 we can decide what we want to change and make the effort to adopt a second
152:38 make the effort to adopt a second map of the auditory world or visual
152:41 map of the auditory world or visual world or or take on a new uh a new set
152:43 world or or take on a new uh a new set of learnings in any domain and we can do
152:45 of learnings in any domain and we can do it if we put our mind to it if the
152:47 it if we put our mind to it if the incentives are high enough we can do it
152:49 incentives are high enough we can do it and at the same time neuroplasticity is
152:51 and at the same time neuroplasticity is always occurring based on the things
152:53 always occurring based on the things we're bombarded with new technology. So,
152:55 we're bombarded with new technology. So, we have to be aware of how we are
152:57 we have to be aware of how we are changing and we need to intervene at
153:00 changing and we need to intervene at times and and leverage those things for
153:02 times and and leverage those things for our health. So, thank you so much for
153:05 our health. So, thank you so much for doing the work that you do. Thank you
153:06 doing the work that you do. Thank you for coming here to educate us on them
153:07 for coming here to educate us on them and um keep us posted. We'll provide
153:10 and um keep us posted. We'll provide links to you singing to uh spiders and
153:12 links to you singing to uh spiders and and all the rest. My mind's blown. Thank
153:15 and all the rest. My mind's blown. Thank you so much.
153:16 you so much. >> Thank you, Eddie. Great to be here.
153:18 >> Thank you, Eddie. Great to be here. >> Thank you for joining me for today's
153:19 >> Thank you for joining me for today's discussion with Dr. Poppyrcum. To learn
153:22 discussion with Dr. Poppyrcum. To learn more about her work and to find links to
153:24 more about her work and to find links to the various resources we discussed,
153:25 the various resources we discussed, please see the show note captions. If
153:27 please see the show note captions. If you're learning from and or enjoying
153:28 you're learning from and or enjoying this podcast, please subscribe to our
153:30 this podcast, please subscribe to our YouTube channel. That's a terrific
153:32 YouTube channel. That's a terrific zerocost way to support us. In addition,
153:34 zerocost way to support us. In addition, please follow the podcast by clicking
153:36 please follow the podcast by clicking the follow button on both Spotify and
153:38 the follow button on both Spotify and Apple. And on both Spotify and Apple,
153:40 Apple. And on both Spotify and Apple, you can leave us up to a five-star
153:41 you can leave us up to a five-star review. And you can now leave us
153:43 review. And you can now leave us comments at both Spotify and Apple.
153:45 comments at both Spotify and Apple. Please also check out the sponsors
153:47 Please also check out the sponsors mentioned at the beginning and
153:48 mentioned at the beginning and throughout today's episode. That's the
153:50 throughout today's episode. That's the best way to support this podcast. If you
153:52 best way to support this podcast. If you have questions for me or comments about
153:54 have questions for me or comments about the podcast or guests or topics that
153:55 the podcast or guests or topics that you'd like me to consider for the
153:57 you'd like me to consider for the Huberman Lab podcast, please put those
153:59 Huberman Lab podcast, please put those in the comment section on YouTube. I do
154:01 in the comment section on YouTube. I do read all the comments. For those of you
154:02 read all the comments. For those of you that haven't heard, I have a new book
154:04 that haven't heard, I have a new book coming out. It's my very first book.
154:06 coming out. It's my very first book. It's entitled Protocols: An Operating
154:08 It's entitled Protocols: An Operating Manual for the Human Body. This is a
154:10 Manual for the Human Body. This is a book that I've been working on for more
154:11 book that I've been working on for more than 5 years, and that's based on more
154:13 than 5 years, and that's based on more than 30 years of research and
154:15 than 30 years of research and experience. And it covers protocols for
154:17 experience. And it covers protocols for everything from sleep to exercise to
154:20 everything from sleep to exercise to stress control protocols related to
154:22 stress control protocols related to focus and motivation. And of course, I
154:25 focus and motivation. And of course, I provide the scientific substantiation
154:27 provide the scientific substantiation for the protocols that are included. The
154:29 for the protocols that are included. The book is now available by pre-sale at
154:31 book is now available by pre-sale at protocolsbook.com.
154:33 protocolsbook.com. There you can find links to various
154:35 There you can find links to various vendors. You can pick the one that you
154:36 vendors. You can pick the one that you like best. Again, the book is called
154:38 like best. Again, the book is called Protocols: An Operating Manual for the
154:41 Protocols: An Operating Manual for the Human Body. And if you're not already
154:43 Human Body. And if you're not already following me on social media, I am
154:44 following me on social media, I am Huberman Lab on all social media
154:46 Huberman Lab on all social media platforms. So that's Instagram, X,
154:48 platforms. So that's Instagram, X, Threads, Facebook, and LinkedIn. And on
154:51 Threads, Facebook, and LinkedIn. And on all those platforms, I discuss science
154:53 all those platforms, I discuss science and science related tools, some of which
154:55 and science related tools, some of which overlaps with the content of the
154:56 overlaps with the content of the Hubberman Lab podcast, but much of which
154:58 Hubberman Lab podcast, but much of which is distinct from the information on the
154:59 is distinct from the information on the Hubberman Lab podcast. Again, it's
155:01 Hubberman Lab podcast. Again, it's Huberman Lab on all social media
155:03 Huberman Lab on all social media platforms. And if you haven't already
155:05 platforms. And if you haven't already subscribed to our neural network
155:06 subscribed to our neural network newsletter, the neural network
155:08 newsletter, the neural network newsletter is a zerocost monthly
155:09 newsletter is a zerocost monthly newsletter that includes podcast
155:11 newsletter that includes podcast summaries as well as what we call
155:12 summaries as well as what we call protocols in the form of 1 to three page
155:15 protocols in the form of 1 to three page PDFs that cover everything from how to
155:17 PDFs that cover everything from how to optimize your sleep, how to optimize
155:18 optimize your sleep, how to optimize dopamine, deliberate cold exposure. We
155:21 dopamine, deliberate cold exposure. We have a foundational fitness protocol
155:22 have a foundational fitness protocol that covers cardiovascular training and
155:24 that covers cardiovascular training and resistance training. All of that is
155:26 resistance training. All of that is available completely zero cost. You
155:28 available completely zero cost. You simply go to hubermanlab.com, go to the
155:30 simply go to hubermanlab.com, go to the menu tab in the top right corner, scroll
155:32 menu tab in the top right corner, scroll down to newsletter, and enter your
155:34 down to newsletter, and enter your email. And I should emphasize that we do
155:35 email. And I should emphasize that we do not share your email with anybody. Thank
155:38 not share your email with anybody. Thank you once again for joining me for
155:39 you once again for joining me for today's discussion with Dr. Poppyrum.
155:41 today's discussion with Dr. Poppyrum. And last, but certainly not least, thank
155:43 And last, but certainly not least, thank you for your interest in science.