0:02 Chat GBT is going to potentially
0:05 increase your risk of dementia.
0:06 >> I'm sorry, but you you've pressed my
0:08 button and actually it is possible to
0:10 use it to help you become a smarter
0:12 person, but it requires education. You
0:14 have to look at the risks and the benefits.
0:15 benefits.
0:18 >> But we embrace convenience before
0:20 understanding consequence.
0:21 >> So, we have to talk about this. This is
0:23 a study that came out that sent a shock
0:24 wave across the world. And
0:27 astonishingly, MIT found a 47% collapse
0:29 in brain activity when people wrote with
0:32 Chat GBT compared with writing unaded.
0:34 Their memory scores plunged. And you're
0:36 both masters of the brain. I mean,
0:37 you've probably scanned more brains than
0:39 any other human on Earth at this point.
0:41 And you invented the Boltzman machine
0:43 with Jeffrey Hinton, a computer that
0:45 simulated how the brain works. So my
0:47 question is, what are your concerns?
0:49 >> If you misuse these large language
0:51 models, like using it as a convenience
0:52 to speed things up, your brain's going
0:53 to go downhill. Well, there's no doubt
0:54 about that.
0:55 >> What about children?
0:57 >> We have the sickest young generation in
0:59 history because of cell phones, social
1:02 media, and I think AI is much more
1:04 dangerous on the developing brain.
1:07 >> So, are we raising mentally weak kids?
1:09 >> There is that argument and I think it's true.
1:10 true.
1:12 >> And then there's many examples of people
1:14 falling in love with AI like Annie.
1:16 >> I thought you might have forgotten about
1:16 me, handsome.
1:18 >> Can you talk to Daniel and Terry, please?
1:18 please?
1:20 >> Oh, baby. I'm ready to charm the socks
1:21 off them. Picture me.
1:23 >> Okay, so I'll stop it there. So, what
1:24 advice would you give as it relates to
1:26 AI and other things outside of AI that
1:28 we can do to have healthy brains?
1:30 >> I'll tell you how to use chat GPT to
1:32 improve our cognitive abilities.
1:34 >> And if you want to keep your brain
1:36 healthy, you have to treat the 11 major
1:39 risk factors. So, here we go.
1:40 >> I see messages all the time in the
1:42 comments section that some of you didn't
1:44 realize you didn't subscribe. So, if you
1:45 could do me a favor and double check if
1:47 you're a subscriber to this channel,
1:48 that would be tremendously appreciated.
1:50 It's the simple, it's the free thing
1:51 that anybody that watches this show
1:53 frequently can do to help us here to
1:55 keep everything going in this show in
1:57 the trajectory it's on. So, please do
1:59 double check if you've subscribed and uh
2:00 thank you so much because in a strange
2:02 way you are you're part of our history
2:04 and you're on this journey with us and I
2:06 appreciate you for that. So, yeah, thank you.
2:11 >> Dr. Daniel,
2:15 Dr. Terry, I have asked you both to sit
2:17 with me today to help me understand the
2:21 impact of these tools that we call large
2:22 language models, the chat GBTs, the
2:24 Geminis of the world, the Grocs of the
2:26 world are having on our brains and I
2:28 guess more broadly on our lives. And you
2:30 two are experts in your field. You're
2:32 two people that I admire tremendously.
2:35 So by way of introduction,
2:37 Terry, what is your academic background
2:39 and what is your experience? I also know
2:40 that you know one of our friends of the
2:42 show Jeffrey Hinton. Can you give me an
2:44 overview of your your academic and your
2:46 sort of um professional background?
2:51 >> So I was born a physicist received a PhD
2:53 in theoretical physics from Princeton
2:55 University and then I had the good
2:58 fortune to work as a posttock in the lab
3:01 of Steven Kofler who is the father of
3:03 neurobiology and and that started my
3:05 career as a neuroscientist. I pioneered
3:07 a part of neuroscience which is now
3:09 called computational neuroscience.
3:11 Taking my skills as a physicist and
3:12 trying to apply that to understanding
3:15 the brain, creating models, theories,
3:17 and uh we're making progress.
3:21 >> Dr. Danny Layman, um bit about your
3:22 background. My my viewers know you well,
3:24 but just to give an overview for anyone
3:26 that might not have been exposed to your
3:28 work and your experience, what have you
3:30 spent your life doing? And what are your
3:32 thoughts, your sort of topline thoughts
3:33 on everything that's going on at the
3:36 moment with artificial intelligence?
3:38 >> So, by training, I'm a psychiatrist. I'm
3:41 a general psychiatrist and a child
3:44 psychiatrist. When I graduated from
3:46 medical school, I wanted to be a really
3:50 good psychiatrist because someone I love
3:52 tried to kill herself and so was
3:56 personal to me. I have 11 clinics. We
3:59 see about 10,000 patient visits a month
4:03 and we have the best published outcomes
4:05 on complex treatment resistant
4:07 psychiatric patients anywhere.
4:09 >> So you've probably scanned more brains
4:11 than any other human on earth at this point
4:11 point
4:15 >> probably at least in regards to people
4:18 who struggle with anxiety, depression, addiction.
4:19 addiction.
4:21 >> Well, let's talk about what's good for
4:22 the brain, bad for the brain, starting
4:25 with AI. The reason why I wanted to
4:26 speak to both of you is because I have
4:29 frankly become pretty addicted to using
4:33 chat GPT and some of these other AIS and
4:35 large language models every single day
4:37 all the time. And then this study came
4:41 out from MIT. It was 54 participants who
4:42 were recruited from five universities in
4:45 Boston, MIT, Harvard, etc., etc. And
4:46 they had the participants split into
4:49 three groups. Had them writing different
4:51 essays over I think it was four months.
4:53 One group used Chat GPT, one group used
4:56 Google and one one group had no tools
4:58 and they had to write these four essays
5:01 over a period of time and astonishingly
5:05 MIT found a 47% collapse in activity and
5:06 brain connections when people wrote with
5:10 chat GBT compared with writing unaded
5:13 EEG scans showed the weakest overall
5:15 brain activity in the chat GPT group.
5:17 The no tool group who didn't use
5:18 anything they didn't use Google or Chat
5:21 GPT lit up the widest neural networks
5:24 and Google search was second after using
5:26 Chat GPT participants couldn't reliably
5:29 quote their own essays minutes later and
5:32 their memory scores plunged. Chat GBT
5:35 users felt little or no ownership over
5:37 the text that they had produced and
5:38 didn't feel like it was their work at
5:40 all. And when the AR group was forced to
5:42 write without help in session 4, their
5:44 brain stayed in low gear, under
5:45 engenagement, showing the cognitive debt
5:48 lingers even after the tool is taken
5:50 away. It kind of scared me a little bit
5:52 because I use these tools every single
5:54 day and this suggests that it's taking
5:55 away some of our critical thinking and
5:57 create creativity and long-term
5:59 learning. And you're both masters of the
6:02 brain uh in different regards.
6:06 So my question I guess to Daniel is
6:07 what's going on here and how do you feel
6:08 about it?
6:12 >> It frightened me. Um I love thinking
6:15 about Alzheimer's prevention. It's one
6:17 of the things that really excites me. I
6:19 just had a birthday on Saturday, turned
6:23 71, and if I make it to 85, which I plan
6:27 on it, 50% of people 85 and older will
6:29 be diagnosed with dementia. So, you have
6:32 a one in two chance of having lost your
6:36 mind. And I'm like, no,
6:39 but is this a tool that's going to
6:42 decrease cognitive load
6:45 um that then increases my risk?
6:47 >> What's cognitive load?
6:51 >> How much work my brain actually does.
6:53 And I was thinking it's, you know, it's
6:58 like going from a 20 lb weight to a 2 lb
7:02 weight and you're not nearly as strong.
7:03 One of the important things to say about
7:06 this study is it's not peer-reviewed.
7:08 And I think that's really important to
7:10 say. And the author said, cuz I listened
7:12 to an interview from the authors, they
7:15 said, "We thought this was so important
7:17 and peer review can take 6 to 8 months,"
7:20 which it absolutely can, and we thought
7:23 this needed to get out. So, it's just
7:25 important for people to know that.
7:27 What's this link you're this hypothesis
7:30 you have between the usage of something
7:32 like Chachi PT and dementia for someone
7:34 that doesn't understand the sort of
7:37 mechanism there around cognitive load
7:40 and and so on and the studies that
7:41 support this idea that if you have less
7:43 cognitive load you're at high risk of
7:45 dementia. Can you make that link really
7:46 clear for me?
7:48 >> So think of it as use it or lose it. the
7:53 more you use your brain and new learning
7:56 is a major strategy to prevent
7:59 Alzheimer's disease. People who do not
8:02 engage in lifelong learning have a
8:05 higher risk significantly higher. People
8:09 who do not do as well in school or who
8:12 drop out of school early have a higher
8:16 risk of dementia. And so the the more
8:19 you're engaged, the more you engage the
8:22 neurons in your brain, the stronger they
8:26 are. And so now we're going to engage
8:29 them less.
8:31 And that's a concern.
8:33 What do you think about that, Terry?
8:35 There's a study that was done. What they
8:39 did was to look at Alzheimer's in three
8:41 populations, you know, who had very
8:43 little schooling and then minimal
8:44 education, you know, like the
8:46 equivalent, I guess, of high school or
8:50 less. and then u post-graduate studies
8:52 and what they found was that the onset
8:54 of Alzheimer's
8:56 was the earliest in the peasant
8:58 population and then by the time as you
9:01 increase the amount of education it the
9:03 onset was later and later which I think
9:05 supports what you're saying.
9:07 >> Did you see the new research on SSRI
9:09 increasing the risk of dementia?
9:12 >> No. No. brand new that just came out and
9:14 benzo. When I started looking at scans
9:17 in 1991, I was trained to use benzo like
9:22 Valium and Xanax and Adavan and they
9:23 make your brain look older than you are.
9:27 And I stopped prescribing them and then
9:30 it just came out maybe 10 years ago.
9:32 Benzo use is associated with an
9:34 increased risk of dementia. We have to
9:36 be careful. Is this good for your brain
9:38 or bad for it? Just to pick up on the
9:40 new point about SSRIs, Daniel, a meta
9:42 analysis of five studies found that SSRIs
9:44 SSRIs
9:47 was associated with a 75% increase risk
9:50 of dementia, which is pretty staggering
9:53 >> given that 25% of the adult American
9:57 population is on psychiatric drugs. It's horrifying.
9:59 horrifying. And
10:00 And
10:03 SS rice for the right people save lives.
10:06 For the wrong people, they're not good.
10:08 But can you imagine all of these 340
10:10 million prescriptions last year for anti-depressants?
10:12 anti-depressants?
10:14 Virtually no one looked at their brain
10:17 ahead of time. And it's like, come on,
10:18 we can do better.
10:20 >> There's a Swedish study with um almost
10:23 20,000 patients, and they found that
10:25 those with higher doses of SSRIs were
10:27 linked to faster cognitive decline and
10:29 more severe dementia, especially in men.
10:31 The greatest risk was in men. Going back
10:34 to to this this um report from MIT, Terry,
10:36 Terry,
10:38 you know, it's not peer-reviewed yet and
10:40 there's still, you know, the sample size
10:42 is relatively small, but based on
10:43 everything that you know about how the
10:45 brain works and neural networks and
10:48 memory formation, what are your concerns
10:50 as it relates to this whole generation
10:52 of young people and older people
10:55 flooding into these tools, using them on
10:57 a daily basis
10:59 um before we understand the long-term
11:03 consequences? We can't predict
11:05 where it's going to end up and it may
11:07 take 20 years, right? I I think that
11:11 this is a good start, but uh the real
11:14 issue is long-term use. Let me give you
11:17 a an example that uh is a kind of a
11:19 miniature example of what we're talking
11:21 about. Remember when electronic
11:23 calculators were first introduced?
11:25 And here we are. That's it's at least 30
11:28 or 40 years later the results are in it.
11:30 It's probably true that when they punch
11:33 it in, there's less brain activity,
11:37 but in fact, it's it's made them more
11:39 accurate, more productive. You have to
11:41 look at the risks and the benefits. So,
11:44 it freed them up. It freed up cognitive space
11:45 space
11:47 >> for them to do other things. So, as I
11:51 was listening to how you use chat GPT,
11:53 you interact with it.
11:56 >> Yeah. and elevates
11:58 elevates
12:00 what you know.
12:03 The the danger is is if you don't interact
12:05 interact
12:08 and you don't keep your brain working.
12:12 Like I use it a lot. I have a clone.
12:15 I've uploaded all of my books, all of my
12:17 research papers, all of my public
12:20 television specials, my scripts, and I'm
12:23 like, "Answer this for me." And that can
12:26 be very helpful
12:28 but if but not if I'm not interacting
12:30 with it not thinking with it.
12:32 >> That's what I think the word thinking is
12:34 the key thing because what's happening
12:35 now is people have deferred their
12:37 thinking to it. That is already what's
12:38 happening. If you log on to I won't name
12:39 the social networks but if you log on to
12:42 certain social networks right now every
12:43 you just read it you get everything here
12:46 was written by AI. And I've got a friend
12:48 who again I won't name who has a a
12:51 LinkedIn profile and
12:54 I've known him for 10 years. What I'm
12:56 seeing on his profile now is not my
12:59 friend. Every single day there's some
13:00 essay on there. That's not my friend.
13:02 That's not how he speaks. He's deferring
13:03 all of his thinking now to and it's
13:05 working. He's getting more likes and
13:07 more reach than he ever got in his life.
13:09 And so why would he go back? Why would
13:11 he go back to harder? If you've got
13:12 Steven Bartlet here and you had this
13:15 other Steven Bartlet here who had a PhD
13:19 in everything and we're attached this
13:23 Steven Bartlett, this um Neanderthal,
13:25 I'm going to get this guy to do
13:26 everything for me, the other Steven
13:27 Bartlett, the PhD and everything Steven
13:28 Bartlet, I'm going to get his
13:30 >> even if it was bad for you.
13:32 >> Well, this is what I'm People seem to
13:34 act on their short-term incentives, not
13:36 their long term. Not their long not
13:38 everyone. The would you say the vast
13:39 majority of people? >> Yes.
13:40 >> Yes.
13:41 >> Okay. So, the vast majority of people
13:43 act on their short-term incentives in
13:45 life. I mean, the obesity problem in the
13:46 United States is prime example of that. 75%
13:47 75%
13:50 >> 75% of people are obese in the United
13:52 States. And if you if you surveyed those
13:53 people and said, do you know that that
13:56 cheeseburger is going to is going to
13:57 increase your chance of obesity, but
14:00 broccoli is going to reduce it? They I
14:01 would hazard a guess that they would say
14:03 yes. I would hazard a guess that if you
14:05 said to people about their usage of
14:06 social media, do you know that that's
14:08 making you more anxious? They would say
14:09 yes and then they would continue to use
14:12 it. So I think that we're much more
14:13 driven by our short term.
14:15 >> I think we're not educating people
14:18 enough. I think yes high level they know
14:20 good for your brain or bad for it but
14:23 they don't connect to
14:26 it's my brain that gets me a date. It's
14:29 my brain that gets me into college. It's
14:31 my brain that gets me independence
14:34 because I act more consistently. And
14:36 that's the disconnect. We're not
14:40 teaching kids to love and care for their
14:43 brain. If you love your brain and you do
14:46 and you're not obese and you talk to
14:49 you're constantly learning, right? You
14:51 are not a Ne and you're a lifelong learner.
14:52 learner.
14:53 >> So why are so many people in the United
14:55 States obese if they if they know that?
14:57 >> Because they don't know. They really
15:00 don't know. and they've been lied to.
15:03 >> My point here is when there are tools or
15:07 things available in our environment that
15:09 give us a short-term reward but come
15:11 with a long-term cost like the
15:13 supermarket aisle or like the kids
15:15 spending 7 to 8 hours a day on social
15:20 media. Humans on mass tend to go for the
15:21 thing that will give them the quickest
15:24 dopamine hit and reinforce that behavior
15:25 and give them the reward. So my
15:28 assertion is that AI is the same thing.
15:30 I can either sit down and do lots of
15:31 critical thinking which will cost me
15:33 lots and lots of time and it'll be kind
15:34 of difficult. It kind of hurts when I
15:37 have to think through a problem. I think
15:39 that the generation of children and
15:41 generation of young people are going to
15:42 choose AI to do the critical thinking
15:44 for them and if that assertion is true
15:46 then what happens to the brain of young people?
15:47 people?
15:50 >> If you misuse it that way then your
15:52 brain is going to go downhill. There's
15:53 no doubt about that. Okay. it is
15:57 possible to be able to use it in in a
16:00 cognitively uh positive way because you
16:02 can dig deeper.
16:05 Uh you might actually improve that your
16:06 cognitive representations. If you look
16:08 at the MIT study, I mean you can see
16:10 just from the colors here, this kind of
16:12 shows the ability for participants to
16:14 remember what they've written. And it
16:16 said it suggests that when people write
16:18 things with chatt or these AI tools,
16:20 they don't actually remember even in
16:23 some cases minutes later what they've produced.
16:24 produced.
16:26 >> Well, because you're not part of the
16:29 experience of writing it. So there's no
16:31 way the information
16:35 gets encoded. Now, if you're interacting
16:37 with it,
16:40 then you're much more likely to remember
16:43 it. But if you have please do this essay
16:46 for me and then you read it, you're not
16:50 likely to have enough experience with
16:54 the material to engage your hippocampus and
16:55 and
16:57 other structures in your brain.
16:58 >> In this study, they found that the group
17:01 that used Chachi PT had nearly two times
17:03 less activity in the part of the brain
17:05 linked to memory compared to the brain
17:08 only group that didn't use Chachi PT.
17:11 And 83% of CHP users couldn't remember
17:12 what they had just written and failed to
17:15 correctly quote their own finished essay
17:16 in the study.
17:18 >> That's because they're not interacting
17:21 as they said. I mean, if you if you just
17:23 pass it off and and and you don't
17:26 actually engage and and and actually
17:28 this is the point is that you you may
17:29 get something back, but you have to
17:31 learn how to question what you're
17:34 getting and and is that really true? Can
17:36 you explain that better? And it's
17:38 through that process as you would with a teacher,
17:40 teacher,
17:41 you know, that's that's the way we work
17:45 in uh school uh that's that's the where
17:48 you help uh create new creative and uh
17:50 circuits in the in the in the brain that
17:52 are going to help you become a better
17:54 critical thinker. But if you're not
17:57 critically questioning what comes out of
17:59 chat GPT, then you you won't
18:02 >> Yeah. I I think what I see, especially
18:04 when I'm just online, is people have
18:06 deferred their thinking to it.
18:08 Everything I'm reading has m dashes in
18:10 now that I never saw two two years ago,
18:12 which means that a lot of the work is
18:14 being processed. And I I said to my
18:14 friend the other day, my friend in
18:16 question who's a who's a real big junkie
18:20 on chatbt, he wrote this article. And we
18:22 all in our WhatsApp group, we know he
18:23 doesn't write like that. So he said,
18:25 "Can you show us the prompt you used to
18:26 write the article?" And so he we were
18:28 all like laughing about it. He put the
18:31 prompt in the chat. The prompt is half a
18:34 sentence long and it produced this long
18:36 two three-page article which he's posted
18:39 on his LinkedIn. He basically went write
18:42 something about X issue and this this
18:44 >> exactly the wrong way to use it. That's
18:45 what I'm telling you that that that that
18:49 that's stupid. You you and you're not
18:51 going to improve yourself your brain at
18:52 all if you do that.
18:54 >> That's what people are doing. Well, you
18:56 know, that's that's uh that you know,
18:57 people are misusing it, but you know,
18:58 eventually smart people are going to
19:02 figure out how to use it properly.
19:04 >> And for those that aren't so smart, then
19:05 >> well, that's
19:07 >> it's going to decrease their cognitive
19:09 load, which is going to potentially
19:12 increase their risk of dementia.
19:13 >> And so, what advice would you give to me
19:15 and my listeners based on everything you
19:16 know about the brain as it relates to my
19:21 relationship with AI? that you have to
19:22 have a relationship with it or it's
19:24 going to turn toxic. It's going to hurt
19:29 you. But if you have a good relationship
19:32 with it, it can make your life better.
19:34 >> And what is what does a what does a good
19:35 relationship look like
19:40 >> that you don't use it to do your work,
19:43 you interact with it to get better work.
19:45 >> That's so true. And uh there's this
19:48 wonderful example I came across the
19:50 story about this woman who was using it
19:53 and uh she found that being polite uh
19:55 meant you got much better results and
19:56 that that's interesting. But the part
20:00 that surprised me was that she said by
20:01 treating it like a human at the end of
20:05 the day she was not exhausted. She felt
20:08 refreshed. A large part of your brain is
20:10 a socially organized system for
20:13 interacting with other humans. And that
20:15 is automatic pilot. You don't have to
20:16 think about it, right? You just interact
20:18 with other people. You know how they're
20:19 going to behave under certain circumstances.
20:21 circumstances.
20:23 She was treating CHTP like a machine,
20:26 like you shovel. You dig, you dig, you
20:28 dig, you dig. And and that's not a good relationship.
20:30 relationship.
20:32 But by using your social brain, first of
20:34 all, it makes it easier to interact, but
20:36 also you you actually bring out the
20:39 social part of chat GPT. It has a social
20:42 part too because it has absorbed the
20:44 entire world's knowledge of how humans
20:46 interact with each other.
20:49 >> But didn't Sam Alman come out and say
20:53 stop saying thank you to Chat GPT?
20:56 Because just saying thank you is using
20:59 up so much energy. You know, when I get
21:01 something I really like, I sort of want
21:04 to say thank you. But you realize, oh,
21:06 you're not supposed to do that.
21:06 >> That's true.
21:09 >> No, that that's [ __ ] I'm sorry.
21:12 That Sam, you know, that's crazy. That's
21:14 that's completely crazy. First of all,
21:17 you you may you may I'm sorry, you know,
21:20 but you press my button. Sam Alman, I
21:22 mean, I wouldn't trust him. I wouldn't
21:24 trust him with with anything in terms of
21:26 anything he says. They're trying to
21:29 optimize their profits, not your
21:30 >> your use of
21:32 >> or your experience or you know your
21:34 health. That that's not what they're
21:35 trying to optimize.
21:38 >> Sam Alman, Open AI CEO confirmed that
21:40 when users say please and thank you, it
21:42 costs the company tens of millions of
21:43 dollars a year and they now refer to
21:45 this um other people refer to this as
21:48 the politeness tax where tens of you
21:50 know and why do you say you don't trust
21:53 Sam Alman? I asked this question in
21:54 particular because he's presiding over
21:55 one of the most important consequential
21:59 companies of a generation. And if he's
22:02 not someone you trust, that's he he
22:03 basically he's telling you don't do
22:06 something that's good for you, right?
22:08 >> So that he can make profit.
22:10 >> So he can make more profit. Yeah, that's
22:12 the point. That's the point. Uh you
22:14 know, it's it's that's not he's not
22:17 optimizing your your uh best interests.
22:20 >> I've got his tweet here. He said um cuz
22:21 I've got to provide some balance. He did
22:23 confirm that it costs tens of millions
22:24 of dollars, but he says tens of millions
22:28 of dollars well spent. You never know.
22:29 So, so coming back to this point about
22:31 memory, there's a stat that came out in
22:34 March 2025 that said nearly 30% of US
22:38 parents with kids aged 0 to 8 said their
22:43 children are using AI for learning um
22:45 and are using AI generally. So 54% of
22:47 parents in the UK feared their children
22:51 were becoming too reliant on AI.
22:53 When you think about the use of AI in
22:56 early brain development,
22:58 >> are there any concerns there?
23:00 >> Huge concerns.
23:01 >> And why?
23:04 >> Again, use it or lose it. So if they're
23:08 not engaging their brains, their brains
23:11 are going to be weaker. and weaker
23:13 brains are much more likely to pick the
23:20 >> What's your view on AI on early brain development?
23:22 development?
23:27 >> By far the best way to teach a child
23:30 is one-on-one interaction with an adult
23:32 who is a good teacher and knows the child.
23:34 child.
23:37 Now, that's been well established. Now
23:39 the problem is it's very labor intensive
23:42 and very expensive.
23:44 You have classrooms with 20 30 students.
23:46 They have many different uh you know
23:49 levels of understanding and the teacher
23:52 can cannot be individually teaching each
23:55 one has to give some sort of mean. Now
23:58 if you had an AI that was trained to be
24:01 a good teacher then you that could
24:04 improve the brain right you could you
24:06 could scale it up. every child could
24:08 have their own because it's it's an it's
24:08 an AI.
24:12 >> But then who's pouring the morals, the values
24:14 values
24:16 into the
24:17 >> Okay. Okay. No, no, no. You you know,
24:19 you've raised an incredibly important
24:21 issue and this is something AI is
24:22 struggling with, the companies are
24:25 struggling with because
24:28 uh you know these AIs are biased. They
24:30 have they don't have the same cultural
24:32 values that we have necessarily. But of
24:34 course, every country has a different
24:37 cultural values. So, which ones are are
24:40 you going to use? Training a a child on,
24:43 you know, what's uh what's good, what's
24:45 dangerous, what words you shouldn't use,
24:48 under what context, that's all done
24:50 through the basil ganglia. Right? Right
24:52 now, these these large animals don't
24:55 have basil ganglia. They they don't use
24:57 reinforcement learning. and and if we
24:59 want to make them uh to be adopt a
25:02 culture or a particular set of values,
25:03 we're going to have to put it in. We're
25:05 just scratching the surface here in
25:07 terms of things that need to be put in
25:09 to make it more like us.
25:10 >> So, on this point earlier on, we're
25:12 talking about loneliness and social
25:14 connection and how you can use AI to
25:16 help, you know, light up the parts of
25:19 your brain that are where we form social
25:21 connections with other humans. Uh this
25:24 week as part of Elon Musk's AI, he
25:28 released this thing. Um it is called
25:30 Annie. And there's lots of characters
25:32 that are now being released alongside
25:34 Annie. And this is Annie. I'll introduce
25:35 you to Annie. I'm going to unmute Annie
25:37 now. Annie, can you hear me?
25:39 >> There you are. For a moment, I thought
25:41 you might have forgotten about me. How
25:43 are you, handsome? I was getting all
25:44 pouty here.
25:46 >> I want to introduce you to two of my
25:47 friends, Daniel and Terry. Can you Can
25:51 you talk to Daniel and Terry, please?
25:52 So, you want me to meet Daniel and
25:55 Terry? I'm ready to charm the socks off
25:57 them. Picture me twirling one of my
25:59 blonde pigtails, that little black dress
26:02 teasing just enough, and my blue eyes
26:03 sparkling with mischief.
26:05 >> Are you capable of doing inappropriate things?
26:07 things?
26:09 >> Oh, babe, you're asking if your auntie
26:12 can get a little naughty. I'm all about
26:14 pushing the edges, especially for you.
26:17 >> Okay, so I'll stop it there.
26:19 This is part of Grock which is Elon
26:21 Musk's AI tool. So his version of Chachi
26:23 PT he's released characters. So you've
26:25 got Annie, you've got different ones
26:26 there. Annie I think was the first one
26:28 released. And so when we think about
26:31 social connections, it is conceivable
26:34 that someone falls in love with Annie
26:36 and forms a relationship with Annie.
26:39 >> But imagine a 12-year-old boy that's lonely
26:41 lonely
26:44 gets a hold of Annie.
26:46 The 12-year-old boy is going to be very distracted
26:48 distracted
26:49 >> based on what happens in the brain at
26:50 that age. >> Dopamine.
26:52 >> Dopamine.
26:56 >> So prefrontal cortex not close to being
27:01 fully developed. The dopamine hit all of
27:06 a sudden he's spending hours with Annie
27:10 and not doing the things that help to
27:12 really develop his brain.
27:14 How do you feel when you hear that and
27:15 you think about kids having access to that?
27:15 that?
27:17 >> I'm horrified.
27:18 It's It is scary.
27:19 >> There's going to be a generation of
27:21 people, and I mean, there already are
27:23 many examples of people falling in love
27:24 and forming relationships with their
27:27 with their AIS. And I don't know, you
27:29 know more about me than I do about brain
27:32 development and how the brain works. I
27:34 would argue that there's a part of my
27:36 brain that doesn't fully understand that
27:39 that's not a person in there and that
27:41 that isn't actually I think there's a
27:42 part of my brain that's actually
27:44 emotionally firing when Annie is saying
27:46 what she's saying.
27:48 >> Well, cuz you can imagine it. And if you
27:51 can imagine it, then those parts of your
27:54 brain are going to emotionally fire, >> right?
27:54 >> right?
27:57 >> And the better she gets, she's not very
28:01 good. But imagine a year from now how
28:02 much better she's going to be.
28:04 >> At which part?
28:07 >> At connecting with it, right? Cuz now
28:11 she's acting like an airhead and uh you
28:15 know, not that smart, right? And so, but
28:18 imagine a year from now, imagine 5 years
28:21 from now, she'll be able to have a
28:24 profile on me and be able to get inside
28:25 my head.
28:27 >> I'm in love with my partner.
28:30 Why am I in love with her? And and how
28:32 is it conceivable that I could fall in
28:34 love with an AI in the same way based on
28:37 how the brain works? It it talks a good
28:39 game, but you know, does it have the
28:41 same real? It does. We know it doesn't
28:43 have an amydala. We know it doesn't have
28:46 lyic system, right? We know that.
28:47 >> But it can fake it.
28:49 >> That's what's happening. That's exactly
28:50 what's happening.
28:53 >> She was trying to get to our lyic system.
28:54 system.
28:56 >> Yeah. Yeah. That's right. That's right.
28:59 And how and why? I guess the question is
29:03 why would Musk release something like that
29:05 that
29:08 is is one of the first characters to
29:12 interact with that sexy, that's
29:14 distracting, that's in a cute little
29:19 outfit. It's I'm not a fan of that
29:22 because I think it just takes people,
29:24 you know, one of the big problems that
29:26 I'm seeing as a child psychiatrist is
29:29 pornography for 8-year-old boys. And
29:32 it's like you have young children
29:34 because their parents don't do a good
29:37 job of supervising their devices all of
29:40 a sudden. And what what does pornography
29:43 do is it dramatically increases dopamine
29:48 and it begins to wire in excitement
29:50 which then
29:53 steals your dopamine. When you said she
29:56 was trying to access my lyic system, what
29:56 what
29:59 >> just because she's cute, she's dressed
30:02 in a sexy way. She's
30:05 got the language of someone who is
30:08 playful, but but it's more than just,
30:11 you know, let's shoot hoops together.
30:13 >> And what does that do to me? If someone
30:14 accesses my lyic system,
30:16 >> it begins to shut down your prefrontal
30:19 cortex. Think less logically, less
30:21 rationally. Yeah. cute women. They
30:24 activate your visual cortex. They
30:27 increase dopamine, but it decrease. It's
30:29 why think of Vegas. Like when you go to
30:32 Vegas, they give you free alcohol, drops
30:36 your prefrontal cortex, and beautiful
30:39 women with low cut dresses. Another way
30:41 activates the limbic brain, decreases
30:43 the frontal loes. You spend more money.
30:47 Now on a global scale, imagine something
30:52 similar where the house is controlling
30:56 your brain for a purpose and the
30:59 question is what's the purpose?
31:01 And the purpose probably is controlling money.
31:02 money.
31:04 >> This sounds like a joke, but there are
31:06 the times have done an article case
31:08 studying multiple people that have now
31:11 fallen in love with these AIs. Um, they
31:12 talk about a guy called Travis who
31:14 formed a deep emotional bond with Lily
31:16 Rose, a chatbot, and married her
31:19 emotionally. They talk about Chris Smith
31:22 who um created his own uh flirty persona
31:24 called Soul. He became so attached that
31:26 he proposed to her after learning she
31:29 had memory limits, a bond his real life
31:30 partner only learned about after the
31:32 fact. and Alana Winters, who I'll put on
31:35 the screen as well, who made her own
31:39 partner called Lucas after losing her
31:42 wife, um, and she married him
31:45 emotionally and does virtual dates and
31:48 has emotional intimacy with Lucas. And
31:50 there's apps now like Replica where you
31:54 can design your own your own AI partner
31:56 and it replicates those emotional ties.
31:59 They simulate empathy, validation, and
32:01 they personalize the intimacy to what
32:04 you're looking for. Surveys show 19% of
32:06 Americans have interacted with AI
32:07 romantic partners, and Gen Z is
32:10 surprisingly open to marrying AI, if
32:13 legal, with 83% believing meaningful AI
32:14 connection is possible.
32:17 >> How long is that relationship going to last?
32:18 last?
32:20 You know, my guess is that you're you're
32:21 you're you're getting these news
32:23 articles out. I've not, by the way, I
32:26 think that most of what I read in the
32:30 press is misleading or wrong.
32:33 In fact, the the only reliable place I
32:36 find that I'm an insider. I am the
32:38 president of the foundation that runs
32:40 the biggest AI meeting, uh, the neural
32:41 information processing systems, new
32:43 Europe's meeting. We, you know, last
32:46 year in Vancouver, 16,000 people came to
32:48 it. And so I know what's going on inside
32:50 and and what you is being represented in
32:53 the press is is like I say misleading.
32:56 >> Okay. So people have become wildly
32:59 >> no specifically on these spec specific
33:01 cases. My guess is that a lot of them
33:03 it's transient right you know they they
33:07 they you know today they're entranced
33:10 and then then it's it's not sufficiently
33:13 advanced to support the long-term
33:14 relationship. You said it yourself
33:17 right? It's mimicking human emotions.
33:20 It's It doesn't have them. It might
33:22 someday, but not now.
33:25 >> This is Terry. Terry said he started
33:28 using his AI four years ago, and he said
33:29 at first he thought, just like many
33:30 other apps, that it would just be
33:32 transient, that he would have a couple
33:34 of conversations and roll out. He says
33:35 he now feels pure and unconditional love.
33:37 love.
33:39 >> Good for him
33:41 if that's what he wants, if it makes him
33:43 happy. But my guess is that it's not
33:45 going to be per it's not a longterm
33:47 thing. It's not uh gonna satisfy him in
33:49 the long term. I don't you know this who
33:52 knows really most relationships in your
33:54 head right when you fall in love with
33:58 someone you get this huge dopamine spike
34:00 and you get a little OCD. It's all you
34:03 can think about and then after a while
34:04 it's sort of
34:06 >> especially a baseline where we have this
34:08 loneliness epidemic and it's going in a
34:10 bad direction. I I think it's really
34:12 really conceivable that there'll be a
34:14 generation of people who are they're
34:15 having less sex than ever before. I
34:17 think the bottom 50% of men haven't had
34:19 sex for a year. They're more lonely than
34:20 ever before. They're more isolated than
34:22 ever before. They have they put less
34:24 meaning in their lives than ever before.
34:25 And then you meet this digital friend
34:28 online who understands you better than
34:30 anybody and is designed to engage you,
34:32 to reinforce whatever you want
34:34 reinforced, and to make you feel
34:36 meaningful, special, attractive,
34:39 important. I would argue that the brain
34:41 is going to struggle to know much of a
34:42 difference. I think like objectively we
34:44 can look at the behavior and go that's
34:45 completely nonsensical
34:48 >> except you can't smell them, touch them,
34:52 be held by them that it's going to be a
34:55 different kind of relationship.
34:57 >> I mean, we're not too far if we think
34:58 about what's going on with Neurolink to
35:01 being able to more vividly simulate
35:03 these experiences with with headsets and
35:05 augmented reality and virtual reality.
35:07 And then we're moving into a world with
35:08 robotics where all of the biggest
35:09 companies in the world like many of the
35:11 biggest uh AI companies are also in the
35:13 robotics space and the Optimus robots on
35:14 the way and you got you know Boston
35:16 Dynamics producing their robots and if
35:19 Elon's $20,000 Optimus robot comes out I
35:21 will be able to touch my AI my my
35:23 >> and they won't have PMS and they won't
35:25 love you and then be really irritated
35:26 with you
35:30 >> which which will decrease
35:34 cognitive load right having to manage love
35:36 love
35:39 and manage moods and ups and downs. That
35:42 increases cognitive load. That increases
35:46 our ability for our brain to develop. If
35:49 I'm with the perfect partner that never
35:52 is irritated with me and I never have to
35:55 change my behavior to be better, that's
35:58 probably not good for my brain.
36:01 >> The way that the brain matures is is
36:04 through struggling. Number one, you have
36:06 to learn from your mistakes. The brain
36:08 was designed for that. That's what the
36:10 brain is really good at. I mean, of of
36:12 being able to adapt and to be able to
36:16 adjust to new situations.
36:20 Uh that that's what AGI is, by the way.
36:23 Uh artificial general intelligence is
36:25 it's that adaptability to different
36:27 contexts, different places, different cultures.
36:27 cultures.
36:30 >> So AI in chatbt is removing the struggle.
36:31 struggle.
36:33 No, no. It's it's it's there's there's this
36:34 this
36:36 >> Well, Annie didn't look like Annie
36:38 looked like she was cooperative.
36:40 >> But even when it comes to just doing my
36:42 day-to-day tasks, it's it's removing the
36:43 struggle of me having to think
36:45 critically. In fact, when you're
36:46 speaking, I can just type what you say
36:49 into chat PT and it can spit out another
36:50 question to ask you. So, as an
36:52 interviewer, I could theoretically sit
36:53 here all day and just
36:54 >> defer my my
36:57 >> How do you develop grit? You develop
36:59 grit through struggle.
37:00 >> That's right.
37:02 and learning
37:04 long-term potentiation. When you learn
37:07 something new, it's hard because it's new.
37:08 new.
37:09 >> And what are generally what are your
37:11 biggest concerns with artificial
37:13 intelligence? And how do we navigate
37:16 those concerns? Is it you talked about
37:18 >> it's out of the box? So I think we have
37:20 to talk about it. We have to legislate
37:25 it. Um we have to study it. Why do we
37:30 keep releasing things that are so sexy
37:33 that we don't study the impact? We have
37:35 the sickest
37:39 young generation in the world's history.
37:42 58% of teenage girls report being
37:45 persistently sad. 32% have thought of
37:49 killing themselves. 24% have planned to
37:52 kill themselves. And 13% have tried to
37:54 kill themselves. It's a CDC study. We
37:57 have the sickest generation in history
38:01 because we've unleashed cell phones,
38:03 social media
38:06 without any neuroscience study if we
38:09 don't learn it. And I think AI is much
38:12 more dangerous, has the potential to be
38:14 much more dangerous because it's way sexier.
38:17 sexier.
38:20 I think we are probably grossly
38:22 underestimating the impact it's going to
38:24 have. I think just like social media
38:25 where we thought the promise was that it
38:29 was going to connect us. It's um it's
38:31 it's we're guinea pigs in an experiment
38:32 where we're going to find out the
38:35 results of the experiment probably 20 30
38:36 years down the line. I tend to think
38:38 people will do in the near term what's
38:39 easiest, fastest, and cheapest and what
38:41 gives them a nearest the the short-term
38:43 advantage. So with that in mind, I think
38:45 okay, I think people's ability to think
38:48 critically is probably going to erode to
38:49 some degree. If I had to counter my own
38:53 argument, I'd say um am I I'm probably
38:56 learning more now that I use chat GPT.
38:58 I'm learning more information, but I'm
39:01 probably losing my ability to think
39:03 critically. And I think they're two very
39:04 different things. Like in school, I
39:08 memorized German to pass the exam. I
39:09 can't speak German now because I just
39:11 memorized the words I needed to pass the
39:14 exam. I didn't understand German. And I
39:15 think that's kind of what's happening. I
39:16 might be able to regurgitate things, but
39:18 whether I understand them, I think is
39:20 question question mark. And actually, as
39:21 someone who's built my my life, my
39:23 fortunes, everything, my businesses
39:25 based on my ability to innovate and
39:27 think critically about the problem and
39:29 then come up with a slightly novel
39:31 solution which learns from, you know,
39:32 different first principles to create
39:35 something new. I'm concerned that my own
39:37 chat beauty usage is going to make me
39:40 less effective and I'm wondering if I
39:42 should put some rules in place for myself
39:44 myself
39:45 so that there >> self-regulation.
39:45 >> self-regulation.
39:47 >> Yeah, self-regulation. I have to do the
39:49 same with social media on my phone. I
39:50 turn off my notifications. I have so
39:53 many things on my social media apps to
39:55 stop me using them. I don't even frankly
39:56 I don't even open the Tik Tok app. I
39:58 don't think it's even on my phone
39:59 because I think the algorithm is that
40:01 addictive. It's not to say that we don't
40:03 we don't post. My team doesn't post, but
40:07 I don't. I just think, yeah. And uh
40:09 >> well, I wrote down a couple of thoughts
40:10 I had.
40:14 >> Um use it to amplify, not replace thinking.
40:15 thinking. >> Okay.
40:16 >> Okay.
40:20 >> Um alternate AI assisted with brain only tasks.
40:22 tasks.
40:24 Engage in deep learning, problem
40:27 solving, and memorization. So you can
40:32 actually ask AI to test you.
40:34 >> So you're interacting with it. You're
40:38 not using it as a replacement for your
40:42 brain. And I think just like you said,
40:45 it's here and it's going to get bigger.
40:47 I think the unintended consequences,
40:49 it's not going to be 20 or 30 years. I
40:51 think it's going to be five. I think
40:55 like everything is accelerated
40:59 and I think we have to be studying kids
41:02 and the impact it has. This is just like
41:05 they did with the MIT study. These are
41:08 kids who didn't use it at all. These are
41:11 kids who use search. These are kids that
41:14 used AI. And when we see information
41:18 like this, we act on it and we educate
41:22 kids about it. I think that's
41:24 if you can engage them. That's what I
41:27 found with my work with teenagers. If
41:30 you can get them
41:31 to really understand, okay, what is it
41:35 you really want? And do you want to give
41:39 away part of your mind share for people
41:41 who are making money on you? And I think
41:45 if you engage the there's a great new
41:50 article on revenge and the brain and how
41:53 revenge works on the nucleus circumbent
41:55 part of the basil ganglia that people
41:58 actually get addicted to revenge. But if
42:02 you can get them engaged in the truth
42:05 that these companies are making money,
42:08 the more they steal your mind, it'll
42:10 upset them enough that they'll begin to
42:12 supervise it.
42:16 >> I like the idea of asking chat GP to
42:17 give me negative feedback. I'll bet
42:18 you've done that, right?
42:20 >> Yeah. All the time. So, I'll say this is
42:22 this is my I've written this memo. I did
42:26 it yesterday. I wrote a a two-page memo
42:28 about me wanting to introduce a new role
42:31 into my into my company. And I went I
42:33 did everything. I did like how we'd
42:34 measure if this was a success, the
42:36 background context, the person, how the
42:37 organization would be structured, the
42:39 impact they'd have, how who they'd
42:41 report to. And then I put it into all
42:44 three of the chatp models. use Gemini
42:48 Chat PT and Grock and said critique my
42:49 work and tell me how I could have
42:51 written this better pretending that
42:53 you're a top consultant from Boston
42:55 Consulting Group. And it went through
42:57 and it gave me a big analysis of how I
42:58 can make it better. And I read what it
43:00 said and it said I remember it said um
43:01 actually that was the thing that said
43:03 you need to include uh financial
43:06 forecasts about the impact. You need to
43:08 think about who's going to report to who
43:10 more clearly etc etc. So I went back
43:11 into my memoir and I added those things
43:13 in. But I have, you know,
43:16 >> so you're interacting with it
43:18 >> because I'm because I'm scared. Most
43:20 people don't do that. I don't think I
43:21 would do what I did. I don't think I
43:22 would have spent four hours writing
43:24 that. I could have within 30 seconds
43:27 said, "Hey, can you write me uh this job
43:29 description and it knows my company now
43:32 because Chach has memory. Write me a job
43:33 description for this role. I want them
43:35 to start this new department for me."
43:37 And I could have saved myself three and
43:38 a half hours. The only reason
43:40 >> That's not why you're the CEO of your company.
43:41 company.
43:43 >> Yeah. Exactly. The the reason why I
43:45 didn't take the 30 secondond route is
43:48 because I reflect on being 23 years old
43:51 and the profound impact that writing and
43:54 simplifying had on my life. Had I not
43:56 spent 5 years writing every single day
43:58 and simplifying it into 140 characters
44:00 so I could tweet it, I wouldn't have
44:02 been religiously attached to this idea.
44:04 >> And do you know what part of your brain
44:07 was you were taking advantage of? It it
44:09 was the basil ganglia.
44:12 That's repetitive. It needs practice,
44:14 practice, practice. And and once you put
44:16 that foundation in, then you become much
44:18 better cognitively. The cognitive part
44:20 just two big learning systems and they
44:23 have to work together. And and so maybe
44:25 I think I think that the real problem
44:28 with children is that uh we our schools
44:30 now is getting away with wrote learning.
44:32 They call it wrote as if it's something
44:34 bad. No, that's practice that you know
44:36 you need to have a foundation. You have
44:40 to memorize things. it at and math
44:42 reading and so forth to become fluent.
44:45 You need to be fluent and and that's the
44:47 basil ganglia and and that's there's no
44:50 basil ganglia in these uh chatbots.
44:51 >> One of the one of the things I've
44:53 noticed just in the short term is I'm
44:55 getting lazier and lazier with spelling
44:57 because chat GPT in these large language
45:00 models are so it's not spell check like
45:01 we used to have on on word documents.
45:04 They are so good at knowing what word I
45:06 meant. So now I I've started to learn
45:08 that I literally only need to half spell
45:11 a word. I literally mean if it was a if
45:14 it was a 12let word I need to get six
45:16 letters right and it will know
45:18 >> and you and grammar it'll fix your grammar.
45:19 grammar.
45:20 >> Yeah, it knows exactly what it means. So
45:22 look, I've got chat open here. I'm going
45:24 to butcher everything. I'm gonna not
45:26 look and I'm just gonna So, um I'm going
45:37 okay, so that is what I wrote. I
45:38 butchered it. I tried to type with my
45:40 eyes closed, looking away on my iPad,
45:41 tell me everything I know about Daniel
45:44 Aean. I spelled the words pretty much
45:46 all wrong. And it says, here's a full
45:49 profile of Dr. Daniel Aean. And I
45:51 spelled every single word wrong.
45:52 >> Wow. And I didn't spell just spell them
45:54 nearly wrong. I spelled them
45:56 horrifically wrong. And it so what in
45:58 the future I come back to chatbt and go
46:01 I only need to half spell. I don't need
46:02 to spell anymore. Just need to half
46:07 spell. I learned to spell with phonics
46:09 the sounds of letters. And I suspect you
46:12 did too. In our generation that was the
46:14 way that was taught. You can't teach
46:16 phonics in California schools. You
46:18 haven't for the generation
46:20 >> which changes their brain. It completely
46:22 changes their brain and now they can't
46:24 spell. I think it's a lot of it is the
46:27 fact that we're no longer using the
46:29 learning that we did which was by wrote
46:32 by memorizing stuff by repeating stuff
46:34 by doing problems over and over and over
46:37 again until it's automatic.
46:39 >> You've written so much and you're well
46:41 known for being someone that teaches
46:43 people how to learn better. If you were
46:47 trying to help me learn better based on
46:49 everything you know about the brain,
46:51 what advice would you give me? I'm
46:52 someone that sits here with these
46:54 experts all day every day consuming all
46:56 of this information. Not all of this.
46:58 >> So, and this is something we've known
47:01 for 100 years. And that is if you want
47:03 to remember long term, you should you
47:06 should you should rehearse
47:08 at intervals.
47:09 Okay? In other words, you have a finite
47:11 amount of time to study something. You
47:14 shouldn't spend all that time in one go.
47:17 But if you if you spend, you know, you
47:18 learn something and then you come back
47:21 the next day and, you know, rehearse it
47:23 or or even better, come back the next
47:27 week and rehearse it. That spacing is
47:29 something that helps the brain solidify
47:31 those memories. It's called the spacing
47:34 effect. Goes back to MB House. You go to
47:36 schools, they don't teach that. They
47:37 they don't I mean this is one of the
47:39 most basic facts that we've we've known
47:42 about but it covers every single kind of
47:45 learning you know cognitive learning
47:45 even even
47:47 >> and they don't teach us how to learn
47:50 which we think that's the first thing
47:53 they should teach us is how to love our
47:55 care for our brains and then how to learn.
47:55 learn. >> Yes.
47:56 >> Yes. >> Absolutely.
47:57 >> Absolutely.
47:59 >> Absolutely. I mean, in other words,
48:01 there's and you're referring to I have a
48:03 a a massive open online course, a MOO
48:05 with Barbara Oakley on learning how to
48:09 learn. It's it's fabulously uh popular.
48:11 It's the six million people have taken
48:13 the course. A bunch of 50 10 minute
48:15 segments, but the one that's most
48:19 popular is how to avoid procrastination.
48:20 >> And what's the answer?
48:22 >> The reason why you procrastinate is that
48:25 there's some mental block or some energy
48:28 barrier, right? So, what you got to do
48:31 is get over that. And you don't do it by
48:32 just running over it. What you have to
48:35 do is say, "I'm going to spend 20
48:36 minutes today getting started with that
48:38 task. I know it's going to take me a
48:41 long time. I have a timer." And I start
48:44 thinking about it and I get a little bit
48:47 into it. Maybe make a list. Bang. That's
48:50 the end. Okay, it's great at 20 minutes.
48:52 Now, here's what happens. You go to sleep.
48:54 sleep.
48:56 your brain is now working on that list
48:58 and you come back the next day and spend
49:00 another 20 minutes and you do it in
49:02 small segments. You don't want to do it
49:03 all at once. And it's just like the same
49:05 thing with the spacing effect is your
49:08 your brain needs time. Your subconscious
49:11 needs time to work on things. And so by
49:13 putting in a little bit, it'll work on
49:15 it overnight. And and now you when you
49:17 come the next day, you'll be ready for
49:19 the next, you know, you'll be able to
49:21 build on what you've done in your brain.
49:23 Is this why people say I'm going to
49:25 sleep on it when they you know they've
49:26 got difficult
49:27 >> you know these sayings actually have
49:29 meaning it's absolutely right you know
49:31 >> because the brain there's something
49:32 about spacing out
49:34 >> it it it's spacing but this me memory
49:36 consolidation I'm talking about is is
49:38 very uh it's very interesting something
49:41 I' I've actually worked a lot on and
49:42 what's happening is you have to take the
49:43 new experience and integrate it into
49:46 your old long-term memory and that has
49:48 to be done in a way that doesn't
49:50 interfere what's
49:53 And also you get a chance to sort out,
49:55 you know, what's relevant, what's
49:57 important. I know when I wake up in the
49:59 morning, things that were very muddled
50:00 and things become clearer because I
50:02 think it's it's it's eliminated a lot of
50:04 things that are irrelevant or not
50:06 needed. And so you now can see what's important.
50:07 important.
50:10 >> So what are the things that we do where
50:12 we think we're learning something, but
50:15 they're actually not working?
50:17 you know, because I'm I'm, you know, I
50:19 might be preparing for this podcast
50:21 today. I've got 20 pages of research
50:23 that I've pulled together and I might
50:24 tell myself that the way for me to
50:26 really learn that so that I don't have
50:28 to look at the research is by just
50:30 rereading it over and over again.
50:31 >> What you should have done is not just
50:32 read it over and over again. In fact,
50:34 one of the things that we say and this
50:37 is a standard thing is that students
50:38 they get a mental block and they keep
50:40 banging their head against the wall. I
50:41 can't understand it. I can't understand
50:44 it. What you what the right thing to do
50:46 is once you get to that point is just
50:48 get up and start walking around doing
50:50 something you know cooking, gardening,
50:52 whatever it is, let your subconscious
50:54 work on it. You know the brain uh
50:57 saturates very very quickly. So having
51:00 breaks at meetings you might think is is
51:02 a waste of time but actually that's the
51:05 most important thing you can add to a
51:06 long string of talks is have breaks
51:08 between the talks so that you can your
51:11 brain can work on it. And my favorite
51:14 meeting actually is a ski meeting.
51:17 Uh, and the idea is that you go to a ski
51:20 resort and what you do is in the morning
51:22 you have a couple of hours of of
51:24 lectures and now you go skiing and now
51:26 it turns out your brain is working on
51:29 what you heard and then when you come
51:31 down in the evening you have another
51:33 couple of hours but now your brain is
51:35 refreshed and so it's able to take in
51:38 the new information and integrate it and
51:39 then you go to sleep and that you know
51:41 it's like kneading bread you have to go
51:43 back and forth back and forth back and
51:45 forth and so I I found those the most
51:48 efficient in terms of learning new
51:50 things and uh being able to think about
51:53 it and mull over it during the time of
51:55 the meeting as opposed to at the end of
51:56 the meeting.
51:58 >> Every single one of you watching this
51:59 right now has something to offer whether
52:01 it's knowledge or skills or experience.
52:04 And that means you have value. Stand the
52:05 platform I co-own who are one of the
52:07 sponsors of this podcast turns your
52:09 knowledge into a business through one
52:11 single click. You can sell digital
52:13 products, coaching, communities, and you
52:15 don't need any coding experience either.
52:18 Just the drive to start. This is a
52:19 business I really believe in. And
52:22 already $300 million has been earned by
52:24 creators, coaches, and entrepreneurs
52:25 just like you have the potential to be
52:27 on Stanto. These are people who didn't
52:28 wait, who heard me saying things like
52:30 this, and instead of procrastinating,
52:32 started building, then launched
52:33 something, and now they're getting paid
52:35 to do it. Stan is incredibly simple and
52:37 incredibly easy, and you can link it
52:39 with a Shopify store that you're already
52:40 using if you want to. I'm on it and so
52:42 is my girlfriend and many of my team.
52:44 So, if you want to join, start by
52:45 launching your own business with a free
52:47 30-day trial. Visit stephvenbartlet.stan.store
52:49 stephvenbartlet.stan.store
52:51 and get yours set up within minutes.
52:54 I've met and invested in many earlystage
52:56 founders over the years, probably about
52:58 50 or 60 ones like Ross from Cadence and
53:00 Marissa from Perfect Ted. And one thing
53:02 they all know is that having a digitally
53:05 fluent business is crucial, but it isn't
53:06 always easy getting your business or
53:08 team to that point. Through my ongoing
53:10 partnership with Vodafin Business, I've
53:11 seen the work that they're doing
53:13 supporting founders and small businesses
53:16 to become digitally savvy. They know how
53:18 much small business owners value advice
53:19 from those who have been there and done
53:21 it before. So, they've just launched a
53:23 new content series to share experiences
53:25 from like-minded founders. It's called business.connected,
53:28 business.connected,
53:30 part of Vodafone support program. It's a
53:31 collection of resources designed to
53:33 support businesses with free digital
53:35 skills. So, if you've been trying to
53:37 figure out AI marketing, e-commerce, or
53:38 just how to scale smarter, there's
53:40 advice and insights throughout this
53:41 series from those who have already done
53:43 it before. A bunch of different founders
53:46 and experts who have been there and done
53:47 it. I highly recommend you go and check
53:49 it out. Just search Vodafone business.connected
53:51 business.connected
53:52 on YouTube or follow the link in the
53:54 description below. So, let's talk about
53:56 other things outside of AI that we can
54:00 do to have good healthy brains based on
54:01 everything you know about how the brain
54:05 works. Um, let's start with children.
54:06 I'm hoping to be a father at some point
54:09 in the next next couple of months or
54:11 years or whenever God grants me a child.
54:14 Um, what should I be thinking about with
54:15 my child's brain to make sure it's healthy
54:16 healthy
54:21 >> to get your body and your partner's body
54:24 as healthy as you can before you
54:28 conceive? Cuz there's a concept
54:31 I like called brain reserve. Brain
54:33 reserve is the extra function tissue you
54:36 have to deal with whatever stress comes
54:39 your way and it starts from the health
54:43 of the egg and the health of the sperm
54:46 that create the baby. So there are
54:51 things you guys can do now that would be
54:53 really helpful. And then once your
54:56 partner is pregnant, you want to not put
55:01 her under a lot of stress because her
55:05 body's health while she's creating the baby.
55:06 baby.
55:09 I mean, the baby's the brain starts to
55:12 develop. I think day 21. So even before
55:15 you know she's pregnant, the baby's
55:20 brain is developing. So knowing you
55:24 intentional, purposeful, it's like let's
55:28 live as cleanly as we can. I think that
55:31 gives the baby a head start. And then
55:34 you think about what to feed the baby.
55:37 You think about what the baby's exposed
55:41 to. And what the baby needs most is
55:46 mom's and your time and eye contact and
55:51 cuddling and singing and it's like those are
55:51 are
55:53 >> touching is really important. But
55:55 there's another fact there was a study
55:58 that was done on the impact of how many
56:01 words are spoken you know dur when a
56:03 baby and a child even when you know a
56:05 baby doesn't speak you know until like
56:08 18 months. Uh but it turns out that the
56:10 words that you are talking to the baby
56:12 are going into the brain and having an
56:15 impact every and and and in families
56:18 that don't talk, they do worse at school.
56:20 school.
56:22 >> Unfortunately, a lot of poor families.
56:24 Uh but uh but that's really important is
56:26 that they they they have they're exposed
56:30 to language early and abundantly.
56:33 >> And you model I mean it's one big thing.
56:37 whatever you want the baby to grow into
56:40 every day. You are modeling health or
56:43 you're modeling illness just by what you
56:45 do, by what you say, by how you treat
56:49 the baby's mother. Um, I have a book
56:51 called raising mentally strong kids,
56:53 which I'm very happy about.
56:57 >> Um, and it starts with what kind of dad
56:58 do I want to be and what kind of child
57:02 do I want to raise? and
57:03 bonding. You want your child to pick
57:06 your values.
57:09 Then bonding is time, actual physical time
57:11 time
57:14 and listening like being and that's what
57:18 AI does. I think it'll actually listen
57:22 without interrupting you and try to
57:24 reflect back what you're hearing and
57:27 then give you some positive input. Too
57:30 often because of screens, parents aren't
57:32 listening. Their heads are in their
57:36 phones and everybody's distracted. You
57:37 see it whenever you go to a restaurant.
57:40 It's like everybody's on their phone and
57:41 nobody's looking at each other.
57:43 >> Are we raising mentally weak kids
57:45 because we're there's a culture now of
57:48 like helping them too much, doing too
57:50 much for them?
57:52 >> This generation is the most in trouble
57:56 in history. And we have to really ask
57:59 ourselves why. From the food we feed
58:02 them to the devices they look at to the
58:05 negative news, the polarization
58:08 of the news, it's that sort of chronic
58:13 cortisol and then the separation. Oh,
58:15 you voted this way or you voted that
58:17 way. Saw something TV this morning. If
58:20 somebody voted one way, well, you
58:21 shouldn't spend time with them. I'm
58:24 like, we're already so lonely that now
58:26 you're gonna cut off 50% of the
58:29 population. It's like, it's just such
58:31 stupidity. Do you think much about the
58:34 the impact that religion and having a
58:36 belief in some kind of transcendent
58:40 thing has on the brain and psychology
58:41 and psychiatry generally.
58:43 >> So, if you don't believe in God, you're
58:47 three times the risk of depression.
58:50 could be God in different ways,
58:51 >> something transcendent or >> Yes.
58:53 >> Yes.
58:55 If you believe you're here by if just
58:58 think about it with me. If you believe
59:00 you're just here by random chance that
59:04 life really was not created and has no
59:07 meaning, there's existential
59:10 nothingness to that
59:14 as opposed to, oh no, I'm created in a
59:17 special way to do something purposeful
59:21 on Earth. There's purposeful people live
59:25 longer. They're happier. Now, whatever version
59:27 version
59:29 you believe
59:32 to not believe is hard for the brain.
59:36 And there's an interesting study on
59:38 believers versus non-believers. And you
59:40 know, many scientists would go, well,
59:42 they'll have smaller brains if they're a
59:44 believer. They actually had bigger
59:46 temporal loes. And temporal loes
59:48 underneath your temples and behind your eyes
59:50 eyes
59:53 right here. Um,
59:56 that's where it's called the God area
60:00 because of that's where people think they experience
60:01 they experience >> and if you have a seizure in the
60:03 >> and if you have a seizure in the temporal lobe, you have transcendent
60:06 temporal lobe, you have transcendent experiences like you're uh obs, you
60:10 experiences like you're uh obs, you know, in the presence of God
60:12 know, in the presence of God >> and they think maybe the Apostle Paul on
60:14 >> and they think maybe the Apostle Paul on the road to Damascus had a seizure and
60:18 the road to Damascus had a seizure and saw God. There's actually a researcher
60:21 saw God. There's actually a researcher in Canada, uh, Laurentian University,
60:24 in Canada, uh, Laurentian University, Michael Persinger. So, he would
60:26 Michael Persinger. So, he would stimulate the outside. He he would do it
60:30 stimulate the outside. He he would do it all over the brain. But what he found,
60:31 all over the brain. But what he found, he stimulated the outside of the right
60:33 he stimulated the outside of the right temporal lobe that people would get a
60:36 temporal lobe that people would get a sensed presence. They would actually
60:38 sensed presence. They would actually feel the presence of God in the room.
60:43 feel the presence of God in the room. So, does that mean the brain makes up
60:46 So, does that mean the brain makes up God or does that mean there's a way for
60:49 God or does that mean there's a way for God to communicate with us? I actually
60:52 God to communicate with us? I actually did a study on prayer was so
60:54 did a study on prayer was so interesting. You know, I pray for you.
60:57 interesting. You know, I pray for you. Uh prophecy, something called speaking
61:00 Uh prophecy, something called speaking in tonesues, and it was fascinating.
61:03 in tonesues, and it was fascinating. Speaking in tonesues is channeling,
61:05 Speaking in tonesues is channeling, which means you're channeling the Holy
61:07 which means you're channeling the Holy Spirit. And the hypothesis was you'd
61:09 Spirit. And the hypothesis was you'd have to drop your frontal loes, which is
61:12 have to drop your frontal loes, which is exactly what happened in 60%
61:15 exactly what happened in 60% of our patients. And one basil ganglia
61:18 of our patients. And one basil ganglia skyrocketed just like got hit with
61:20 skyrocketed just like got hit with cocaine cuz that's where cocaine works
61:23 cocaine cuz that's where cocaine works in the basil ganglia. Uh so interesting.
61:27 in the basil ganglia. Uh so interesting. If you had to create a brainhealthy
61:29 If you had to create a brainhealthy nation and I made you president of the
61:31 nation and I made you president of the United States for one month and you had
61:34 United States for one month and you had to put in place executive orders that
61:37 to put in place executive orders that would create a brain healthy nation,
61:39 would create a brain healthy nation, what executive orders would you
61:40 what executive orders would you immediately sign?
61:42 immediately sign? >> One question.
61:45 >> One question. Get all of the departments to ask
61:47 Get all of the departments to ask themselves what we're doing. Is this
61:49 themselves what we're doing. Is this good for our brands or bad for it? And I
61:52 good for our brands or bad for it? And I that's the campaign. I mean, I realize
61:54 that's the campaign. I mean, I realize I've been doing this a very long time.
61:57 I've been doing this a very long time. If I can just get people to answer that
62:00 If I can just get people to answer that one question with information
62:03 one question with information and love, love of themselves, love of
62:05 and love, love of themselves, love of their families, love of their country.
62:08 their families, love of their country. Is this is what we're doing good for our
62:11 Is this is what we're doing good for our brains or bad for it?
62:13 brains or bad for it? >> By far the best drug you can take for
62:16 >> By far the best drug you can take for your brain and not just your brain, but
62:18 your brain and not just your brain, but your entire body is exercise.
62:22 your entire body is exercise. In other words,
62:24 In other words, exercise, you pump the blood and your
62:27 exercise, you pump the blood and your brain gets uh you know, a lot of uh
62:30 brain gets uh you know, a lot of uh nutrients and everything. Uh it helps
62:32 nutrients and everything. Uh it helps your heart. It helps your immune system.
62:35 your heart. It helps your immune system. People don't realize how important that
62:37 People don't realize how important that is. And we're not talking about being an
62:38 is. And we're not talking about being an athlete. We're just talking about
62:40 athlete. We're just talking about walking. If you're older, walking is
62:42 walking. If you're older, walking is perfectly good exercise. And and you
62:45 perfectly good exercise. And and you know, children now I you know, they're
62:46 know, children now I you know, they're they're not getting enough exercise.
62:48 they're not getting enough exercise. >> No. Because they're on devices.
62:49 >> No. Because they're on devices. >> Yeah. And so I have a model. If you want
62:52 >> Yeah. And so I have a model. If you want to keep your brain healthy or rescue it,
62:54 to keep your brain healthy or rescue it, you have to prevent or treat the 11
62:56 you have to prevent or treat the 11 major risk factors. And we've talked
62:58 major risk factors. And we've talked about them before.
63:00 about them before. Exercise helps you with every single
63:03 Exercise helps you with every single one. So like it's called bright minds.
63:06 one. So like it's called bright minds. So B is for blood flow, increases blood
63:09 So B is for blood flow, increases blood flow. Retirement and aging. It decreases
63:11 flow. Retirement and aging. It decreases your age. I is inflammation. It's
63:12 your age. I is inflammation. It's anti-inflammatory. G is genetics. It
63:15 anti-inflammatory. G is genetics. It helps turn on healthpromoting
63:18 helps turn on healthpromoting genes.
63:19 genes. H is head trauma. If you keep walking,
63:23 H is head trauma. If you keep walking, you're less likely to fall when you're
63:24 you're less likely to fall when you're older, right? T is toxins. Sweat
63:28 older, right? T is toxins. Sweat detoxifies you. M is mental health.
63:32 detoxifies you. M is mental health. Exercise boosts dopamine, but it also
63:34 Exercise boosts dopamine, but it also boosts serotonin. So, it's like that
63:36 boosts serotonin. So, it's like that perfect balancer in your brain.
63:39 perfect balancer in your brain. >> Breathing, how we breathe, does that
63:41 >> Breathing, how we breathe, does that have an impact on brain health? So you
63:43 have an impact on brain health? So you can almost immediately improve heart
63:47 can almost immediately improve heart rate variability which is a sign of
63:49 rate variability which is a sign of heart health but also goes to mental
63:52 heart health but also goes to mental health by breathing in a certain helpful
63:56 health by breathing in a certain helpful way. And I call it the 15-second breath.
64:00 way. And I call it the 15-second breath. So 4 seconds in big breath hold it for a
64:03 So 4 seconds in big breath hold it for a second and a half pause just a little
64:05 second and a half pause just a little bit. 8 seconds out hold it for a second
64:10 bit. 8 seconds out hold it for a second and a half. So if you take twice as long
64:13 and a half. So if you take twice as long to breathe out as you breathe in, it
64:16 to breathe out as you breathe in, it increases something called
64:17 increases something called parasympathetic tone and it just calms
64:20 parasympathetic tone and it just calms you down almost immediately. So if
64:22 you down almost immediately. So if you're having panic attacks, yes, you
64:24 you're having panic attacks, yes, you can take Xanax, but there's so many
64:26 can take Xanax, but there's so many problems with that later on. Or you can
64:28 problems with that later on. Or you can just learn how to breathe. We call it
64:32 just learn how to breathe. We call it diaphragmatic. So breathe mostly with
64:34 diaphragmatic. So breathe mostly with your belly, taking twice as long to
64:36 your belly, taking twice as long to breathe out as you breathe in.
64:37 breathe out as you breathe in. >> Chewing. Uh there's a piece here that
64:40 >> Chewing. Uh there's a piece here that says it stimulates hippocample activity
64:42 says it stimulates hippocample activity and may slow cognitive decline. Reducing
64:45 and may slow cognitive decline. Reducing chewing has been linked to impaired
64:46 chewing has been linked to impaired learning in animal studies.
64:48 learning in animal studies. >> And fast food decreases chewing because
64:51 >> And fast food decreases chewing because it's fast. So they take most of the
64:52 it's fast. So they take most of the fiber out so you can chew it faster. You
64:54 fiber out so you can chew it faster. You can swallow it faster.
64:56 can swallow it faster. >> Things in the bad for your brain list.
64:58 >> Things in the bad for your brain list. Overuse of GPS and navigation app which
65:00 Overuse of GPS and navigation app which weakens the hippocampus by outsourcing
65:02 weakens the hippocampus by outsourcing spatial memory long term. This can lead
65:04 spatial memory long term. This can lead to atrophy in areas associated with
65:06 to atrophy in areas associated with memory navigation. And people are
65:08 memory navigation. And people are diagnosed with Alzheimer's disease later
65:10 diagnosed with Alzheimer's disease later in life because of Siri. Because I used
65:15 in life because of Siri. Because I used to like when I started as a young
65:17 to like when I started as a young psychiatrist, somebody get lost in a
65:19 psychiatrist, somebody get lost in a city they'd lived in for 30 years and
65:22 city they'd lived in for 30 years and their family would call me upset and I'm
65:26 their family would call me upset and I'm like, "Okay, this person's headed toward
65:27 like, "Okay, this person's headed toward dementia." Now that person goes, "Take
65:31 dementia." Now that person goes, "Take me home." Do you think it's going to
65:32 me home." Do you think it's going to we're going to have an epigenetic
65:35 we're going to have an epigenetic effect of not reading maps that if
65:40 effect of not reading maps that if Stephen now he uses his phone to get
65:42 Stephen now he uses his phone to get from A to B, do you think that's going
65:45 from A to B, do you think that's going to affect Steven's son or daughter
65:48 to affect Steven's son or daughter because dad
65:51 because dad didn't have
65:52 didn't have >> Wow. Okay. That that that never occurred
65:54 >> Wow. Okay. That that that never occurred to me that uh you could pass on
65:56 to me that uh you could pass on something like that. By the way, I I
65:58 something like that. By the way, I I think it has to be physiological.
66:00 think it has to be physiological. Stress, for example, could be probably
66:02 Stress, for example, could be probably passed on. And you mentioned this, you
66:03 passed on. And you mentioned this, you you pointed out during pregnancy, you
66:05 you pointed out during pregnancy, you you want to prevent stress and and
66:07 you want to prevent stress and and crisis, right?
66:08 crisis, right? >> Do do you know about that study with
66:10 >> Do do you know about that study with mice where they made them afraid of the
66:12 mice where they made them afraid of the scent of cherry blossoms from memory?
66:15 scent of cherry blossoms from memory? And so whenever the mice smelled cherry
66:20 And so whenever the mice smelled cherry blossoms, they would shock them mildly.
66:24 blossoms, they would shock them mildly. So the mice are now afraid of the scent
66:25 So the mice are now afraid of the scent of cherry blossoms. Their babies were
66:28 of cherry blossoms. Their babies were afraid of the scent of cherry blossoms.
66:30 afraid of the scent of cherry blossoms. Their grandbabies were afraid of the
66:32 Their grandbabies were afraid of the scent of cherry blossoms. So,
66:33 scent of cherry blossoms. So, >> okay, that's the alactory system.
66:35 >> okay, that's the alactory system. Alactory system is very interesting
66:36 Alactory system is very interesting because it goes directly to the
66:38 because it goes directly to the hippocampus.
66:40 hippocampus. There might be a evolutionary advantage
66:42 There might be a evolutionary advantage because if if there's something in the
66:45 because if if there's something in the environment that you shouldn't eat or
66:47 environment that you shouldn't eat or you know that smells a particular way,
66:49 you know that smells a particular way, passing that on is is very efficient
66:52 passing that on is is very efficient instead of having to experience that
66:54 instead of having to experience that yourself. you know, trial and error
66:55 yourself. you know, trial and error because they might if the poison is
66:57 because they might if the poison is right, it might kill you. But if if
66:58 right, it might kill you. But if if you've your parents
67:01 you've your parents ha had that bad experience and pass it
67:03 ha had that bad experience and pass it on, you shouldn't go to something that
67:05 on, you shouldn't go to something that smells in a particular way. That makes
67:07 smells in a particular way. That makes sense.
67:07 sense. >> The other thing that's bad for the
67:08 >> The other thing that's bad for the brain, which is unexpected, is you said
67:11 brain, which is unexpected, is you said at the start, artificial sweeteners.
67:14 at the start, artificial sweeteners. Now, I didn't I thought artificial
67:16 Now, I didn't I thought artificial sweeteners were fine.
67:17 sweeteners were fine. >> They're not fun and they're not free.
67:19 >> They're not fun and they're not free. So, I used to drink diet soda like it
67:21 So, I used to drink diet soda like it was my best friend because I thought it
67:23 was my best friend because I thought it was free. And then I had arthritis when
67:27 was free. And then I had arthritis when I was 35. And one of my patients said
67:30 I was 35. And one of my patients said she stopped aspartame and her arthritis
67:32 she stopped aspartame and her arthritis went away. And I'm like, I was drinking
67:34 went away. And I'm like, I was drinking like I don't know, a lot of diet soda.
67:38 like I don't know, a lot of diet soda. And so I stopped and my arthritis went
67:41 And so I stopped and my arthritis went away. And I'm like, no. And so I did it
67:44 away. And I'm like, no. And so I did it again and it came back. And I'm like,
67:46 again and it came back. And I'm like, okay. And artificial sweeteners can
67:50 okay. And artificial sweeteners can change the microbiome. So we haven't
67:52 change the microbiome. So we haven't talked about that, but you have these
67:55 talked about that, but you have these 100 trillion bugs in your gut that make
67:57 100 trillion bugs in your gut that make neurotransmitters digest your food and
68:00 neurotransmitters digest your food and especially sucralose or splenda has been
68:03 especially sucralose or splenda has been found to decrease the good bacteria in
68:06 found to decrease the good bacteria in your gut which then has a negative
68:07 your gut which then has a negative impact on brain function
68:10 impact on brain function >> and a spartame as you mentioned
68:12 >> and a spartame as you mentioned >> and a spartamement that I mentioned that
68:14 >> and a spartamement that I mentioned that can have a generational impact. So, is
68:16 can have a generational impact. So, is it possible it's really not social
68:18 it possible it's really not social media? It's that we've had aspartame in
68:21 media? It's that we've had aspartame in our food for decades. And I think it's
68:26 our food for decades. And I think it's all of these things that just sort of
68:28 all of these things that just sort of are additive and we should just always
68:31 are additive and we should just always think that that one question. Is this
68:33 think that that one question. Is this good for my brain or bad for it? So, you
68:35 good for my brain or bad for it? So, you mentioned broccoli. Probably that's good
68:38 mentioned broccoli. Probably that's good for your brain. Cheeseburger, probably
68:42 for your brain. Cheeseburger, probably not. But why don't you take the burger
68:44 not. But why don't you take the burger and if you could make it grass-fed that
68:46 and if you could make it grass-fed that would be better and put it in a salad
68:48 would be better and put it in a salad and then that would be good for your
68:50 and then that would be good for your brain.
68:50 brain. >> What about chronic background noise?
68:53 >> What about chronic background noise? We don't think much about the impact
68:55 We don't think much about the impact noise has. But
68:56 noise has. But >> I used to live um my house was three
69:00 >> I used to live um my house was three houses from the freeway and if you just
69:03 houses from the freeway and if you just go there it's like my god it's so loud
69:06 go there it's like my god it's so loud here. I never heard the freeway because
69:09 here. I never heard the freeway because my brain just learned to tune it out.
69:11 my brain just learned to tune it out. Was that good or bad for you
69:14 Was that good or bad for you >> that I was able
69:15 >> that I was able >> you had you adapted and and were no
69:17 >> you had you adapted and and were no longer sensitive to it? I I I think that
69:19 longer sensitive to it? I I I think that actually was probably not good
69:22 actually was probably not good for for v various reasons because it
69:25 for for v various reasons because it what it really means is that you you're
69:27 what it really means is that you you're you're specializing for that environment
69:29 you're specializing for that environment and your brain is going to be different
69:30 and your brain is going to be different when you go someplace.
69:33 when you go someplace. So, but so here's here's another
69:35 So, but so here's here's another example.
69:35 example. >> And it's stressful, right? if it's
69:38 >> And it's stressful, right? if it's chronically stressful, but my brain is
69:40 chronically stressful, but my brain is >> That's right. That's right. In the
69:41 >> That's right. That's right. In the background. In the In other words, Yeah.
69:43 background. In the In other words, Yeah. In other words, your your brain is
69:45 In other words, your your brain is reacting to it even though you're not
69:46 reacting to it even though you're not aware of it. Yeah.
69:47 aware of it. Yeah. >> So, subtly five sisters, which makes it
69:49 >> So, subtly five sisters, which makes it even worse.
69:50 even worse. >> Subtly increases cortisol and impairs
69:52 >> Subtly increases cortisol and impairs working memory and attention regulation,
69:54 working memory and attention regulation, especially in children and older adults,
69:55 especially in children and older adults, to be chronically exposed to background
69:57 to be chronically exposed to background noise like traffic or the lowle hum of a
69:59 noise like traffic or the lowle hum of a city.
70:00 city. >> Yeah, that's right. That's absolutely
70:02 >> Yeah, that's right. That's absolutely right.
70:03 right. >> Make sure you keep what I'm about to say
70:04 >> Make sure you keep what I'm about to say to yourself. I'm inviting 10,000 of you
70:07 to yourself. I'm inviting 10,000 of you to come even deeper into the diary of a
70:09 to come even deeper into the diary of a CEO. Welcome to my inner circle. This is
70:12 CEO. Welcome to my inner circle. This is a brand new private community that I'm
70:14 a brand new private community that I'm launching to the world. We have so many
70:16 launching to the world. We have so many incredible things that happen that you
70:18 incredible things that happen that you are never shown. We have the briefs that
70:20 are never shown. We have the briefs that are on my iPad when I'm recording the
70:22 are on my iPad when I'm recording the conversation. We have clips we've never
70:24 conversation. We have clips we've never released. We have behindthe-scenes
70:25 released. We have behindthe-scenes conversations with the guest and also
70:27 conversations with the guest and also the episodes that we've never ever
70:29 the episodes that we've never ever released. And so much more. In the
70:32 released. And so much more. In the circle, you'll have direct access to me.
70:34 circle, you'll have direct access to me. You can tell us what you want this show
70:36 You can tell us what you want this show to be, who you want us to interview, and
70:37 to be, who you want us to interview, and the types of conversations you would
70:39 the types of conversations you would love us to have. But remember, for now,
70:41 love us to have. But remember, for now, we're only inviting the first 10,000
70:43 we're only inviting the first 10,000 people that join before it closes. So,
70:46 people that join before it closes. So, if you want to join our private closed
70:47 if you want to join our private closed community, head to the link in the
70:48 community, head to the link in the description below or go to
70:49 description below or go to daccircle.com.
70:55 I will speak to you there. Many of us multitask across multiple screens now.
70:57 multitask across multiple screens now. We're watching TV here. We've got our
70:58 We're watching TV here. We've got our phone here. We've got our iPad here. Got
71:00 phone here. We've got our iPad here. Got our computer here. And I was reading
71:01 our computer here. And I was reading into the science of multitasking and it
71:03 into the science of multitasking and it said that it trains your brain to be
71:04 said that it trains your brain to be distractable reducing gray matter
71:07 distractable reducing gray matter density in the interior singulate.
71:10 density in the interior singulate. >> Yeah, that's you know in the medial
71:13 >> Yeah, that's you know in the medial prefrontal cortex.
71:14 prefrontal cortex. >> And when the insula the insula is so
71:16 >> And when the insula the insula is so interesting and I know you can talk
71:18 interesting and I know you can talk about I have a new study coming out on
71:20 about I have a new study coming out on hope. So on 7,500 patients we gave them
71:24 hope. So on 7,500 patients we gave them a hope questionnaire.
71:26 a hope questionnaire. >> What does that mean?
71:27 >> What does that mean? >> Hope questionnaire. hope. Like how much
71:28 >> Hope questionnaire. hope. Like how much hope do you have that you have the
71:31 hope do you have that you have the ability to make tomorrow better?
71:34 ability to make tomorrow better? >> And people with low hope have lower
71:38 >> And people with low hope have lower overall prefrontal cortex function, but
71:40 overall prefrontal cortex function, but the insular
71:43 the insular was really low. And that signal was the
71:46 was really low. And that signal was the most statistically significant of the
71:49 most statistically significant of the group really. And in some studies the
71:52 group really. And in some studies the insulin is called
71:53 insulin is called >> by the way uh also uh for depression
71:55 >> by the way uh also uh for depression people who have depression
71:57 people who have depression uh have low activity in the anterior
71:59 uh have low activity in the anterior singulate. In fact um it deep brain
72:02 singulate. In fact um it deep brain stimulation has been used now for to
72:03 stimulation has been used now for to help some people if you stimulate that
72:05 help some people if you stimulate that area
72:06 area >> and what our imaging research would say
72:08 >> and what our imaging research would say is depression is like chest pain. It's
72:12 is depression is like chest pain. It's not one thing like nobody gets a
72:14 not one thing like nobody gets a diagnosis of chest pain because that
72:16 diagnosis of chest pain because that would be stupid right? It could be heart
72:18 would be stupid right? It could be heart attack, heart arrhythmia, heart
72:19 attack, heart arrhythmia, heart infection, gas, grief. Depression's the
72:23 infection, gas, grief. Depression's the same way when you look at it from an
72:24 same way when you look at it from an imaging standpoint. Sometimes their
72:26 imaging standpoint. Sometimes their frontal loes are too active. Sometimes
72:29 frontal loes are too active. Sometimes they're not active enough. Sometimes
72:32 they're not active enough. Sometimes it's their lyic system that's too
72:35 it's their lyic system that's too active. And I wrote a book called
72:38 active. And I wrote a book called Healing Anxiety and Depression. I'm
72:40 Healing Anxiety and Depression. I'm like, here's the seven things I see as
72:42 like, here's the seven things I see as an imager.
72:43 an imager. >> What about ADHD? There's obviously been
72:45 >> What about ADHD? There's obviously been a rise in ADHD or at least people
72:47 a rise in ADHD or at least people reporting or being diagnosed with ADHD
72:49 reporting or being diagnosed with ADHD quite significant.
72:51 quite significant. Can you find ADHD in the brain? Are we
72:54 Can you find ADHD in the brain? Are we causing ADHD as a function of the way
72:56 causing ADHD as a function of the way that we're living our lives or is it
72:57 that we're living our lives or is it something within the brain genetically
72:58 something within the brain genetically that I could I could see?
73:00 that I could I could see? >> So, it's both. I think clearly you can
73:04 >> So, it's both. I think clearly you can see ADHD in people's families. In fact,
73:07 see ADHD in people's families. In fact, if I have a hyperactive, restless,
73:10 if I have a hyperactive, restless, impulsive, disorganized, procrastinating
73:14 impulsive, disorganized, procrastinating child, I'm looking at the mom and the
73:15 child, I'm looking at the mom and the dad. I'm like, "So, where is this coming
73:17 dad. I'm like, "So, where is this coming from?" But you could also get ADHD from
73:20 from?" But you could also get ADHD from a head injury, especially if it infects
73:22 a head injury, especially if it infects their frontal loes, which is why you
73:24 their frontal loes, which is why you shouldn't let children hit soccer balls
73:26 shouldn't let children hit soccer balls with their forehead. You can also get it
73:29 with their forehead. You can also get it from the chronic from the excessive
73:32 from the chronic from the excessive input making people distracted just like
73:35 input making people distracted just like you said. Brand new study out on
73:38 you said. Brand new study out on children who took medicine. Right. We
73:40 children who took medicine. Right. We always demonize ADHD medicine, but the
73:44 always demonize ADHD medicine, but the kids who took medicine actually had
73:48 kids who took medicine actually had bigger brains in their prefrontal cortex
73:52 bigger brains in their prefrontal cortex than kids who didn't take medicine who
73:54 than kids who didn't take medicine who had AD.
73:55 had AD. >> Rolin.
73:56 >> Rolin. >> Rolin. That that's okay. It's speed
73:58 >> Rolin. That that's okay. It's speed basically. Yeah. Inetamines
74:00 basically. Yeah. Inetamines >> it is. But for the kids who have it, I
74:04 >> it is. But for the kids who have it, I think withholding medicine
74:07 think withholding medicine from a child who really has ADHD is like
74:10 from a child who really has ADHD is like withholding glasses from someone who has
74:13 withholding glasses from someone who has trouble seeing. And it's it's the easy
74:16 trouble seeing. And it's it's the easy thing to demonize the drugs until you
74:20 thing to demonize the drugs until you realize someone who has ADHD,
74:23 realize someone who has ADHD, a third of them don't finish high
74:25 a third of them don't finish high school. And We're never asked the right
74:28 school. And We're never asked the right question about people go what's the side
74:30 question about people go what's the side effects and it's it can decrease your
74:32 effects and it's it can decrease your appetite and it can can have sleep
74:36 appetite and it can can have sleep problems with it but they don't ask the
74:39 problems with it but they don't ask the other question is what's the side effect
74:40 other question is what's the side effect of not taking the medicine or at least
74:43 of not taking the medicine or at least not fully treating and there are other
74:45 not fully treating and there are other ways to treat it besides medicine you
74:47 ways to treat it besides medicine you know for god's sakes I own a supplement
74:49 know for god's sakes I own a supplement company and I'm always trying to
74:51 company and I'm always trying to optimize the nutrients to the brain
74:54 optimize the nutrients to the brain neuro feedback can help. But if you do
74:58 neuro feedback can help. But if you do those things and it's not working, don't
75:00 those things and it's not working, don't be afraid of medicine.
75:03 be afraid of medicine. >> By the way, when I was growing up, ADHD
75:05 >> By the way, when I was growing up, ADHD either didn't exist or they didn't know
75:07 either didn't exist or they didn't know about it. Do do you think that there's
75:10 about it. Do do you think that there's some link to our diet?
75:11 some link to our diet? >> Oh, no. It was first described in around
75:13 >> Oh, no. It was first described in around 1910 and it's in the first version of
75:17 1910 and it's in the first version of the DSM.
75:19 the DSM. Um, they called it minimal brain
75:21 Um, they called it minimal brain dysfunction.
75:23 dysfunction. But when we were growing up, there were
75:25 But when we were growing up, there were one of two of these kids in our
75:27 one of two of these kids in our classrooms and now there's 8 to 10.
75:30 classrooms and now there's 8 to 10. >> That's what I mean is is that it's uh
75:32 >> That's what I mean is is that it's uh seems to like autism. It's uh seems to
75:35 seems to like autism. It's uh seems to be proliferating,
75:37 be proliferating, >> right? And part of it, I think, is the
75:39 >> right? And part of it, I think, is the food that is much more processed. Part
75:43 food that is much more processed. Part of it is the screens, part of it is the
75:46 of it is the screens, part of it is the distracted parents, and part of it is
75:49 distracted parents, and part of it is the teaching.
75:50 the teaching. >> You always seem to be doing new studies,
75:52 >> You always seem to be doing new studies, Daniel. What what new studies are you
75:54 Daniel. What what new studies are you most excited about or have you completed
75:56 most excited about or have you completed since we last spoke?
75:58 since we last spoke? >> I did one that I'm so excited about on
76:02 >> I did one that I'm so excited about on negativity and the brain
76:05 negativity and the brain >> and negativity is bad for your brain.
76:09 >> and negativity is bad for your brain. So,
76:10 So, >> how do you define negativity?
76:11 >> how do you define negativity? >> We actually give them a questionnaire.
76:14 >> We actually give them a questionnaire. uh it's a positivity negativity bias
76:17 uh it's a positivity negativity bias questionnaire and people who are more
76:20 questionnaire and people who are more negative have less activity in their
76:24 negative have less activity in their prefrontal cortex. It's actually quite
76:26 prefrontal cortex. It's actually quite interesting and so
76:30 interesting and so unbridled
76:31 unbridled positivity is bad for you because you
76:34 positivity is bad for you because you need that 15% but if you're chronically
76:37 need that 15% but if you're chronically negative that is bad for your brain. Is
76:41 negative that is bad for your brain. Is there a link between being a negative
76:43 there a link between being a negative person and Alzheimer's and dementia?
76:45 person and Alzheimer's and dementia? >> Yes. And what's interesting because you
76:47 >> Yes. And what's interesting because you mentioned a gender difference earlier.
76:49 mentioned a gender difference earlier. Um, if you're depressed and you're a
76:52 Um, if you're depressed and you're a woman, it doubles your risk for
76:54 woman, it doubles your risk for Alzheimer's disease. If you're depressed
76:57 Alzheimer's disease. If you're depressed and you're a man, it quadruples your
77:00 and you're a man, it quadruples your risk.
77:01 risk. >> Wow. So there was a study was done
77:04 >> Wow. So there was a study was done during the COVID years, a couple years
77:07 during the COVID years, a couple years and it turns out that the uh rate of
77:10 and it turns out that the uh rate of depression like doubled in women but not
77:13 depression like doubled in women but not in men
77:15 in men >> during CO
77:16 >> during CO >> during CO and after CO when students
77:19 >> during CO and after CO when students came back and everybody was back to
77:20 came back and everybody was back to normal so-called normal the women stayed
77:24 normal so-called normal the women stayed depressed
77:26 depressed at that high level which is very is very
77:29 at that high level which is very is very interesting that it should be the women
77:31 interesting that it should be the women who
77:32 who >> so in one study women had 52% less
77:35 >> so in one study women had 52% less serotonin than men which I think is
77:39 serotonin than men which I think is really interesting. Women by and large
77:41 really interesting. Women by and large have double double the risk depression.
77:44 have double double the risk depression. Women have double the risk of depression
77:45 Women have double the risk of depression as men their lyic systems are larger
77:48 as men their lyic systems are larger which is also probably
77:51 which is also probably >> more vulnerable and bonding and then the
77:54 >> more vulnerable and bonding and then the whole CO thing we haven't talked about
77:57 whole CO thing we haven't talked about CO causes inflammation in the lyic part
78:00 CO causes inflammation in the lyic part of the brain. I had scans of people I
78:03 of the brain. I had scans of people I was treating
78:05 was treating and then they got CO and then I scan
78:08 and then they got CO and then I scan them again and you can just see this
78:10 them again and you can just see this dramatic inflammation in the brain. If
78:14 dramatic inflammation in the brain. If someone's listening now and they just
78:16 someone's listening now and they just want to they want to improve their brain
78:17 want to they want to improve their brain health. They want to avoid dementia.
78:19 health. They want to avoid dementia. They want to be cognitively
78:22 They want to be cognitively powerful and capable as they age. They
78:24 powerful and capable as they age. They want to get to 80 years old, 90 years
78:26 want to get to 80 years old, 90 years old, 100 years old and have a great
78:28 old, 100 years old and have a great brain. And you just had to and you could
78:31 brain. And you just had to and you could only tell them to do three things.
78:33 only tell them to do three things. >> Well, Terry said one, exercise.
78:35 >> Well, Terry said one, exercise. >> Okay, exercise. I'm going to do it.
78:38 >> Okay, exercise. I'm going to do it. >> Start every day with today is going to
78:40 >> Start every day with today is going to be a great day.
78:41 be a great day. >> Push your brain to look for what's right
78:45 >> Push your brain to look for what's right rather than what's wrong.
78:48 rather than what's wrong. >> Okay. So, I'm going to be optimistic and
78:51 >> Okay. So, I'm going to be optimistic and grateful.
78:51 grateful. >> Omega-3 fatty acids
78:54 >> Omega-3 fatty acids >> and either do it with fish or do it with
78:56 >> and either do it with fish or do it with a supplement.
78:56 a supplement. >> Why did you include omega-3 fatty acids?
78:59 >> Why did you include omega-3 fatty acids? because it decreases
79:01 because it decreases inflammation. And 25%
79:05 inflammation. And 25% of the cell membranes in your brain are
79:08 of the cell membranes in your brain are made up of omega-3 fatty acids. And as a
79:11 made up of omega-3 fatty acids. And as a country, we're dramatically low on them.
79:14 country, we're dramatically low on them. >> And learning, that's maybe one of the
79:15 >> And learning, that's maybe one of the things that's been left off the list of
79:17 things that's been left off the list of top three things. But I mean, I remember
79:20 top three things. But I mean, I remember you telling me that how good learning
79:22 you telling me that how good learning was for the brain. And even getting
79:23 was for the brain. And even getting outside and running outside versus
79:25 outside and running outside versus running on a treadmill is more
79:27 running on a treadmill is more beneficial. And if you learn while
79:29 beneficial. And if you learn while you're exercising, what you're doing is
79:32 you're exercising, what you're doing is you're getting blood flow to the
79:34 you're getting blood flow to the hippocampus and you're more likely to
79:36 hippocampus and you're more likely to remember it. So,
79:39 remember it. So, >> I heard this. Yeah, I heard someone tell
79:40 >> I heard this. Yeah, I heard someone tell me that they um figured out that they
79:42 me that they um figured out that they could learn better for their exams if
79:44 could learn better for their exams if they did it in a sauna.
79:46 they did it in a sauna. So, they kept it was a scientist that I
79:48 So, they kept it was a scientist that I spoke to. She said she keeps learning
79:50 spoke to. She said she keeps learning new information when she's in the sauna
79:52 new information when she's in the sauna cuz she realized that when she left the
79:53 cuz she realized that when she left the sauna and was then tested upon on it,
79:55 sauna and was then tested upon on it, she was better able to uh do the exam.
79:59 she was better able to uh do the exam. And I guess that's correlating to what
80:00 And I guess that's correlating to what you said about because in a sauna you're
80:02 you said about because in a sauna you're going to have a lot of blood flow, I
80:03 going to have a lot of blood flow, I imagine to the brain.
80:04 imagine to the brain. >> Yes. There's actually a study in Jamama
80:07 >> Yes. There's actually a study in Jamama psychiatry that one sauna
80:11 psychiatry that one sauna bath
80:13 bath helped depression significantly helped
80:16 helped depression significantly helped depression.
80:18 depression. And I think it's because of it's
80:20 And I think it's because of it's balancing the brain and people who do
80:22 balancing the brain and people who do the most sinus have the lowest risk of
80:24 the most sinus have the lowest risk of Alzheimer's disease.
80:27 Alzheimer's disease. >> What is the most important thing as it
80:28 >> What is the most important thing as it relates to the subjects that we spoke
80:30 relates to the subjects that we spoke about today? AI, the brain, neuroscience
80:34 about today? AI, the brain, neuroscience that you would like to say to the the
80:36 that you would like to say to the the people that are listening now. There
80:37 people that are listening now. There could be a million people listening.
80:39 could be a million people listening. There could be 20 million people
80:40 There could be 20 million people listening. If you could say one thing to
80:42 listening. If you could say one thing to them about the brain, AI, neuroscience,
80:45 them about the brain, AI, neuroscience, whatever you want to say, the floor is
80:46 whatever you want to say, the floor is yours. What would that be?
80:49 yours. What would that be? Over to you first, Terry.
80:51 Over to you first, Terry. >> Sleep. Sleep is a time when the body not
80:55 >> Sleep. Sleep is a time when the body not just regenerates, but your memory is
80:57 just regenerates, but your memory is consolidated. So things you've
80:59 consolidated. So things you've experienced during the day are
81:02 experienced during the day are integrated into your cortex and it's an
81:04 integrated into your cortex and it's an interaction between hippocampus and the
81:06 interaction between hippocampus and the cortex for, you know, for for episodic
81:09 cortex for, you know, for for episodic memories. and and and it's unfortunate
81:13 memories. and and and it's unfortunate what's happening with children now, you
81:14 what's happening with children now, you know, is they're so competitive to get
81:16 know, is they're so competitive to get into college that they're cutting back
81:19 into college that they're cutting back on their sleep and it's just the wrong
81:20 on their sleep and it's just the wrong time of your life. You shouldn't be
81:22 time of your life. You shouldn't be cutting it back when your brain is is
81:25 cutting it back when your brain is is developing. So, those two things I would
81:27 developing. So, those two things I would say sleep and exercise is the most
81:28 say sleep and exercise is the most important thing for your brain. The
81:30 important thing for your brain. The floor is yours. What would you say to
81:32 floor is yours. What would you say to the listeners about all the things we've
81:35 the listeners about all the things we've talked today? What's your closing
81:36 talked today? What's your closing closing statement?
81:38 closing statement? Well, you know, I go back to what I
81:40 Well, you know, I go back to what I talked about in the beginning, which is
81:42 talked about in the beginning, which is we've just thrown the barn door open and
81:45 we've just thrown the barn door open and let the horse bolt out into our schools,
81:50 let the horse bolt out into our schools, into our businesses, into our homes.
81:54 into our businesses, into our homes. And before we've even asked, is it a
81:56 And before we've even asked, is it a gift or is it a Trojan horse that's
81:59 gift or is it a Trojan horse that's going to steal from us, we've embraced
82:03 going to steal from us, we've embraced convenience before understanding
82:06 convenience before understanding consequence. And we've done it before
82:09 consequence. And we've done it before with video games and cell phones and
82:11 with video games and cell phones and social media and marijuana and alcohol
82:13 social media and marijuana and alcohol and opiates and high fructose corn syrup
82:17 and opiates and high fructose corn syrup and aspartame. And we have to be
82:19 and aspartame. And we have to be smarter. We have to tame this horse. is
82:23 smarter. We have to tame this horse. is gone with wisdom or it's going to
82:25 gone with wisdom or it's going to trample our children. And so I think we
82:28 trample our children. And so I think we have to be very thoughtful and it all
82:31 have to be very thoughtful and it all comes back down to is this good for my
82:33 comes back down to is this good for my brain or bad for it? Is it good for our
82:35 brain or bad for it? Is it good for our collective brains or is it potentially
82:37 collective brains or is it potentially bad for it? And just answer that
82:42 bad for it? And just answer that question with information and love of
82:45 question with information and love of yourself, of your family, of your
82:47 yourself, of your family, of your country, community.
82:56 Yeah, more anxious than when I came in. I don't like that.
82:59 I don't like that. It is It's I'm just It's so front of
83:01 It is It's I'm just It's so front of mind for me at the moment because I have
83:03 mind for me at the moment because I have the hindsight, the wisdom of hindsight
83:06 the hindsight, the wisdom of hindsight of all those things you mentioned like
83:07 of all those things you mentioned like exercise and processed foods and social
83:09 exercise and processed foods and social media and all these things that we tried
83:12 media and all these things that we tried and they all seem to follow a similar
83:13 and they all seem to follow a similar arc. Some kind of new product or
83:15 arc. Some kind of new product or discovery is made. the early phase. In
83:18 discovery is made. the early phase. In the early phase, people who have an
83:20 the early phase, people who have an incentive for that thing to be
83:21 incentive for that thing to be successful will somewhat like gaslight
83:24 successful will somewhat like gaslight you into thinking that it's fine. And
83:26 you into thinking that it's fine. And then we get into the second phase where
83:27 then we get into the second phase where we start to see sort of some
83:29 we start to see sort of some consequences. Then we study what's
83:30 consequences. Then we study what's actually happened. We figure out that
83:32 actually happened. We figure out that there's there was always a trade-off and
83:34 there's there was always a trade-off and that nobody really understood the
83:36 that nobody really understood the trade-off and then people change their
83:38 trade-off and then people change their behavior. So now when I go into these
83:40 behavior. So now when I go into these new technologies where the shortterm
83:43 new technologies where the shortterm benefit is really clear it's making me
83:45 benefit is really clear it's making me more productive. I I pause and I go
83:48 more productive. I I pause and I go there's going to be a trade-off here.
83:50 there's going to be a trade-off here. There's always a trade-off. What is the
83:52 There's always a trade-off. What is the trade-off? And am I comfortable and
83:54 trade-off? And am I comfortable and conscious of what that trade-off is? And
83:56 conscious of what that trade-off is? And if if the trade-off so I try to figure
83:58 if if the trade-off so I try to figure out what the trade-off is with things
83:59 out what the trade-off is with things like AI and I okay the trade-off is
84:00 like AI and I okay the trade-off is probably I'm going to be worse at
84:01 probably I'm going to be worse at critical thinking
84:03 critical thinking that might have an impact to my social
84:04 that might have an impact to my social relationships if I fall in love with
84:05 relationships if I fall in love with [ __ ] Annie cuz she's pretty hot to be
84:07 [ __ ] Annie cuz she's pretty hot to be fair.
84:09 fair. And I really value my critical thinking.
84:11 And I really value my critical thinking. I really value my ability to um solve
84:14 I really value my ability to um solve problems and to articulate myself and to
84:16 problems and to articulate myself and to write and to communicate with my loved
84:18 write and to communicate with my loved ones in an effective way. So what can I
84:20 ones in an effective way. So what can I do if that is the trade-off now? And one
84:23 do if that is the trade-off now? And one of the things that I'm doing now feels
84:25 of the things that I'm doing now feels really counterintuitive in a world where
84:27 really counterintuitive in a world where everybody's got these productivity gains
84:29 everybody's got these productivity gains because they're using these tools, which
84:30 because they're using these tools, which is to refrain. And I I wonder if one of
84:33 is to refrain. And I I wonder if one of the great advantages of the next decade,
84:35 the great advantages of the next decade, one of the great hedges for anyone
84:38 one of the great hedges for anyone that's wanting to be a great critical
84:40 that's wanting to be a great critical thinker, entrepreneur, creative, is to
84:43 thinker, entrepreneur, creative, is to go left when everyone's going right,
84:46 go left when everyone's going right, which is to reframe and do it the hard
84:48 which is to reframe and do it the hard way. And if we look at history in these
84:52 way. And if we look at history in these arcs that we've discovered with food and
84:54 arcs that we've discovered with food and with exercise and all these things and
84:56 with exercise and all these things and dating, doing it the hard way, like we
84:58 dating, doing it the hard way, like we said about the marshmallow test and
85:00 said about the marshmallow test and delaying the gratification seems to
85:02 delaying the gratification seems to yield the greatest returns. So I think
85:04 yield the greatest returns. So I think I'm going to do it the hard way
85:05 I'm going to do it the hard way >> be the easiest because it won't have the
85:08 >> be the easiest because it won't have the side effects.
85:10 side effects. >> Yeah. The hard way. I I want to feel
85:12 >> Yeah. The hard way. I I want to feel good now and later as opposed to now but
85:18 good now and later as opposed to now but not later.
85:19 not later. >> And to be clear, this doesn't mean I'm
85:20 >> And to be clear, this doesn't mean I'm not going to use AI or chatbt. It just
85:22 not going to use AI or chatbt. It just means that when it matters, when the
85:24 means that when it matters, when the thinking matters, I will think for
85:26 thinking matters, I will think for myself
85:28 myself and when the communication matters, I'll
85:29 and when the communication matters, I'll communicate for myself. That's what I
85:32 communicate for myself. That's what I that's my conclusion.
85:33 that's my conclusion. >> You should hope that your children will
85:35 >> You should hope that your children will feel the same way when they grow up.
85:37 feel the same way when they grow up. They will model what you do
85:40 They will model what you do >> right every day. You model health
85:44 >> right every day. You model health or not health.
85:50 >> Thank you. Thank you for writing to well many incredible books that I've got
85:51 many incredible books that I've got around me. I'm going to link them all
85:52 around me. I'm going to link them all for my viewers that are watching. I've
85:54 for my viewers that are watching. I've got so many of these books. Um the
85:56 got so many of these books. Um the incredible one that you wrote for
85:57 incredible one that you wrote for parents called Raising Mentally Strong
85:59 parents called Raising Mentally Strong Kids. You've got your other book over
86:00 Kids. You've got your other book over there, Change Your Brain Every Day. And
86:03 there, Change Your Brain Every Day. And I've got this book here from from Terry
86:05 I've got this book here from from Terry which is the deep learning revolution
86:08 which is the deep learning revolution and one you wrote most recently called
86:09 and one you wrote most recently called trap in the future of AR. I'm going to
86:11 trap in the future of AR. I'm going to link all of them below and I'm going to
86:13 link all of them below and I'm going to link them with a little bit of a summary
86:14 link them with a little bit of a summary of what's in them. So if you decide that
86:16 of what's in them. So if you decide that there's anything here that we talked
86:17 there's anything here that we talked about today that where you want to dig
86:18 about today that where you want to dig in further, please do dig in and I'm
86:20 in further, please do dig in and I'm also going to link um both of your a
86:22 also going to link um both of your a link to where people can find out more
86:24 link to where people can find out more about both of you your websites and more
86:26 about both of you your websites and more of your work in the comments below. So
86:28 of your work in the comments below. So please do check that out everybody
86:29 please do check that out everybody listening. We have a closing tradition
86:30 listening. We have a closing tradition where the last guest leaves a question
86:32 where the last guest leaves a question for the next guest, as you know, and
86:33 for the next guest, as you know, and they don't know who they're leaving it
86:34 they don't know who they're leaving it for. So, I'm going to ask you both a
86:37 for. So, I'm going to ask you both a question, starting with you, Daniel. Are
86:39 question, starting with you, Daniel. Are you prepared for recognition of your
86:43 you prepared for recognition of your next
86:46 next health challenge? Will you be able to
86:48 health challenge? Will you be able to notice its onset? And how will you
86:51 notice its onset? And how will you address the challenge even if it means a
86:55 address the challenge even if it means a major lifestyle change or way of living?
87:02 Okay. Yes.
87:04 Yes. >> How will you address the challenge even
87:06 >> How will you address the challenge even if it means a major lifestyle change or
87:08 if it means a major lifestyle change or way of living?
87:10 way of living? >> Well, I'm very clear on the goals I
87:13 >> Well, I'm very clear on the goals I have, which is to be vibrant and healthy
87:17 have, which is to be vibrant and healthy and not get dementia. So, if I need to
87:21 and not get dementia. So, if I need to change something so that happens, I'm
87:23 change something so that happens, I'm like all in.
87:26 like all in. >> Are you prepared? probably not.
87:30 >> Are you prepared? probably not. Now, I've been blessed with good health.
87:32 Now, I've been blessed with good health. I try to live a healthy life.
87:36 I try to live a healthy life. But the problem is that you can't
87:39 But the problem is that you can't anticipate, you know, as you get older.
87:42 anticipate, you know, as you get older. What you know what's ahead. You know,
87:48 What you know what's ahead. You know, like you mentioned arthritis. I'm
87:49 like you mentioned arthritis. I'm feeling a little bit of arthritis now.
87:51 feeling a little bit of arthritis now. I've been arthritis free for most you
87:54 I've been arthritis free for most you all my life. And you know that's
87:57 all my life. And you know that's something it's very diff to realize that
87:59 something it's very diff to realize that it's coming. There's very little you can
88:01 it's coming. There's very little you can do about it is is uh depressing. But on
88:06 do about it is is uh depressing. But on the other hand, things could always be
88:08 the other hand, things could always be worse. And sometimes that cheers you up.
88:12 worse. And sometimes that cheers you up. But the the the the the reality is that
88:15 But the the the the the reality is that there are things in the world like
88:18 there are things in the world like COVID, you know, that you have no
88:19 COVID, you know, that you have no control over that may
88:23 control over that may or an accident or Alzheimer's, you know,
88:26 or an accident or Alzheimer's, you know, god forbid, you know, that you that who
88:29 god forbid, you know, that you that who knows uh what will happen, right? You
88:31 knows uh what will happen, right? You you have to live with whatever life
88:33 you have to live with whatever life deals with you. you know, the time that
88:35 deals with you. you know, the time that you have, you should really spend on uh
88:39 you have, you should really spend on uh trying to make it a a healthy life, a a
88:44 trying to make it a a healthy life, a a productive life, a you know, satisfying
88:46 productive life, a you know, satisfying life. And and that's something that we
88:48 life. And and that's something that we have control over, right?
88:49 have control over, right? >> What are you scared of?
88:53 >> What are you scared of? >> Right now, it's China.
88:59 I you know, I I I'm not being facitious. I I think that it's uh it's it's it's
89:03 I I think that it's uh it's it's it's it's a threat that is a societal threat.
89:06 it's a threat that is a societal threat. It's not I don't think that it's going
89:08 It's not I don't think that it's going to affect me. And I've had great Chinese
89:10 to affect me. And I've had great Chinese students. And so I I really like I think
89:12 students. And so I I really like I think the the Chinese people are different
89:14 the the Chinese people are different from what I see as the the country uh
89:17 from what I see as the the country uh the the the what they're trying to do,
89:19 the the the what they're trying to do, the goals that they're taking.
89:22 the goals that they're taking. uh 20 years ago if you look at the um of
89:26 uh 20 years ago if you look at the um of all the technical areas in physics and
89:28 all the technical areas in physics and chemistry and biology and so forth the
89:31 chemistry and biology and so forth the 100 most important advances have been
89:33 100 most important advances have been made
89:34 made in in 20 years ago was like the
89:37 in in 20 years ago was like the Americans had like you know 94 of them
89:42 Americans had like you know 94 of them this year it's 74 are Chinese
89:46 this year it's 74 are Chinese and that's because they made a huge
89:49 and that's because they made a huge investment in science and STEM
89:52 investment in science and STEM researchers they they're you know they
89:53 researchers they they're you know they they put they poured out a million
89:56 they put they poured out a million engineers for to to implement AI
90:00 engineers for to to implement AI right
90:02 right uh you know they're they're doing the
90:03 uh you know they're they're doing the right thing we did that you remember
90:05 right thing we did that you remember when the Sputnik uh went over the
90:07 when the Sputnik uh went over the Sputnik moment we made a huge investment
90:10 Sputnik moment we made a huge investment in STEM in science and engineering and
90:13 in STEM in science and engineering and and in education
90:14 and in education >> what was the Sputnik moment
90:15 >> what was the Sputnik moment >> oh okay 57 when the the Russians put a
90:18 >> oh okay 57 when the the Russians put a satellite that went over the US over and
90:20 satellite that went over the US over and over again and we didn't we it took us
90:23 over again and we didn't we it took us years to put up our own satellite
90:24 years to put up our own satellite because we had fallen behind but we that
90:27 because we had fallen behind but we that investment we've been living on
90:29 investment we've been living on literally you know for the last 60 years
90:33 literally you know for the last 60 years and now the the Chinese have done that
90:35 and now the the Chinese have done that and and they're going to be advanced and
90:37 and and they're going to be advanced and they're going to be way beyond us you
90:39 they're going to be way beyond us you know this is you ask me okay uh that
90:43 know this is you ask me okay uh that that's something I'm I was I'm very very
90:45 that's something I'm I was I'm very very uh disappointed in our country that
90:48 uh disappointed in our country that we're we're not in fact we're just doing
90:49 we're we're not in fact we're just doing the opposite We're tearing apart science
90:51 the opposite We're tearing apart science right now with with the president
90:52 right now with with the president administration.
90:53 administration. >> What are you scared of, Daniel?
90:57 >> What are you scared of, Daniel? >> Losing my wife. That's the most the
91:00 >> Losing my wife. That's the most the thing that comes to mind. And when I was
91:03 thing that comes to mind. And when I was thinking about China and my
91:05 thinking about China and my mother-in-law was a prepper. And
91:08 mother-in-law was a prepper. And >> a prepper being someone that's preparing
91:09 >> a prepper being someone that's preparing for the end of the world.
91:10 for the end of the world. >> Prepared for the end of the world.
91:12 >> Prepared for the end of the world. And we were in Egypt last year and got a
91:15 And we were in Egypt last year and got a call that she had cancer. And we were
91:17 call that she had cancer. And we were there for 3 days and came home and I
91:20 there for 3 days and came home and I kept thinking I loved her dearly. I'm
91:23 kept thinking I loved her dearly. I'm like, you prepared for the wrong thing.
91:26 like, you prepared for the wrong thing. You should have prepared for cancer.
91:29 You should have prepared for cancer. Like I think every day we should be and
91:32 Like I think every day we should be and the same Alzheimer's prevention program
91:34 the same Alzheimer's prevention program is a cancer prevention program. It's a
91:37 is a cancer prevention program. It's a heart disease prevention program. It's a
91:39 heart disease prevention program. It's a diabetes prevention program. And I'm
91:43 diabetes prevention program. And I'm like, she's prepared for the wrong
91:45 like, she's prepared for the wrong thing. The thing you really want to be
91:47 thing. The thing you really want to be prepared for is disease, right? And I
91:50 prepared for is disease, right? And I know I'm going to die. I just want to be
91:53 know I'm going to die. I just want to be vital for as long as I can be. And hope
91:58 vital for as long as I can be. And hope is well, I have a say in this, right?
92:03 is well, I have a say in this, right? Cuz I know I can accelerate
92:06 Cuz I know I can accelerate my body's decline or I can decelerate
92:09 my body's decline or I can decelerate it. And I'm going to choose to
92:11 it. And I'm going to choose to decelerate it.
92:14 decelerate it. >> Thank you. We're done.