0:24 All right, let's deconstruct rationality.
0:25 rationality.
0:27 We did a thorough deconstruction of
0:30 science and materialism in my series,
0:32 Deconstructing the Myth of Science. This
0:34 is going to be an analog series where
0:35 we're going to be focusing on
0:37 rationality and its limits. This will be
0:39 a three-part series with the deepest
0:40 insights coming in part two and part
0:43 three. Make sure you stick around for
0:45 those remaining parts. Uh, you know, I
0:48 stress that because you know what the
0:49 average view time for one of my videos
0:53 is? It's about 20 to 30 minutes.
0:55 Uh if you're going to be only watching
0:57 for 20 or 30 minutes one of my videos,
0:58 you might as well not even be watching
1:00 at all because you're not going to learn
1:02 anything really. Uh maybe even it'll
1:05 it'll harm you more than it'll help you.
1:08 So really um
1:09 make sure you stick to the to the end
1:10 because there's a lot of juicy stuff
1:12 here. This is original intellectual
1:13 work. I've been working on this for a
1:15 long long time. Um you're not going to
1:17 find this kind of information anywhere
1:19 else. There's hundreds of critical
1:21 insights throughout this whole series.
1:23 It's very dense. Uh I've been preparing
1:25 it for months. There's enough
1:27 information here for a dense book or a
1:29 PhD thesis. I mean, this is really like
1:32 a philosophy PhD wrapped into three
1:34 episodes. So, it's going to take about
1:36 10 hours to explain this whole topic.
1:38 So, please be patient. Uh the only thing
1:39 I can promise you is that it's going to
1:40 be worth it because it's going to change
1:42 how you understand how the mind works,
1:44 how you understand science, how you
1:46 understand academia, a lot of stuff like this.
1:47 this.
1:49 Now, you've probably intuited that
1:50 there's something wrong with highly
1:53 rational people, but it's not so easy to
1:55 put your finger on exactly what's wrong
1:58 with them because usually such people
2:00 are pretty intelligent. They have a high
2:02 IQ and so on and they're well educated
2:04 in the formal sense of having gone to
2:06 university and having some credentials
2:09 and degrees and maybe even a PhD. But
2:11 still, these people don't understand
2:14 reality fundamentally. Why is that?
2:17 Well, this series is going to be a
2:18 complete technical explanation of how
2:21 and why rationalism limits your mind and
2:22 prevents you from making sense of
2:25 reality. And it explains why rationalism
2:27 cannot reach ultimate truth and ultimate understanding.
2:29 understanding.
2:32 Rationalism is a subtle paradigm. It's
2:33 an invisible paradigm. You're not
2:37 conscious of it even though you hold it.
2:40 And it grounds your sense of reality.
2:42 So this series is going to help you to
2:44 move from rationalism from rationality
2:47 into postrationality.
2:49 This is a conceptual framework for
2:52 advanced transhuman mystical cognition.
2:53 This is a critical topic for developing
2:55 higher stages of cognitive development
2:58 which is what I teach. It's particularly
3:00 important for people in the STEM fields
3:03 and in academia.
3:06 So this series is not for soft new age
3:09 people. Those people need to learn basic
3:12 rationality. This series is for people
3:14 who have learned
3:17 and gone deep into rationality, the
3:19 scientists and so forth, but have gotten
3:21 stuck in that and how to get them beyond
3:25 that into the postrational domain.
3:26 Now, a lot of credit here needs to be
3:29 given to David Chapman from a website
3:34 Very important website. It's like a
3:35 whole book basically. It's like an
3:37 online book. I encourage you to go to
3:39 metarationality.com and read the whole
3:40 thing. It's going to take you weeks to
3:42 read it all. There's a lot of important
3:44 information in there. I'm going to be
3:46 quoting and um citing a lot of things
3:48 that he said here, especially in the
3:49 first part. And then in part two and
3:51 part three, we're going to be expanding
3:53 and going even beyond what David
3:56 Chapman's work, excellent work um uh
4:00 he's done. So, um a lot of credit to him
4:02 and I'll be quoting him throughout this
4:05 episode over and over again. So, uh,
4:06 we're sort of starting with his
4:07 foundation and then we're building on
4:10 top of that to go even beyond. Now, a
4:11 few warnings here before we really get
4:13 into the meat of it. Warning number one
4:14 is that this is advanced philosophy
4:17 which nobody nobody really understands.
4:18 Academics, scientists do not understand
4:19 what I'm going to be talking about,
4:22 especially in part two and part three.
4:24 Um, it took me over 20 years of
4:27 philosophy work to understand what I'm
4:29 going to be sharing with you over this
4:31 series. This requires crazy levels of
4:34 open-mindedness. And this is not any
4:35 kind of belief system that I want you to
4:37 adopt. These are insights that I want
4:39 you to contemplate for yourself. And
4:41 only if you contemplate it deeply and
4:44 you see the truth of it, then
4:46 you should hold these ideas. Otherwise,
4:48 don't don't adopt this as a belief system.
4:49 system.
4:51 The other warning is that do not be
4:53 misled by the term deconstruction. You
4:55 might think deconstruction is something
4:57 that's coming from postmodernism.
4:59 Uh, and while postmodernism has valid
5:01 insights and some of these ideas that
5:02 I'm going to be sharing do overlap with
5:04 postmodernism, what I'm talking about
5:06 here goes way, way, way beyond postmodernism.
5:08 postmodernism.
5:09 So, this is not just a rehashing of
5:12 stuff from, you know, silly postmodern
5:15 philosophy. We're going way beyond that.
5:17 And the other warning is that spiritual
5:19 things will be mentioned here. So, be
5:20 careful. People who are in the
5:23 rationalist paradigm very deeply, uh,
5:25 they have a spiritual allergy. So, watch
5:26 out for that spiritual allergy. I might
5:29 be using the word God here and there. I
5:31 might be using other mystical terms here
5:33 and there. Um, just be careful that you
5:36 don't get turned off by that. Um,
5:38 remember, keep an open mind. Keep an
5:40 open mind.
5:42 And the other warning is that a key
5:44 mistake that many rationalists make is
5:46 that any critique of rationality is
5:47 interpreted as a promotion of
5:50 irrationality or woo.
5:52 After all, anything that isn't rational
5:53 must be irrational, right? Those are the
5:57 only two options. Or it's like uh the
5:58 only reason you would want to oppose
6:00 rationality is because you have some
6:01 irrational thing that you want to
6:03 promote. Some fantasy, some delusion,
6:04 some falsehood,
6:08 some cult ideology.
6:09 But notice that this is question
6:12 begging, right? We need to explore and
6:16 think about what is rationality really
6:18 and are there flaws within it? Are there
6:20 problems? You can't just assume that
6:22 there aren't any.
6:24 So, this brings us to the pre-trans
6:27 fallacy that Ken Wilbur talks about.
6:30 There's pre-rational, there's rational,
6:32 and then there's postrational. And so,
6:33 the mistake that the people at the
6:35 rational stage make is that they think
6:37 that any criticism of the rational must
6:40 come from the pre-rational. Something
6:42 like a flat-earther criticizing, you
6:43 know, science. That would be an example
6:46 of a pre-rational critique of the rational.
6:48 rational.
6:49 That's obviously not what we're
6:52 promoting here. Um
6:54 there's a distinction you need to make
6:56 in your mind that you can criticize
6:58 things from above or from below. A
7:00 criticism of rationality from below
7:02 would be maybe maybe from a from a a
7:05 fundament a religious fundamentalist or
7:08 from a u flat earth or something like
7:10 that. Somebody who hasn't mastered rationality,
7:11 rationality,
7:13 somebody who hasn't actually understood
7:15 science, the methods of science. But
7:17 then once you do master science and you
7:20 learn it and you you learn the methods
7:21 of science, you know, the kind of bog
7:22 standard stuff that they teach you in
7:24 academia, then you start to wonder,
7:25 well, okay, well, is there anything
7:27 beyond that? Is there any kind of
7:29 cognition, any any way to use the mind
7:30 beyond that? And it turns out that there
7:32 is. And then that would open you up to
7:33 the postrational. And then from the
7:36 post-rational, you can criticize the
7:39 rational. And that is not irrational.
7:40 And that is not because you want to
7:42 promote some kind of fantasy or
7:44 delusion, but because there are
7:47 legitimate limits to the rational stage.
7:49 And these are stages of cognition that
7:51 we're talking about.
7:53 Another warning is that for those of you
7:57 who are into spirituality and new age
8:00 type people, um, and especially if you
8:02 didn't go to academia and you didn't
8:05 study hard sciences, then you actually
8:07 suffer from being too much in the
8:09 pre-rational. you haven't actually
8:10 mastered the rational stage of cognitive
8:11 development. That's something you need
8:13 to work on. And that's not what this
8:15 series is about. This series is more
8:17 advanced. This is about transcending the
8:19 rational, which is different than
8:20 learning the rational. And it's
8:23 important that you learn to be rational
8:25 because otherwise you will be irrational
8:26 and that's not good. That's going to
8:28 lead to problems in your understanding
8:30 of reality and in your life. And a lot
8:33 of new age spiritual woo type people do
8:36 suffer from irrationality. That's not a
8:38 um that's not a madeup thing. that's
8:40 real and they would benefit from just
8:42 learning rationality. But I'm not going
8:44 to teach you rationality here. I'm going
8:47 to teach you how to go beyond that.
8:48 So just remember that developing
8:50 rationality is an important thing. And
8:52 that's also how you know that what I'm
8:53 what I'm talking about here is coming
8:55 from a higher stage rather than a lower
8:57 stage is because I'm not here to
8:59 demonize rationality.
9:01 I'm not here to tell you not to learn
9:04 science or not to learn rationality. You
9:05 should learn those things. Those are
9:08 important. And then that's how you know
9:10 that I'm trying to teach you something a
9:13 little bit beyond that cuz otherwise I'd
9:14 just be demonizing it and straw manning
9:17 it which I'm not going to be doing.
9:19 Now immediately we should tackle an
9:20 objection here which is something like
9:22 this. But Leo aren't you contradicting
9:25 yourself already because here you are
9:28 using rationality to try to deconstruct
9:30 rationality thereby doesn't that prove
9:32 rationality's validity? After all, isn't
9:35 this whole conversation you using logic
9:36 and making various kinds of arguments
9:38 and points and isn't that rationality?
9:42 So, how can rationality be wrong?
9:43 If rationality is wrong, then your
9:45 critique of rationality must also be
9:47 wrong, right? Um
9:48 Um
9:51 uh this is not a contradiction. It might
9:52 seem this way on the surface, but it's
9:55 not. Notice that rationality, if we
9:57 think of it as a kind of a cognitive system,
9:59 system,
10:01 it's able to reflect on itself and to
10:04 find faults within itself. This is not a
10:06 mistake or a contradiction. This is
10:08 essential. This is essential. A system's
10:11 ability to reflect on itself
10:14 to go meta on itself. This is the
10:16 essence of what development and reaching
10:18 higher stages of cognition is all about.
10:20 Notice that rationality is capable of
10:23 self-awareness of its own limits.
10:26 To turn rationality into a system that
10:28 doesn't self-reflect, that doesn't
10:30 question itself and doesn't admit any
10:34 limits to itself, that itself is to turn
10:37 it into a dogma. And that itself is a
10:41 kind of rational irrationality.
10:43 And notice that rational rationality can
10:46 be turned into a dogma. People do do this.
10:52 So, um,
10:55 it's not at all a contradiction
10:57 that people who claim to be highly
11:00 rational are actually irrational. That
11:02 might sound like, well, but then that's
11:03 not that's not possible, right? If
11:06 someone is is acting rational, then
11:07 they're rational. They can't be
11:10 irrational. But no, they can.
11:12 And in fact, you can get so wrapped up
11:14 in acting rational that you don't see
11:17 your own irrationality because that
11:19 requires something more than r to see
11:22 the irrationality in how you apply your
11:25 rational ideas and your rational belief
11:27 system which we might call rationalism.
11:28 Um this requires consciousness and
11:30 self-reflection and something that goes
11:37 After all, why would you assume that
11:39 rationality has no limits? It would be
11:46 So it actually is possible for
11:49 rationality to become irrational. That's
11:51 not a contradiction. Or rather it is a
11:53 contradiction. But to be able to see
11:55 that contradiction within yourself is
11:58 actually a the whole point of this
12:00 series is to teach you that to help you
12:02 to see those contradictions because just
12:04 because you are rational or you think of
12:07 yourself as rational doesn't mean that
12:08 your worldview doesn't have
12:10 contradictions within it. It certainly
12:13 does. Certainly does.
12:15 So here's a key framing question for
12:17 this entire investigation that we're
12:20 doing. It's this. How does rationality
12:23 become selfdeception?
12:25 If you just contemplate that one question
12:27 question
12:29 for years,
12:32 you will derive all the insights that I
12:33 will be sharing with you throughout this
12:36 whole series. That's really
12:38 where all of this is coming from. That's
12:40 the source.
12:42 So, here's the core insight for you is
12:45 that rationality has limits. Once you
12:47 suspect that that's true, then you can
12:49 ask questions like, well, what are the
12:51 limits of rationality?
12:54 How is rationality a trap? How is
12:57 rationality misused and abused? How does
12:59 rationality get get metaphysics and
13:02 epistemology wrong? How does rationality
13:04 limit intelligence?
13:06 How is rationality an obstacle towards
13:09 the highest understanding of reality?
13:11 What is rationality?
13:14 What is reason?
13:16 And what is reason's relationship to truth?
13:22 You can just turn off this whole series
13:24 right here and just contemplate these
13:27 questions for years and you will get all
13:28 of the information that I would be
13:31 teaching you. Which is how this is
13:32 different from any kind of ideology or
13:34 belief system. Right? I want you to
13:36 contemplate these questions deeply for
13:37 yourself and to have these insights for
13:39 yourself. But because it takes so long,
13:42 it took me 20 years to understand this
13:44 stuff. um really it's too much to expect
13:46 people to just do it casually like this
13:49 is very serious philosophy. So it really
13:50 helps to have someone who's gone through
13:52 this whole process who can point out the
13:55 the the the things you should be
13:57 contemplating and the um the various
14:00 traps and um you know some of these
14:02 things just take a lot of creativity
14:04 just to have certain kinds of insights.
14:05 You wouldn't even think of them on your
14:07 own in 20 years uh if somebody didn't
14:09 point them out to you. And sometimes
14:10 it's just lucky that I have one of these
14:12 insights. You know, across 20 years I
14:13 have a few of these insights and I
14:15 compile them all together. And then
14:17 that's what this is.
14:20 That's the value of it.
14:21 If you work a lot with rational systems,
14:23 you will notice that rational methods
14:26 often fail leading to wrong conclusions,
14:28 delusions, and wrong sense making,
14:29 failure to understand reality, failure
14:32 to predict reality. And rational fields
14:34 start to stagnate and they get get stuck
14:37 in themselves.
14:40 There's a key four-part distinction
14:42 we're going to be making here, which is
14:43 we're going to want to distinguish
14:46 between reason, reasonleness,
14:48 reasonleness,
14:52 rationality, and rationalism.
14:54 Credit to David Chapman for this
14:56 important distinction.
14:59 So, I'm not going to fully define these
15:00 right now for you. In a in about an hour
15:02 or two when we get towards this this the
15:05 last uh part of this episode I'm going
15:06 to give you some definitions of these.
15:09 But for now just contemplate what might
15:10 be the difference between reason,
15:12 reasonleness, rationality and rationalism.
15:14 rationalism.
15:18 For our purposes right now as we kind of
15:20 bootstrap this investigation, let's
15:22 focus on just the difference between
15:25 rationality and rationalism.
15:27 Rationality is some kind of function
15:29 that your mind is performing. Let's just
15:34 start there. And then rationalism is
15:36 what is that? Well, that would be to
15:38 turn rationality into some kind of ism
15:41 or some kind of system. So you can also
15:43 have the parallel here between you can
15:44 have science and then you can have scientism.
15:46 scientism.
15:47 What's the difference between science
15:50 and scientism? We have a analogous
15:51 difference between rationality and
15:54 rationalism. So a lot of what this whole
15:56 episode is about is about deconstructing
16:00 ration rationalism
16:03 and then rationality itself. Well, we're
16:04 going to maintain that. Of course, we're
16:07 not going to stop using rationality.
16:09 We're going to keep using it. But also,
16:10 we're going to learn as we talk about
16:12 the limits of rationalism, we're also
16:13 going to learn about the limits of
16:16 rationality as well.
16:18 So my thesis for this whole series is
16:21 that rationalism is a self-deception and
16:23 not the highest form of intelligence nor
16:25 cognitive development.
16:27 But to understand what that really
16:29 means, we need to define in a lot of
16:32 depth what is rationalism. So let's go
16:34 into that right now. I have a list for
16:38 you. Um, I it it's worth our time to
16:40 really set this up and to think about
16:42 what is rationalism, to really define it
16:45 deeply because that's what we're deconstructing.
16:47 deconstructing.
16:50 And see, the trick with rationalism is that
16:52 that
16:53 you probably don't consciously think of
16:56 yourself as a rationalist. You probably
16:58 don't identify that way. But
17:01 nevertheless, you probably subscribe if
17:03 not to the entire paradigm that I'll be
17:05 describing here in a second than to
17:08 important parts of it. Right? And it and
17:09 it's not the case that you need to
17:11 subscribe to the whole thing to be
17:13 limited by it. Whichever parts of it you
17:15 subscribe to will limit your mind's
17:19 ability to to cognize and to understand
17:21 reality. All right. So let's let's
17:24 define what is rationalism.
17:25 Rationalism is a worldview. First of
17:27 all, it's an implicit epistemic
17:30 paradigm. And by implicit, what we mean
17:32 is that most people who hold it, they're
17:33 not conscious that they're holding it,
17:35 but they do hold it. And uh
17:38 nevertheless, it limits them.
17:40 It's also a faith in the power of reason
17:42 to solve all problems, all sense making,
17:45 all truth seeeking.
17:48 It's quote any belief system that makes
17:49 exaggerated claims about the power of
17:52 rationality, usually involving a formal
17:55 guarantee of correctness. End quote.
17:57 These quotes here are all from David
17:58 Chapman. So, I'm not going to keep
18:00 repeating his name. Just if I'm quoting
18:02 something and I I'm not telling you
18:03 where I'm quoting it from in this
18:04 episode, it's all coming from David
18:05 Chapman and his website metarrationality.com.
18:11 He also says, quote, "Rationality,
18:12 rationalism means that quote,
18:14 rationality is all there is to thinking
18:16 and acting well. It is sufficient for
18:19 all purposes, and there's nothing else
18:21 that you need."
18:22 This implies that there's nothing
18:24 better. There's no better system.
18:25 There's no better epistemology than rationality.
18:28 rationality.
18:31 Rationalism is also quote specifies some
18:32 ultimate criterion according to which
18:34 thinking or acting could be judged to be
18:37 correct or optimal end quote.
18:40 And it's quote thinking in accordance
18:41 with this criterion leads to true
18:44 beliefs. And it also claims that
18:46 rationality yields maximally effective
18:49 action. End quote.
18:51 Rationalism is trying to think about
18:54 reality in formal ways as a means of
18:55 solving the problem of distinguishing
18:58 truth from falsehood.
19:01 So this is the key issue that we deal
19:03 with when we're talking about the mind
19:04 and we're talking about epistemology,
19:06 philosophy, science, anything is really
19:07 ultimately what we're talking about. We
19:09 can all boil it down to how do you
19:11 distinguish truth from falsehood?
19:14 That's the
19:16 really tricky question. How do you do that?
19:18 that?
19:22 And um rationalism's answer is that well
19:24 you be you you you you do that by
19:26 becoming more and more formal. You look
19:28 for formal ways. And and what do we mean
19:30 by formal ways? And we're going to be
19:32 using this term called formalism. What
19:36 does that mean? Well, see
19:39 just think about it. before there was official
19:41 official
19:44 codified scientific method which only
19:46 has existed for about 500 years let's
19:48 say since this European scientific
19:50 revolution before that how did human
19:53 beings make sense of reality well this
19:56 was the pre-rational era of human
19:58 civilization which is the majority of
20:00 human civilization was pre-rational so
20:02 this was before official scientific
20:05 method um I mean people still still did
20:08 science but it wasn't formalized
20:10 to formalize something is to lay it down
20:13 into a sort of a law or into rules or
20:15 into official kind of method and and
20:18 then to to teach it by institutions and
20:19 through culture and all this kind of
20:23 stuff. So you know the ancient Egyptians
20:24 we can ask the question like did the
20:26 ancient Egyptians have science? Well, in
20:29 a certain sense obviously yes because
20:30 they built some incredible stuff. You
20:32 can't build the pyramids without some
20:35 kind of advanced understanding of of
20:37 nature and reality and how to carve
20:39 rocks. I mean, you could call all that
20:41 science. On the other hand, it wasn't
20:43 like a formal scientific method the way
20:44 that we have now. So, in a certain
20:46 sense, they didn't have science, but
20:47 nevertheless, they were able to
20:50 understand reality. And also notice
20:52 that, you know, it was mixed in with a
20:54 lot of superstition. it was mixed in
20:57 with um mystical ideas and religious
21:00 ideas and all sorts of false ideas as
21:04 well. So what is the reasoning behind
21:06 formalizing scientific method? Well,
21:10 it's supposed to be like you know um
21:12 we're becoming more rigorous and
21:14 therefore by be the the reason for
21:16 becoming more rigorous is because
21:18 there's a lot of sloppy stuff. If you're
21:19 using your mind in a sloppy way, if
21:21 you're doing science in a sloppy way,
21:23 then you're going to get a lot of truth
21:25 mixed with a lot of falsehood as well.
21:26 You're going to get superstition, you're
21:28 going to get woo, you're going to get
21:29 falsehood, and you're going to get
21:31 delusions and self-deceptions, which is
21:34 all common in the pre-rational
21:38 uh stages of of human cognition. So then
21:40 as you formalize, as you you know, you
21:43 use rigorous method, you do double blind
21:45 placeboc controlled studies, that's a
21:47 example of formalization. uh you know
21:49 you don't imagine that the ancient
21:50 Egyptians did any kind of double blind
21:53 placebo controlled studies about their
21:54 religious beliefs right they just
21:56 believe stuff
21:59 um and uh and so you know they probably
22:02 believe a lot of false stuff even if
22:05 they believe some true stuff as well. So
22:07 uh the idea is that by you know
22:09 formalizing our studies and being very
22:11 rigorous and having people proofread our
22:13 work and writing you know formal
22:15 research papers citing all of our
22:17 sources by doing this kind of formal
22:21 method then um we're eliminating the
22:23 untruth and we're separating the truth
22:26 from the untruth and in this way we're
22:27 creating the distinction between science
22:31 and pseudocience. So we might say
22:33 uh as modernists we might say that the
22:34 you know the ancient Egyptians had a lot
22:36 of pseudocience but also some legitimate
22:38 science mixed in. It's just mixed
22:40 complex bag and and but but you know we
22:42 don't like having pseudocience and
22:43 science mixed together. We want
22:45 something more pure, more true. We want
22:47 to get rid of all the falsehood and the
22:49 delusion. So we formalize. Another way
22:52 to think about formalism is like think
22:53 about a concept like for example intelligence.
22:55 intelligence.
22:56 You can have a folk notion of
22:57 intelligence like oh that person's
22:59 really smart. she's really intelligent.
23:00 We can just kind of talk about it
23:03 informally that way and we're capturing
23:06 some reality when we say that. But on
23:07 the other hand, like what is
23:09 intelligence? It's this amorphous,
23:12 nebulous, fuzzy concept. How do we
23:15 define it in a rigorous way? So a
23:17 scientist comes along and says, well,
23:20 okay, so I mean it's no good to just
23:21 talk about intelligence, you know, using
23:23 our, you know, intuition of who's smart
23:25 and who isn't. We need to me start
23:27 measuring this stuff. So let's come up
23:29 with some kind of you know first of all
23:31 let's define what intelligence is and we
23:33 can say that well intelligence is your
23:35 ability to solve math problems quickly.
23:37 That might be one measure. Another
23:38 measure is that you know it's your
23:40 verbal it's your verbal abilities your
23:42 reading comprehension and we could
23:44 design some kind of test to measure
23:45 that. And maybe it's your ability to
23:47 solve geometric puzzles you know
23:50 rotating objects like on an IQ test. So,
23:51 you know, we can come up with like a a
23:53 four-point definition of what
23:56 intelligence is and then we can design a
23:57 test to measure each of those and then
23:58 we can give the these tests to people
24:00 and then we get an IQ test and then we
24:03 we can give someone a number and then
24:05 based on that it seems like okay see
24:07 we've formalized this fuzzy notion
24:08 called intelligence into something a
24:11 little bit more rigorous.
24:14 But then the question becomes, well, is
24:16 that though really what intelligence is?
24:19 Or is that an artificial construct of
24:21 the human mind? And maybe it doesn't
24:23 really capture what intelligence is.
24:24 Maybe intelligence is something beyond
24:27 just what this IQ test is capturing. But
24:29 then of course the the limitation
24:30 becomes if if we start to take our
24:32 construct too seriously without being
24:34 aware that it's a construct, then we can
24:36 get trapped
24:38 into thinking that we understand
24:45 So this is formalism.
24:48 Rationalism is all about formalism as a
24:49 solution to this problem of
24:52 distinguishing truth from falsehood.
24:54 Rationalism is also attempts to create
24:57 laws of thinking, creating a method for
24:59 right thinking. Like wouldn't it be
25:02 ideal because you know the human mind is so
25:04 so
25:06 full of bullshit?
25:08 Uh most of the time it's easy to
25:10 bullshit with the human mind. And so how
25:13 do we learn to think properly? What are
25:15 the laws of thinking?
25:16 Wouldn't it be ideal if we could just
25:18 come up with some kind of formula for
25:21 how to get to the truth? So rationalism
25:24 wants to do that.
25:28 It wants to find those laws of thinking.
25:30 But maybe those laws don't exist. We
25:32 have to be open to that possibility as
25:38 Also, maybe you create these laws, but
25:40 then those laws actually limit your
25:42 ability to think. We have to also
25:44 consider that possibility.
25:48 So, this is all very tricky stuff. Um,
25:50 rationalism is also normative. Normative
25:52 means that others ought to follow it.
25:54 Everyone ought to be rationalist
25:57 according to the rationalist. That's the
25:59 proper way to be because if you're not a
26:01 rationalist according to the rationalist
26:03 then you're some sort of diluted new age
26:06 person or some softrained you know fuzzy
26:08 intuitive person who believes in a bunch
26:11 of woo
26:12 rationalism is the assumption that
26:15 anything that is true can be arrived at
26:17 by rationality and if rationality can't
26:21 arrive at it then it cannot be true.
26:23 Rationalism is the attempt to create an
26:25 objective replacement for intuition
26:28 because intuition is subjective, fuzzy,
26:31 and unreliable.
26:33 Rationalism is the attempt to make human
26:36 sensem as rigorous and objective as mathematics.
26:42 Mathematics is sort of like the the gold
26:44 standard for the rationalist or for the
26:47 scientist because the truths of
26:50 mathematics can be elegantly written in just
26:52 just
26:53 simple equations and then those
26:55 equations can be checked. We can even
26:57 build a computer program to check for
26:59 the validity of our equations and our
27:01 logical proofs. And so, wouldn't that be
27:03 ideal if we could just have a computer
27:04 program that can just check any
27:06 statement or any belief system or any
27:09 worldview for its correctness? And in
27:12 this way, we can eliminate any kind of
27:13 delusions from our understanding of reality.
27:16 reality.
27:18 But it's an open question whether that's
27:21 even possible though.
27:23 Rationalism is an inclination towards
27:24 systemic theoretic approaches to
27:26 understanding the world. It's the
27:29 attempt to make all of reality explicit,
27:32 formalized, and quantified.
27:34 Formalism is assumed to be more true
27:36 than any other kind of knowing. If it
27:38 isn't formal, it isn't serious and it
27:40 isn't real. If you can't formalize it,
27:43 it's not real at all. If it can't be
27:45 quantified, it isn't real. If it can't
27:51 Rationalism is the notion that all
27:53 informal human notions of reality can
27:55 simply be formalized with enough work
27:57 and that nothing important or truthful
28:00 would be lost if we did so. It's the
28:02 attempt to boil all of reality down to a
28:05 set of logical true false propositions
28:08 and statements. It's sensemaking and
28:10 understanding turned into propositional
28:13 knowledge. And in a sense that's what
28:15 formal science is trying to do. That's
28:18 what the whole game of academia is. You
28:20 go into academia and your job is to
28:25 contribute to mankind's knowledge base.
28:26 And what do we mean by propositional
28:28 knowledge? Well, propositional knowledge
28:31 is like we might think of it as
28:33 statements that are true or false about
28:36 the world. For example,
28:39 the earth is round. True or false? True.
28:41 Okay, that's a, you know, that's a
28:43 propositional statement. And then, you
28:44 know, snow is white. Is that true or
28:46 false? Well, it's true. Okay. And then
28:48 the moon is made of cheese. Is that true
28:49 or false? No, it's false. Okay. And then
28:52 so you could think of science in a
28:54 simplistic manner as just a list of all
28:57 the true propositions about the world.
28:58 And then a scientist's job or an
29:01 academic's job is to, you know, research
29:04 more and more of the world and to
29:06 contribute to this giant database of
29:09 true and false statements about the
29:12 world. And that if we just collect and
29:14 grow and expand this database of
29:16 propositional knowledge that eventually
29:17 we'll just understand the whole
29:19 universe. And that that's all that there
29:21 is to science. That's all that there is
29:23 to reason. That's all there is to using
29:25 the mind. That's all there is to truth.
29:27 Truth just is a database of everything
29:29 that's true about the world in
29:40 Rationalism is the treating of complex
29:42 phenomena as analyzable and reducible to
29:49 You know, if you go to university
29:51 freshman logic classes, they will teach
29:55 you how to reduce
29:56 questions about the world and about
30:01 nature into logical forms. Whereas like
30:02 they even have this kind of special
30:04 notation. It's called first order logic.
30:06 There's second order logic and so forth.
30:07 But it teaches you how to sort of like
30:09 reduce everything into syllergistic forms.
30:11 forms.
30:13 By syllogistic form I mean stuff like
30:18 you know Socrates is a man.
30:20 Statement number one claim number two is
30:22 that uh all men are mortal and then the
30:26 conclusion is you know um Socrates
30:29 therefore it must be mortal. This is
30:31 freshman first order logic class kind of stuff.
30:33 stuff.
30:35 And rationalism is the notion that you
30:38 can reduce sensemaking to
30:40 that kind of thing.
30:42 It's attempts to understand reality
30:45 through logical syllogisms.
30:47 The beauty of logical syllogisms is that
30:49 they're deductive and that if you have
30:52 one of these sort of syllogisms and they
30:54 always contain premises but as long as
30:55 the premises are true and you go through
30:57 the logic and the logic is sound there's
31:00 actual rules you know modus ponins modus
31:02 tonins all those laws of logic and if
31:04 you follow those deductive laws properly
31:06 then that guarantees that you get the
31:07 correct answer your answer must be right
31:11 at the end of all that you know logicing
31:13 and so that's very appealing because
31:14 ideally we would want our understanding
31:16 of the universe to be as rigorous and
31:18 solid as mathematics
31:22 and as first order logic.
31:25 So rationalism attempts to do that.
31:26 Rationalism is the belief that
31:28 everything can be proven and that things
31:30 that should only be taken as true if
31:33 they are proven.
31:35 It's the demand for rigorous objective
31:41 Uh and this leads to this kind of
31:45 attitude, this kind of like skeptic
31:47 uh conservative scientific attitude of
31:49 like, well, extraordinary claims require
31:51 extraordinary evidence. If you're going
31:52 to claim something, you need to back it
31:53 up with sources. You got to site your
31:55 sources. You got to site your studies
31:57 and all that kind of stuff. So, so
31:59 again, you can see how we're formalizing
32:02 this whole process of sensemaking and
32:08 It's the attempt to systematize knowledge.
32:09 knowledge.
32:12 It's the attempt for absolute certainty
32:14 in knowledge.
32:16 However, there's an interesting little
32:18 paradox and kind of contradiction here
32:20 because at the same time, rationalism
32:23 wants to invent a system for absolute
32:26 certainty in in our knowledge. On the
32:29 other hand, it also will want to deny
32:36 You know most academics and scientists
32:37 if you tell them that something is
32:39 absolutely true uh they'll be very
32:41 skeptical of that because from the
32:44 pre-rational stages where people would
32:46 claim absolute truth in God or absolute
32:50 truth in Jesus or whatever you know uh
32:51 scientists have become very skeptical of
32:52 that because where's the evidence you
32:54 know where's the evidence that that
32:57 Noah's ark actually existed where is the
32:58 evidence that miracles are real you know
32:59 you can't claim these things as
33:01 absolutes without the evidence without
33:03 the and In general, there's a very
33:04 skeptical attitude towards any kind of absolutes
33:06 absolutes
33:08 to the point where absolutes are even
33:11 deny that they're even possible.
33:13 Uh many in the scientific field just
33:14 believe that there are no such thing as absolutes.
33:21 Rationalism is quote, "If something is
33:22 mathematically true, you can be
33:24 absolutely sure of it because it's a
33:26 mathematical proof and unarguably
33:29 correct." End quote. And that's the
33:31 ideal that we're aiming for.
33:33 It's the idea of objective correct of
33:35 the objective correctness of reason and proof.
33:36 proof.
33:38 It's the notion that you can just
33:42 guarantee truth if you follow certain
33:44 correct procedures.
33:46 It's the attempt to rid all subjectivity
33:48 from understanding and from truth
33:51 seeeking. The problem of epistemology is
33:55 solved by eliminating subjectivity.
33:57 See, that was the real problem in the
33:59 pre-rational era is that, you know, you
34:02 had all this subjectivity. People
34:03 believed in different kinds of gods and
34:06 different kinds of mystical woo because,
34:07 you know, different cultures have their
34:09 own subjective biases and so forth. And
34:11 nobody was conscious of their biases and
34:13 they were just living out their biases.
34:15 And so, you know, to develop a formal
34:16 science, we need to get away from all
34:20 that. We need to object objectivize all
34:23 that subjective stuff. And we do that
34:25 by, you know,
34:28 by the methods of academia.
34:30 This is done ideally by inventing some
34:32 kind of mechanical process for
34:35 distinguishing truth from falsehood.
34:37 Rationalism is the idea that truth and
34:38 falsehood can be distinguished
34:41 mechanically without intelligence.
34:43 This is critical.
34:46 Mechanically without intelligence, a
34:48 machine that finds the truth. That's the
34:51 rationalist ideal in this. There is no
34:54 fuzzy soft
35:01 It's just a machine.
35:02 Ideally, we would have an AI that would
35:05 give us all the answers, right?
35:06 And right now, even today, you know,
35:08 people are still under under this idea
35:10 that you can just invent an AI that's
35:13 going to tell you the truth,
35:14 right? Isn't that what's going to happen
35:16 in a decade is we just have an AI and
35:18 you just ask the AI anything and the eye
35:19 tells you what's true and then you just
35:21 believe it and that's it. That's all
35:22 there is to epistemology. Leo,
35:24 epistemology has been solved. All we
35:26 need is just a better AI that just has
35:30 more CPU power, GPU power that can just
35:32 crunch more numbers and it'll solve all
35:34 these philosophical fuzzy problems that
35:36 you've been talking about. Actualiz will
35:38 become irrelevant because we'll just
35:39 have an AI that tells us all the answers.
35:41 answers.
35:46 Rationalism assumes that that's possible.
35:48 possible.
35:50 Rationalism assumes that the problem of
35:52 self-deception can be solved by just
35:53 being hyperrational, rigorous, formal,
35:56 and mechanical.
36:00 This uh this famous adage of, you know,
36:03 quote, shut up and calculate.
36:05 Uh about a hundred years ago when they
36:06 were developing quantum mechanics, you
36:08 know, the fathers of quantum mechanics,
36:10 uh they were arguing about, you know,
36:11 well, what does it really mean? Is is a
36:13 particle really a w, you know, is it a
36:15 particle? Is it a wave? Is it here? Is
36:17 it there? What what does quantum
36:18 mechanics really mean? They were trying
36:19 to figure out the interpretations and
36:21 and one of the one of the physicists
36:22 just was this kind of extreme
36:25 rationalist type and he just said, "You
36:26 guys are arguing about all this
36:28 philosophy. Just shut up and calculate.
36:30 Shut up and calculate and then that will
36:33 be enough to understand reality.
36:36 Or will it?"
36:39 Now look, sometimes it is appropriate to
36:40 just shut up and calculate. Sometimes
36:42 you don't have enough information to do
36:44 your philosophy on. And so maybe it's
36:46 right to shut up and calculate. Imagine
36:49 if you applied that attitude towards
36:51 understanding everything that I teach
36:53 for example with actualize.org. How far
36:55 would that get you?
36:57 How deeply would you understand reality
36:59 with that approach?
37:01 See, a rationalist believes that there's
37:02 nothing more to reality than just calculations.
37:05 calculations.
37:08 If reality is just a computer,
37:11 if that's all it is, then all we need to
37:12 do is apply that computational approach
37:14 and eventually that's going to get us
37:18 the closest to to the truth.
37:21 But is reality just a computer?
37:28 Uh, rationalism is the firm belief that
37:31 this is the right way to go.
37:35 We need to be more hard, more rigorous.
37:37 rid the sciences of all the fluff, all
37:41 the woo, all the human sentiment.
37:42 And when a rationalist is confronted
37:44 with epistemic and metaphysical
37:46 problems, he just doubles down on the
37:49 rigor. Technical rigor is the solution.
37:50 We're just not being rigorous enough.
37:53 That's our problem.
37:55 Rationalism is a faith that there exists
37:58 this master method that quote guarantees
38:00 a correct algorithm for rational thought
38:03 and action end quote.
38:05 It's the default worldview of people in
38:07 the technical work and technical fields
38:10 like STEM, programming, business, hard
38:13 science, academia, western medicine.
38:16 Rationalism is a kind of group think and
38:19 a kind of culture that permeates these
38:28 And academics swim in it like fish and water.
38:31 water.
38:33 It's an overestimation of pure reason to
38:34 solve problems and to understand
38:36 reality. It's an overconfidence in
38:38 logical frameworks. It's treating
38:41 reality like a computer system. Reality
38:42 is just an algorithm. It's just an
38:43 equation. All we got to do is just find
38:45 a few more physics equations and then
38:47 we'll solve reality
38:51 because everything's just computation.
38:52 Reality is just math equations after
38:57 all. Right? It's all it is.
38:59 Rationalism is truth seeeking reduced to
39:01 scientific studies, lab experiments, and
39:02 research papers. That's how you do truth
39:05 seeeking. You don't sit around and do
39:07 philosophy. You do research studies and
39:09 then you get your colleagues to confirm
39:11 your research studies and then you
39:13 publish in papers. Uh, I mean, you
39:16 publish a paper in in journals, you
39:19 know, credible journals that have high
39:24 rigorous standards for proof and and um,
39:26 you know, editorial controls and stuff
39:30 like that, rigorous peerre.
39:31 Rationalism is the treating of
39:33 rationality as an absolute. It turns
39:36 reason into God.
39:38 That's a poetic way of describing what
39:40 it is. It's the notion that reason is
39:42 the highest intelligence.
39:44 the explicit or the implicit notion that
39:46 reason should be king over mind and
39:49 reality. Rationalism is the pretense
39:50 that you are a rational being who makes
39:53 sense of reality rationally and acts rationally.
39:55 rationally.
39:57 It's the notion that you can build up a
39:59 rock-solid system of understanding from
40:02 rocksolid first principles. This is sort
40:05 of the cartisian
40:09 um project that Renee Deart tried to do
40:11 which sort of launched the the
40:13 scientific revolution in Europe. This
40:15 idea that you you sit down and you
40:17 question everything to rock bottom first
40:18 principles and then from that you sort
40:21 of build up your system of science based
40:23 on on that and then it's and then it
40:25 becomes injubitable. You can't doubt it
40:26 because, you know, it's built on just,
40:30 you know, solid first order logic.
40:32 That's just deductive.
40:34 This rationalism is a kind of autistic
40:36 nerd approach to explaining and
40:38 understanding reality. It's a set of
40:41 ideals. It's a vibe. It's an aesthetic.
40:43 It's a culture.
40:46 It's this vibe of facts don't care about
40:48 your feelings. Hard science, logic, and
40:50 math are what's real, and everything
40:52 else is just soft human sentiment.
40:55 Rationality is just facts and logic and
40:56 science and everything else is just pseudocience.
40:57 pseudocience.
40:59 It's a dismissive attitude towards the
41:01 soft sciences because they're inferior
41:03 to the hard sciences
41:05 because the closer you get to atomic
41:08 physics, that's like what's real is
41:10 atomic physics. And then the more higher
41:12 up you go, you know, chemistry, biology,
41:14 sociology, politics, then the further
41:17 you get away from atomic physics, then
41:20 the less real it gets.
41:23 That's what the rationalist believes.
41:25 It's the attempt to apply the methods of
41:27 hard science to the soft sciences and to
41:30 the social domain. So the rationalist
41:33 doesn't just believe that, you know,
41:35 atomic physics is what's ultimately
41:38 real. But then that attitude is applied
41:40 to the social domain as well. For
41:42 example, Marxism is supposed to be a scientific
41:48 uh approach to economics and to politics.
41:50 politics.
41:52 Whereas before that people had their you
41:53 know political ideas and so forth. When
41:55 Karl Marx came along the whole appeal of
41:56 Marxism was that it was supposed to be
41:59 scientific. You know scientifically
42:02 mankind is advancing from u you know the
42:05 the the feudal stages to the capitalist
42:07 stages to the socialist stages and then
42:09 eventually we're going to get to the to
42:10 the communist stages and this is you
42:12 know this is all pre-ordained and set in
42:13 stone kind of thing. this is all
42:15 scientific and then people actually
42:17 start to believe that Marxism is this
42:19 it's not just a political theory it's science
42:25 which makes it more true and then makes
42:28 people believe in it even harder and
42:30 then of course what you get in practice
42:35 is not what Marxism predicted
42:36 and then you start to wonder well wait a
42:38 minute where's the disconnect Marxism
42:39 was supposed to be this rational
42:42 scientific system and then we get all
42:44 this crazy chaos from it. Why is that
42:46 the case? See, this is an example of
42:50 rationalism failing when it is actually
42:52 when it when it collides with the you
42:54 know the messiness and the chaos of the
42:55 real world, especially the social domain.
42:58 domain.
43:00 Rationalism is this attitude of owning
43:02 people with facts and logic.
43:04 Reason must rule the passions, reason
43:06 above emotions and woo, the belief that
43:08 many of the world's problems can be
43:09 solved if only people were just a little
43:11 bit more rational.
43:13 The conviction that the success of math
43:15 and science proves that rationalism must
43:18 be correct because after all we landed a
43:20 man on the moon. Therefore, science and
43:23 logic are true. So what's like what is
43:26 there to do philosophy about?
43:27 We know how to get to the truth. It's
43:29 just this the kind of stuff that got us
43:31 to the moon.
43:33 Let's just keep doing more of that and
43:34 then eventually we'll just understand
43:35 everything there is to understand about
43:37 the world.
43:38 It's the attempt to explain everything
43:41 with physics and with evolution.
43:43 Rationalists love to evoke invoke
43:45 evolution, evolutionary theory. You
43:47 know, why do men cheat so much? Well,
43:49 evolution. Why do women cheat so much?
43:52 Well, evolution. Why this? Why that?
43:53 What are men attracted to? What are
43:54 women attracted to? Well, it's just evolution.
43:56 evolution.
43:58 These kinds of explanations. You've read
43:59 this kind of stuff. You've heard this
44:00 kind of stuff. You've heard these kind
44:03 of arguments being made. And a lot of it
44:05 is this kind of bro science kind of
44:07 stuff, right? So there's an element to
44:09 this whole rationalism project which is
44:12 like hardcore serious academics with
44:14 PhDs. That's like the really rigorous
44:15 stuff. And then but then it kind of
44:18 devolves as it as it permeates. This is
44:19 a culture, right? It permeates through
44:21 society and through culture. So it's not
44:24 just academics who are doing this bros
44:26 on Reddit, you know, your keyboard
44:28 warriors on Reddit who are, you know, in
44:30 the science subreddits, you know,
44:31 arguing about science versus religion
44:33 and atheism and theism and this kind of
44:36 stuff, right? These Reddit bros are also
44:38 creating their own kind of, you know,
44:41 bro science, you know, uh, fitness
44:43 influencers and weightlifters and so
44:45 forth. You know, they are also invoking
44:47 this kind of like it they're not just
44:49 weightlifting. They're they're doing
44:51 weightlifting in a scientific way. They
44:53 think, you know, you know, they're
44:55 they're measuring the the muscle size,
44:56 the muscle mass, and they're measuring
44:58 the chemistry and and all this kind of
44:59 stuff. And then they're they're invoking
45:01 this kind of scientific rationalist language.
45:03 language.
45:04 And then they bring that into their
45:06 debates about mysticism and about
45:08 religion and about spirituality. They
45:10 bring all that in. Right? This entire
45:12 paradigm gets brought in. See,
45:14 rationalism is like a lens through which
45:16 you make sense of the whole world.
45:17 That's why it's so important. That's why
45:19 we're deconstructing it is because it it
45:21 it doesn't just influence how you do
45:23 science. This isn't just a topic for scientists.
45:30 Because when we start discussing and
45:32 arguing about what reality is, how it
45:34 really works,
45:37 you're going to make appeals. Religious
45:38 people will make appeals to their, you
45:39 know, to their Bibles and so forth. And
45:42 then science, science geeks and so forth
45:43 will make appeals to science and
45:45 evolutionary biology and physics and
45:48 quantum mechanics or whatever else to
45:51 rationalize and justify their views. And
45:52 this becomes very dangerous. This
46:01 Core to rationalism is this idea of
46:04 reductionism. Reductionism is the notion
46:05 that you can reduce everything to
46:07 particles, physics, equations,
46:10 computation or mechanics.
46:12 Everything complex and fuzzy can
46:14 ultimately be reduced down to hard
46:15 physical systems like atoms and
46:18 equations. This includes all of ethics,
46:20 aesthetics, art, religion, philosophy,
46:22 sensemaking, spirituality. Even all of
46:25 it can just be reduced down to neurons
46:28 in the brain
46:30 which are just atoms ultimately. It's
46:34 just all chemistry, right? Or is it?
46:37 Well, the rationalist believes it is.
46:40 Rationalism is generally unholistic.
46:43 it. Its approach to reality is to study
46:45 very technical specialized
46:47 specialized
46:49 local aspects of reality, understanding
46:51 those and then just stitching those
46:52 together. And this is sort of the
46:54 academic approach. You know, you have
46:55 your you have your physics department,
46:56 you have your chemistry department, you
46:58 have your biology department, even in
46:59 your physics department. It's subdivided
47:02 into 20 different sub fields of physics.
47:03 And then, you know, you just specialize
47:05 in your little narrow field. And you got
47:06 to do that because otherwise, you know,
47:08 you're not gonna have enough mental
47:10 bandwidth to understand all the crazy
47:11 technical detail of each one of those
47:13 sub fields. So you just specialize in
47:14 your narrow little field. You make that
47:16 your own niche and then you devote the
47:18 rest of your life just studying that.
47:19 You're humble. You're not too arrogant.
47:21 You don't try to master every field
47:23 because that's impossible. And so you're
47:25 just very humble. This is kind of the
47:26 rationalist ethos. You're humble. You
47:28 focus on your little specialty. And
47:29 everybody does that. And together we
47:31 will collect this database of knowledge
47:32 and understanding. And then eventually
47:33 that's how we'll understand the whole universe.
47:35 universe.
47:45 because uh uh
47:48 see reductionism in itself is unholistic
47:50 because reductionism believes that you
47:51 reality is just nothing more than the
47:53 sum of its parts. In the end everything
47:54 is just atoms. So if you just know the
47:56 position of every atom in the universe,
47:57 you're going to know everything in the
47:58 universe and that's all that there is to
48:00 truth seeeking. That's the fundamental attitude.
48:02 attitude.
48:05 It's attempts to fix complex realities
48:07 into simplified models. So a lot of
48:10 rationalism is all about modeling stuff
48:11 and then they model and model and model
48:13 and eventually they they start to take
48:17 their models as reality itself.
48:18 Rationalism is the belief that
48:20 rationality, logic and science are not
48:21 relative. They're not subjective and
48:23 they're not biased. This is like
48:26 objective truth. You know, humans have
48:28 mathematics and logic and if we meet
48:29 aliens in the future, they will also
48:31 have the exact same mathematics and
48:33 logic. That's the idea because this is
48:36 universal, right? Math and science are universal.
48:37 universal.
48:39 It's not that some one culture, you
48:42 know, there's not Asian math and then
48:45 African math and then European math.
48:52 uh rationalism is also this kind of masculine
48:54 masculine
48:57 uh thinker personality type approach
48:59 on the MyersBriggs you have your
49:00 thinkers and you have your feelers well
49:02 rationalism forgets all about the
49:03 feeling you know feeling you can't
49:04 understand reality through feeling it's
49:06 just all thinking and it's all this
49:07 masculine approach and that's not a bias
49:11 that's just objectively how reality is
49:12 it's the subordination of the right
49:15 hemisphere to the left hemisphere
49:17 it's this overly uh leftrained autistic
49:19 ic view of reality.
49:21 It's performative formality, rigor, and
49:23 objectivity. It's this fantasy that
49:24 you're being very strict and rigorous
49:26 and literal and objective and factual
49:29 and scientific when often times you
49:38 So you see hardcore scientists subscribe
49:41 to this kind of paradigm, but also
49:43 people who are not scientists also
49:49 Um,
49:52 it's this paradigm that by by being this
49:55 science nerd that you're actually being
49:58 very rigorous and that you're doing the
50:01 best thing possible for having the
50:04 cleanest, purest, truest epistemology.
50:05 It's this idea that you can solve the
50:08 problems of epistemology just by being
50:11 ultra scientific.
50:13 Uh a great example of this kind of u
50:15 performative formalism is basian reasoning.
50:17 reasoning.
50:19 Anyone who talks about basian reasoning
50:23 I'm not going to explain it here. Um
50:26 it's really nonsense. Pure nonsense. Um
50:28 but uh but the these people who take
50:30 basian reasoning seriously uh and and
50:31 who think that this is how you solve
50:33 epistemology is through through
50:35 application of basian reasoning. Um,
50:38 this is an example of this kind of like
50:43 uh performative uh formality or rigor.
50:45 See, you can act as though you're using
50:47 your mind in a rigorous way and think
50:48 that you're actually getting to the
50:51 truth, but you actually aren't. This is
50:54 the trap.
50:55 Rationalism is the belief that your
50:57 worldview is logical and consistent when
50:59 it isn't.
51:01 It's the attempt to model humans as
51:04 rational agents. You know, a lot of uh
51:06 economic models try to do this and then
51:08 they get all sorts of wacky results
51:11 because humans are not just simple
51:13 rational agents the way that you assume.
51:15 It's the notion that philosophy and
51:17 metaphysics are unnecessary or that it's
51:21 nonsense or that it's impossible.
51:22 Philosophy is nothing more than mental
51:25 games and linguistic confusion. You're
51:26 not actually solving any kind of
51:29 problems by doing philosophy. The only
51:30 problems you can solve are by doing science.
51:33 science.
51:34 Philosophy doesn't actually give you any
51:37 kind of answers or understanding because
51:40 it's all just so subjective and fuzzy.
51:41 But science, you know, gives you hard
51:45 answers, gives you facts.
51:46 It's the attitude that science does not
51:52 It's the belief that reality is made out
51:55 of crisp, definite objective categories,
51:56 thinking in rigid categories and
51:58 dualities. It's the assumption that
51:59 there is a set of concrete objective
52:02 facts about an external world which is
52:05 uh mind independent. There just is an
52:07 objective world out there that's
52:10 independent of our minds and all we got
52:11 to do is just get the fuzzy stuff out of
52:13 the way to understand that world and
52:17 then we're good. Epistemology is solved.
52:19 Rationalism is the elevation of the
52:21 objective over the subjective. It takes
52:23 objectivity for granted as more real
52:25 than subjectivity.
52:27 The objective stuff is what's real. The
52:28 subjective stuff that's just human
52:31 sentiment doesn't matter. Subjectivity
52:34 is just the airy fairy mystical woo
52:37 which is corrupting science that we got
52:39 to get rid of.
52:41 Rationalism places rational concepts,
52:43 models, abstractions above first person
52:46 subjective experience. It even denies
52:48 that first person subjective experience exists.
52:50 exists.
52:52 It even denies the reality of consciousness.
52:54 consciousness.
52:56 uh you know famously Dan Daniel Dennett
52:57 who I would characterize as a
53:00 rationalist uh rationalist philosopher
53:04 died recently. Um Daniel Dennett wrote a
53:05 whole book
53:07 trying to argue that that consciousness
53:09 is just an illusion and that all of it
53:11 ultimately boils down to just Adams and
53:18 Uh rationalism believes that first
53:20 person experience is unreliable and not
53:22 proof of anything.
53:24 It requires that all truth be first
53:27 third person verifiable. Somebody else
53:28 has to verify the truth for you.
53:36 Rationalism doesn't believe anything
53:38 unless it is verified by peer-reviewed
53:40 studies and research papers. A
53:42 rationalist will only believe in God if
53:44 there's a peer-reviewed research paper
53:47 on God.
53:49 uh it's preoccupied with justifying and
53:50 demonstrating truth as part of a
53:53 scientific consensus. Right?
53:55 In a sense, the rationalist does not
53:59 have a notion of personal truth. Uh
54:02 there's only truth as part of a as a collective
54:07 and that if it's a personal truth, it's
54:10 not really a truth at all. A personal
54:11 truth is just some subjective stuff that
54:13 you made up, you know, to please
54:17 yourself. Whereas the real truth is all
54:21 uh third person verified.
54:22 Rationalism is this kind of game of
54:24 formal academic rigor. The standards of academia.
54:26 academia.
54:27 It believes that everything true is
54:30 accessible but to scientific method and
54:33 it is scientific method as dogma and it
54:36 denies any limits to scientific method.
54:39 Scientific method is able to grasp
54:40 everything that's true and everything
54:42 that's real.
54:44 Rationalism is also often pragmatism.
54:46 It's this pragmatic notion of truth.
54:48 This sort of Richard Rory notion of
54:51 truth. Um truth just is what is
54:55 effective. It is utility and value.
54:59 It confuses truth with survival.
55:07 So this is sort of a little correlary to
55:10 the pragmatic thing is that um yeah it's
55:12 the belief that science doesn't really
55:14 discover truth or prove anything. It's
55:16 just useful. So see the rationalist
55:17 wants to have it both ways. On the one
55:19 hand the rationalist just wants to say
55:21 that science is the truth and everything
55:23 else is just nonsense. On the other
55:25 hand, he also wants to say when you
55:26 start to question him about truth and
55:29 about is science really truth, then he's
55:30 you're going to back him into a corner
55:31 and he's going to say, "Well, wait, wait
55:33 a minute. Actually, you know, um,
55:34 science doesn't really even care about
55:36 the truth. Science isn't about truth.
55:43 So, it kind of h tries to play it both ways.
55:46 ways.
55:47 Rationalism is a notion that moral
55:48 questions can be quantified and
55:50 calculated. So there's this attempt to
55:52 even apply rationality to moral
55:54 questions which is something like for
55:58 example that Sam Harris likes to do
56:01 tries to rationalize uh um
56:03 um
56:06 yeah it tries to to make
56:08 tries to rationalize morality. I guess
56:11 you could say
56:13 rationalism is the denial that
56:15 rationality has limits. It's skepticism
56:17 towards beliefs not grounded in rational argument.
56:19 argument.
56:20 It's belief in a sharp distinction
56:23 between science and pseudocience and
56:26 therefore an allergy to pseudocience.
56:27 Right? So the rationalist believes that
56:29 there is such a thing as science and
56:31 then everything else that's not science
56:34 is by definition pseudocience and that
56:36 there's a sharp distinction between
56:38 those two things.
56:40 Rationalism is a close-mindedness to
56:42 anything mystical. It's an attitude of
56:44 ridicule and mockery for anything
56:46 mystical, woo, traditional, religious,
56:49 or new age. It's a stubborn refusal to
56:51 take anything mystical seriously. It's
56:53 the assumption that anyone taking
56:55 talking about mystical things is just
56:57 sloppy and delusional, not using their
56:59 mind properly, not following the proper
57:02 procedures of of thinking.
57:05 Uh because in reality, nothing can be
57:07 mystical, right? There is no mystery to
57:10 the universe. It's just mechanics.
57:12 And therefore, it's the impulse to
57:14 demystify everything. Everything
57:20 And of course, therefore, it's a
57:21 rejection and denial of mystical
57:24 experience. Mystical experience means
57:25 nothing because it's just
57:28 hallucinations. Soft, fuzzy human
57:30 sentiment, wishful thinking. That's all
57:32 that it is. There's no truth in it.
57:34 There's nothing deep about it. It's not
57:36 better than science. You can't
57:37 understand things through mystical
57:39 experiences because it you can't verify
57:41 them. You know, you had some
57:43 hallucination. Okay, but so what? You
57:44 saw a vision of Jesus. Okay, but so
57:46 what? That doesn't mean it's real.
57:47 doesn't mean that it's true. Doesn't
57:49 mean that, you know, other people see it
57:51 too. It can't be reproduced in a
57:53 laboratory. Can't be reproduced in a
57:54 double blind placeboc controlled study.
57:58 Therefore, it's not real.
58:00 So, rationalism rejects experience that
58:02 cannot be formalized.
58:03 So, it wants experience to be formalized.
58:06 formalized.
58:08 A thing can only be real if it can be
58:10 made explicit and repeatably testable in
58:13 a controlled laboratory environment.
58:15 Another way to think about rationalism
58:17 is that it's an attempt to deny God with reason.
58:19 reason.
58:21 It's an extreme skepticism applied to
58:24 everything immaterial and woo. But
58:26 notice this is a very biased application
58:29 of skepticism because it never applies
58:31 that same level of skepticism towards
58:34 materialism, scientism, rationalism itself.
58:37 itself.
58:39 If a thing doesn't fit rationalism, then
58:41 it is assumed to be irrational and false.
58:47 It is a kind of hyper secularization.
58:49 Everything must be secularized, right?
58:51 We need to move away from all the old
58:53 superstitious religious dogma of you
58:56 know the medieval era and move into a
58:58 modern technologically advanced you know technoutopia
59:00 technoutopia
59:03 uh which is free of of of any mystical woo.
59:11 It's like Leo why why call it God? Don't
59:14 call it God, call it nothing.
59:16 Because the God thing, it's like, uh,
59:18 it's icky. There's an allergy to God.
59:20 There's an allergy to anything mystical,
59:23 anything religious.
59:25 Rationalism dismisses intuition,
59:26 tradition, lived experience, emotions,
59:29 qualia, even consciousness. It often
59:31 ignores contextual, emotional, embodied
59:33 forms of intelligence. Intelligence is
59:36 just the kind of uh
59:38 technical intelligence,
59:40 the kind of intelligence that's measured
59:41 by an acute. That's real intelligence.
59:42 Everything else is not really intelligence.
59:50 Rationalism is the belief that
59:51 statements that can't be subjected to
59:53 scientific method,
59:56 they're not just false, they're not even
59:59 false, they're just meaningless.
60:01 For example, a statement like God is love, that's not true, and it's not even
60:04 love, that's not true, and it's not even false. It's just meaningless,
60:07 false. It's just meaningless, softbrained, word salad nonsense. It
60:09 softbrained, word salad nonsense. It means nothing.
60:17 Everything is a system to be analyzed. Analysis is the method. Through
60:20 Analysis is the method. Through analysis, we can understand all of
60:21 analysis, we can understand all of reality.
60:23 reality. Rationalism treats consciousness as an
60:25 Rationalism treats consciousness as an algorithm.
60:27 algorithm. Rationalism, you can think of it as a
60:29 Rationalism, you can think of it as a fantasy of control over reality or
60:31 fantasy of control over reality or infinity. It's an illusion of certainty
60:33 infinity. It's an illusion of certainty and security in a very chaotic,
60:35 and security in a very chaotic, epistemically chaotic world.
60:38 epistemically chaotic world. It's a craving for guaranteed solutions.
60:41 It's a craving for guaranteed solutions. That's what science offers, guaranteed
60:43 That's what science offers, guaranteed answers. And you don't need to rely on
60:45 answers. And you don't need to rely on yourself. See, the rationalist has this
60:47 yourself. See, the rationalist has this allergy of relying on their own
60:49 allergy of relying on their own intuitions or their own convictions
60:51 intuitions or their own convictions about the truth. It's like the scientist
60:53 about the truth. It's like the scientist doesn't want to believe their own
60:56 doesn't want to believe their own beliefs. In a sense, what they want is
60:58 beliefs. In a sense, what they want is they they want an experiment to tell
61:00 they they want an experiment to tell them what is true, to give them the
61:02 them what is true, to give them the guaranteed answer so that they don't
61:04 guaranteed answer so that they don't bullshit themselves.
61:10 Rationalism tries to create rigorous ways of thinking as a safeguard against
61:12 ways of thinking as a safeguard against delusion without realizing that these
61:14 delusion without realizing that these limits will limit consciousness and
61:16 limits will limit consciousness and limit intelligence and create their own
61:17 limit intelligence and create their own delusion.
61:19 delusion. See that that little wrinkle never
61:21 See that that little wrinkle never occurs to the rationalist.
61:24 occurs to the rationalist. Rationalism is a psychological aversion
61:26 Rationalism is a psychological aversion to ambiguity, fuzziness, relativity,
61:28 to ambiguity, fuzziness, relativity, construction,
61:30 construction, psychology, and metaphysics. It's the
61:33 psychology, and metaphysics. It's the assumption that making understanding
61:35 assumption that making understanding more and more technical makes it more
61:37 more and more technical makes it more truthful.
61:39 truthful. For example,
61:47 you know, in in universities, people will write 200page proofs and, you know,
61:50 will write 200page proofs and, you know, PhD thesis on one plus 1 equals 2. They
61:55 PhD thesis on one plus 1 equals 2. They can't just accept that 1 plus 1 equals
61:57 can't just accept that 1 plus 1 equals 2. They can't intuitit that. No, no, no,
61:59 2. They can't intuitit that. No, no, no, no. That's too fuzzy. That's too
62:01 no. That's too fuzzy. That's too subjective. We need a 200page proof
62:03 subjective. We need a 200page proof proving that 1 plus 1 equals 2. Then we
62:06 proving that 1 plus 1 equals 2. Then we can be sure it's this kind of attitude.
62:08 can be sure it's this kind of attitude. And that might sound ridiculous, but you
62:11 And that might sound ridiculous, but you know, very intelligent people have
62:13 know, very intelligent people have devoted their whole lives just to to
62:15 devoted their whole lives just to to write 200 pages of proofs on why oneplus
62:17 write 200 pages of proofs on why oneplus 1 equals 2. And and their proofs fail.
62:20 1 equals 2. And and their proofs fail. See, [laughter]
62:21 See, [laughter] it's so hard to technically prove that 1
62:23 it's so hard to technically prove that 1 plus 1 equals two. That, you know, when
62:25 plus 1 equals two. That, you know, when Bertrand Russell and and Gotlib Frag
62:27 Bertrand Russell and and Gotlib Frag tried to do this, you know, 100 years
62:29 tried to do this, you know, 100 years ago, they tried to, you know, Bertrand
62:31 ago, they tried to, you know, Bertrand Russell wrote a thousandpage logic book
62:34 Russell wrote a thousandpage logic book just to prove that 1 plus 1 equals two
62:35 just to prove that 1 plus 1 equals two in his logic and still it failed. That's
62:39 in his logic and still it failed. That's how difficult that is. But that's the
62:41 how difficult that is. But that's the attempt that they're trying to do.
62:43 attempt that they're trying to do. Uh it's this horror and tear at
62:46 Uh it's this horror and tear at insanity. The rationalist really
62:47 insanity. The rationalist really struggles with insanity because all of
62:50 struggles with insanity because all of rationalism is about maintaining sanity
62:51 rationalism is about maintaining sanity and they're not aware of that.
62:53 and they're not aware of that. Rationalists consider something like,
62:55 Rationalists consider something like, you know, a philosophy like soypism.
62:56 you know, a philosophy like soypism. They would consider that absurd. You
62:58 They would consider that absurd. You know, that's absurd.
63:01 know, that's absurd. Uh rationalists might be obsessed with
63:02 Uh rationalists might be obsessed with IQ and formal measures of intelligence
63:04 IQ and formal measures of intelligence like that. In a sense, you can think of
63:06 like that. In a sense, you can think of rationalism as apologia for scientism,
63:08 rationalism as apologia for scientism, materialism, atheism, pragmatism, and
63:10 materialism, atheism, pragmatism, and reductionism. It's a set of
63:13 reductionism. It's a set of rationalizations explaining why those
63:15 rationalizations explaining why those things are true and the correct way of
63:17 things are true and the correct way of understanding the world. It's
63:18 understanding the world. It's rationality as dogma, the analog of
63:21 rationality as dogma, the analog of scientism. It's close-mindedness to
63:23 scientism. It's close-mindedness to ideas which do not fit conventional uh
63:25 ideas which do not fit conventional uh notions of rationality.
63:28 notions of rationality. It's the arrogance of rationality. You
63:30 It's the arrogance of rationality. You notice that rationality can be quite
63:32 notice that rationality can be quite arrogant.
63:34 arrogant. It is rationality which is not
63:37 It is rationality which is not self-aware. It is science which is not
63:39 self-aware. It is science which is not self-aware.
63:41 self-aware. Ultimately it's a stage of cognitive
63:42 Ultimately it's a stage of cognitive development
63:44 development biodnamic stage orange green and yellow
63:47 biodnamic stage orange green and yellow these are really the rational stages the
63:50 these are really the rational stages the Susan Cook on the Susan Cook greater
63:51 Susan Cook on the Susan Cook greater cook reder model the nine stages of ego
63:53 cook reder model the nine stages of ego development it's the achiever and the
63:55 development it's the achiever and the expert stage and even stages beyond that
64:04 and ultimately we could call it you know I I funnally kind of call it to myself
64:06 I I funnally kind of call it to myself rationalism is logical woo
64:13 So, okay, that's what rationalism is. We've defined it to death. Um, now this
64:16 We've defined it to death. Um, now this is granted as kind of an exaggerated
64:19 is granted as kind of an exaggerated stereotype, right? I've I've taken
64:20 stereotype, right? I've I've taken rationalism as a definition to its
64:22 rationalism as a definition to its ultimate extreme. This would be like a
64:25 ultimate extreme. This would be like a 100 out of 100.
64:28 100 out of 100. Few people are this cartoonishly
64:30 Few people are this cartoonishly rationalist.
64:32 rationalist. Usually, you're not going to be 100 out
64:34 Usually, you're not going to be 100 out of 100. you're going to be 40 out of
64:35 of 100. you're going to be 40 out of 100, maybe 30, maybe 80 out of 100, you
64:38 100, maybe 30, maybe 80 out of 100, you know, depending on, you know, where your
64:41 know, depending on, you know, where your sensibilities lie. But you'd be
64:43 sensibilities lie. But you'd be surprised. You'd be surprised how many
64:44 surprised. You'd be surprised how many people are quite cartoonishly
64:47 people are quite cartoonishly rationalist, especially when you start
64:49 rationalist, especially when you start to argue with them, right? See, usually
64:51 to argue with them, right? See, usually when a human being is just going through
64:53 when a human being is just going through normal life, they're not thinking about
64:54 normal life, they're not thinking about these kinds of deep existential topics,
64:57 these kinds of deep existential topics, then their rationalism doesn't really
64:59 then their rationalism doesn't really come out because it doesn't have any
65:00 come out because it doesn't have any reason to express itself.
65:03 reason to express itself. But if you if you take like a hardcore
65:05 But if you if you take like a hardcore academic physicist from MIT and and you
65:08 academic physicist from MIT and and you sit him down and you start to drill him
65:10 sit him down and you start to drill him on these philosophical issues like well
65:13 on these philosophical issues like well what is real? Is the mind real? Is
65:15 what is real? Is the mind real? Is matter real? Are numbers real? You know,
65:18 matter real? Are numbers real? You know, are emotions real? You start to, you
65:19 are emotions real? You start to, you know, drill him down on this and then
65:21 know, drill him down on this and then you start to talk to him. Well, what
65:22 you start to talk to him. Well, what what is truth? How do we distinguish
65:24 what is truth? How do we distinguish truth from falsehood? And all this kind
65:25 truth from falsehood? And all this kind of stuff. you start to do that, you're
65:28 of stuff. you start to do that, you're going to see his rationalism come out
65:30 going to see his rationalism come out and it's going to look quite silly and
65:32 and it's going to look quite silly and cartoonish.
65:34 cartoonish. Um, but of course
65:37 Um, but of course that's going to make him look bad. So,
65:38 that's going to make him look bad. So, he's going to try to say face and he's
65:40 he's going to try to say face and he's going to try to soften it up because he
65:42 going to try to soften it up because he doesn't want to come off as just this
65:44 doesn't want to come off as just this like logical robot. And, you know, most
65:46 like logical robot. And, you know, most people are not logical robots.
65:49 people are not logical robots. Um, but then again, I mean, again, you'd
65:51 Um, but then again, I mean, again, you'd be surprised at how many are. Here's a
65:53 be surprised at how many are. Here's a list of examples in case you think I'm
65:54 list of examples in case you think I'm just like straw manning and
65:55 just like straw manning and exaggerating. Here's a list of examples
65:57 exaggerating. Here's a list of examples of people who fall to some degree into
66:01 of people who fall to some degree into this kind of paradigm. Uh
66:04 this kind of paradigm. Uh we're going to start with the best one
66:05 we're going to start with the best one which is uh less wrong.com which is
66:07 which is uh less wrong.com which is founded by Elizer Yudkowski. So this guy
66:10 founded by Elizer Yudkowski. So this guy is like the poster boy of rationalism.
66:16 um he developed this whole website less wrong.com and this whole community of
66:19 wrong.com and this whole community of people who are trying to solve the
66:21 people who are trying to solve the problem of epistemology but they're
66:23 problem of epistemology but they're trying to solve it purely in the
66:24 trying to solve it purely in the rationalists paradigm. You know it's
66:26 rationalists paradigm. You know it's just about how to think more logically,
66:28 just about how to think more logically, how to be aware of your cognitive
66:29 how to be aware of your cognitive biases. This kind of and look that's
66:31 biases. This kind of and look that's good is teaching people how to be
66:33 good is teaching people how to be rational as opposed to pre-rational.
66:36 rational as opposed to pre-rational. That's great but then they don't realize
66:38 That's great but then they don't realize there's something beyond that. I mean
66:41 there's something beyond that. I mean they call themselves less wrong and
66:42 they call themselves less wrong and that's right they are less wrong but
66:46 that's right they are less wrong but less wrong is still wrong ultimately at
66:48 less wrong is still wrong ultimately at the end of the day it's still wrong
66:49 the end of the day it's still wrong right so ultimately in this series what
66:53 right so ultimately in this series what we're trying to describe is what's wrong
66:54 we're trying to describe is what's wrong with less wrong.com and there is
66:56 with less wrong.com and there is something wrong with it it's just uh
66:57 something wrong with it it's just uh difficult to put your finger on it other
67:00 difficult to put your finger on it other examples of people like this is for
67:02 examples of people like this is for example Sam Harris Richard Dawkins
67:04 example Sam Harris Richard Dawkins Daniel Dennett Michael Shurmer Lawrence
67:06 Daniel Dennett Michael Shurmer Lawrence Krauss Mark Taggmar Joe Shabbach
67:10 Krauss Mark Taggmar Joe Shabbach uh Neil deGrasse Tyson to some degree,
67:12 uh Neil deGrasse Tyson to some degree, Tim Mlin, Eric and Brett Weinstein to
67:15 Tim Mlin, Eric and Brett Weinstein to some degree, Peter Begoian, these kinds
67:18 some degree, Peter Begoian, these kinds of characters. You can find interviews
67:19 of characters. You can find interviews with these people online. If you watch a
67:21 with these people online. If you watch a lot of science videos and philosophy
67:23 lot of science videos and philosophy videos on YouTube, you'll find these
67:25 videos on YouTube, you'll find these people doing podcasts and doing
67:26 people doing podcasts and doing interviews and doing debates, this kind
67:28 interviews and doing debates, this kind of stuff, sharing their worldview,
67:30 of stuff, sharing their worldview, you're going to see all of it in some
67:32 you're going to see all of it in some sense is kind of rooted in rationalism
67:35 sense is kind of rooted in rationalism to various degrees. So, I'm not saying
67:36 to various degrees. So, I'm not saying they're all 100 out of 100 these cartoon
67:38 they're all 100 out of 100 these cartoon characters, but to various degrees, you
67:40 characters, but to various degrees, you know, like Richard Dawkins is a great
67:42 know, like Richard Dawkins is a great example of a of a rationalist. You know,
67:44 example of a of a rationalist. You know, he's pretty high up there as a
67:46 he's pretty high up there as a cartoonish rationalist. Um,
67:50 cartoonish rationalist. Um, who else? Professor Dave McQue. This
67:53 who else? Professor Dave McQue. This channel called Decoding the Gurus.
67:56 channel called Decoding the Gurus. Uh, Kurt Jiong's channel. I have an
67:59 Uh, Kurt Jiong's channel. I have an interview. I have like a 10-hour
67:59 interview. I have like a 10-hour interview. If you haven't seen it, go
68:01 interview. If you haven't seen it, go watch it on Kurt Jiongle's channel
68:02 watch it on Kurt Jiongle's channel called Theories of Everything. So Kurt
68:04 called Theories of Everything. So Kurt Jiongal interviews a lot of these kind
68:06 Jiongal interviews a lot of these kind of academic scientific type of people,
68:07 of academic scientific type of people, physicists and so forth and they try to
68:10 physicists and so forth and they try to understand reality, but you can see he
68:12 understand reality, but you can see he and and the people he's interviewing,
68:13 and and the people he's interviewing, they're never going to understand
68:14 they're never going to understand reality. He does interview some
68:16 reality. He does interview some spiritual people and that's fine. Like
68:18 spiritual people and that's fine. Like he interviewed Rupert Spir, uh whoever
68:20 he interviewed Rupert Spir, uh whoever else, Frank Yang and so forth. So, you
68:23 else, Frank Yang and so forth. So, you know, he's open-minded to mystical kind
68:25 know, he's open-minded to mystical kind of stuff, but still, you can see that
68:28 of stuff, but still, you can see that fundamentally these people will never
68:29 fundamentally these people will never understand reality at the levels that
68:31 understand reality at the levels that I'm trying to teach it at because
68:32 I'm trying to teach it at because they're they're stuck in this academic
68:34 they're they're stuck in this academic rationalist paradigm.
68:36 rationalist paradigm. Um,
68:42 these kind of this kind of uh cultural phenomena of debunking these debunker
68:44 phenomena of debunking these debunker type of people, these are rationalists
68:45 type of people, these are rationalists usually. Another example is this AI doom
68:49 usually. Another example is this AI doom debates channel run by this guy called
68:51 debates channel run by this guy called Leon Leon Spear uh Shapi Lron Shapi
68:56 Leon Leon Spear uh Shapi Lron Shapi tricky name. Um
68:58 tricky name. Um he he's a follower of the less wrong
69:01 he he's a follower of the less wrong community and uh Udkowski. So he's a
69:05 community and uh Udkowski. So he's a good example of a kind of a rationalist.
69:07 good example of a kind of a rationalist. Um
69:09 Um he yeah this whole AI doom debate. Oh my
69:12 he yeah this whole AI doom debate. Oh my god it's just it's just rationalism
69:15 god it's just it's just rationalism crap. um new atheism, atheist debunkers,
69:19 crap. um new atheism, atheist debunkers, all all the atheists on YouTube, Matt
69:21 all all the atheists on YouTube, Matt Dillah Hunty, Alex O' Conor, there's a
69:23 Dillah Hunty, Alex O' Conor, there's a there's a whole slew of these kind of,
69:25 there's a whole slew of these kind of, you know, cosmic skeptic, how do they
69:28 you know, cosmic skeptic, how do they call themselves? Yeah, there's there's
69:29 call themselves? Yeah, there's there's all these YouTube uh atheist channels.
69:31 all these YouTube uh atheist channels. So, all of them are stuck in this
69:32 So, all of them are stuck in this rationalist paradigm.
69:34 rationalist paradigm. uh what I call these kind of fake
69:36 uh what I call these kind of fake skeptics. People who apply skepticism to
69:39 skeptics. People who apply skepticism to all the mystical stuff and religious
69:40 all the mystical stuff and religious stuff and all the new age stuff, but
69:43 stuff and all the new age stuff, but they don't apply skepticism to their own
69:45 they don't apply skepticism to their own paradigm. These are the fake skeptics.
69:47 paradigm. These are the fake skeptics. Um the Reddit science bros, online
69:50 Um the Reddit science bros, online science bro culture, but it goes beyond
69:53 science bro culture, but it goes beyond this. It goes even into stuff like this
69:55 this. It goes even into stuff like this is a culture. Remember, this is kind of
69:56 is a culture. Remember, this is kind of a subculture. It goes into areas like,
69:59 a subculture. It goes into areas like, you know, Wikipedia editors will
70:01 you know, Wikipedia editors will subscribe to rationalism. that rational
70:03 subscribe to rationalism. that rational standards will be used to, you know,
70:05 standards will be used to, you know, edit Wikipedia.
70:07 edit Wikipedia. Journalists, researchers, Silicon Valley
70:10 Journalists, researchers, Silicon Valley tech bros,
70:12 tech bros, um they're generally going to subscribe
70:14 um they're generally going to subscribe to this kind of worldview.
70:16 to this kind of worldview. Uh uh other examples of rationalism
70:18 Uh uh other examples of rationalism include economic reductionism, various
70:20 include economic reductionism, various kinds of schools of economics,
70:22 kinds of schools of economics, uh logical positivism, strict
70:25 uh logical positivism, strict empiricism, logicism, behaviorism,
70:28 empiricism, logicism, behaviorism, realism,
70:34 a lot of western analytical philosophy, Bertin Russell, David Hilbert, Gotlov
70:36 Bertin Russell, David Hilbert, Gotlov Frager. Uh it also creeps its way into
70:40 Frager. Uh it also creeps its way into Western medicine.
70:46 uh those who dismiss anything spiritual, paranormal, supernatural, mystical by
70:48 paranormal, supernatural, mystical by demanding proof. Uh this James Randy
70:52 demanding proof. Uh this James Randy prize situation, this is pure
70:54 prize situation, this is pure rationalism. uh evolutionary psychology
70:58 rationalism. uh evolutionary psychology Steven Wolfram
71:00 Steven Wolfram he's trying to model the whole universe
71:01 he's trying to model the whole universe as just a set of calculations and these
71:03 as just a set of calculations and these little you know um cellular auton
71:06 little you know um cellular auton automata
71:08 automata um
71:10 um and just in general academia stem
71:14 and just in general academia stem we're living in a in a kind of society
71:16 we're living in a in a kind of society that is kind of now dominated by
71:19 that is kind of now dominated by technology
71:21 technology technology is this like supreme power.
71:25 technology is this like supreme power. It creates the most wealthiest
71:27 It creates the most wealthiest companies. All the tech companies are
71:29 companies. All the tech companies are now the biggest, wealthiest, most
71:31 now the biggest, wealthiest, most powerful companies. They have a lot of
71:33 powerful companies. They have a lot of political influence. Of course, they
71:34 political influence. Of course, they also influence academia influ academia
71:37 also influence academia influ academia influences them. So, there's a lot of
71:38 influences them. So, there's a lot of sort of incestuous cross-pollination
71:40 sort of incestuous cross-pollination between politics, big tech, academia.
71:44 between politics, big tech, academia. This is all going back and forth. And um
71:47 This is all going back and forth. And um the ideology behind all of that
71:50 the ideology behind all of that um is infested with rationalism. Here's
71:54 um is infested with rationalism. Here's an example of peak rationalism. It's
71:57 an example of peak rationalism. It's called the church touring thesis. I'm
71:59 called the church touring thesis. I'm going to quote it here from Wikipedia.
72:01 going to quote it here from Wikipedia. It says something like this. Quote, "Any
72:04 It says something like this. Quote, "Any method of reasoning other than
72:05 method of reasoning other than mathematical logic will lead you to
72:07 mathematical logic will lead you to holding contradictory beliefs. From any
72:10 holding contradictory beliefs. From any two contradictory beliefs, you can
72:11 two contradictory beliefs, you can deduce all falsehoods. Since we don't
72:14 deduce all falsehoods. Since we don't believe all false things, our brains
72:16 believe all false things, our brains must run on logic." End quote.
72:21 must run on logic." End quote. That's the church touring thesis.
72:24 That's the church touring thesis. Uh, another example of rationalism is
72:28 Uh, another example of rationalism is something like this. Here's a list of
72:30 something like this. Here's a list of all the cognitive biases. There's like
72:33 all the cognitive biases. There's like dozens of them, 50 of them or so. Just
72:35 dozens of them, 50 of them or so. Just study this list, learn this list, apply
72:38 study this list, learn this list, apply this to your own thinking process, and
72:39 this to your own thinking process, and this will solve the problem of
72:41 this will solve the problem of selfdeception.
72:43 selfdeception. Another
72:45 Another example of rationalism is u
72:49 example of rationalism is u some kind of narrowly designed clinical
72:51 some kind of narrowly designed clinical study
72:52 study that shows that for example red wine
72:55 that shows that for example red wine increases longevity is good for health.
72:58 increases longevity is good for health. Now technically this study has been done
73:00 Now technically this study has been done properly. It followed all the proper you
73:03 properly. It followed all the proper you know procedures
73:05 know procedures of how to conduct a study but the study
73:08 of how to conduct a study but the study is is measuring such narrow variables
73:10 is is measuring such narrow variables right it's not capturing the totality of
73:12 right it's not capturing the totality of human health on a couple of you know
73:15 human health on a couple of you know three variables that they measured you
73:17 three variables that they measured you know they selected you know maybe
73:18 know they selected you know maybe longevity and they selected some you
73:20 longevity and they selected some you know some protein uh biioarker and then
73:24 know some protein uh biioarker and then they they selected some you know
73:25 they they selected some you know inflammation biioarker and then they
73:27 inflammation biioarker and then they measure those things and they say well
73:28 measure those things and they say well red wine seems to improve all these
73:30 red wine seems to improve all these three variables
73:31 three variables Therefore, red wine, you know, is
73:33 Therefore, red wine, you know, is healthy for you. And then they publish
73:34 healthy for you. And then they publish this study and then in in a in a in, you
73:37 this study and then in in a in a in, you know, in the Nature Journal, and it's a
73:38 know, in the Nature Journal, and it's a wellrespected journal. And then
73:40 wellrespected journal. And then everybody, you know, all the journalists
73:42 everybody, you know, all the journalists go out and say, "Oh, start drinking red
73:44 go out and say, "Oh, start drinking red wine because it's good for you." This
73:46 wine because it's good for you." This kind of thing. But it never really
73:47 kind of thing. But it never really occurs to these kinds of people that
73:49 occurs to these kinds of people that this is a very narrow study. It doesn't
73:51 this is a very narrow study. It doesn't truly capture the the holism of what
73:53 truly capture the the holism of what human health entails, nor the diversity
73:56 human health entails, nor the diversity of human genetics, how, you know, red
73:58 of human genetics, how, you know, red wine is going to impact different
73:59 wine is going to impact different genetic groups differently and so forth.
74:00 genetic groups differently and so forth. All these complexities are completely
74:02 All these complexities are completely reduced down to some some simplistic
74:04 reduced down to some some simplistic thing. But it seems like it's rational,
74:07 thing. But it seems like it's rational, right? It seems like this was a a solid
74:09 right? It seems like this was a a solid scientific study.
74:12 scientific study. But it's possible to design very narrow
74:14 But it's possible to design very narrow scientific experiments and studies that
74:16 scientific experiments and studies that are going to be very misleading.
74:22 But nobody cares about that because the researchers that were working on this
74:23 researchers that were working on this study, you know, to them they said,
74:25 study, you know, to them they said, "Well, Leo, but but our study was just
74:27 "Well, Leo, but but our study was just so technical and so limited. We didn't
74:29 so technical and so limited. We didn't come out there and say that everyone
74:30 come out there and say that everyone should start drinking red wine. All we
74:32 should start drinking red wine. All we did we we just, you know, technically
74:34 did we we just, you know, technically all we said is that it increases and
74:36 all we said is that it increases and improves these three biomarkers and
74:37 improves these three biomarkers and that's all that it is.
74:40 that's all that it is. See, they want to have it both ways.
74:42 See, they want to have it both ways. They want to say that. But then human
74:45 They want to say that. But then human beings, when human beings are living,
74:47 beings, when human beings are living, right? They're not living by narrow
74:49 right? They're not living by narrow nerdy technical, you know, three
74:51 nerdy technical, you know, three biomarker nonsense. They need to to to
74:54 biomarker nonsense. They need to to to take scientific facts and so forth and
74:56 take scientific facts and so forth and they need to distill it down into
74:58 they need to distill it down into highlevel
75:00 highlevel concepts and understanding that then
75:02 concepts and understanding that then informs how human beings actually live
75:03 informs how human beings actually live their life, what they eat, how they
75:05 their life, what they eat, how they exercise, how they think about the
75:07 exercise, how they think about the world, how they do religion or don't and
75:09 world, how they do religion or don't and so forth, right?
75:12 so forth, right? Humans can't live on technical science
75:14 Humans can't live on technical science research studies.
75:17 research studies. And that is what the human is doing all
75:19 And that is what the human is doing all the time is it's living first and
75:21 the time is it's living first and foremost. It's living and even the
75:22 foremost. It's living and even the scientists who are doing these narrow
75:24 scientists who are doing these narrow studies first and foremost they're
75:26 studies first and foremost they're living they're surviving only then are
75:28 living they're surviving only then are they doing science
75:31 they doing science and this can get lost on them right and
75:33 and this can get lost on them right and because of the unholistic nature of
75:35 because of the unholistic nature of rationalism it's like well but Leo that
75:37 rationalism it's like well but Leo that doesn't matter that's beyond the scope
75:38 doesn't matter that's beyond the scope of science it who cares yeah of course
75:40 of science it who cares yeah of course you're right of course you can't just
75:42 you're right of course you can't just live by scientific studies but Leo I'm
75:44 live by scientific studies but Leo I'm just a humble researcher I'm just like a
75:46 just a humble researcher I'm just like a humble chemist and all I did I just did
75:47 humble chemist and all I did I just did this little simple study and so what
75:51 this little simple study and so what See, it's a refusal to take epistemic
75:54 See, it's a refusal to take epistemic responsibility.
76:01 Uh because no scientist really takes epistemic responsibility. All they do is
76:02 epistemic responsibility. All they do is they only take responsibility for their
76:04 they only take responsibility for their little narrow, you know, thesis that
76:06 little narrow, you know, thesis that they're that they're working on and
76:08 they're that they're working on and nothing else. But when you do that, you
76:10 nothing else. But when you do that, you can't make sense of the entirety of
76:12 can't make sense of the entirety of reality. And then the scientists will
76:14 reality. And then the scientists will say, "Well, Leo, but that's because you
76:15 say, "Well, Leo, but that's because you can't. That's too much. That's too much
76:17 can't. That's too much. That's too much to ask for.
76:19 to ask for. Another
76:21 Another example of rationalism is a robot.
76:23 example of rationalism is a robot. Imagine a robot that gets stuck in a
76:25 Imagine a robot that gets stuck in a logical loop when it's told a specific
76:27 logical loop when it's told a specific set of words which cause it to
76:30 set of words which cause it to self-destruct because it can't think
76:31 self-destruct because it can't think outside of its own logic
76:40 because the robot just operates. It's sort of this kind of like idea that you
76:41 sort of this kind of like idea that you know Azac Azimov's you know three rules
76:44 know Azac Azimov's you know three rules of robotics you know a robot shall never
76:46 of robotics you know a robot shall never hurt a person and a robot shall never do
76:48 hurt a person and a robot shall never do this and a robot shall always do this
76:49 this and a robot shall always do this and it's got these three rules and then
76:51 and it's got these three rules and then you know you get these science fiction
76:53 you know you get these science fiction kind of scenarios where you know well
76:54 kind of scenarios where you know well well what happens if the if you put a
76:56 well what happens if the if you put a robot in kind of a situation where those
76:58 robot in kind of a situation where those rules are kind of starting to conflict
77:00 rules are kind of starting to conflict and contradict with each other what's
77:01 and contradict with each other what's the robot going to do and the robot is
77:02 the robot going to do and the robot is kind of like in it's caught in its own
77:04 kind of like in it's caught in its own logic and uh it doesn't know what to do
77:07 logic and uh it doesn't know what to do because the robot is not truly
77:08 because the robot is not truly intelligent because it's just following
77:10 intelligent because it's just following these three rules and those three rules
77:12 these three rules and those three rules do not define what intelligence is.
77:20 Another example of rationalism is for example in academic philosophy. If you
77:21 example in academic philosophy. If you go into academic philosophy, how do they
77:24 go into academic philosophy, how do they do philosophy? Their notion of how to do
77:25 do philosophy? Their notion of how to do philosophy properly is you're going to
77:28 philosophy properly is you're going to go read
77:31 go read famous dead philosophers. You're going
77:33 famous dead philosophers. You're going to read their texts in their original
77:34 to read their texts in their original languages and then you're going to
77:35 languages and then you're going to dissect them into technical little
77:37 dissect them into technical little arguments. you know, what did he say in
77:38 arguments. you know, what did he say in this sentence and what did he say in
77:40 this sentence and what did he say in that sentence and you're going to you're
77:41 that sentence and you're going to you're going to look for contradictions or
77:42 going to look for contradictions or you're going to look for little minor
77:43 you're going to look for little minor points that they made and you're going
77:44 points that they made and you're going to make your own arguments and you're
77:46 to make your own arguments and you're going to justify your argument based on
77:48 going to justify your argument based on logic and textual evidence. You're going
77:50 logic and textual evidence. You're going to you're going to cite the cart and
77:51 to you're going to cite the cart and you're going to cite Kant and you're
77:52 you're going to cite Kant and you're going to put you know you're going to
77:53 going to put you know you're going to assemble together this kind of argument
77:55 assemble together this kind of argument and by doing this you are doing real
77:57 and by doing this you are doing real philosophy and you're doing truth
77:58 philosophy and you're doing truth seeeking and you're really understanding
77:59 seeeking and you're really understanding something about the world.
78:07 It's important to understand that rationalism is a paradigm held by an
78:10 rationalism is a paradigm held by an ego, by a self, by a mind.
78:14 ego, by a self, by a mind. This paradigm is doing survival. It
78:17 This paradigm is doing survival. It itself is trying to survive. It fights
78:19 itself is trying to survive. It fights tooth and nail to defend itself. This
78:21 tooth and nail to defend itself. This paradigm is not self-aware of its own
78:23 paradigm is not self-aware of its own survival. It is unconscious and
78:25 survival. It is unconscious and mechanical, which limits its
78:27 mechanical, which limits its intelligence and it limits its
78:29 intelligence and it limits its self-reflective capacity.
78:32 self-reflective capacity. But it is intelligent enough to make
78:34 But it is intelligent enough to make strong technical arguments,
78:40 which is what makes it so self-deceptive and so difficult to deconstruct.
78:42 and so difficult to deconstruct. Because if you're going to argue with a
78:44 Because if you're going to argue with a rationalist
78:48 like you're going to try to argue with a Sam Harris or you're going to try to
78:50 Sam Harris or you're going to try to argue with a uh Ilizer Yudowski or a Lon
78:54 argue with a uh Ilizer Yudowski or a Lon Shapiro, you're going to argue with
78:55 Shapiro, you're going to argue with these people. They're very good at
78:57 these people. They're very good at making arguments. They're very logical.
78:59 making arguments. They're very logical. They will present facts and evidence and
79:02 They will present facts and evidence and this is very convincing to a lot of
79:04 this is very convincing to a lot of onlookers. You know, you you look at
79:06 onlookers. You know, you you look at these debates and you could be very
79:07 these debates and you could be very convinced by a debate. You know, watch a
79:09 convinced by a debate. You know, watch a debate between Sam Harris and Jordan
79:11 debate between Sam Harris and Jordan Peterson and you're going to see that
79:13 Peterson and you're going to see that Sam Harris makes very powerful arguments
79:15 Sam Harris makes very powerful arguments against Jordan Peterson.
79:18 against Jordan Peterson. But what's funny, it's so funny is that
79:21 But what's funny, it's so funny is that ultimately Sam Harris is wrong in his
79:23 ultimately Sam Harris is wrong in his arguments in his conclusion. His
79:26 arguments in his conclusion. His ultimate conclusion is wrong, but his
79:27 ultimate conclusion is wrong, but his arguments are correct. How can that be?
79:29 arguments are correct. How can that be? How do we explain that?
79:31 How do we explain that? Have you ever been in an argument with
79:33 Have you ever been in an argument with such a person? They're highly rational,
79:35 such a person? They're highly rational, highly educated, very scientific,
79:37 highly educated, very scientific, more educated than you.
79:39 more educated than you. But ultimately, they're still making bad
79:42 But ultimately, they're still making bad arguments and they're still wrong at the
79:43 arguments and they're still wrong at the end of the day. So our challenge is how
79:46 end of the day. So our challenge is how do we explain this phenomena?
79:48 do we explain this phenomena? Technically, what makes this kind of
79:50 Technically, what makes this kind of paradigm and epistemology and worldview
79:51 paradigm and epistemology and worldview wrong? Right? Because if they are making
79:53 wrong? Right? Because if they are making these mistakes and they are coming out
79:55 these mistakes and they are coming out wrong, and look, I'm not demonstrating
79:57 wrong, and look, I'm not demonstrating why Sam Harris is wrong. You're just we
79:59 why Sam Harris is wrong. You're just we don't have time for that here to go into
80:01 don't have time for that here to go into the the the depth of that. Um you just
80:04 the the the depth of that. Um you just have to kind of take me at my word.
80:05 have to kind of take me at my word. [laughter]
80:06 [laughter] Um but trust me, he's wrong on on very
80:09 Um but trust me, he's wrong on on very important things, but he's very good at
80:10 important things, but he's very good at making arguments. Um so
80:15 making arguments. Um so so yeah, how do we explain that? That's
80:17 so yeah, how do we explain that? That's what we're trying to explain.
80:20 what we're trying to explain. [snorts] Here's
80:25 a quote that explains the gist of the problem by John Dolan. He says, quote,
80:28 problem by John Dolan. He says, quote, "These briefly are the key elements of
80:30 "These briefly are the key elements of the stereotype.
80:32 the stereotype. Logic cripples and constrains. It forces
80:35 Logic cripples and constrains. It forces one into narrow and mechanical modes of
80:36 one into narrow and mechanical modes of thought that cut one off from the vast
80:38 thought that cut one off from the vast range of superior thoughts, feelings,
80:40 range of superior thoughts, feelings, and perception. Logic is an is an enemy
80:44 and perception. Logic is an is an enemy of wit and humor. Logic makes us dull
80:46 of wit and humor. Logic makes us dull and pedantic. Logic presupposes a
80:49 and pedantic. Logic presupposes a simple-minded black and white yes no
80:51 simple-minded black and white yes no conception of the world. But logic
80:54 conception of the world. But logic misses the point of half the things we
80:56 misses the point of half the things we ordinarily say and cannot match the
80:58 ordinarily say and cannot match the insight of the humblest person's common
81:00 insight of the humblest person's common sense.
81:02 sense. End quote.
81:09 That's the gist of our problem. And by the way, when I say that Sam Harris, you
81:11 the way, when I say that Sam Harris, you know, is ultimately wrong against for
81:13 know, is ultimately wrong against for some Jordan Peterson, I don't get me
81:15 some Jordan Peterson, I don't get me wrong. I'm not saying Jordan Peterson
81:17 wrong. I'm not saying Jordan Peterson makes great arguments. He makes terrible
81:20 makes great arguments. He makes terrible arguments a lot of times, awful, awful
81:22 arguments a lot of times, awful, awful arguments. But he is wrong. I mean, he
81:25 arguments. But he is wrong. I mean, he is correct on some very fundamental
81:27 is correct on some very fundamental things. And so, it's very ironic. How
81:30 things. And so, it's very ironic. How could it be that Jordan Peterson makes
81:32 could it be that Jordan Peterson makes such terrible arguments, but then he's
81:34 such terrible arguments, but then he's he's correct on such fundamental things,
81:37 he's correct on such fundamental things, and then Sam Harris makes such great
81:38 and then Sam Harris makes such great arguments, but then he's incorrect on
81:40 arguments, but then he's incorrect on such fundamental things. See, that's the
81:42 such fundamental things. See, that's the that's the really interesting thing
81:43 that's the really interesting thing here.
81:46 here. And the problem with rationalists is
81:48 And the problem with rationalists is that they don't just apply their
81:49 that they don't just apply their worldview to technical domains like
81:50 worldview to technical domains like physics, logic, and computer science.
81:52 physics, logic, and computer science. They try to apply to social issues, to
81:54 They try to apply to social issues, to human behavior, to social science, to
81:56 human behavior, to social science, to politics, to economics, to morality, to
81:58 politics, to economics, to morality, to ethics, to religion, to history, to
82:00 ethics, to religion, to history, to business, to philosophy, and to
82:01 business, to philosophy, and to spirituality.
82:04 spirituality. But if rationalism fails to formalize
82:06 But if rationalism fails to formalize even something like mathematics, which
82:08 even something like mathematics, which it fails to do, which is what Bur and
82:11 it fails to do, which is what Bur and Russell failed to do, it will certainly
82:13 Russell failed to do, it will certainly fail when applied to politics, to
82:14 fail when applied to politics, to morality, to business, to economics, and
82:17 morality, to business, to economics, and to philosophy.
82:23 Rationalism is a kind of a mind virus that infects academia and spreads
82:25 that infects academia and spreads outward through society and culture
82:28 outward through society and culture into business into politics even
82:33 into business into politics even culturally it's dominant because of big
82:35 culturally it's dominant because of big tech success of tech science and
82:36 tech success of tech science and computer science and see the success of
82:38 computer science and see the success of science and especially computer science
82:40 science and especially computer science in the last 50 years has created such a
82:44 in the last 50 years has created such a huge illusion
82:46 huge illusion that rationality must be true because
82:49 that rationality must be true because how else can we explain the success of
82:51 how else can we explain the success of computer science and all the technology
82:54 computer science and all the technology that we have.
82:59 And that turns out to be a very powerful illusion, difficult to deconstruct,
83:02 illusion, difficult to deconstruct, difficult to even see what's the
83:03 difficult to even see what's the alternative to that.
83:05 alternative to that. Conservatives like Jordan Peterson love
83:07 Conservatives like Jordan Peterson love to whine about postmodernism corrupting
83:09 to whine about postmodernism corrupting academia. And it does to some degree,
83:11 academia. And it does to some degree, but no one out there is really
83:13 but no one out there is really articulating how rationalism corrupts
83:15 articulating how rationalism corrupts academia.
83:17 academia. how modernism corrupts academia.
83:20 how modernism corrupts academia. And so in a nutshell, this is my formal
83:22 And so in a nutshell, this is my formal explanation of that issue
83:25 explanation of that issue of what's wrong with the less wrong.com
83:28 of what's wrong with the less wrong.com Sam Harris
83:30 Sam Harris type of worldview.
83:33 type of worldview. Kant famously said after reading Hume
83:36 Kant famously said after reading Hume that Hume awoke him from his dogmatic
83:38 that Hume awoke him from his dogmatic slumber. And so with this series, I hope
83:41 slumber. And so with this series, I hope to awaken rationalists and academics and
83:43 to awaken rationalists and academics and scientists from their dogmatic slumber.
83:46 scientists from their dogmatic slumber. All right. Now, let's get into some
83:48 All right. Now, let's get into some really deep stuff. So,
83:51 really deep stuff. So, let's go back to that four-part
83:53 let's go back to that four-part distinction that we made between uh, you
83:55 distinction that we made between uh, you know,
83:56 know, reason, reasonleness, rationality, and
83:59 reason, reasonleness, rationality, and rationalism. So, let me define these for
84:02 rationalism. So, let me define these for you, and I'm going to be quoting David
84:03 you, and I'm going to be quoting David Chapman here. So, reasonleness is
84:06 Chapman here. So, reasonleness is defined as, quote, thinking and acting
84:08 defined as, quote, thinking and acting in ways that make sense and are likely
84:09 in ways that make sense and are likely to work, but are not formally rational.
84:12 to work, but are not formally rational. End quote.
84:14 End quote. Rationality is defined as quote formal,
84:17 Rationality is defined as quote formal, systemic, explicit, technical, abstract,
84:19 systemic, explicit, technical, abstract, rigorous methods for making uh for
84:22 rigorous methods for making uh for thinking and acting
84:24 thinking and acting end quote. It's a kind of a rules-based
84:29 end quote. It's a kind of a rules-based way of sensemaking.
84:37 Uh that's David Chapman's definition of rationality. This is very tricky because
84:39 rationality. This is very tricky because to actually define what is rationality
84:41 to actually define what is rationality is in itself such a such a hornets nest
84:44 is in itself such a such a hornets nest of a problem that I'm not even going to
84:46 of a problem that I'm not even going to define it for you. I want you to
84:48 define it for you. I want you to contemplate that for yourself. I'm just
84:49 contemplate that for yourself. I'm just going to say that that's his definition.
84:50 going to say that that's his definition. I have a slightly different formulation
84:52 I have a slightly different formulation of it in my own mind. I'm not even going
84:54 of it in my own mind. I'm not even going to say it yet because I want you to just
84:55 to say it yet because I want you to just contemplate it for yourself as a
84:56 contemplate it for yourself as a homework assignment.
84:59 homework assignment. Um
85:01 Um because it's it's it's such a slippery
85:03 because it's it's it's such a slippery thing. You know what is rationality?
85:05 thing. You know what is rationality? Contemplate. Don't just take my answers.
85:08 Contemplate. Don't just take my answers. Contemplate. And don't take David
85:09 Contemplate. And don't take David Chapman's definitions and don't look it
85:11 Chapman's definitions and don't look it up in a and don't ask AI and don't look
85:13 up in a and don't ask AI and don't look it up in a on Wikipedia. I want you to
85:16 it up in a on Wikipedia. I want you to contemplate that. This is something you
85:18 contemplate that. This is something you must contemplate.
85:20 must contemplate. And now we're going to define
85:21 And now we're going to define rationalism as just an ideology that
85:23 rationalism as just an ideology that makes exaggerated claims about the power
85:25 makes exaggerated claims about the power of rationality.
85:27 of rationality. And it aims for definite proof of
85:29 And it aims for definite proof of rationality's universal efficacy. That's
85:32 rationality's universal efficacy. That's quoting David Chapman. And then David
85:34 quoting David Chapman. And then David Chapman defines metrationality. What is
85:36 Chapman defines metrationality. What is that? Metrationality is quote informal
85:39 that? Metrationality is quote informal reasoning about how to best use
85:41 reasoning about how to best use reasonable rational and metrational
85:43 reasonable rational and metrational methods together in particular context
85:46 methods together in particular context end quote.
85:54 So see if if rationalism is this kind of paradigm
85:56 paradigm and worldview
86:02 and rationality is this kind of system of thinking that we're doing you know
86:03 of thinking that we're doing you know this kind of formal systematic thinking
86:06 this kind of formal systematic thinking metrationality
86:07 metrationality is not the application of rationality to
86:11 is not the application of rationality to rationality. This is an important
86:13 rationality. This is an important distinction because if you think it's
86:16 distinction because if you think it's that then you're going to be question
86:17 that then you're going to be question begging, right? Because the question
86:20 begging, right? Because the question here, the more fundamental question is
86:22 here, the more fundamental question is what is rationality and what what are
86:24 what is rationality and what what are its limits? To understand the limits of
86:27 its limits? To understand the limits of rationality, you have to go beyond the
86:29 rationality, you have to go beyond the rational. You can't just apply the same
86:31 rational. You can't just apply the same system to itself and hope to be able to
86:34 system to itself and hope to be able to understand the systems limits. To
86:36 understand the systems limits. To understand the limits of a system, the
86:37 understand the limits of a system, the system has to go outside and beyond of
86:39 system has to go outside and beyond of itself. And that's what metrationality
86:41 itself. And that's what metrationality is. So metarrationality is not a more
86:44 is. So metarrationality is not a more rigorous more formal way of doing
86:46 rigorous more formal way of doing rationality. Metar rationality is
86:48 rationality. Metar rationality is starting to in a fuzzy intuitive way
86:51 starting to in a fuzzy intuitive way start to to kind of tease apart
86:53 start to to kind of tease apart different parts of rationality and
86:55 different parts of rationality and rationalism
86:57 rationalism and uh exploring its limits and how to
87:00 and uh exploring its limits and how to best apply it.
87:03 best apply it. So what we're arguing here is that
87:05 So what we're arguing here is that rationalism is bad and it's a
87:07 rationalism is bad and it's a self-deception and it must be discarded.
87:10 self-deception and it must be discarded. And rationality is important but it is
87:14 And rationality is important but it is still limited. It must be learned but
87:16 still limited. It must be learned but then transcended
87:18 then transcended and then you're going to use it as a
87:19 and then you're going to use it as a tool. Rationality is a tool. You can
87:21 tool. Rationality is a tool. You can apply it in certain cases, but then
87:23 apply it in certain cases, but then you're going to have the meta awareness
87:24 you're going to have the meta awareness to understand that well, okay, I can
87:26 to understand that well, okay, I can apply rationality here and here, but
87:27 apply rationality here and here, but then I'm not going to apply it in that
87:28 then I'm not going to apply it in that situation because it's not going to work
87:30 situation because it's not going to work in that situation, and I'm not going to
87:32 in that situation, and I'm not going to apply it over there because it's too
87:33 apply it over there because it's too limited over there. See?
87:41 So, now let's start to get deep into why rationalism is wrong.
87:47 There's two main categories of wrongness, epistemology and ontology.
87:51 wrongness, epistemology and ontology. The epistemology is actually more
87:53 The epistemology is actually more fascinating and it's in a certain sense
87:55 fascinating and it's in a certain sense deeper. There's more content to it. So
87:57 deeper. There's more content to it. So I'm leaving that for part two and part
87:59 I'm leaving that for part two and part three. Uh here we're going to focus on
88:01 three. Uh here we're going to focus on the ontology. So let's start with
88:02 the ontology. So let's start with ontology.
88:04 ontology. Rationalism has an unconscious ontology.
88:07 Rationalism has an unconscious ontology. Understanding this ontology requires a
88:10 Understanding this ontology requires a lot of work and a lot of time and
88:11 lot of work and a lot of time and creative intelligence and intuition.
88:13 creative intelligence and intuition. Took me 20 years to fully understand
88:15 Took me 20 years to fully understand what this ontology entails. So what is
88:17 what this ontology entails. So what is the ontology of rationalism? And by the
88:20 the ontology of rationalism? And by the way, what is ontology? Uh, ontology is
88:24 way, what is ontology? Uh, ontology is the branch of philosophy that deals with
88:27 the branch of philosophy that deals with questions of being. What is there? What
88:29 questions of being. What is there? What is there in the universe? What's real?
88:32 is there in the universe? What's real? That's ontology. And epistemology is how
88:35 That's ontology. And epistemology is how do we know what's real?
88:38 do we know what's real? So the ontology of rationalism is as
88:41 So the ontology of rationalism is as follows. There exists an objective
88:43 follows. There exists an objective material world that is distinct from you
88:45 material world that is distinct from you and the self.
88:48 and the self. That's crucial.
88:50 That's crucial. All of science hinges on this. The world
88:53 All of science hinges on this. The world is distinct from the person who is
88:56 is distinct from the person who is investigating the world.
89:04 Uh the next item is objectivity is more fundamental than subjectivity. So
89:05 fundamental than subjectivity. So objectivity is what's actually real.
89:09 objectivity is what's actually real. To understand what's real, we need to
89:11 To understand what's real, we need to get out of subjectivity.
89:13 get out of subjectivity. Another
89:15 Another piece of this ontology is that there
89:17 piece of this ontology is that there exist distinct objects. That's just a
89:19 exist distinct objects. That's just a given. There just are cats and dogs and
89:23 given. There just are cats and dogs and men and women and trees and atoms in the
89:25 men and women and trees and atoms in the world. That's just what the world is
89:27 world. That's just what the world is made out of. These discrete distinct
89:30 made out of. These discrete distinct things.
89:32 things. The next piece is that everything has a
89:34 The next piece is that everything has a crisp objective definition.
89:37 crisp objective definition. We can define what is a cat and what is
89:39 We can define what is a cat and what is a dog and they're different from each
89:40 a dog and they're different from each other fundamentally. We can define what
89:42 other fundamentally. We can define what is a man and what is a woman and they're
89:44 is a man and what is a woman and they're different from each other fundamentally.
89:45 different from each other fundamentally. They're mutually exclusive.
89:48 They're mutually exclusive. Everything has a fixed identity.
89:51 Everything has a fixed identity. By identity, we're saying something very
89:53 By identity, we're saying something very deep. You know, you can wonder what is
89:56 deep. You know, you can wonder what is identity. That's a whole video onto
89:58 identity. That's a whole video onto itself. What is identity? You can't ask
90:02 itself. What is identity? You can't ask a more profound question than that
90:03 a more profound question than that almost. Um but like in this case, what
90:06 almost. Um but like in this case, what I'm talking about when I say identity is
90:08 I'm talking about when I say identity is like a cat. What is a cat?
90:12 like a cat. What is a cat? That's a much more profound question
90:14 That's a much more profound question than you would think that that than than
90:16 than you would think that that than than science understands. What is a cat and
90:18 science understands. What is a cat and what is a dog and what is the difference
90:20 what is a dog and what is the difference between a cat and a dog? And what makes
90:22 between a cat and a dog? And what makes a cat different from a dog at the
90:25 a cat different from a dog at the identity level?
90:27 identity level? Well, science and rationalism just
90:30 Well, science and rationalism just assumes that there are these identities.
90:32 assumes that there are these identities. A cat just is a cat and a dog just is a
90:34 A cat just is a cat and a dog just is a dog. Another
90:37 dog. Another way to say it is that there within
90:39 way to say it is that there within rationalism, there is no onlogical
90:41 rationalism, there is no onlogical relativity. Things just are what they
90:43 relativity. Things just are what they are. What a thing is, a thing's identity
90:46 are. What a thing is, a thing's identity doesn't depend on perspective. And it
90:47 doesn't depend on perspective. And it doesn't depend on who's asking
90:51 doesn't depend on who's asking and how you're asking. It just is what
90:53 and how you're asking. It just is what it is. A cat is a cat. It doesn't matter
90:55 it is. A cat is a cat. It doesn't matter who's looking at it, how you're looking
90:57 who's looking at it, how you're looking at it. It just is a cat.
91:03 Another piece of the ontology is that reality and objects are mind
91:05 reality and objects are mind independent.
91:06 independent. How you think about objects doesn't
91:09 How you think about objects doesn't change them.
91:12 change them. Another piece of the ontology is that
91:14 Another piece of the ontology is that mind can be distinguished from matter.
91:16 mind can be distinguished from matter. There's mind over here. That's the
91:18 There's mind over here. That's the subjective fuzzy stuff. That's the woo.
91:20 subjective fuzzy stuff. That's the woo. That's the illusion. That's the not real
91:23 That's the illusion. That's the not real stuff. And then there's matter. And
91:24 stuff. And then there's matter. And matter is what's real.
91:27 matter is what's real. Mind is just matter with various
91:30 Mind is just matter with various illusions in it somehow. We don't know
91:31 illusions in it somehow. We don't know the exact nature of these illusions. But
91:34 the exact nature of these illusions. But mind it must obviously just be an
91:35 mind it must obviously just be an illusion stemming from atoms, right?
91:38 illusion stemming from atoms, right? Atoms turn into cells, turn into um
91:42 Atoms turn into cells, turn into um neurons and then neurons somehow create
91:43 neurons and then neurons somehow create this illusion of mind and that's just
91:45 this illusion of mind and that's just all that there is to it. And uh and
91:47 all that there is to it. And uh and obviously what's real is the atoms and
91:49 obviously what's real is the atoms and the neurons and what's unreal is the
91:51 the neurons and what's unreal is the mind and the ideas and the the fuzzy
91:53 mind and the ideas and the the fuzzy human stuff.
91:58 Another piece of this ontology is that all higher order things can be reduced
92:00 all higher order things can be reduced down to atomic facts. So consciousness
92:03 down to atomic facts. So consciousness just is atoms. That's all that it is.
92:05 just is atoms. That's all that it is. There's not two things in the universe.
92:07 There's not two things in the universe. There's not like atoms and then
92:08 There's not like atoms and then consciousness. There's just atoms and
92:11 consciousness. There's just atoms and then consciousness is just some sort of
92:12 then consciousness is just some sort of complex combination of atoms. That's all
92:14 complex combination of atoms. That's all that that is.
92:20 Another piece of the ontology is that qualities are reducible.
92:24 qualities are reducible. You can reduce qualities to numbers, to
92:27 You can reduce qualities to numbers, to equations, to atoms.
92:29 equations, to atoms. What do we mean by qualities? Red,
92:47 the image of a unicorn in your mind. Imagine a unicorn. See, that's a
92:49 Imagine a unicorn. See, that's a quality. You have a qualitative
92:51 quality. You have a qualitative experience of a unicorn in your mind. So
92:53 experience of a unicorn in your mind. So to a scientist, to a to a reductionist,
92:57 to a scientist, to a to a reductionist, to a rationalist, that unicorn that you
92:59 to a rationalist, that unicorn that you have in your mind, um there's nothing
93:02 have in your mind, um there's nothing more to that unicorn than just numbers
93:03 more to that unicorn than just numbers and and atoms. It's just some kind of
93:05 and and atoms. It's just some kind of equations. That's all that it is. A very
93:07 equations. That's all that it is. A very complex equation, computation.
93:13 Another piece of the ontology is that subjectivity is not real and the self is
93:17 subjectivity is not real and the self is not real.
93:18 not real. What is the self? There's no such thing
93:21 What is the self? There's no such thing as a self. There's just like atoms and
93:23 as a self. There's just like atoms and stuff, right? There's a universe.
93:24 stuff, right? There's a universe. There's a world. The self is just some
93:27 There's a world. The self is just some human madeup stuff. It's not real.
93:37 When sc what is science studying? What does a scientist believe he's studying?
93:40 does a scientist believe he's studying? Does he believe he's studying the self
93:42 Does he believe he's studying the self when he's studying gravity?
93:45 when he's studying gravity? When he's studying chemistry, when he's
93:48 When he's studying chemistry, when he's doing math, does he believe he's
93:49 doing math, does he believe he's studying the self? No. He believes he's
93:51 studying the self? No. He believes he's studying an external world outside of
93:53 studying an external world outside of himself.
93:59 The next piece of the ontology is that reality is nothing more than a machine.
94:02 reality is nothing more than a machine. And since it's nothing more than a
94:04 And since it's nothing more than a machine, mechanical methods are all that
94:05 machine, mechanical methods are all that we need to understand it.
94:08 we need to understand it. Another piece of the ontology is that
94:10 Another piece of the ontology is that reality is dumb.
94:13 reality is dumb. Reality can't be intelligent. Nothing is
94:15 Reality can't be intelligent. Nothing is inherently intelligent.
94:18 inherently intelligent. Another
94:25 piece of the ontology is that reality is inherently not mystical. There's nothing
94:28 inherently not mystical. There's nothing mystical. Everything can be demystified
94:30 mystical. Everything can be demystified if you if you do your thinking process
94:32 if you if you do your thinking process properly. If you use rationality
94:33 properly. If you use rationality properly, everything will get
94:35 properly, everything will get demystified into non-stical stuff. At
94:37 demystified into non-stical stuff. At the end of the day,
94:39 the end of the day, another piece of the ontology is that
94:41 another piece of the ontology is that consciousness and mind are not
94:42 consciousness and mind are not fundamental. These are second order
94:46 fundamental. These are second order perhaps even third third order
94:48 perhaps even third third order epipenomena
94:57 not fundamental to what reality is. Another piece of the ontology is that
95:00 Another piece of the ontology is that there exist others who can corroborate
95:02 there exist others who can corroborate what is real and what is true.
95:09 Another piece of the ontology is that reality is finite.
95:11 reality is finite. Another one is that reality is not a
95:13 Another one is that reality is not a unity. Another one is that absolutes do
95:15 unity. Another one is that absolutes do not exist.
95:21 And another one is that there is no higher intelligence than the human.
95:25 higher intelligence than the human. Now, a few of these pieces of the
95:26 Now, a few of these pieces of the ontology you might even as a rationalist
95:28 ontology you might even as a rationalist you might question. You might say,
95:29 you might question. You might say, "Well, Leo, I'm a rationalist. I even
95:32 "Well, Leo, I'm a rationalist. I even identify as a rationalist, but I don't
95:33 identify as a rationalist, but I don't believe the universe is necessarily
95:35 believe the universe is necessarily finite. Maybe it's infinite. I'm open to
95:37 finite. Maybe it's infinite. I'm open to that possibility. And I'm and I'm not,
95:39 that possibility. And I'm and I'm not, you know, you said I believe it's not a
95:41 you know, you said I believe it's not a unity, but maybe it is a unity. Why are
95:43 unity, but maybe it is a unity. Why are you saying that? And you know, I believe
95:45 you saying that? And you know, I believe that there might be aliens who are more
95:46 that there might be aliens who are more intelligent than humans or AI might be
95:48 intelligent than humans or AI might be more intelligent than humans. I can buy
95:50 more intelligent than humans. I can buy that. So why are you saying these
95:52 that. So why are you saying these things? Well,
95:54 things? Well, the problem is that you don't really
95:55 the problem is that you don't really understand what I'm talking about. Um,
95:58 understand what I'm talking about. Um, when I say infinite or finite, you don't
96:00 when I say infinite or finite, you don't understand what that means. And when I
96:01 understand what that means. And when I say unity, you don't understand what
96:02 say unity, you don't understand what that means. And when I say absolute, you
96:04 that means. And when I say absolute, you don't understand what that means. And
96:05 don't understand what that means. And when I say intell something more
96:06 when I say intell something more intelligent than human, you don't
96:08 intelligent than human, you don't understand the consequences of these of
96:09 understand the consequences of these of these ideas, right? You understand them
96:11 these ideas, right? You understand them in a superficial sense. So a scientist
96:14 in a superficial sense. So a scientist might say, well, Leo, as a scientist,
96:15 might say, well, Leo, as a scientist, you know, I'm open to the universe being
96:17 you know, I'm open to the universe being infinite. And science does not
96:19 infinite. And science does not contradict the I mean, science is
96:22 contradict the I mean, science is compatible with a finite universe or an
96:24 compatible with a finite universe or an infinite universe. You know, that's an
96:25 infinite universe. You know, that's an open question. It's okay. We don't we're
96:27 open question. It's okay. We don't we're not dogmatic about that.
96:30 not dogmatic about that. No, actually, you are. You don't
96:31 No, actually, you are. You don't understand that all of science is
96:33 understand that all of science is finitude, right?
96:39 Your entire understanding of the universe is a finite one.
96:43 universe is a finite one. When I say infinite, you say, well, the
96:45 When I say infinite, you say, well, the universe might, you know, it might be
96:46 universe might, you know, it might be infinite in three dimensions. You know,
96:48 infinite in three dimensions. You know, XYZ coordinates might go on forever.
96:50 XYZ coordinates might go on forever. That's not what I mean. When I say
96:52 That's not what I mean. When I say infinite, I truly mean absolutely
96:54 infinite, I truly mean absolutely infinite. I don't just mean in three
96:55 infinite. I don't just mean in three spatial dimensions or even in the time
96:57 spatial dimensions or even in the time dimension. I mean absolutely infinite.
97:00 dimension. I mean absolutely infinite. You don't know what that you can't
97:01 You don't know what that you can't fathom what I'm saying and you don't
97:03 fathom what I'm saying and you don't understand that science is all of
97:05 understand that science is all of science is finite right you're under the
97:08 science is finite right you're under the illusion that that science can deal with
97:10 illusion that that science can deal with infinity it can't that's because you
97:12 infinity it can't that's because you don't understand what infinity is
97:14 don't understand what infinity is likewise when I say one of your
97:16 likewise when I say one of your onlogical assumptions is that reality is
97:18 onlogical assumptions is that reality is not a unity so you might say well Leo I
97:20 not a unity so you might say well Leo I believe it could be a unity again you
97:22 believe it could be a unity again you don't understand what I mean by unity
97:30 when I say unity That means reality is so one that there
97:33 That means reality is so one that there doesn't even exist an other who you can
97:36 doesn't even exist an other who you can ask if your science is true or real.
97:39 ask if your science is true or real. Right? See as a scientist you think that
97:42 Right? See as a scientist you think that you can go and ask somebody to
97:43 you can go and ask somebody to corroborate. You can you can ask an
97:45 corroborate. You can you can ask an whether it's a human or an alien or just
97:48 whether it's a human or an alien or just a a unconscious uh you know appar
97:51 a a unconscious uh you know appar scientific apparatus you can ask that
97:53 scientific apparatus you can ask that thing to tell you what truth is. You can
97:56 thing to tell you what truth is. You can use it to corroborate because you think
97:57 use it to corroborate because you think it's other than you. You think you can
97:59 it's other than you. You think you can you can gather a collection of other
98:00 you can gather a collection of other people, other minds, and collaborate
98:03 people, other minds, and collaborate together to come to a consensus on what
98:05 together to come to a consensus on what the truth is, right? That's because you
98:07 the truth is, right? That's because you don't understand that it's a unity.
98:11 don't understand that it's a unity. You don't understand that that the
98:12 You don't understand that that the scientific mind when it's doing
98:13 scientific mind when it's doing analysis, it's always subdividing
98:15 analysis, it's always subdividing reality. You're trying to use
98:17 reality. You're trying to use subdivision to understand unity. And you
98:19 subdivision to understand unity. And you don't see that you can't understand
98:21 don't see that you can't understand unity through subdivision because the
98:24 unity through subdivision because the very act of your understanding is
98:26 very act of your understanding is already subdividing the unity breaking
98:28 already subdividing the unity breaking your ability to understand it.
98:34 When you say that there are higher intelligences than human whether it's an
98:38 intelligences than human whether it's an AI or an alien you can say that
98:42 AI or an alien you can say that that doesn't mean you comprehend the
98:44 that doesn't mean you comprehend the ramifications of what you're actually
98:46 ramifications of what you're actually supposedly admitting to. See,
98:54 one of the problems with rationalism is that the rationalist truly does not
98:56 that the rationalist truly does not comprehend what higher intelligence
98:58 comprehend what higher intelligence beyond the human is. Because all of
99:01 beyond the human is. Because all of rationality and all of rationalism, this
99:03 rationality and all of rationalism, this is only just human in a very narrow
99:07 is only just human in a very narrow space within consciousness
99:10 space within consciousness within mind space. If we think of
99:12 within mind space. If we think of everything as just mind space, it's a
99:14 everything as just mind space, it's a very small, very low form of mind space
99:22 and everything that you consider rational and reasonable is within that
99:24 rational and reasonable is within that mind space.
99:26 mind space. Anything beyond that mind space
99:30 Anything beyond that mind space from a rationalist point of view is
99:32 from a rationalist point of view is going to seem irrational and therefore
99:34 going to seem irrational and therefore false. That doesn't mean it is
99:36 false. That doesn't mean it is irrational and false. That just means
99:38 irrational and false. That just means it's outside of your mind space. It's
99:40 it's outside of your mind space. It's literally outside of your domain of
99:42 literally outside of your domain of sanity.
99:47 The rationalist doesn't understand that there exists higher intelligence and
99:49 there exists higher intelligence and higher consciousness than the human so
99:51 higher consciousness than the human so high that it literally transcends human
99:54 high that it literally transcends human sanity. And then this is what mysticism
99:56 sanity. And then this is what mysticism is. Mysticism is not the belief in
99:59 is. Mysticism is not the belief in religious dogma and superstition and
100:02 religious dogma and superstition and nonsense. I mean there are that kind of
100:04 nonsense. I mean there are that kind of mysticism. That's that's bullshit. But
100:06 mysticism. That's that's bullshit. But that's not what we're talking about.
100:07 that's not what we're talking about. That's not interesting. That's too easy.
100:09 That's not interesting. That's too easy. What we're talking about is a mind
100:11 What we're talking about is a mind that's completely beyond the human mind.
100:18 That's what truly higher intelligence means. But then for the human mind to
100:22 means. But then for the human mind to understand that mind, it's not going to
100:25 understand that mind, it's not going to seem rational.
100:27 seem rational. You see, this is the fundamental
100:28 You see, this is the fundamental problem.
100:30 problem. This is why we need to deconstruct
100:33 This is why we need to deconstruct rationality is not so that you can
100:35 rationality is not so that you can believe
100:37 believe preodern nonsense and superstition. It's
100:41 preodern nonsense and superstition. It's so that your mind can expand into a
100:44 so that your mind can expand into a higher form of intelligence that is uh
100:47 higher form of intelligence that is uh beyond your own sanity and beyond
100:50 beyond your own sanity and beyond anything the human race considers
100:52 anything the human race considers reasonable and rational.
100:58 See, but all of these things are lost on a rationalist.
101:05 But of course, the rationalist, like I said, it's a par, it's a clever, sneaky
101:06 said, it's a par, it's a clever, sneaky paradigm that wants to maintain itself.
101:08 paradigm that wants to maintain itself. So, it's always going to try to come up
101:10 So, it's always going to try to come up with rationalizations against everything
101:12 with rationalizations against everything that I'm saying here, right? So, the
101:14 that I'm saying here, right? So, the rationalist is going to have smart
101:15 rationalist is going to have smart arguments. He's going to make various
101:17 arguments. He's going to make various kinds of arguments and sort of pseudo
101:19 kinds of arguments and sort of pseudo concessions to things that I'm saying,
101:20 concessions to things that I'm saying, you know, in order to save face.
101:23 you know, in order to save face. But fundamentally what the rationalist
101:25 But fundamentally what the rationalist is not realizing is that this entire
101:27 is not realizing is that this entire paradigm is is is it's such a small
101:31 paradigm is is is it's such a small little island within this giant ocean of
101:33 little island within this giant ocean of consciousness.
101:36 consciousness. And that
101:38 And that most of the advanced stuff that's way
101:40 most of the advanced stuff that's way beyond your current consciousness
101:44 beyond your current consciousness is so far beyond that it's not going to
101:46 is so far beyond that it's not going to seem rational to you and it's not going
101:48 seem rational to you and it's not going to seem scientific. And that's not a
101:50 to seem scientific. And that's not a mistake. That's not a bug. as a feature
101:53 mistake. That's not a bug. as a feature that just shows you the limits of your
101:55 that just shows you the limits of your current paradigm,
101:57 current paradigm, right? But for you to be able to
101:58 right? But for you to be able to understand those higher things, you're
102:00 understand those higher things, you're going to have to deconstruct this
102:01 going to have to deconstruct this paradigm, which is what we're doing.
102:04 paradigm, which is what we're doing. But you're so attached to your paradigm
102:07 But you're so attached to your paradigm with your sense of self, you don't even
102:09 with your sense of self, you don't even understand that your entire worldview,
102:12 understand that your entire worldview, all of science is just a construct of
102:14 all of science is just a construct of the self to maintain your sanity. You
102:16 the self to maintain your sanity. You don't even understand that.
102:18 don't even understand that. So because of that, you're going to be
102:20 So because of that, you're going to be fighting for your life and for your
102:21 fighting for your life and for your sanity.
102:24 sanity. And so literally, the things that I'm
102:26 And so literally, the things that I'm trying to teach you here will take you
102:27 trying to teach you here will take you beyond your own sanity, which is why
102:30 beyond your own sanity, which is why you're going to reject them.
102:39 So back to the ontology. Uh that's the whole ontology, I gave it to you, of
102:40 whole ontology, I gave it to you, of rationalism.
102:42 rationalism. It's also the ontology of materialism,
102:44 It's also the ontology of materialism, of scientism. There's a lot of overlap
102:46 of scientism. There's a lot of overlap between these things. Um, so first of
102:48 between these things. Um, so first of all, you should notice that all of those
102:50 all, you should notice that all of those points that I gave you about the
102:51 points that I gave you about the ontology, all of those are assumptions.
102:54 ontology, all of those are assumptions. They're just assumptions. None of them
102:56 They're just assumptions. None of them have ever been proven.
102:59 have ever been proven. Most of them are unconscious. The people
103:01 Most of them are unconscious. The people who hold these assumptions, they don't
103:03 who hold these assumptions, they don't even understand that they hold them.
103:04 even understand that they hold them. They don't understand the consequences
103:06 They don't understand the consequences of these assumptions. It takes a lot of
103:08 of these assumptions. It takes a lot of philosophical work just to become
103:10 philosophical work just to become conscious of the assumptions which is a
103:12 conscious of the assumptions which is a great example of why philosophy is so
103:14 great example of why philosophy is so valuable and so necessary for science.
103:17 valuable and so necessary for science. This is why science cannot do without
103:19 This is why science cannot do without philosophy because science is incapable
103:21 philosophy because science is incapable of deeply questioning its own
103:22 of deeply questioning its own assumptions. Philosophy is necessary for
103:25 assumptions. Philosophy is necessary for that. See philosophy is in a sense
103:28 that. See philosophy is in a sense meta-cience. It is it is a form of
103:33 meta-cience. It is it is a form of metrationality.
103:34 metrationality. You see, because to to question the
103:38 You see, because to to question the assumptions of rationalism and
103:40 assumptions of rationalism and rationality and to try try to transcend
103:42 rationality and to try try to transcend it and to see if there's anything beyond
103:43 it and to see if there's anything beyond it, well, you would need something more
103:45 it, well, you would need something more general, less specific than a rigorous
103:48 general, less specific than a rigorous formal system because the rigorous
103:51 formal system because the rigorous formal system has certain assumptions
103:53 formal system has certain assumptions baked into its very rigorous nature,
103:55 baked into its very rigorous nature, right? It's laws. Any system of laws you
103:57 right? It's laws. Any system of laws you create will have assumptions um about
104:00 create will have assumptions um about the validity of why those laws were
104:02 the validity of why those laws were selected. If you want to start to
104:03 selected. If you want to start to question why those laws were selected,
104:05 question why those laws were selected, why those rules are being followed, you
104:07 why those rules are being followed, you need to go outside of that system. The
104:08 need to go outside of that system. The system can't do that about itself. This
104:11 system can't do that about itself. This is the self-reflection problem.
104:18 And I'd like you to notice that these assumptions come from conformity for the
104:20 assumptions come from conformity for the most part. So here we're connecting with
104:23 most part. So here we're connecting with my episode,
104:25 my episode, the psychology of conformity.
104:29 the psychology of conformity. All of rationalism is just conformity.
104:32 All of rationalism is just conformity. That's all that it is.
104:34 That's all that it is. But it's so sophisticated and
104:38 But it's so sophisticated and pseudointellent. I don't want to call it
104:40 pseudointellent. I don't want to call it intelligent. I'm going to call it
104:41 intelligent. I'm going to call it pseudointelligent
104:43 pseudointelligent because in my mind it's not intelligent.
104:45 because in my mind it's not intelligent. But for most people it seems
104:47 But for most people it seems intelligent. You know, an MIT academic
104:49 intelligent. You know, an MIT academic seems intelligent. He's not, but he
104:51 seems intelligent. He's not, but he seems intelligent to the average person.
104:54 seems intelligent to the average person. When you see a Sam Harris, you know,
104:55 When you see a Sam Harris, you know, arguing with a Jordan Peterson, it seems
104:57 arguing with a Jordan Peterson, it seems intelligent. It's not. What they're
104:58 intelligent. It's not. What they're doing is not intelligent by my
105:00 doing is not intelligent by my standards. But of course there are many
105:02 standards. But of course there are many degrees of intelligence here of course
105:04 degrees of intelligence here of course right so by human standards it seems
105:06 right so by human standards it seems intelligent just cuz you don't know
105:08 intelligent just cuz you don't know anything higher
105:19 Now back to this issue of what's wrong with rationalism. Rationalism might work
105:21 with rationalism. Rationalism might work if the universe was a finite
105:23 if the universe was a finite deterministic
105:25 deterministic clockwork machine made of objective
105:27 clockwork machine made of objective definite crisp mind independent parts.
105:30 definite crisp mind independent parts. But this ontology is fundamentally
105:32 But this ontology is fundamentally wrong.
105:33 wrong. Which is why rationalists tend to not
105:35 Which is why rationalists tend to not want to discuss metaphysics and ontology
105:37 want to discuss metaphysics and ontology because if they did, they would realize
105:40 because if they did, they would realize how bad their ontology is. See, reality
105:43 how bad their ontology is. See, reality is an absolute unity. This has profound
105:46 is an absolute unity. This has profound and subtle implications for science, for
105:49 and subtle implications for science, for rationality, for understanding and
105:51 rationality, for understanding and sensemaking.
105:54 sensemaking. No distinction, what what does it mean
105:56 No distinction, what what does it mean to say that it's absolute unity? That
105:58 to say that it's absolute unity? That means that no distinction has an a fixed
106:00 means that no distinction has an a fixed objective reality.
106:02 objective reality. What I'm saying is that literally there
106:04 What I'm saying is that literally there is not a difference between a cat and a
106:06 is not a difference between a cat and a dog fundamentally.
106:09 dog fundamentally. But science assumes there is.
106:10 But science assumes there is. Rationality assumes that there is. There
106:13 Rationality assumes that there is. There is not a difference fundamentally
106:14 is not a difference fundamentally between science and pseudocience. But
106:16 between science and pseudocience. But rationalism assumes there is. There is
106:18 rationalism assumes there is. There is not a difference between even
106:21 not a difference between even rationality and irrationality.
106:24 rationality and irrationality. But rationality assumes that there is.
106:26 But rationality assumes that there is. All of science is working on there being
106:29 All of science is working on there being these fixed distinctions.
106:32 these fixed distinctions. See, science doesn't even understand
106:34 See, science doesn't even understand that the building block of of the
106:36 that the building block of of the universe is not atoms or equations or
106:40 universe is not atoms or equations or computation. It's distinctions.
106:43 computation. It's distinctions. Science and rationalists don't even have
106:45 Science and rationalists don't even have enough consciousness to understand that.
106:48 enough consciousness to understand that. That's because they're living in a
106:50 That's because they're living in a distinct world. They're living in a
106:52 distinct world. They're living in a world of duality. They don't really
106:54 world of duality. They don't really understand the implications and
106:56 understand the implications and ramifications of non-duality. What I'm
106:58 ramifications of non-duality. What I'm discussing here is non-duality.
106:59 discussing here is non-duality. Non-uality is just another synonym for
107:02 Non-uality is just another synonym for unity. Absolute unity.
107:09 It's because ontologically we have an absolute unity that um nothing is
107:12 absolute unity that um nothing is objectively distinct from anything else
107:15 objectively distinct from anything else in a fixed absolute way. All the
107:18 in a fixed absolute way. All the distinctions are morphing, changing,
107:21 distinctions are morphing, changing, fluid,
107:24 fluid, and that that's the true nature of the
107:25 and that that's the true nature of the world. Not a machine. See, a machine has
107:29 world. Not a machine. See, a machine has fixed parts. Another way to put it is
107:31 fixed parts. Another way to put it is that absolute unity means that there are
107:33 that absolute unity means that there are no parts to reality. There are no fixed
107:34 no parts to reality. There are no fixed parts. All the parts that you think are
107:38 parts. All the parts that you think are separate from each other, these are
107:40 separate from each other, these are pseudo separations. Pseudo parts. They
107:43 pseudo separations. Pseudo parts. They are parts in a
107:51 These are they're they're parts only as they are conceived by a mind
107:55 they are conceived by a mind and the mind doing the conceiving. This
107:57 and the mind doing the conceiving. This is absolutely fundamental. Reality just
107:59 is absolutely fundamental. Reality just is a mind that is conceiving of parts
108:02 is a mind that is conceiving of parts and distinctions. And all of this is
108:04 and distinctions. And all of this is lost on rationalism.
108:08 lost on rationalism. So this brings us to um now we're going
108:10 So this brings us to um now we're going to go deep into David Chapman's work.
108:13 to go deep into David Chapman's work. This brings us to the problem of
108:14 This brings us to the problem of nebulosity as he calls it. Nebulosity.
108:16 nebulosity as he calls it. Nebulosity. It's a bit of a pompous word.
108:17 It's a bit of a pompous word. Nebulosity. Um, nebulous means
108:20 Nebulosity. Um, nebulous means cloudlike, means fuzzy, abstract,
108:23 cloudlike, means fuzzy, abstract, ambiguous, amorphous.
108:26 ambiguous, amorphous. So, what is the problem of nebulosity?
108:28 So, what is the problem of nebulosity? He defines it like this. All of science
108:30 He defines it like this. All of science and sensemaking is plagued by
108:31 and sensemaking is plagued by categorization and definition problems.
108:33 categorization and definition problems. Rationality depends upon definitions to
108:36 Rationality depends upon definitions to work. But what is a definition anyways?
108:39 work. But what is a definition anyways? It's a very deep subject. Rationality
108:42 It's a very deep subject. Rationality assumes that definitions exist for
108:44 assumes that definitions exist for objects. This is another way to say that
108:45 objects. This is another way to say that it assumes identities, fixed identities,
108:48 it assumes identities, fixed identities, fixed categories, fixed distinctions.
108:51 fixed categories, fixed distinctions. Um,
108:54 Um, rationality assumes mind independent
108:56 rationality assumes mind independent definitions.
108:59 definitions. But what if definitions are not mind
109:00 But what if definitions are not mind independent? What if the mind is
109:02 independent? What if the mind is defining the object and thereby creating
109:05 defining the object and thereby creating the object and that's what an object is?
109:08 the object and that's what an object is? How and where do you draw boundaries
109:10 How and where do you draw boundaries between things in order to make truth
109:12 between things in order to make truth claims? See, this is the problem. If you
109:14 claims? See, this is the problem. If you want to have a formal
109:17 want to have a formal logical system that deals with, you
109:20 logical system that deals with, you know, propositions of statements, you
109:22 know, propositions of statements, you need to be able to crisply define what
109:24 need to be able to crisply define what you're talking about. For example, if
109:26 you're talking about. For example, if you want to say that snow is white,
109:27 you want to say that snow is white, well, you need a crisp definition of
109:29 well, you need a crisp definition of snow. What is snow?
109:32 snow. What is snow? And what is white? And while we're at
109:34 And what is white? And while we're at it, what is is [laughter]
109:36 it, what is is [laughter] it all depends on the meaning of what is
109:38 it all depends on the meaning of what is is,
109:40 is, right? So um but seriously what is
109:44 right? So um but seriously what is white? How do you define white
109:46 white? How do you define white scientifically?
109:48 scientifically? See a scientist will come in here and
109:49 See a scientist will come in here and say well Leo white is just like you know
109:51 say well Leo white is just like you know it's a wavelength of light and it's it
109:53 it's a wavelength of light and it's it you know it's the 425
109:56 you know it's the 425 nanometer range of light or whatever.
109:58 nanometer range of light or whatever. It's a mixture of multiple lights. I
110:00 It's a mixture of multiple lights. I don't know what it I don't know how they
110:01 don't know what it I don't know how they define white but that's what it is they
110:05 define white but that's what it is they want to say.
110:07 want to say. But, you know, it gets very very hairy.
110:10 But, you know, it gets very very hairy. And this is the this is this is a trick
110:12 And this is the this is this is a trick because snow, you know, is snow really
110:15 because snow, you know, is snow really white? What about the little blue tint
110:17 white? What about the little blue tint it has? Or it's more clear, transparent?
110:19 it has? Or it's more clear, transparent? How do you define these things? How do
110:20 How do you define these things? How do you define what snow is? Does ice count
110:22 you define what snow is? Does ice count as snow? Is an icicle snow? Does it have
110:25 as snow? Is an icicle snow? Does it have to be falling from the sky if it's on
110:27 to be falling from the sky if it's on the ground? But what if dirt is mixed
110:28 the ground? But what if dirt is mixed into it? Is that snow still? At what
110:31 into it? Is that snow still? At what point does it stop being snow? If it's
110:33 point does it stop being snow? If it's melting, is it snow? Now it's water. Is
110:36 melting, is it snow? Now it's water. Is it still white? Is water snow? See, it
110:39 it still white? Is water snow? See, it gets very confusing. This is the problem
110:40 gets very confusing. This is the problem of nebulosity that David Chapman is
110:43 of nebulosity that David Chapman is talking about.
110:45 talking about. So, for science to work, it needs to be
110:48 So, for science to work, it needs to be able to identify, categorize, and
110:49 able to identify, categorize, and distinguish objects. The more crisply it
110:52 distinguish objects. The more crisply it can do so, the better it works. But
110:55 can do so, the better it works. But there's a problem. Many, many problems.
110:59 there's a problem. Many, many problems. So many problems. We got to pace
111:00 So many problems. We got to pace ourselves here. I'm getting ahead of
111:01 ourselves here. I'm getting ahead of myself. Uh before we get into into that,
111:03 myself. Uh before we get into into that, let's let's cover some of these uh other
111:05 let's let's cover some of these uh other examples. Here's some examples. Um for
111:09 examples. Here's some examples. Um for example,
111:10 example, science assumes that there just is such
111:12 science assumes that there just is such a thing as a planet. But what is a
111:14 a thing as a planet. But what is a planet really? How do you define what a
111:16 planet really? How do you define what a planet is? Or for example, it assumes
111:19 planet is? Or for example, it assumes that there is such a thing as a
111:20 that there is such a thing as a particle, but what is a particle? How do
111:22 particle, but what is a particle? How do you define what a particle is? Or there
111:24 you define what a particle is? Or there just is such thing as an object, but
111:26 just is such thing as an object, but what is an object? How do you define
111:28 what is an object? How do you define what an object is? Or there is such
111:29 what an object is? Or there is such thing as a human. How do you define what
111:31 thing as a human. How do you define what a human is? Is a Neanderthal a human?
111:35 a human is? Is a Neanderthal a human? When did the first human appear
111:37 When did the first human appear scientifically?
111:45 Or now we get even into more complex things which are various kinds of
111:46 things which are various kinds of abstractions. For example, look,
111:52 rationalism will assume that there just is such a thing as being rational. But
111:55 is such a thing as being rational. But what does it mean to be rational? How do
111:58 what does it mean to be rational? How do you define that in a rigorous way?
112:04 or assumes that there is just s such a thing as right and wrong, good and bad.
112:06 thing as right and wrong, good and bad. How do you define those concepts?
112:10 How do you define those concepts? There just is such a thing as science
112:11 There just is such a thing as science and pseudocience. But how do you define
112:13 and pseudocience. But how do you define those?
112:15 those? What counts as pseudocience?
112:18 What counts as pseudocience? See, the truth is that none of these
112:20 See, the truth is that none of these things are ever fully known. It's all
112:22 things are ever fully known. It's all just assumed as simple and given.
112:27 just assumed as simple and given. And it works on the surface. It works a
112:29 And it works on the surface. It works a little bit if you're doing basic
112:31 little bit if you're doing basic science. It works. Like if you want to
112:32 science. It works. Like if you want to shoot a projectile,
112:34 shoot a projectile, you can successfully shoot a projectile
112:36 you can successfully shoot a projectile without getting into these complex
112:37 without getting into these complex ontological issues. If you just want to,
112:41 ontological issues. If you just want to, you know, do some chemistry and invent a
112:43 you know, do some chemistry and invent a new drug, you can do that without
112:46 new drug, you can do that without getting into this complex ontology. But
112:49 getting into this complex ontology. But when you start to do really deep science
112:51 when you start to do really deep science and you start to get into the quantum
112:53 and you start to get into the quantum physics level into the like the real
112:55 physics level into the like the real questions of what is anything at all,
112:57 questions of what is anything at all, where did anything or you you go to the
112:59 where did anything or you you go to the very macro scale like you know the big
113:01 very macro scale like you know the big bang, where the big bang come from,
113:02 bang, where the big bang come from, where did anything come from at all. You
113:04 where did anything come from at all. You get into these deep kind of questions or
113:06 get into these deep kind of questions or like what is life? What is
113:07 like what is life? What is consciousness? What is a mind? What is
113:08 consciousness? What is a mind? What is the self? You get into these deep
113:09 the self? You get into these deep questions. um these simplistic
113:12 questions. um these simplistic assumptions uh about just there being
113:15 assumptions uh about just there being discrete objects just working together
113:17 discrete objects just working together in a mechanical fashion. All this breaks
113:19 in a mechanical fashion. All this breaks down. None of this holds because these
113:21 down. None of this holds because these were just assumptions. That's all they
113:23 were just assumptions. That's all they were.
113:25 were. There are no definite boundaries between
113:26 There are no definite boundaries between physical objects. Even at the atomic
113:28 physical objects. Even at the atomic level, there's no objective criterion
113:31 level, there's no objective criterion for which atoms are part of which
113:32 for which atoms are part of which objects.
113:34 objects. Nowhere in the universe does it say that
113:36 Nowhere in the universe does it say that here is a cloud and this atom is part of
113:39 here is a cloud and this atom is part of this cloud. Whereas this other atom is
113:41 this cloud. Whereas this other atom is not part of this cloud. This is
113:46 not part of this cloud. This is the picking out of that cloud out of the
113:48 the picking out of that cloud out of the sky.
113:51 sky. The self is doing that. The mind is
113:54 The self is doing that. The mind is doing that.
114:00 See, so this is breaking that distinction that science makes between
114:02 distinction that science makes between the self who is studying the universe
114:04 the self who is studying the universe and the universe itself. Because of
114:06 and the universe itself. Because of course the self and the universe are
114:07 course the self and the universe are one. You are the universe studying
114:09 one. You are the universe studying itself.
114:11 itself. But science doesn't understand the
114:13 But science doesn't understand the complex, you know, entanglements that
114:15 complex, you know, entanglements that this is going to create, the paradoxes
114:16 this is going to create, the paradoxes that this creates.
114:26 How many clouds are in the sky today? There is no objective answer to that
114:28 There is no objective answer to that question because what determines what is
114:30 question because what determines what is a cloud?
114:33 a cloud? Your mind does. This is not a physical
114:35 Your mind does. This is not a physical fact you find in the world. How many
114:37 fact you find in the world. How many clouds are there? How many rocks are
114:40 clouds are there? How many rocks are there on the moon?
114:43 there on the moon? See, simplistically,
114:45 See, simplistically, under the materialist paradigm, you
114:47 under the materialist paradigm, you think that there is an answer to that
114:48 think that there is an answer to that question. But actually, there isn't.
114:50 question. But actually, there isn't. Because what counts as a rock?
114:53 Because what counts as a rock? Is a pebble a rock? If you have a rock
114:56 Is a pebble a rock? If you have a rock and there's a crack in it, but it's not
114:58 and there's a crack in it, but it's not all the way cracked in half, is that two
115:00 all the way cracked in half, is that two rocks? Is that one rock? How do we count
115:02 rocks? Is that one rock? How do we count that? Right? Is a grain of sand a rock
115:04 that? Right? Is a grain of sand a rock technically?
115:11 How about a car? Is a car a single object or is it how many objects are in
115:14 object or is it how many objects are in a car? Who gets to say?
115:18 a car? Who gets to say? See, quantity itself is relative and
115:21 See, quantity itself is relative and subjective.
115:22 subjective. How many of what you think there are?
115:24 How many of what you think there are? How many objects are there in the
115:26 How many objects are there in the universe? Is there an objective answer
115:27 universe? Is there an objective answer to that question? Does God know the
115:29 to that question? Does God know the answer to that question? In a certain
115:31 answer to that question? In a certain sense, no. Because the very notion of
115:33 sense, no. Because the very notion of quantity, quantifying things is
115:36 quantity, quantifying things is something that your own mind is doing
115:37 something that your own mind is doing and projecting onto the world. This is
115:39 and projecting onto the world. This is where that distinction between reality
115:42 where that distinction between reality and mind is breaking down. And then
115:44 and mind is breaking down. And then we're going to say that there is no such
115:45 we're going to say that there is no such thing as mind independent objects. All
115:48 thing as mind independent objects. All objects are inside the mind. All objects
115:50 objects are inside the mind. All objects are mind. An object is mind
115:55 are mind. An object is mind making a distinction within itself.
116:01 So, human perception is too hairy to give rationalism the clean, distinct
116:03 give rationalism the clean, distinct objects that it needs in order to
116:06 objects that it needs in order to construct itself.
116:08 construct itself. Let's go through a couple of deep
116:10 Let's go through a couple of deep examples of nebulosity so you kind of
116:11 examples of nebulosity so you kind of start to see some of the profound
116:13 start to see some of the profound implications of it because it can seem
116:15 implications of it because it can seem kind of trivial. You might say, well,
116:16 kind of trivial. You might say, well, Leo, you know, how many rocks are there
116:18 Leo, you know, how many rocks are there on the moon? Who really cares? Is that
116:20 on the moon? Who really cares? Is that really
116:22 really that gerine to the entire edifice of
116:25 that gerine to the entire edifice of science? Does that really change
116:27 science? Does that really change anything about science? Or like how many
116:29 anything about science? Or like how many clouds are in the sky? Does that really
116:30 clouds are in the sky? Does that really change how we do science and what
116:31 change how we do science and what science means? And the answer is yes, it
116:33 science means? And the answer is yes, it does. But you don't understand probably
116:36 does. But you don't understand probably the implications. So let's look at some
116:38 the implications. So let's look at some more examples. Consider an apple,
116:40 more examples. Consider an apple, ordinary apple. Now most rationalists
116:43 ordinary apple. Now most rationalists and scientists and materialists would
116:44 and scientists and materialists would consider that an apple is just a well-
116:46 consider that an apple is just a well- definfined object that just exists
116:48 definfined object that just exists independent of the mind. Something
116:49 independent of the mind. Something that's obvious, given, nothing mystical
116:51 that's obvious, given, nothing mystical about it, and it's just objective.
116:54 about it, and it's just objective. It's just a fact, right? An apple's just
116:55 It's just a fact, right? An apple's just a fact.
116:58 a fact. But when we're thinking about apples and
117:00 But when we're thinking about apples and talking about apples and trying to
117:01 talking about apples and trying to understand apples,
117:04 understand apples, what apple am I talking about? Am I
117:06 what apple am I talking about? Am I talking about an apple today?
117:09 talking about an apple today? One of our
117:11 One of our juicy giant apples or an apple from
117:13 juicy giant apples or an apple from 5,000 years ago? Have you ever seen an
117:15 5,000 years ago? Have you ever seen an apple from 5,000 years ago? It's like a
117:18 apple from 5,000 years ago? It's like a little crab apple. It hardly even
117:20 little crab apple. It hardly even resembles an apple. Probably barely
117:23 resembles an apple. Probably barely tastes like a modern apple.
117:25 tastes like a modern apple. Today's apples are these gigantic
117:27 Today's apples are these gigantic genetically modified monstrosities that
117:29 genetically modified monstrosities that have been bred over a thousand years to
117:31 have been bred over a thousand years to be juicy and colorful and all this.
117:35 be juicy and colorful and all this. So, in what sense are they the same
117:36 So, in what sense are they the same thing?
117:38 thing? See, when we're doing science, we're not
117:41 See, when we're doing science, we're not just talking about specific objects in
117:43 just talking about specific objects in the world like this apple, that pear,
117:47 the world like this apple, that pear, that banana, specific objects. We're
117:49 that banana, specific objects. We're also talking about abstract categories.
117:52 also talking about abstract categories. Apple. How do you even understand what I
117:54 Apple. How do you even understand what I mean by apple? Apple is an abstract
117:56 mean by apple? Apple is an abstract category of thing.
117:58 category of thing. It doesn't even refer to that specific
118:00 It doesn't even refer to that specific apple over there. It's a it's a term for
118:04 apple over there. It's a it's a term for entire class.
118:06 entire class. But what is a class?
118:09 But what is a class? Is a class just a um a collection of
118:12 Is a class just a um a collection of atoms? Can we just reduce a class to
118:14 atoms? Can we just reduce a class to just atoms? That's all it is. Just
118:15 just atoms? That's all it is. Just atoms, calculations, and equations. But
118:18 atoms, calculations, and equations. But you see, atom itself is a class.
118:32 but the human mind when we're talking about understanding and sensemaking
118:33 about understanding and sensemaking which is the primary concern here. Uh to
118:36 which is the primary concern here. Uh to understand reality
118:38 understand reality you're not just understanding individual
118:40 you're not just understanding individual objects you're you're also understanding
118:42 objects you're you're also understanding abstractions and classes.
118:44 abstractions and classes. How do you define what these
118:46 How do you define what these abstractions and classes actually are?
118:47 abstractions and classes actually are? How do you define what an apple is? For
118:49 How do you define what an apple is? For example, if I genetically modify an
118:51 example, if I genetically modify an apple with 5% pair DNA, is it still an
118:53 apple with 5% pair DNA, is it still an apple? How do you classify it? If you
118:56 apple? How do you classify it? If you want to make truth claims about apples
118:58 want to make truth claims about apples and pears and their differences, you
119:01 and pears and their differences, you better be able to crisply define what
119:03 better be able to crisply define what what is an apple and what is a pair. So,
119:05 what is an apple and what is a pair. So, you can say, well, [clears throat] it's
119:06 you can say, well, [clears throat] it's true that apples are this way, and it's
119:08 true that apples are this way, and it's true that pairs are this way, and what's
119:10 true that pairs are this way, and what's true of a pair is not true of an apple.
119:13 true of a pair is not true of an apple. That's basically what's all of science
119:15 That's basically what's all of science depends upon. Do you see how fundamental
119:17 depends upon. Do you see how fundamental this is? If you can't make these crisp
119:19 this is? If you can't make these crisp categories, then you can't make truth
119:22 categories, then you can't make truth statements about different objects in
119:24 statements about different objects in the in the universe.
119:35 Look at how interesting this gets. If an apple is growing on a tree, is it still
119:37 apple is growing on a tree, is it still an apple?
119:40 an apple? Where does the apple begin and the tree
119:43 Where does the apple begin and the tree end?
119:46 end? Is it valid to say that an apple is a
119:48 Is it valid to say that an apple is a tree?
119:51 tree? What's the difference? And I'm talking
119:52 What's the difference? And I'm talking about the fruit. So the apple fruit, is
119:54 about the fruit. So the apple fruit, is the apple fruit a tree or not? What's
119:57 the apple fruit a tree or not? What's the difference between an apple fruit
119:58 the difference between an apple fruit and the tree?
120:07 The things that are true of the apple tree, do those apply to the apple fruit
120:09 tree, do those apply to the apple fruit and vice versa?
120:13 and vice versa? If I take an apple and I put it into the
120:14 If I take an apple and I put it into the fire, is it still an apple? At what
120:17 fire, is it still an apple? At what point does it stop being an apple?
120:20 point does it stop being an apple? If you eat the apple and the apple is
120:22 If you eat the apple and the apple is inside your gut, is that still an apple?
120:26 inside your gut, is that still an apple? At what point does the apple stop being
120:27 At what point does the apple stop being an apple and it becomes you?
120:39 If I cut an apple in half with a knife, just one clean slice in half, how many
120:42 just one clean slice in half, how many objects do we have? Two or still one
120:46 objects do we have? Two or still one apple. Who gets to say?
120:49 apple. Who gets to say? And what about the little bits, the
120:52 And what about the little bits, the little microscopic bits of juice and
120:54 little microscopic bits of juice and apple particles that are on my knife
120:56 apple particles that are on my knife after I've sliced it? Because you never
120:59 after I've sliced it? Because you never slice it perfectly. You slice it in
121:00 slice it perfectly. You slice it in half, you got two big halves. What about
121:02 half, you got two big halves. What about the residue on the knife? Does that
121:04 the residue on the knife? Does that count as the apple?
121:06 count as the apple? So, do we have two apples, two pieces,
121:09 So, do we have two apples, two pieces, or do we have three? And do we count
121:11 or do we have three? And do we count this as one piece? Or do we count well,
121:13 this as one piece? Or do we count well, how do we count this particle sludge
121:16 how do we count this particle sludge that's left on the knife? How do we
121:18 that's left on the knife? How do we count that? How do we quantify that?
121:27 [snorts] Now, we might, a scientist might want to
121:29 Now, we might, a scientist might want to say, well, what determines whether an
121:32 say, well, what determines whether an object is one or two is the physical
121:34 object is one or two is the physical connection. If they're physically
121:35 connection. If they're physically connected, that means that they're one.
121:37 connected, that means that they're one. And if they're not, that means that it's
121:38 And if they're not, that means that it's two. But first of all, why should
121:42 two. But first of all, why should physical connection be what determines
121:45 physical connection be what determines what is an object? Why can't you have an
121:46 what is an object? Why can't you have an object that is physically disconnected?
121:54 After all, if we're talking about a cloud, we say that there's let's say
121:56 cloud, we say that there's let's say let's say there's one very kind of
121:58 let's say there's one very kind of clear, prominent, perfectly clear sky,
122:00 clear, prominent, perfectly clear sky, but just one clear prominent cloud in
122:02 but just one clear prominent cloud in the sky. and we say, "Oh, look,
122:03 the sky. and we say, "Oh, look, obviously there's one cloud up there
122:05 obviously there's one cloud up there today." Okay, fine. But
122:09 today." Okay, fine. But is the cloud physically connected? No.
122:13 is the cloud physically connected? No. No atoms at all if you go to the very
122:16 No atoms at all if you go to the very micro level, no atoms ever are
122:19 micro level, no atoms ever are physically, you know, touching each
122:21 physically, you know, touching each other.
122:23 other. And after all, what is physical anyways?
122:25 And after all, what is physical anyways? How do you even define what a physical
122:27 How do you even define what a physical connection is? See, we just assume that
122:30 connection is? See, we just assume that there are just these things called
122:31 there are just these things called physical connections. Not really. Not
122:33 physical connections. Not really. Not when you go down to the atomic level.
122:35 when you go down to the atomic level. And if the rationalists want to say that
122:36 And if the rationalists want to say that the atomic level, which they want to say
122:38 the atomic level, which they want to say is ultimately what is the truth, the
122:41 is ultimately what is the truth, the ultimate truth is the atomic level.
122:42 ultimate truth is the atomic level. Well, at the atomic level, there is no
122:44 Well, at the atomic level, there is no such thing as a physical connection. And
122:45 such thing as a physical connection. And yet, rationalism depends on physical
122:47 yet, rationalism depends on physical connections to define what objects are
122:49 connections to define what objects are and then to make truth claims about
122:51 and then to make truth claims about them.
122:58 And then you know what actually determines what a physical connection
122:59 determines what a physical connection is?
123:01 is? H
123:03 H if I glue two things together, does that
123:05 if I glue two things together, does that count as a physical connection?
123:09 count as a physical connection? But there are many objects that you
123:10 But there are many objects that you would say that's one object that are
123:11 would say that's one object that are glued together.
123:14 glued together. For example, you would say, well, that's
123:15 For example, you would say, well, that's one car, but the car is glued out of a
123:18 one car, but the car is glued out of a bunch of parts.
123:26 If there's oil between, you know, parts inside of an engine, does that mean it's
123:28 inside of an engine, does that mean it's one engine or multiple parts? What is
123:29 one engine or multiple parts? What is it?
123:34 If I take an apple and I compress it in a in a press into a 5% sized cube of
123:37 a in a press into a 5% sized cube of itself,
123:39 itself, is that still an apple?
123:43 is that still an apple? You might say, "Well, Leo, but so what?
123:44 You might say, "Well, Leo, but so what? Like you're you're saying these things,
123:46 Like you're you're saying these things, but what does it have to do with
123:47 but what does it have to do with anything?" Well, you see, because
123:51 anything?" Well, you see, because when you're doing science and you're
123:53 when you're doing science and you're doing your experiments and you're going
123:54 doing your experiments and you're going to be making knowledge, propositional
123:57 to be making knowledge, propositional knowledge claims about all these things,
123:59 knowledge claims about all these things, you're going to be using terms. You're
124:00 you're going to be using terms. You're going to say something like snow is
124:02 going to say something like snow is white in a in the most simple sense.
124:04 white in a in the most simple sense. Snow is white and apples are are such
124:07 Snow is white and apples are are such and such have such and such properties
124:10 and such have such and such properties and uh you know, apples are different
124:12 and uh you know, apples are different from pears in such and such a way. But
124:14 from pears in such and such a way. But to what degree are those things actually
124:16 to what degree are those things actually true? They're only true to the extent
124:18 true? They're only true to the extent that you have you know what you're
124:21 that you have you know what you're talking about, right? You have to define
124:22 talking about, right? You have to define apple and snow and white and pear and
124:25 apple and snow and white and pear and difference and sameness and and all this
124:28 difference and sameness and and all this kind of stuff. And so the problem that
124:30 kind of stuff. And so the problem that happens for for logical people is that
124:34 happens for for logical people is that our macro reality that we're dealing
124:37 our macro reality that we're dealing with every day that we're surviving in
124:38 with every day that we're surviving in that we're trying to understand and make
124:39 that we're trying to understand and make sense of is so chaotic and nebulous
124:44 sense of is so chaotic and nebulous and amorphous and fluid that it's very
124:48 and amorphous and fluid that it's very difficult to make definitive statements
124:50 difficult to make definitive statements about anything.
124:57 And this makes sensemaking very difficult.
124:59 difficult. If you haven't noticed, sensemaking is
125:01 If you haven't noticed, sensemaking is very very difficult to do correctly.
125:04 very very difficult to do correctly. And this is one of the reasons is
125:06 And this is one of the reasons is because what rationalism has to do to
125:09 because what rationalism has to do to simplify all this complexity down, it
125:11 simplify all this complexity down, it has to simplify it down into
125:15 has to simplify it down into little models, little abstractions and
125:17 little models, little abstractions and so forth. So you say that oh there's
125:19 so forth. So you say that oh there's apples and pairs, but in what sense are
125:20 apples and pairs, but in what sense are there really apples and pairs or are
125:22 there really apples and pairs or are these just simplifications of your mind?
125:25 these just simplifications of your mind? Once you make these simplifications, now
125:27 Once you make these simplifications, now you're dealing with lowresolution models
125:29 you're dealing with lowresolution models of the actual thing. And then you start
125:32 of the actual thing. And then you start to combine them and do logic on them.
125:34 to combine them and do logic on them. And then you get false conclusions.
125:38 And then you get false conclusions. And you're also when you're distilling
125:39 And you're also when you're distilling this stuff down and simplifying it,
125:40 this stuff down and simplifying it, you're making all sorts of simplifying
125:42 you're making all sorts of simplifying assumptions.
125:45 assumptions. For example, you're assuming that a that
125:47 For example, you're assuming that a that an apple is mind independent. You're
125:49 an apple is mind independent. You're assuming that quantity is mind
125:50 assuming that quantity is mind independent.
125:57 Is an apple really an objective reality or does it only exist inside your mind?
126:01 or does it only exist inside your mind? And how would you know the difference?
126:07 Here's another example of nebulosity. I'm going to give you a couple of these
126:08 I'm going to give you a couple of these kind of deep examples. And I want you to
126:10 kind of deep examples. And I want you to really contemplate this, you know,
126:13 really contemplate this, you know, contend with the with the with the
126:15 contend with the with the with the onlogical complexity of of what it means
126:17 onlogical complexity of of what it means to understand reality.
126:20 to understand reality. Because usually human beings are making
126:21 Because usually human beings are making sense of reality in such superficial or
126:24 sense of reality in such superficial or narrow ways, even scientists,
126:27 narrow ways, even scientists, that they're never contending with the
126:28 that they're never contending with the with the real complexity of what we're
126:30 with the real complexity of what we're doing here when when we're trying to
126:32 doing here when when we're trying to understand the world.
126:34 understand the world. For example, simple question might be
126:36 For example, simple question might be like, how long is an elephant? That's a
126:38 like, how long is an elephant? That's a kind of a scientific question. But then
126:40 kind of a scientific question. But then the question becomes, what counts as an
126:42 the question becomes, what counts as an elephant? Does a mammoth count as an
126:44 elephant? Does a mammoth count as an elephant?
126:52 It's tricky. But let's say we grant you that. Okay, this thing here is an
126:53 that. Okay, this thing here is an elephant. We'll just we'll just
126:55 elephant. We'll just we'll just stipulate that's an elephant by
126:56 stipulate that's an elephant by definition. Okay, so whatever's in front
126:57 definition. Okay, so whatever's in front of us is an elephant. But then how long
126:59 of us is an elephant. But then how long is it? Well, how do you measure it? Do
127:01 is it? Well, how do you measure it? Do you do you what do you do with its trunk
127:03 you do you what do you do with its trunk and its tail? Do you stretch them out?
127:06 and its tail? Do you stretch them out? How much do you stretch them out when
127:07 How much do you stretch them out when you're measuring it?
127:12 And then at what temperature are you measuring this this elephant? Because
127:14 measuring this this elephant? Because depending on whether you're measuring it
127:15 depending on whether you're measuring it in the in the hot jungle or in the in
127:18 in the in the hot jungle or in the in the cold, you know, tundra, uh it's
127:21 the cold, you know, tundra, uh it's going to be a different length.
127:23 going to be a different length. And are you measuring in the morning or
127:25 And are you measuring in the morning or in the evening? Because depending on the
127:27 in the evening? Because depending on the morning and evening, you know, you're
127:28 morning and evening, you know, you're taller in the mornings and you're
127:29 taller in the mornings and you're shorter in the evenings. And so it is
127:32 shorter in the evenings. And so it is with elephants, I assume. Uh, do you
127:34 with elephants, I assume. Uh, do you include the the hairs on its nose and
127:36 include the the hairs on its nose and tail? Do you include that in your
127:38 tail? Do you include that in your measurement? Do you count the dirt and
127:41 measurement? Do you count the dirt and oil on its hair? Do you include that in
127:43 oil on its hair? Do you include that in the measurement?
127:50 Is the oil and dirt that's caked onto the elephant's ass, is that part of the
127:52 the elephant's ass, is that part of the elephant, let's say I want to weigh the
127:53 elephant, let's say I want to weigh the elephant and find out its mass. Well,
127:55 elephant and find out its mass. Well, does the oil of the elephant count, does
127:58 does the oil of the elephant count, does the does the food in its stomach, what
128:01 the does the food in its stomach, what do we do about that? Is that part of the
128:02 do we do about that? Is that part of the elephant? Should we consider that it
128:04 elephant? Should we consider that it part of the elephant or not? What about
128:06 part of the elephant or not? What about the dirt on its tail that's caked into
128:07 the dirt on its tail that's caked into his hair?
128:10 his hair? What if one hair is hanging
128:13 What if one hair is hanging off the elephant's tail and it's
128:16 off the elephant's tail and it's detached from the elephant's body, but
128:17 detached from the elephant's body, but it's still holding on because it's caked
128:19 it's still holding on because it's caked with dirt? You know, the elephant swims
128:21 with dirt? You know, the elephant swims around in the mud, so it's just kind of
128:23 around in the mud, so it's just kind of caked on. Do we include that hair in the
128:26 caked on. Do we include that hair in the mass and the length of the elephant?
128:29 mass and the length of the elephant? And of course we have to understand uh
128:32 And of course we have to understand uh the elephant's velocity because the
128:34 the elephant's velocity because the length of the elephant literally depends
128:35 length of the elephant literally depends on how fast it's moving through spaceime
128:38 on how fast it's moving through spaceime according to Einstein's general
128:40 according to Einstein's general relativity. So if you don't know the
128:41 relativity. So if you don't know the elephant's velocity and your own
128:43 elephant's velocity and your own velocity relative to the elephant and
128:45 velocity relative to the elephant and you you know you get your reference
128:46 you you know you get your reference frames correct then you're not going to
128:47 frames correct then you're not going to really know the length of the elephant.
128:49 really know the length of the elephant. So it's relative to its velocity. Is the
128:51 So it's relative to its velocity. Is the elephant dead or alive? Because it's
128:54 elephant dead or alive? Because it's going to be different depending on if
128:55 going to be different depending on if it's dead or alive. And then how do we
128:56 it's dead or alive. And then how do we know if it's dead?
128:59 know if it's dead? When does an elephant count as dead
129:00 When does an elephant count as dead versus alive?
129:02 versus alive? And if finally we measure this elephant
129:04 And if finally we measure this elephant somehow and we say that it's 8.246
129:07 somehow and we say that it's 8.246 meters long,
129:09 meters long, how long does that truth last?
129:12 how long does that truth last? The elephant is that long is that long?
129:14 The elephant is that long is that long? For what? For a minute, for an hour, for
129:17 For what? For a minute, for an hour, for a day, and then it's different.
129:21 a day, and then it's different. And then what is a meter anyway? How do
129:23 And then what is a meter anyway? How do we even know what a meter is? How do we
129:25 we even know what a meter is? How do we even measure what a meter is?
129:33 So this gets us into like the the deep the really deep philosophical issues of
129:35 the really deep philosophical issues of science and you know empiricism.
129:38 science and you know empiricism. Empiricism just sort of assumes that we
129:40 Empiricism just sort of assumes that we can go out in the world and just kind of
129:41 can go out in the world and just kind of like measure stuff and see stuff and
129:43 like measure stuff and see stuff and it's just it's there for us. It's just a
129:44 it's just it's there for us. It's just a given. But then when you actually start
129:46 given. But then when you actually start to get deeper and deeper into the
129:48 to get deeper and deeper into the complexities of it, you run into all
129:50 complexities of it, you run into all these problems. And serious scientists
129:52 these problems. And serious scientists who are doing serious scientific work,
129:53 who are doing serious scientific work, they're always running into these kinds
129:55 they're always running into these kinds of problems. And then the question is
129:57 of problems. And then the question is how then do you navigate it with your
129:59 how then do you navigate it with your mind? And you're going to say, well man,
130:02 mind? And you're going to say, well man, all we do is we just study the facts,
130:04 all we do is we just study the facts, the facts and the logic and it's just
130:05 the facts and the logic and it's just the objective. But it's like no, because
130:07 the objective. But it's like no, because there's so much interpretive work that
130:08 there's so much interpretive work that your mind is doing to make sense of all
130:10 your mind is doing to make sense of all this stuff. Just to understand what the
130:13 this stuff. Just to understand what the right, you know, how do you just to
130:15 right, you know, how do you just to understand and to make sense of how do
130:16 understand and to make sense of how do you measure an elephant properly? That
130:18 you measure an elephant properly? That already requires a lot of very intuitive
130:22 already requires a lot of very intuitive higher order kinds of thinking that
130:24 higher order kinds of thinking that cannot be simply boiled down to atomic
130:25 cannot be simply boiled down to atomic facts. You're making decisions. You're
130:28 facts. You're making decisions. You're making judgments, executive judgments
130:30 making judgments, executive judgments about, you know, doing it this way
130:32 about, you know, doing it this way versus doing it that way. Interpretive
130:34 versus doing it that way. Interpretive judgments as well.
130:40 Now, you might say, "Well, Leo, but ultimately, so what? It sounds like
130:42 ultimately, so what? It sounds like you're just kind of like nitpicking
130:43 you're just kind of like nitpicking stuff, and these are just like little
130:44 stuff, and these are just like little technical problems of, you know,
130:46 technical problems of, you know, imperfect measurement. You know, who
130:47 imperfect measurement. You know, who cares if if you measure the elephant
130:49 cares if if you measure the elephant this way or that way? Ultimately, the
130:50 this way or that way? Ultimately, the elephant has some side of some kind of
130:52 elephant has some side of some kind of length. Whether you can measure it by
130:54 length. Whether you can measure it by pulling on its tail or not, who cares?
130:56 pulling on its tail or not, who cares? In the end, it's just an object that
130:58 In the end, it's just an object that just exists.
131:04 Or is it? What if objects themselves are relative?
131:08 What if objects themselves are relative? What if there is no such thing as an
131:10 What if there is no such thing as an elephant?
131:12 elephant? even that particular one right in front
131:13 even that particular one right in front of you. What if in some sense
131:17 of you. What if in some sense you're projecting that elephant onto
131:20 you're projecting that elephant onto whatever is there?
131:26 Here's some more examples of nebulosity. Are fruits healthy to eat? Might be kind
131:29 Are fruits healthy to eat? Might be kind of a scientific question you might ask,
131:30 of a scientific question you might ask, right?
131:33 right? Well, what counts as a fruit? And which
131:35 Well, what counts as a fruit? And which fruits? because you're you're
131:37 fruits? because you're you're generalizing, you know, to say that the
131:39 generalizing, you know, to say that the category of fruit is an extremely
131:41 category of fruit is an extremely general abstract category. How do you
131:43 general abstract category. How do you determine what to fit in there? Do
131:45 determine what to fit in there? Do berries count as fruits? Is a rotten
131:47 berries count as fruits? Is a rotten fruit a fruit? Is a poisonous fruit a
131:51 fruit a fruit? Is a poisonous fruit a fruit?
131:55 So, in what sense can you say that it's true that it's healthy to eat fruits?
132:02 See, simplistically, some nutrition scientists might say it
132:05 some nutrition scientists might say it is true that it's healthy to eat fruits,
132:07 is true that it's healthy to eat fruits, and it's and it's it's also, you know,
132:10 and it's and it's it's also, you know, it's false that eating McDonald's is
132:12 it's false that eating McDonald's is healthy for you. But based on what are
132:15 healthy for you. But based on what are you making those true and false claims?
132:16 you making those true and false claims? See, there's so much oversimplification
132:19 See, there's so much oversimplification that is happening there, right? Because
132:20 that is happening there, right? Because they're not thinking of all the edge
132:22 they're not thinking of all the edge cases. They're not thinking about what
132:23 cases. They're not thinking about what about if the fruit is rotten, what if
132:24 about if the fruit is rotten, what if the fruit is poisonous?
132:27 the fruit is poisonous? Now, the scientists will say, "Well,
132:29 Now, the scientists will say, "Well, Leo, yeah, of course I can't think of
132:31 Leo, yeah, of course I can't think of all of these different edge cases. That
132:33 all of these different edge cases. That would take too long and it's too too
132:35 would take too long and it's too too messy." Of course not. So, we're we're
132:37 messy." Of course not. So, we're we're oversimplifying stuff. And that's just,
132:39 oversimplifying stuff. And that's just, you know, all of us scientists, we we
132:41 you know, all of us scientists, we we all know this, Leo. You're not you're
132:42 all know this, Leo. You're not you're not telling us anything new. We already
132:44 not telling us anything new. We already know everything you're saying. No, you
132:46 know everything you're saying. No, you don't. You don't understand uh how
132:49 don't. You don't understand uh how problematic this becomes
132:52 problematic this becomes because you are making truth claims and
132:55 because you are making truth claims and you are making sense of reality using
132:57 you are making sense of reality using these gross simplifications
133:04 but then you're not accounting in a holistic manner how this is distorting
133:06 holistic manner how this is distorting everything that you understand about the
133:08 everything that you understand about the world.
133:15 Even the fact that all you're saying is that well you know I'm I'm just a humble
133:17 that well you know I'm I'm just a humble nutritionist and all I do I just you
133:18 nutritionist and all I do I just you know I just report these nutrition
133:21 know I just report these nutrition studies and of course they're not
133:23 studies and of course they're not totally holistic and all of that but you
133:24 totally holistic and all of that but you know this is the best we can do even
133:26 know this is the best we can do even when you make that kind of claim. See
133:29 when you make that kind of claim. See already what you're doing this is this
133:31 already what you're doing this is this is the key to David Chapman's point is
133:34 is the key to David Chapman's point is that you're not being strictly formal
133:36 that you're not being strictly formal when you do that.
133:38 when you do that. See, the fuzziness in your thinking
133:43 See, the fuzziness in your thinking is
133:49 absolutely critical because what we're trying to demonstrate
133:51 because what we're trying to demonstrate here with all these examples is that
133:52 here with all these examples is that strict formalism just doesn't work. It
133:56 strict formalism just doesn't work. It can't work.
133:58 can't work. Not just because it's difficult or it's
134:01 Not just because it's difficult or it's time consuming technically, but that
134:03 time consuming technically, but that literally the ontology of reality
134:06 literally the ontology of reality doesn't allow for it. It makes it
134:07 doesn't allow for it. It makes it impossible. Reality just isn't this way.
134:11 impossible. Reality just isn't this way. It isn't this way where you can reduce
134:12 It isn't this way where you can reduce it and simplify it down in these ways
134:15 it and simplify it down in these ways that are being assumed by science and
134:17 that are being assumed by science and rationality.
134:29 in order for you to survive and to understand the world,
134:33 understand the world, you need to make use of fuzzy intuitive
134:36 you need to make use of fuzzy intuitive thinking and understanding and
134:37 thinking and understanding and sensemaking
134:39 sensemaking which
134:41 which is prior to and transcendent of formal
134:45 is prior to and transcendent of formal science and formal rationality which we
134:48 science and formal rationality which we would say rationalism.
134:50 would say rationalism. You couldn't operate otherwise. Your
134:52 You couldn't operate otherwise. Your mind literally couldn't operate
134:53 mind literally couldn't operate otherwise because you need to be able to
134:55 otherwise because you need to be able to make judgment calls. Like for example,
134:56 make judgment calls. Like for example, when you say, well, what counts as a
134:58 when you say, well, what counts as a fruit? Does a does a rotten fruit count
135:00 fruit? Does a does a rotten fruit count as a fruit in terms of healthy eating?
135:03 as a fruit in terms of healthy eating? You would say no, it doesn't count. But
135:04 You would say no, it doesn't count. But then what are you using to understand
135:06 then what are you using to understand that no, it doesn't count. That it
135:08 that no, it doesn't count. That it shouldn't be counted as part of that
135:09 shouldn't be counted as part of that category. Well, you're using fuzzy
135:12 category. Well, you're using fuzzy intuitive kinds of sensemaking and
135:15 intuitive kinds of sensemaking and understanding which cannot be codified
135:17 understanding which cannot be codified and formalized because there's way too
135:19 and formalized because there's way too many exceptions for you to be able to
135:21 many exceptions for you to be able to formalize them all and put them into
135:22 formalize them all and put them into rules or to make any kind of
135:24 rules or to make any kind of mathematical model of them or a system
135:26 mathematical model of them or a system of rules for how to think properly about
135:29 of rules for how to think properly about these different categories.
135:31 these different categories. That's the issue.
135:34 That's the issue. We're trying to demonstrate that
135:36 We're trying to demonstrate that formalism is actually a mistake and that
135:39 formalism is actually a mistake and that it actually makes sensemaking worse, not
135:41 it actually makes sensemaking worse, not better.
135:42 better. Not always. Sometimes formalism is
135:44 Not always. Sometimes formalism is necessary and good. We're not trying to,
135:46 necessary and good. We're not trying to, you know, we're not saying to abandon
135:48 you know, we're not saying to abandon all formalism, but uh we're trying to
135:51 all formalism, but uh we're trying to show the limitations of formalism. And
135:53 show the limitations of formalism. And we're trying to debunk this paradigm,
135:56 we're trying to debunk this paradigm, this worldview that says that just
135:59 this worldview that says that just increasing formalism is the solution to
136:01 increasing formalism is the solution to all of these epistemic problems. It
136:04 all of these epistemic problems. It isn't. It can't be.
136:07 isn't. It can't be. Why not? Because you can't even define
136:10 Why not? Because you can't even define what formalism is.
136:13 what formalism is. You can't even define what rational is.
136:15 You can't even define what rational is. You can't define what science is. You
136:17 You can't define what science is. You can't define what an apple is. You can't
136:19 can't define what an apple is. You can't define what a cat is. You can't define
136:20 define what a cat is. You can't define what a woman is. You can't define what a
136:22 what a woman is. You can't define what a man is. You can't define what a human
136:23 man is. You can't define what a human is. Can't define what an elephant is.
136:30 Can't define what an atom is. Can't define what red is. You literally can't
136:32 define what red is. You literally can't define anything
136:35 define anything because you actually don't know what
136:36 because you actually don't know what anything is.
136:43 This is so fundamental that science completely overlooks it.
136:52 And this ties in with with that assumption about reality being finite.
136:56 assumption about reality being finite. See, all of rationalism assumes that
136:58 See, all of rationalism assumes that reality is finite onlogically
137:01 reality is finite onlogically because finite has to do with
137:04 because finite has to do with definition. Here's the connection.
137:11 For rationalism to work, it needs to be able to have crisp definitions. to have
137:13 able to have crisp definitions. to have crisp definitions. The only things that
137:14 crisp definitions. The only things that can be crisply defined are finite
137:16 can be crisply defined are finite things. You can't by definition define
137:19 things. You can't by definition define an infinite thing. But if reality as a
137:22 an infinite thing. But if reality as a whole is an infinite thing, then reality
137:24 whole is an infinite thing, then reality as a whole is literally undefined.
137:26 as a whole is literally undefined. That's not a mistake or a lack of
137:28 That's not a mistake or a lack of knowing. That's what it is. It's
137:29 knowing. That's what it is. It's undefined.
137:32 undefined. So if reality is truly infinite, this is
137:34 So if reality is truly infinite, this is what I mean by you don't understand the
137:35 what I mean by you don't understand the consequences of infinity. If reality is
137:37 consequences of infinity. If reality is truly infinite, which also means it's a
137:39 truly infinite, which also means it's a unity, that means you can't define it.
137:41 unity, that means you can't define it. Because in order to define something,
137:43 Because in order to define something, you need to go outside and beyond it to
137:46 you need to go outside and beyond it to define it with
137:49 define it with what is a definition. To define
137:51 what is a definition. To define something, you need to you have
137:52 something, you need to you have something other to it to define it in
137:54 something other to it to define it in terms of, right? Like what is a cat? A
137:57 terms of, right? Like what is a cat? A cat is something that has legs and parts
137:59 cat is something that has legs and parts and fur and and this and that. See,
138:01 and fur and and this and that. See, you're you're defining it with other
138:02 you're you're defining it with other things.
138:07 But you can't do that if if you have infinity. You can't define infinity
138:09 infinity. You can't define infinity because infinity is so total that you
138:12 because infinity is so total that you can't go outside of it to define it with
138:14 can't go outside of it to define it with anything else.
138:23 So this is where the ontology destroys rationalism. Rationalism can't work
138:26 rationalism. Rationalism can't work because reality is infinite. Now you
138:28 because reality is infinite. Now you might say, well, how do you know it's
138:29 might say, well, how do you know it's infinite? Well, I can't demonstrate that
138:31 infinite? Well, I can't demonstrate that to you here. And part of the problem of
138:33 to you here. And part of the problem of rationalism is that a rationalist will
138:35 rationalism is that a rationalist will expect me to demonstrate it to you here
138:37 expect me to demonstrate it to you here through some kind of formal proof. And
138:39 through some kind of formal proof. And of course, that's the whole catch22 here
138:42 of course, that's the whole catch22 here is that I can't formally prove to you
138:44 is that I can't formally prove to you that reality is infinite. That's the
138:47 that reality is infinite. That's the whole point of why we're deconstructing
138:48 whole point of why we're deconstructing rationality is so that you can realize
138:51 rationality is so that you can realize the infinity of it for yourself. So
138:54 the infinity of it for yourself. So that's that's beyond the scope of this
138:55 that's that's beyond the scope of this episode. You just have to take my word
138:57 episode. You just have to take my word for it for now. But I mean, it requires
139:00 for it for now. But I mean, it requires awakening.
139:06 In a sense, the entire paradigm of materialism, scientism, and rationalism,
139:08 materialism, scientism, and rationalism, you can think of it as just a denial of
139:10 you can think of it as just a denial of infinity. That's all that it is in a
139:12 infinity. That's all that it is in a sense. Because what's happening is that
139:14 sense. Because what's happening is that the mind, the scientific mind is just
139:16 the mind, the scientific mind is just doing analysis. Analysis is subdividing
139:18 doing analysis. Analysis is subdividing reality all the time without being
139:20 reality all the time without being construct aware enough to see that it's
139:21 construct aware enough to see that it's subdividing reality. So, it's
139:23 subdividing reality. So, it's subdividing reality and then confusing
139:25 subdividing reality and then confusing those subdivisions with being out there
139:28 those subdivisions with being out there when actually they are in here.
139:31 when actually they are in here. There is this idea that you can separate
139:34 There is this idea that you can separate the subjective from the objective,
139:36 the subjective from the objective, remove the subjective and just study the
139:37 remove the subjective and just study the objective. And you can't do that if what
139:39 objective. And you can't do that if what we're talking about is infinity. Because
139:41 we're talking about is infinity. Because within infinity, the subjective and
139:43 within infinity, the subjective and objective are in a unity with each
139:45 objective are in a unity with each other.
139:47 other. And if you don't realize that, then
139:48 And if you don't realize that, then you're going to get everything wrong.
139:52 you're going to get everything wrong. [snorts]
140:00 I sort of went on a tangent from my example back to this example of
140:01 example back to this example of nebulosity. So we were talking about
140:03 nebulosity. So we were talking about which fruits are healthy to eat.
140:11 Healthy for who? For humans, for cats, for flies, which types of humans,
140:13 for flies, which types of humans, different types of humans. Apples could
140:15 different types of humans. Apples could be healthy for them or fruits could be
140:16 be healthy for them or fruits could be healthy or unhealthy for them. You know,
140:18 healthy or unhealthy for them. You know, maybe there's animals for whom fruits
140:20 maybe there's animals for whom fruits are unhealthy. But when we say fruits
140:22 are unhealthy. But when we say fruits are healthy to eat, when we just say
140:23 are healthy to eat, when we just say that and we say that that's true. Some
140:25 that and we say that that's true. Some nutritionists say that. See, there's all
140:26 nutritionists say that. See, there's all sorts of assumptions like, well, we're
140:28 sorts of assumptions like, well, we're talking about humans and we're not
140:29 talking about humans and we're not talking about cats or elephants or other
140:31 talking about cats or elephants or other creatures, aliens.
140:38 And now that's the scientists would say, well yeah, Leo, that's obvious. We're
140:39 well yeah, Leo, that's obvious. We're just kind of simplifying. Yeah, but the
140:41 just kind of simplifying. Yeah, but the point is that the your mind is still
140:44 point is that the your mind is still able to make sense of these
140:45 able to make sense of these simplifications. And this is absolutely
140:48 simplifications. And this is absolutely essential for you to make sense of
140:49 essential for you to make sense of reality is that you're always operating
140:51 reality is that you're always operating with these kinds of simplifications,
140:53 with these kinds of simplifications, abstractions, fuzzy, nebulous concepts,
140:56 abstractions, fuzzy, nebulous concepts, none of which are ever fully defined
140:57 none of which are ever fully defined ever.
140:59 ever. You cannot define what a cat is.
141:02 You cannot define what a cat is. You cannot define what anything is. And
141:05 You cannot define what anything is. And yet your mind still understands it,
141:06 yet your mind still understands it, still works with it somehow.
141:13 That's because the essence of mind is is an informal
141:15 the essence of mind is is an informal process.
141:17 process. And if you don't understand and
141:18 And if you don't understand and appreciate that and see the power of
141:20 appreciate that and see the power of that, the power lies in its informality.
141:24 that, the power lies in its informality. That's why it's powerful. When you
141:26 That's why it's powerful. When you formalize it, you lose its power. You
141:28 formalize it, you lose its power. You actually lose its robustness. It's
141:30 actually lose its robustness. It's robust because it's informal. It's
141:32 robust because it's informal. It's nebulous. This is essential to proper
141:36 nebulous. This is essential to proper use of the mind.
141:39 use of the mind. Why are we talking about those? Because
141:41 Why are we talking about those? Because we're act this is actually a way for us
141:43 we're act this is actually a way for us to overcome some of the deepest
141:45 to overcome some of the deepest self-deceptions of the mind which are
141:47 self-deceptions of the mind which are committed by the scientific mind. Not by
141:50 committed by the scientific mind. Not by the pre-rational mind, by the by the by
141:53 the pre-rational mind, by the by the by the rational and by the modern mind, not
141:55 the rational and by the modern mind, not the premodern mind. The deceptions of
141:58 the premodern mind. The deceptions of the premodern mind are are too easy.
142:00 the premodern mind are are too easy. They're uninteresting.
142:07 We say let's say an apple is healthy. But are the seeds inside the apple are
142:10 But are the seeds inside the apple are healthy? Are they? They contain cyanide.
142:18 Are fruits sprayed with pesticides? Are those considered healthy? What about GMO
142:20 those considered healthy? What about GMO fruits? How do you know if those are
142:21 fruits? How do you know if those are healthy? How much fruit? Maybe up to a
142:24 healthy? How much fruit? Maybe up to a certain point fruit are healthy, but
142:26 certain point fruit are healthy, but then beyond that point they're not
142:27 then beyond that point they're not healthy anymore. So in what sense can
142:29 healthy anymore. So in what sense can you even say that? Like look, you might
142:31 you even say that? Like look, you might say, well, yeah, we can qualify. We can
142:32 say, well, yeah, we can qualify. We can say
142:35 say specific [snorts] kinds of fruits like
142:36 specific [snorts] kinds of fruits like apples are healthy for human beings.
142:38 apples are healthy for human beings. That statement is true.
142:41 That statement is true. Well, maybe it's not true when you
142:43 Well, maybe it's not true when you consider the the apple seeds. And maybe
142:45 consider the the apple seeds. And maybe maybe if you consider eating a ton of of
142:47 maybe if you consider eating a ton of of the of the apples, it's no longer true.
142:50 the of the apples, it's no longer true. You'll say, "Okay, Leo, let's qualify
142:51 You'll say, "Okay, Leo, let's qualify first. Let's say apples are healthy for
142:54 first. Let's say apples are healthy for human beings as long as you don't eat
142:56 human beings as long as you don't eat the apple seeds and also you don't eat
142:59 the apple seeds and also you don't eat too much of it. you keep it under, you
143:01 too much of it. you keep it under, you know, a certain amount. But then even
143:03 know, a certain amount. But then even that's not going to work because you
143:05 that's not going to work because you could say, well, what about people with
143:06 could say, well, what about people with a certain genetic disorder? Is that
143:08 a certain genetic disorder? Is that healthy for them? And you can say, okay,
143:10 healthy for them? And you can say, okay, we can qualify further. Apples are
143:11 we can qualify further. Apples are healthy for these kinds of people who
143:13 healthy for these kinds of people who don't have this kind of genetic
143:14 don't have this kind of genetic disorder, blah blah blah blah blah.
143:16 disorder, blah blah blah blah blah. Okay, but what if we cook the apples and
143:18 Okay, but what if we cook the apples and we overcook them? Does that count? So
143:20 we overcook them? Does that count? So now we have to qualify. Raw apples are
143:22 now we have to qualify. Raw apples are healthy, but not the overcooked ones.
143:32 See, and this goes on forever. There's no end to this.
143:35 There's no end to this. Is ketchup a fruit? Is orange juice a
143:38 Is ketchup a fruit? Is orange juice a fruit? Is refined sugar from apples a
143:40 fruit? Is refined sugar from apples a fruit? Is a nut a fruit? Is a plastic
143:43 fruit? Is a nut a fruit? Is a plastic apple a fruit? Uh, is a plastic apple a
143:46 apple a fruit? Uh, is a plastic apple a fruit?
143:48 fruit? Now, you might say, "Well, obviously,
143:49 Now, you might say, "Well, obviously, Leo, a plastic apple is not a fruit."
143:52 Leo, a plastic apple is not a fruit." Um, that's just silly. Yeah. But see, if
143:57 Um, that's just silly. Yeah. But see, if we're in a room together and I say,
143:58 we're in a room together and I say, "Hand me that apple over there and you
144:01 "Hand me that apple over there and you you look over there and it's it's a
144:02 you look over there and it's it's a plastic apple. It's obviously plastic.
144:05 plastic apple. It's obviously plastic. You're not deceived. You grab it. You
144:07 You're not deceived. You grab it. You hand it to me. No questions, no no
144:09 hand it to me. No questions, no no concerns. It just works. You understand
144:12 concerns. It just works. You understand exactly what I'm talking about. Even
144:15 exactly what I'm talking about. Even though that apple wasn't a real apple.
144:17 though that apple wasn't a real apple. It was just a plastic apple, but you
144:19 It was just a plastic apple, but you still understood what I told you. That's
144:22 still understood what I told you. That's because your mind is so advanced. your
144:24 because your mind is so advanced. your understanding is so fucking advanced and
144:26 understanding is so fucking advanced and you have no comprehension of how
144:28 you have no comprehension of how advanced your own understanding is that
144:30 advanced your own understanding is that it doesn't matter, right? You're not
144:32 it doesn't matter, right? You're not like a logical computer or robot where
144:34 like a logical computer or robot where it's like, you know, you tell a robot,
144:36 it's like, you know, you tell a robot, "Hand me that apple." And the the robot
144:38 "Hand me that apple." And the the robot looks over and says, "No, that's not an
144:39 looks over and says, "No, that's not an apple. That's a plastic thing."
144:43 apple. That's a plastic thing." See, that's because if the robot was
144:45 See, that's because if the robot was programmed with these kinds of logical
144:47 programmed with these kinds of logical rules, it wouldn't have enough
144:48 rules, it wouldn't have enough sophistication and intelligence to
144:50 sophistication and intelligence to really be able to interpret what you're
144:51 really be able to interpret what you're saying.
144:57 Now, you might say, "Well, Leo, but AI are are really smart these days. They're
144:59 are are really smart these days. They're able to understand all this all this
145:00 able to understand all this all this intuitive stuff." Exactly. The only
145:03 intuitive stuff." Exactly. The only reason AI is so intelligent, notice
145:04 reason AI is so intelligent, notice this, is because uh these
145:08 this, is because uh these computer scientists have figured out a
145:10 computer scientists have figured out a way to create this this intuitive fuzzy
145:13 way to create this this intuitive fuzzy nebulous black box, which they have no
145:15 nebulous black box, which they have no idea how it actually works.
145:18 idea how it actually works. They just, you know, create it somehow
145:20 They just, you know, create it somehow through some sort of mechanical process.
145:21 through some sort of mechanical process. They create it. But then this thing
145:23 They create it. But then this thing somehow works in a very fuzzy mystical
145:26 somehow works in a very fuzzy mystical almost way. Have no idea how it actually
145:29 almost way. Have no idea how it actually works. And then it's able to understand
145:31 works. And then it's able to understand the difference between an apple and a
145:32 the difference between an apple and a plastic apple and all these sorts of
145:34 plastic apple and all these sorts of nuances and so forth which could never
145:36 nuances and so forth which could never be captured in any kind of formal
145:38 be captured in any kind of formal system. Notice that you could not create
145:40 system. Notice that you could not create an AI using formal you know logical
145:43 an AI using formal you know logical rules.
145:52 And so in order to reach the kind of level of intelligence that humans have,
145:54 level of intelligence that humans have, you had to abandon the logical rules for
145:56 you had to abandon the logical rules for creating AI and you had to go with a
145:57 creating AI and you had to go with a fuzzy black block black box approach.
146:00 fuzzy black block black box approach. And that's what made it really
146:02 And that's what made it really intelligent.
146:04 intelligent. You might say, well, Leo, but so what?
146:06 You might say, well, Leo, but so what? Here's the thing.
146:09 Here's the thing. Academia and science is all about taking
146:13 Academia and science is all about taking that fuzzy stuff and formalizing it
146:16 that fuzzy stuff and formalizing it into, you know, a scientific method and
146:19 into, you know, a scientific method and all of that, thinking that that solves
146:22 all of that, thinking that that solves the problem of self-deception and that
146:24 the problem of self-deception and that you're going to be able to use that to
146:25 you're going to be able to use that to understand reality at the highest
146:27 understand reality at the highest levels.
146:29 levels. And that's not going to work. That's the
146:32 And that's not going to work. That's the point.
146:34 point. And the point is that they don't
146:35 And the point is that they don't understand that it's not going to work.
146:42 And I'm not just talking about in the future like you know in the future there
146:44 future like you know in the future there will be things that science doesn't
146:45 will be things that science doesn't understand because its current method is
146:47 understand because its current method is limited. No, I'm saying that there are
146:51 limited. No, I'm saying that there are very advanced profound things that I can
146:54 very advanced profound things that I can teach you right now today that a
146:56 teach you right now today that a scientist and a rationalist will not
146:58 scientist and a rationalist will not understand simply because they're
147:00 understand simply because they're operating from that paradigm.
147:03 operating from that paradigm. And they don't understand that. And
147:05 And they don't understand that. And therefore, I literally can't teach them
147:07 therefore, I literally can't teach them the most advanced truths about the
147:09 the most advanced truths about the universe because they won't be able to
147:11 universe because they won't be able to comprehend it because their mind is
147:12 comprehend it because their mind is limited by this paradigm. That's the
147:15 limited by this paradigm. That's the point.
147:19 [snorts] More examples of nebulosity. Is the sky
147:22 More examples of nebulosity. Is the sky blue? What counts as the sky? What
147:25 blue? What counts as the sky? What counts as blue? Are birds dinosaurs? Is
147:28 counts as blue? Are birds dinosaurs? Is there water in a coconut?
147:34 There's liquid in there. And if you do a microscopic analysis, most of that
147:36 microscopic analysis, most of that liquid is water. But
147:44 if let's say I needed a I needed water for some sort of
147:46 I needed water for some sort of laboratory experiment,
147:49 laboratory experiment, could I use a coconut to to get that
147:51 could I use a coconut to to get that water? No, probably not. Even though
147:54 water? No, probably not. Even though technically there is water in a coconut.
147:56 technically there is water in a coconut. Or is there? Depends on how you want to
147:58 Or is there? Depends on how you want to look at it. See the issue here is that
148:00 look at it. See the issue here is that the question of is there water in a
148:01 the question of is there water in a coconut doesn't have an objective
148:03 coconut doesn't have an objective answer. It all depends relativistically
148:05 answer. It all depends relativistically on what you are asking the question for
148:08 on what you are asking the question for its purpose, what you need it to do for
148:09 its purpose, what you need it to do for you.
148:12 you. That's nebulosity.
148:15 That's nebulosity. Nebulosity is not just about objects not
148:18 Nebulosity is not just about objects not being concrete enough and crisp enough
148:20 being concrete enough and crisp enough and defined enough. It's also about how
148:23 and defined enough. It's also about how we're using these concepts as well. How
148:25 we're using these concepts as well. How we're using and talking about these
148:26 we're using and talking about these objects.
148:33 Here's an interesting philosophical question. You know, there there are
148:35 question. You know, there there are these debates about trans. Are trans
148:37 these debates about trans. Are trans women really women and so forth? Here's
148:40 women really women and so forth? Here's a really interesting way to phrase it.
148:43 a really interesting way to phrase it. Let's say in a hundred years we have the
148:44 Let's say in a hundred years we have the technology to do perfect surgery
148:47 technology to do perfect surgery to give a biological woman a working
148:51 to give a biological woman a working penis and sperm and hormones so that all
148:53 penis and sperm and hormones so that all of it works and she's able to impregnate
148:57 of it works and she's able to impregnate a biological woman.
149:00 a biological woman. Is she still a woman or is she now a
149:02 Is she still a woman or is she now a man?
149:05 man? That's the really interesting question
149:06 That's the really interesting question with the trans question, right?
149:09 with the trans question, right? Because right now we don't have good
149:10 Because right now we don't have good enough surgery to do this. But in a 100
149:13 enough surgery to do this. But in a 100 years you can imagine that we will have
149:14 years you can imagine that we will have this kind of surgery available. Well, in
149:17 this kind of surgery available. Well, in that case, what really is a man and a
149:20 that case, what really is a man and a woman?
149:22 woman? I'd like you to contemplate that for
149:24 I'd like you to contemplate that for yourself.
149:26 yourself. There are also higher order questions
149:28 There are also higher order questions that we ask and that we deal with all
149:30 that we ask and that we deal with all the time for sensemaking purposes. For
149:31 the time for sensemaking purposes. For example, is that person a Marxist? Is
149:33 example, is that person a Marxist? Is that person a fascist? Is that person a
149:37 that person a fascist? Is that person a a Nazi?
149:43 Oftentimes there are not objective answers to these questions.
149:46 answers to these questions. And notice that the term science is
149:49 And notice that the term science is itself nebulous. What is science? I dare
149:51 itself nebulous. What is science? I dare you to define it.
149:57 For example, is actualize that already science.
150:03 How about the term reasonable? I dare you to try to define reasonable. What
150:05 you to try to define reasonable. What the fuck is reasonable? That's the most
150:08 the fuck is reasonable? That's the most nebulous of terms. What about crazy and
150:11 nebulous of terms. What about crazy and absurd?
150:13 absurd? These are nebulous terms. Why does that
150:15 These are nebulous terms. Why does that matter? Well, because when I start
150:17 matter? Well, because when I start telling you advanced
150:19 telling you advanced uh truths from trans, you know, rational
150:24 uh truths from trans, you know, rational domains of of intelligence and
150:26 domains of of intelligence and consciousness, you're going to call me
150:27 consciousness, you're going to call me crazy and absurd.
150:40 But you can't even define what crazy and absurd are. And yet you you're using
150:41 absurd are. And yet you you're using those terms to try to deny what other
150:45 those terms to try to deny what other people are telling you to certain deny
150:47 people are telling you to certain deny certain truths and facts and realities.
150:53 How about the word truth? Everyone uses the word or the concept truth, but no
150:56 the word or the concept truth, but no scientist can define it. I dare you
150:58 scientist can define it. I dare you define what truth is.
151:07 See, rationalism cannot even define what truth is. And yet, it entirely depends
151:09 truth is. And yet, it entirely depends upon it. How do you even scientifically
151:12 upon it. How do you even scientifically answer the question, what is truth? How
151:14 answer the question, what is truth? How do you run a scientific experiment to
151:16 do you run a scientific experiment to prove that?
151:22 How do you do a peer-reviewed research study on what truth is? And yet, science
151:24 study on what truth is? And yet, science requires the concept of truth. You can't
151:26 requires the concept of truth. You can't do science without this concept. Some
151:28 do science without this concept. Some people say like, "Oh, well, Leo, you
151:30 people say like, "Oh, well, Leo, you know, science doesn't care about the
151:31 know, science doesn't care about the truth."
151:33 truth." I mean, you're so not understanding the
151:35 I mean, you're so not understanding the death of this problem.
151:37 death of this problem. Of course, science cares about truth.
151:39 Of course, science cares about truth. You wouldn't be arguing with anybody if
151:41 You wouldn't be arguing with anybody if you didn't care about truth.
151:44 you didn't care about truth. That's the only reason you argue is
151:46 That's the only reason you argue is because you have some vague notion of
151:47 because you have some vague notion of what truth is and you think you got it
151:50 what truth is and you think you got it and the other person doesn't. That's the
151:52 and the other person doesn't. That's the only reason you're arguing.
151:54 only reason you're arguing. That's the only reason you're debunking
151:55 That's the only reason you're debunking or that you're even skeptical about
151:57 or that you're even skeptical about anything is because you have a deep
151:59 anything is because you have a deep intuition for truth, but you can't
152:02 intuition for truth, but you can't define it.
152:07 Most of the things your mind cares about, attaches meaning to, and makes
152:09 about, attaches meaning to, and makes truth claims about are hopelessly
152:11 truth claims about are hopelessly nebulous and cannot be reduced down to
152:13 nebulous and cannot be reduced down to anything formal, atomic, objective, or
152:16 anything formal, atomic, objective, or rigorously scientific.
152:19 rigorously scientific. Now here the rationalist will pull this
152:21 Now here the rationalist will pull this move and say well yeah Leo sure I agree
152:23 move and say well yeah Leo sure I agree with you in practice you're right we
152:25 with you in practice you're right we can't do it because you know we can't
152:27 can't do it because you know we can't track the the the
152:30 track the the the exact physical coordinates of every atom
152:31 exact physical coordinates of every atom in the universe you know it's too much
152:33 in the universe you know it's too much it's too complicated we can't do all
152:34 it's too complicated we can't do all that but in theory we could
152:42 and the answer is no you can't in theory you can't this is
152:44 in theory you can't this is selfdeception
152:46 selfdeception the mind works in a very loose andfor
152:48 the mind works in a very loose andfor formal, abstract, nebulous way for very
152:52 formal, abstract, nebulous way for very important reasons. Sensemaking cannot be
152:56 important reasons. Sensemaking cannot be strictly formal.
152:58 strictly formal. And to think that it can is a very deep
153:00 And to think that it can is a very deep epistemic and onlogical mistake.
153:03 epistemic and onlogical mistake. You cannot treat sensemaking like
153:05 You cannot treat sensemaking like physics and math.
153:13 physics, math, science, these are all just small subsets of a larger thing
153:16 just small subsets of a larger thing called understanding or sensemaking
153:22 and you cannot create a separation because here the scientific person or
153:24 because here the scientific person or the rationalist will want to say yeah
153:25 the rationalist will want to say yeah Leo I mean you're talking about stuff I
153:28 Leo I mean you're talking about stuff I understand all this but you're not
153:30 understand all this but you're not talking about science we have like
153:32 talking about science we have like scientific questions over here and then
153:34 scientific questions over here and then the stuff that you're talking about
153:35 the stuff that you're talking about these are not scientific questions and
153:37 these are not scientific questions and we just have to come to terms with that.
153:39 we just have to come to terms with that. But again, see, you're creating a
153:41 But again, see, you're creating a distinction between scientific questions
153:43 distinction between scientific questions and not scientific questions. But you
153:46 and not scientific questions. But you don't know what are scientific questions
153:47 don't know what are scientific questions and what are not scientific questions
153:49 and what are not scientific questions because you can't even define what
153:50 because you can't even define what science is in an objective, rigorous
153:53 science is in an objective, rigorous way.
153:56 way. And yet you're assuming, see, you're
153:58 And yet you're assuming, see, you're assuming you know these differences and
153:59 assuming you know these differences and distinctions. And you're assuming that
154:01 distinctions. And you're assuming that they actually exist for you to know them
154:04 they actually exist for you to know them and that your mind is not constructing
154:06 and that your mind is not constructing them. That's the selfdeception.
154:14 Rationalism requires boundaries and categories to be objective, crisp,
154:16 categories to be objective, crisp, simple, non-perspectal,
154:17 simple, non-perspectal, non-relativistic.
154:23 Rationalism makes simplifications of complex phenomena because full
154:25 complex phenomena because full descriptions are impossible and it
154:27 descriptions are impossible and it assumes and requires reducible
154:28 assumes and requires reducible complexity. But many things, most things
154:32 complexity. But many things, most things are not reducably complex.
154:38 quoting David Chapman, quote, "Rationalism attempts to reinterpret
154:39 "Rationalism attempts to reinterpret nebulosity as linguistic vagueness and
154:42 nebulosity as linguistic vagueness and uncertainty." End quote.
154:45 uncertainty." End quote. And quoting him again, "Rationalism's
154:47 And quoting him again, "Rationalism's goal is to prove that rationality is
154:49 goal is to prove that rationality is guaranteed correct or optimal and that
154:51 guaranteed correct or optimal and that and nebulosity makes this impossible."
154:53 and nebulosity makes this impossible." End quote.
154:56 End quote. See,
154:59 See, to the rationalists, it seems like what
155:01 to the rationalists, it seems like what the issues and problems I'm presenting
155:03 the issues and problems I'm presenting here that they're just linguistic
155:04 here that they're just linguistic problems.
155:10 But that's not what is being said. What's being said is that these are
155:12 What's being said is that these are epistemic problems and even deeper that
155:14 epistemic problems and even deeper that these are onlogical problems
155:18 these are onlogical problems and that you can't fix this with any
155:21 and that you can't fix this with any amount of formalizing your language.
155:28 Rationalism tries to work around these problems by quote adding formal
155:30 problems by quote adding formal machinery end quote.
155:32 machinery end quote. This formal machinery obiscates the
155:34 This formal machinery obiscates the problem with counterproductive
155:36 problem with counterproductive complexity.
155:39 complexity. Rationalism invents ad hoc machinery to
155:42 Rationalism invents ad hoc machinery to try to overcome these issues
155:45 try to overcome these issues and uh it seems like it's going to work
155:49 and uh it seems like it's going to work but it doesn't work
155:51 but it doesn't work and it doesn't recognize that.
155:55 and it doesn't recognize that. Now an important little nuance here is
155:56 Now an important little nuance here is that nebulosity
155:58 that nebulosity does not mean that things are just
156:00 does not mean that things are just subjective in a sense of it's just all a
156:03 subjective in a sense of it's just all a matter of opinion and it's all arbitrary
156:05 matter of opinion and it's all arbitrary and anything goes.
156:07 and anything goes. Some categories are more effective than
156:09 Some categories are more effective than others.
156:15 Nebulosity means rather that um you have to be very careful here not to make a
156:16 to be very careful here not to make a straw man of relativism. Go see my
156:19 straw man of relativism. Go see my episode called Understanding Relativity
156:21 episode called Understanding Relativity or Relativism Part One where I make a
156:24 or Relativism Part One where I make a steel man of of relativity. Um because
156:27 steel man of of relativity. Um because it's it's often um straw man. But
156:31 it's it's often um straw man. But nebulosity does not mean uh
156:34 nebulosity does not mean uh that anything goes. What it means is
156:37 that anything goes. What it means is that when you say something, you have to
156:39 that when you say something, you have to qualify it. You have to say that you
156:41 qualify it. You have to say that you know X is true under these in
156:44 know X is true under these in circumstances from this point of view
156:47 circumstances from this point of view for a certain purpose that I have as a
156:52 for a certain purpose that I have as a mind trying to understand the world.
156:56 mind trying to understand the world. It gets rid of this simplification in
156:58 It gets rid of this simplification in science that science is just this
156:59 science that science is just this objective neutral thing. You know when
157:01 objective neutral thing. You know when we're doing science we're just
157:01 we're doing science we're just objectively observing the world. No,
157:03 objectively observing the world. No, when you're doing science, you're asking
157:05 when you're doing science, you're asking questions for specific motivated
157:07 questions for specific motivated purposes of survival, from a human point
157:10 purposes of survival, from a human point of view, from within a human frame, from
157:13 of view, from within a human frame, from within a human mind, limited by human
157:15 within a human mind, limited by human sanity. And within that, you're taking a
157:18 sanity. And within that, you're taking a point of view, you're asking questions
157:19 point of view, you're asking questions about certain variables, certain
157:21 about certain variables, certain phenomenon in the world. All of this is
157:23 phenomenon in the world. All of this is biased, deeply biased.
157:26 biased, deeply biased. And then you're you're mixing these
157:27 And then you're you're mixing these things up in your own mind which biases
157:29 things up in your own mind which biases it even further. And you're adding it to
157:31 it even further. And you're adding it to your own worldview and paradigm and
157:33 your own worldview and paradigm and sensemaking system which biases it even
157:35 sensemaking system which biases it even further.
157:43 For example, humans evolved from chimps is true from a certain point of view.
157:47 is true from a certain point of view. Did humans really evolve from chimps?
157:50 Did humans really evolve from chimps? Technically no. Because chimps didn't
157:53 Technically no. Because chimps didn't exist when our an our common the common
157:55 exist when our an our common the common ancestor between humans and chimps is
157:57 ancestor between humans and chimps is not a chimp. It's something in between.
158:01 not a chimp. It's something in between. Chimps actually evolved you know on
158:03 Chimps actually evolved you know on their own for millions of years
158:04 their own for millions of years alongside humans to become chimps.
158:07 alongside humans to become chimps. So our common ancestor is not
158:09 So our common ancestor is not technically a chimp. But when I say
158:11 technically a chimp. But when I say humans evolved from chimps, every one of
158:13 humans evolved from chimps, every one of you understands what I'm talking about.
158:15 you understands what I'm talking about. Right? Of course, the pedants in the
158:17 Right? Of course, the pedants in the audience, those of you who are
158:19 audience, those of you who are rationalist pedants might say, "Oh, Leo,
158:21 rationalist pedants might say, "Oh, Leo, no, technically humans did not evolve
158:22 no, technically humans did not evolve from chimps." Yeah, you can say all that
158:24 from chimps." Yeah, you can say all that stuff. But see, if you're smart, if
158:27 stuff. But see, if you're smart, if you're actually intelligent, you'll
158:28 you're actually intelligent, you'll realize that pedentry is not
158:30 realize that pedentry is not intelligence.
158:32 intelligence. Pedentry is a low form of intelligence.
158:34 Pedentry is a low form of intelligence. A higher form of intelligence is to
158:35 A higher form of intelligence is to understand that when I say humans
158:37 understand that when I say humans evolved from chimps, you understand my
158:39 evolved from chimps, you understand my point. I don't literally mean that a
158:42 point. I don't literally mean that a human came from a chimpanzees vagina.
158:44 human came from a chimpanzees vagina. That's not literally what I'm being
158:45 That's not literally what I'm being what's being said. What's being said, if
158:48 what's being said. What's being said, if you're intelligent and you're using your
158:50 you're intelligent and you're using your intuition in a fuzzy way, I'm saying
158:53 intuition in a fuzzy way, I'm saying that
158:55 that humans came from something more
158:57 humans came from something more primitive than a human, which was kind
159:00 primitive than a human, which was kind of chimp-like. That's that's more of
159:02 of chimp-like. That's that's more of like what I'm saying when I say humans
159:04 like what I'm saying when I say humans evolved from chimps.
159:13 For example, when I say the sky is blue, that's true from a certain point of
159:14 that's true from a certain point of view,
159:17 view, right? A pent might say, "Well, Leo, no,
159:19 right? A pent might say, "Well, Leo, no, the sky is technically not blue because,
159:21 the sky is technically not blue because, you know, at night it's black and in the
159:23 you know, at night it's black and in the morning it's orange, and then, you know,
159:24 morning it's orange, and then, you know, if there's clouds, it's white or it's
159:26 if there's clouds, it's white or it's gray." This like, see, a pent will do
159:28 gray." This like, see, a pent will do that. That's a lower form of
159:30 that. That's a lower form of intelligence. But when I say the sky is
159:31 intelligence. But when I say the sky is blue, you're not going to nitpick, you
159:34 blue, you're not going to nitpick, you know, which shade of blue is Leo talking
159:36 know, which shade of blue is Leo talking about. You generally understand the
159:37 about. You generally understand the truth of that statement.
159:39 truth of that statement. See, truth is not this binary thing of
159:41 See, truth is not this binary thing of like that's true, that's false, and then
159:44 like that's true, that's false, and then that that's all that there is. Truth is
159:46 that that's all that there is. Truth is a, you know, we're capturing shades of
159:50 a, you know, we're capturing shades of truth, different perspectives on truth.
159:54 truth, different perspectives on truth. For example, when we say that women give
159:56 For example, when we say that women give birth,
159:58 birth, a woman is someone that gives birth as a
160:01 a woman is someone that gives birth as a definition for woman, that's generally
160:03 definition for woman, that's generally true. We kind of can all understand and
160:05 true. We kind of can all understand and agree with that. But then we also
160:06 agree with that. But then we also understand there's a lot of edge cases
160:08 understand there's a lot of edge cases and exceptions where women are not able
160:09 and exceptions where women are not able to give birth. Some women are born
160:11 to give birth. Some women are born without the ability other people other
160:13 without the ability other people other women have you know uh accidents and and
160:17 women have you know uh accidents and and diseases and so forth that prevent that
160:18 diseases and so forth that prevent that whatever.
160:21 whatever. So is a woman defined by the ability to
160:23 So is a woman defined by the ability to give birth?
160:25 give birth? Technically no.
160:29 Technically no. You can imagine creating a alternative
160:32 You can imagine creating a alternative race of women we might call them who are
160:35 race of women we might call them who are just like regular women but we just
160:39 just like regular women but we just completely remove all their ability to
160:40 completely remove all their ability to have birth.
160:42 have birth. But they still have all the other let's
160:44 But they still have all the other let's say features of women. They have the
160:47 say features of women. They have the psychology of women. They have the the
160:48 psychology of women. They have the the moodiness of women, the emotions of
160:51 moodiness of women, the emotions of women, the the physical needs of women
160:53 women, the the physical needs of women and so forth. They might even have the
160:55 and so forth. They might even have the anatomical features. They might have the
160:56 anatomical features. They might have the breasts and the and and all the, you
160:59 breasts and the and and all the, you know, all the genitals and so forth, but
161:02 know, all the genitals and so forth, but they're just not able to give birth by,
161:03 they're just not able to give birth by, you know, some genetic tinkering that we
161:05 you know, some genetic tinkering that we do
161:07 do um at the very micro scale. So, and they
161:10 um at the very micro scale. So, and they and they look fully like women. In fact,
161:11 and they look fully like women. In fact, we could even kind of sex them up a
161:13 we could even kind of sex them up a little bit so that they look like extra
161:15 little bit so that they look like extra like women.
161:17 like women. So, that they're like they look even
161:19 So, that they're like they look even more feminine than the class than than
161:21 more feminine than the class than than your average woman. So, by all intents
161:24 your average woman. So, by all intents and purposes, when you see this this uh
161:27 and purposes, when you see this this uh you know, this being walking around,
161:29 you know, this being walking around, you're going to call it a woman. You're
161:30 you're going to call it a woman. You're going to think of it as a woman, even
161:31 going to think of it as a woman, even though it's not technically able to give
161:33 though it's not technically able to give birth. Um,
161:40 so is that still a woman? Well, there's there's not an objective answer to that.
161:42 there's not an objective answer to that. It depends on how you want to look at
161:43 It depends on how you want to look at it.
161:50 And if I did call it a woman, you would understand where I'm coming
161:51 you would understand where I'm coming from. you would understand what I mean
161:52 from. you would understand what I mean even though it technically it doesn't
161:54 even though it technically it doesn't give birth.
161:56 give birth. Another example,
161:59 Another example, it's kind of a controversial one. You
162:00 it's kind of a controversial one. You know, for example, I might want to make
162:01 know, for example, I might want to make a claim that men care about truth more
162:03 a claim that men care about truth more than women.
162:06 than women. Is that true or false?
162:14 In what sense is it true to say that? And then and and if we say that that's
162:15 And then and and if we say that that's true, let's just just grant it to me for
162:17 true, let's just just grant it to me for a second that that's true in some sense.
162:20 a second that that's true in some sense. And again, see that in some sense plays
162:23 And again, see that in some sense plays a very important part because I mean it
162:25 a very important part because I mean it in some sense and I don't mean it in
162:27 in some sense and I don't mean it in some other sense.
162:30 some other sense. I also may not mean it across the board.
162:32 I also may not mean it across the board. I I might mean it just as an average as
162:34 I I might mean it just as an average as a generalization.
162:36 a generalization. But see a lot of the statement truth
162:38 But see a lot of the statement truth claims that we make about the world a
162:40 claims that we make about the world a lot of the ways in which we understand
162:41 lot of the ways in which we understand the world it's all generalizations. And
162:43 the world it's all generalizations. And then the question becomes what are the
162:45 then the question becomes what are the right ways to generalize? What are the
162:46 right ways to generalize? What are the wrong ways? And then this is where a lot
162:48 wrong ways? And then this is where a lot of subjectivity and relativ relativity
162:50 of subjectivity and relativ relativity comes into play. But
162:54 comes into play. But if we want to say that like say that men
162:56 if we want to say that like say that men care about truth more than women do,
163:08 See, if you're a scientist and you say, "No, Leah, that's not a scientific
163:10 "No, Leah, that's not a scientific statement."
163:12 statement." Well, you got a problem here. Because if
163:15 Well, you got a problem here. Because if there's any truth at all in that
163:17 there's any truth at all in that statement,
163:18 statement, supposedly it should be part of science,
163:21 supposedly it should be part of science, right? Do you really want to say that
163:23 right? Do you really want to say that science is this very formal thing, but
163:25 science is this very formal thing, but then there's all these other truths
163:26 then there's all these other truths outside of science that just science is
163:30 outside of science that just science is not going to look at, care about, or
163:32 not going to look at, care about, or just completely ignore? Well, then your
163:34 just completely ignore? Well, then your science is extremely limited and
163:36 science is extremely limited and crippled.
163:38 crippled. On the other hand, in what sense is men
163:40 On the other hand, in what sense is men don't, you know, men care about truth
163:42 don't, you know, men care about truth more than women? In what sense can we
163:43 more than women? In what sense can we say that that's scientifically true?
163:47 say that that's scientifically true? How do we demonstrate or prove that?
163:50 How do we demonstrate or prove that? Even if it was true, how do we do prove
163:52 Even if it was true, how do we do prove it?
163:54 it? That's very difficult. See, and so this
163:57 That's very difficult. See, and so this this kind of gets you into the
163:58 this kind of gets you into the hairiness, but but look, if you as a
164:00 hairiness, but but look, if you as a man, why do I say this? I'm not trying
164:03 man, why do I say this? I'm not trying to be like misogynistic or sexist here,
164:05 to be like misogynistic or sexist here, but if you as a man are trying to
164:07 but if you as a man are trying to understand women, how women work, how
164:09 understand women, how women work, how women think, their psychology,
164:11 women think, their psychology, which is very important for a man's
164:13 which is very important for a man's survival to understand that because
164:15 survival to understand that because otherwise you're going to be a crippled
164:16 otherwise you're going to be a crippled man. You have to you have to understand
164:19 man. You have to you have to understand women's relationship to truth.
164:22 women's relationship to truth. It's different than it is for a man
164:25 It's different than it is for a man generally speaking.
164:27 generally speaking. And it's important for you to understand
164:29 And it's important for you to understand that and that is part of a larger
164:31 that and that is part of a larger sensemaking that you are doing as you're
164:33 sensemaking that you are doing as you're growing up in the world.
164:36 growing up in the world. And it's also
164:38 And it's also different from science but it's not also
164:40 different from science but it's not also outside the realm of science. We don't
164:41 outside the realm of science. We don't want to say that as what either. But
164:43 want to say that as what either. But what I want you to see is that we have
164:46 what I want you to see is that we have sensemaking and understanding which is
164:47 sensemaking and understanding which is this giant generalized thing very fuzzy
164:51 this giant generalized thing very fuzzy very hairy and chaotic and abstract. And
164:53 very hairy and chaotic and abstract. And then within it we have a little island
164:55 then within it we have a little island that we might call formal academic
164:57 that we might call formal academic science.
165:04 And you might say well well so what? Well,
165:06 Well, the the the crux of it is is that see
165:12 the the the crux of it is is that see what you really care about as a mind is
165:16 what you really care about as a mind is not really science. Even if you're a
165:18 not really science. Even if you're a scientist, what I'm trying to convince
165:19 scientist, what I'm trying to convince you of is that what's really important
165:21 you of is that what's really important is not science. What's really important
165:22 is not science. What's really important is that you get this right right here.
165:24 is that you get this right right here. You get the entire thing right. You get
165:28 You get the entire thing right. You get sensemaking right. If you get
165:29 sensemaking right. If you get sensemaking right, then science, you can
165:32 sensemaking right, then science, you can do science. But if you are only focused
165:35 do science. But if you are only focused on science and you don't get sensemaking
165:37 on science and you don't get sensemaking right, you will get science wrong
165:40 right, you will get science wrong because sensemaking is not contained in
165:43 because sensemaking is not contained in science. Science is contained in
165:45 science. Science is contained in sensemaking.
166:00 You're missing the forest for the trees. as a rationalist.
166:03 as a rationalist. And all of this is happening ultimately
166:05 And all of this is happening ultimately because reality, the world, the universe
166:07 because reality, the world, the universe is not a clockwork. It's a mind. It's a
166:10 is not a clockwork. It's a mind. It's a mind. A mind, the difference between a
166:11 mind. A mind, the difference between a mind and a physical system is that a
166:13 mind and a physical system is that a mind is fluid. It blends between things.
166:15 mind is fluid. It blends between things. It's fuzzy. It's nebulous. That's the
166:18 It's fuzzy. It's nebulous. That's the nature of mind is it's fluid. The reason
166:21 nature of mind is it's fluid. The reason it's fluid is because it's unified.
166:23 it's fluid is because it's unified. Therefore, none of the distinctions are
166:25 Therefore, none of the distinctions are fixed. They don't have a fixed objective
166:28 fixed. They don't have a fixed objective reality. They all dissolve.
166:31 reality. They all dissolve. They're all temporary.
166:33 They're all temporary. And you can't fix this by sharpening up
166:35 And you can't fix this by sharpening up language or by formalizing it. [snorts]
166:40 language or by formalizing it. [snorts] Rationalism assumes that ordinary
166:42 Rationalism assumes that ordinary language is defective because it's not
166:43 language is defective because it's not clear enough. But this lacks but this
166:46 clear enough. But this lacks but this lack of clarity and messiness of
166:48 lack of clarity and messiness of ordinary language is actually a feature
166:49 ordinary language is actually a feature and not a bug.
166:59 Whether this is now quoting David Chapman, whether or not an object
167:01 Chapman, whether or not an object belongs to a category often does not
167:03 belongs to a category often does not have a definite true or false answer.
167:05 have a definite true or false answer. End quote. It depends on spectrums and
167:08 End quote. It depends on spectrums and degrees, depends on perspective, depends
167:10 degrees, depends on perspective, depends on purpose. Uh categorization is
167:13 on purpose. Uh categorization is contextual.
167:16 contextual. Which category should we use? This is
167:18 Which category should we use? This is not an objective factual question.
167:21 not an objective factual question. For example, categories of mental
167:23 For example, categories of mental illness, categories of species,
167:24 illness, categories of species, categories of political party,
167:26 categories of political party, categories of bad science or this is
167:30 categories of bad science or this is anomalous data that we should throw
167:31 anomalous data that we should throw away. These kinds of categories
167:35 away. These kinds of categories uh these are not atomic facts that you
167:37 uh these are not atomic facts that you find in the world.
167:44 Nebulosity is not merely a matter of degree or ambiguity. It also depends on
167:46 degree or ambiguity. It also depends on your needs, your goals, and your
167:47 your needs, your goals, and your purposes.
167:49 purposes. What do you need the concept or category
167:51 What do you need the concept or category for?
167:53 for? For example, what counts as water
167:55 For example, what counts as water depends on what you need it for.
167:59 depends on what you need it for. Is there water in a coconut? Depends on
168:02 Is there water in a coconut? Depends on what you need the water for.
168:10 If you are asking because you're thirsty and you need to quench your thirst. In a
168:12 and you need to quench your thirst. In a certain sense, there is. You can drink a
168:13 certain sense, there is. You can drink a coconut and it'll quench your thirst.
168:16 coconut and it'll quench your thirst. If you're asking for the purposes of
168:18 If you're asking for the purposes of running a chemistry experiment because
168:19 running a chemistry experiment because you need pure H2O, then no, there isn't.
168:30 For example, is that object straight? Depends on what you need it for. How
168:32 Depends on what you need it for. How straight do you need it? It's straight
168:34 straight do you need it? It's straight for the purposes of maybe, you know,
168:37 for the purposes of maybe, you know, some sort of macro purpose, some sort of
168:39 some sort of macro purpose, some sort of general purpose.
168:41 general purpose. Um, you know, if if you're if you're
168:44 Um, you know, if if you're if you're building some furniture and you have a
168:45 building some furniture and you have a board, it's like, is that board
168:46 board, it's like, is that board straight? Well, for the purposes of
168:47 straight? Well, for the purposes of building this couch, yeah, it's straight
168:50 building this couch, yeah, it's straight enough.
168:51 enough. But if you were going to use this to
168:53 But if you were going to use this to build, you know, a skyscraper, it's not
168:55 build, you know, a skyscraper, it's not straight enough because the skyscraper
168:58 straight enough because the skyscraper requires, you know, because it's so so
169:00 requires, you know, because it's so so tall, you need everything to be extra
169:02 tall, you need everything to be extra straight because the tolerances are so
169:04 straight because the tolerances are so much tighter for what counts as
169:06 much tighter for what counts as straight. And then if you say in a
169:08 straight. And then if you say in a technical sense, nothing is ever
169:10 technical sense, nothing is ever straight.
169:19 So there's even a relative factor here of
169:21 there's even a relative factor here of in what sense are you asking the
169:22 in what sense are you asking the question and also of what kind of
169:24 question and also of what kind of questions are you asking? Your questions
169:26 questions are you asking? Your questions themselves are not clear-cut. How do you
169:28 themselves are not clear-cut. How do you formulate the question? How you
169:30 formulate the question? How you formulate your questions very much
169:32 formulate your questions very much determines what kind of answers you're
169:33 determines what kind of answers you're going to get.
169:35 going to get. So there's virtually no truths, absolute
169:37 So there's virtually no truths, absolute truths found at the uh level of macro
169:40 truths found at the uh level of macro objects.
169:42 objects. And often what's meant by true is just
169:44 And often what's meant by true is just practically true enough. True for a
169:46 practically true enough. True for a particular purpose.
169:48 particular purpose. And your mind understands this
169:51 And your mind understands this intuitively without you need needing to
169:53 intuitively without you need needing to formalize it. That's the power of your
169:56 formalize it. That's the power of your mind.
170:01 Humans invent and select onlogical categories to suit their survival.
170:03 categories to suit their survival. categorization is inherently irreducibly
170:05 categorization is inherently irreducibly nebulous. Ontologically, also sameness
170:09 nebulous. Ontologically, also sameness and difference are relative.
170:11 and difference are relative. I'm not going to have time to explain
170:13 I'm not going to have time to explain that here. Go watch my long 4hour
170:16 that here. Go watch my long 4hour episode called sameness versus
170:18 episode called sameness versus difference where we go into that in
170:19 difference where we go into that in depth.
170:21 depth. Identity itself is relativistic and mind
170:23 Identity itself is relativistic and mind dependent. Contemplate that doozy right
170:26 dependent. Contemplate that doozy right there.
170:29 there. Also consider that true and false are
170:32 Also consider that true and false are not just objective properties of the
170:34 not just objective properties of the physical world. For example, David
170:36 physical world. For example, David Chapman gives this example. Consider a
170:39 Chapman gives this example. Consider a book full of lies.
170:41 book full of lies. A book full of lies is not physically
170:42 A book full of lies is not physically false because those are just collection
170:46 false because those are just collection of atoms.
170:47 of atoms. Atoms can't be false. The falseness of
170:49 Atoms can't be false. The falseness of the lies in the book only exists inside
170:52 the lies in the book only exists inside of your mind.
170:54 of your mind. A mind is needed to interpret the
170:56 A mind is needed to interpret the symbols and deal with the meanings in
170:58 symbols and deal with the meanings in order to make sense of the world.
171:01 order to make sense of the world. So this simplistic idea that you can
171:02 So this simplistic idea that you can just boil down truth to atomic facts
171:06 just boil down truth to atomic facts just doesn't fly. It doesn't fly because
171:09 just doesn't fly. It doesn't fly because when you're reading a book, even if
171:12 when you're reading a book, even if you're reading a science textbook, you
171:13 you're reading a science textbook, you might be reading a science textbook and
171:14 might be reading a science textbook and then you find a sentence that says
171:15 then you find a sentence that says something and then you say that's not
171:16 something and then you say that's not true because I did some experiment and
171:19 true because I did some experiment and showed that it's that's false. But how
171:21 showed that it's that's false. But how do you know that that's true or false?
171:23 do you know that that's true or false? How do you make sense of that? See, it's
171:25 How do you make sense of that? See, it's not in the atoms that it's true and
171:27 not in the atoms that it's true and false. It's in your own mind, in the
171:30 false. It's in your own mind, in the symbols, the connections of symbols in
171:31 symbols, the connections of symbols in your own mind.
171:39 Quoting David Chapman, quote, there exists no definite, absolutely true
171:41 exists no definite, absolutely true answers to most questions, however they
171:43 answers to most questions, however they are stated. There is no objective answer
171:45 are stated. There is no objective answer to is Trump a fascist. There is no
171:48 to is Trump a fascist. There is no objective answer to is there water in a
171:49 objective answer to is there water in a coconut? There's no objective answer to
171:52 coconut? There's no objective answer to are apples healthy.
172:02 Most issues and objects relevant to human life are macrocale and even
172:04 human life are macrocale and even non-physical abstractions.
172:10 We run into questions like is drinking red wine good or bad for you? These kind
172:13 red wine good or bad for you? These kind of macro abstract questions, nebulous
172:16 of macro abstract questions, nebulous questions. Should you give your children
172:18 questions. Should you give your children a vaccine?
172:24 Does candidate X have a mental illness for this medical? Like let's say I'm
172:26 for this medical? Like let's say I'm running a I'm a scientist and I'm
172:27 running a I'm a scientist and I'm running a medical trial and I want to
172:29 running a medical trial and I want to study depression. So now I have to
172:31 study depression. So now I have to create the category of depression and I
172:33 create the category of depression and I have to somehow screen depressed people
172:35 have to somehow screen depressed people to be part of my medical trial. But how
172:38 to be part of my medical trial. But how do I define what depression is in a
172:40 do I define what depression is in a formal way?
172:44 formal way? See that how nebulous that is. And then
172:46 See that how nebulous that is. And then depending on how I define dep
172:48 depending on how I define dep depression, how I filter people into my
172:50 depression, how I filter people into my study, that's going to depend that
172:52 study, that's going to depend that that's going to influence the ultimate
172:53 that's going to influence the ultimate conclusion of my scientific study
172:56 conclusion of my scientific study of whether some drug helps or hurts
172:59 of whether some drug helps or hurts depressed people.
173:02 depressed people. For example, the question of like should
173:03 For example, the question of like should you take that UFO video that you saw on
173:05 you take that UFO video that you saw on YouTube, should you take it seriously?
173:09 YouTube, should you take it seriously? Very nebulous question.
173:12 Very nebulous question. Question like do do psychedelics damage
173:14 Question like do do psychedelics damage the brain? That's a difficult question
173:16 the brain? That's a difficult question to answer in the general.
173:18 to answer in the general. How about this one? Do psychedelics
173:20 How about this one? Do psychedelics reveal legitimate scientific truths
173:23 reveal legitimate scientific truths about reality? How do you answer that
173:25 about reality? How do you answer that scientifically?
173:28 scientifically? How about this one? Does such and such a
173:31 How about this one? Does such and such a thing count as abuse or does such and
173:33 thing count as abuse or does such and such a thing count as theft?
173:39 Well, people have different opinions about that.
173:45 A question like is this scientific instrument like this telescope is it
173:47 instrument like this telescope is it calibrated properly?
173:54 See, you can't have a formal method for how to calibrate a scientific
173:55 how to calibrate a scientific instrument.
174:01 You have to actually use intelligence to see whether your scientific instruments
174:03 see whether your scientific instruments are calibrated properly because there's
174:04 are calibrated properly because there's so many different ways that a scientific
174:05 so many different ways that a scientific instrument could be miscalibrated in a
174:07 instrument could be miscalibrated in a in a hund different ways. And you can't
174:10 in a hund different ways. And you can't possibly uh formalize them all. And yet
174:14 possibly uh formalize them all. And yet you need to understand if your
174:15 you need to understand if your instruments are properly calibrated to
174:17 instruments are properly calibrated to do science.
174:19 do science. And how do you know that you that you
174:21 And how do you know that you that you judge the instrument calibration is
174:24 judge the instrument calibration is proper when you could be self-deceived
174:26 proper when you could be self-deceived about that?
174:27 about that? See, how do you invent a system of rules
174:30 See, how do you invent a system of rules to per to prevent that self-deception?
174:34 to per to prevent that self-deception? Now, we're getting to some really
174:35 Now, we're getting to some really interesting epistemology.
174:38 interesting epistemology. These are the real issues that plague
174:40 These are the real issues that plague science.
174:45 Or a question like, how do you interpret the the results of this quantum
174:47 the the results of this quantum mechanics experiment correctly?
174:50 mechanics experiment correctly? Lots of disagreement about that. or a
174:52 Lots of disagreement about that. or a question like, "Do terrorists have a
174:54 question like, "Do terrorists have a legitimate grievance against American
174:56 legitimate grievance against American Empire?
174:58 Empire? Are terrorists justified in their in
175:01 Are terrorists justified in their in their violent attacks?
175:05 their violent attacks? What is the rational correct answer to
175:07 What is the rational correct answer to that question?"
175:12 And you see, I want you to see, you might say, "Well, Leo, this is
175:13 might say, "Well, Leo, this is terrorism. It has nothing to do with
175:15 terrorism. It has nothing to do with science."
175:16 science." But all of it is connected and unified
175:20 But all of it is connected and unified by sensemaking, understanding, and
175:23 by sensemaking, understanding, and truth.
175:24 truth. Ultimately, when people are arguing with
175:26 Ultimately, when people are arguing with each other about things, they're arguing
175:27 each other about things, they're arguing over truth. Some terrorist will say,
175:30 over truth. Some terrorist will say, "Yeah, we have a legitimate grievance
175:31 "Yeah, we have a legitimate grievance against the government." And somebody
175:33 against the government." And somebody else will say, "No, they don't." And
175:35 else will say, "No, they don't." And you're arguing over truth. And in
175:37 you're arguing over truth. And in science, you're arguing over truth. Did
175:40 science, you're arguing over truth. Did Did the dinosaurs die out because of a
175:41 Did the dinosaurs die out because of a comet or because of a meteorite or
175:43 comet or because of a meteorite or because of some virus or whatever?
175:46 because of some virus or whatever? you're arguing over truth.
175:49 you're arguing over truth. All of this is connected uh by the the
175:52 All of this is connected uh by the the more general problem of how do you
175:54 more general problem of how do you distinguish truth from falsehood? See,
175:55 distinguish truth from falsehood? See, your mind is always doing that. You need
175:57 your mind is always doing that. You need to do that in order to survive and to
175:59 to do that in order to survive and to understand anything.
176:01 understand anything. How do you judge whether what I'm
176:02 How do you judge whether what I'm telling you is true or false?
176:05 telling you is true or false? How do you judge? See, you say, "Well,
176:07 How do you judge? See, you say, "Well, Leo, science is the most true thing that
176:10 Leo, science is the most true thing that we have." So, we just use science.
176:13 we have." So, we just use science. [snorts] But how did you determine that
176:14 [snorts] But how did you determine that science was the most true thing we have?
176:17 science was the most true thing we have? Did you read a textbook? How did you
176:18 Did you read a textbook? How did you know that the science textbook was true?
176:31 How about a question like, "Does that computer contain pornography?" Well,
176:33 computer contain pornography?" Well, what counts as pornography?
176:36 what counts as pornography? How do you formalize what is
176:38 How do you formalize what is pornography? You might say, "What what
176:40 pornography? You might say, "What what does it matter?" Well, if you have a
176:41 does it matter?" Well, if you have a legal case, let's say somebody is found
176:43 legal case, let's say somebody is found to have porn on their computer and it's
176:46 to have porn on their computer and it's underageed porn and now the government
176:49 underageed porn and now the government is suing them. Well, now you need to
176:50 is suing them. Well, now you need to make a legal case whether that was truly
176:52 make a legal case whether that was truly underageed porn on that computer or it
176:54 underageed porn on that computer or it wasn't. Well, how do you formalize that?
176:56 wasn't. Well, how do you formalize that? Which is why our legal system is
176:58 Which is why our legal system is notoriously ambiguous and nebulous. Laws
177:01 notoriously ambiguous and nebulous. Laws are specifically written to be ambiguous
177:03 are specifically written to be ambiguous on purpose. Why? Because if you write
177:06 on purpose. Why? Because if you write the law too formally,
177:09 the law too formally, then you're going to create all sorts of
177:12 then you're going to create all sorts of room for loopholes
177:15 room for loopholes and then criminals are going to exploit
177:17 and then criminals are going to exploit that. See, do you see how how all all
177:21 that. See, do you see how how all all how interconnected all this is? It goes
177:23 how interconnected all this is? It goes so deep.
177:26 so deep. Is that thing over there misinformation?
177:28 Is that thing over there misinformation? How do you define what's misinformation?
177:30 How do you define what's misinformation? Is that person lying? How do you define
177:31 Is that person lying? How do you define lying versus delusion versus
177:34 lying versus delusion versus self-deception versus just a difference
177:37 self-deception versus just a difference of perspective? Is that person
177:39 of perspective? Is that person gaslighting me? How do you define what
177:41 gaslighting me? How do you define what gaslighting is?
177:44 gaslighting is? Is that organization over there a cult?
177:46 Is that organization over there a cult? How do you define what a cult is?
178:00 Does that fetus over there have uh is that a person with human rights?
178:03 that a person with human rights? How do you define human rights? How do
178:04 How do you define human rights? How do you define a person? Is a person born at
178:07 you define a person? Is a person born at the moment of conception or is the
178:08 the moment of conception or is the person happen when it comes out of the
178:11 person happen when it comes out of the woman's body?
178:14 woman's body? When does a person become a person?
178:22 Here's another interesting example. Is Christianity true?
178:26 Is Christianity true? See, the problem is that these
178:27 See, the problem is that these rationalists, they don't just argue
178:29 rationalists, they don't just argue about science stuff. They argue about
178:30 about science stuff. They argue about religious stuff, too. A rationalist want
178:32 religious stuff, too. A rationalist want to say, "No, Christianity is not true."
178:34 to say, "No, Christianity is not true." But then you, how do you define
178:36 But then you, how do you define Christianity? And how do you define
178:37 Christianity? And how do you define true? And then which version of
178:39 true? And then which version of Christianity are we talking about? There
178:41 Christianity are we talking about? There might be some versions which are true
178:43 might be some versions which are true and some which are false. And then which
178:45 and some which are false. And then which parts of Christianity to say that
178:46 parts of Christianity to say that Christianity is true? Do all the parts
178:49 Christianity is true? Do all the parts need to be true or just some of the
178:51 need to be true or just some of the parts like 50%,
178:53 parts like 50%, 80%, who determines that percentage?
178:58 80%, who determines that percentage? Consider for example that God existed,
179:02 Consider for example that God existed, but everything else in the Bible was
179:04 but everything else in the Bible was false.
179:05 false. So the only truthful part is that God
179:08 So the only truthful part is that God exists, but everything else else about
179:09 exists, but everything else else about Christianity is false. Does that mean
179:12 Christianity is false. Does that mean Christianity is false or is it true?
179:16 Christianity is false or is it true? In a certain sense, it's false in the
179:17 In a certain sense, it's false in the details, but it's true. In the most
179:19 details, but it's true. In the most important thing, you might say like the
179:20 important thing, you might say like the most important thing about Christianity
179:22 most important thing about Christianity is if there is a god that it recognizes
179:24 is if there is a god that it recognizes a god, right? So, if it gets that right,
179:27 a god, right? So, if it gets that right, but everything else wrong, is it still
179:29 but everything else wrong, is it still worthwhile to teach it?
179:40 See, these are the kind of nebulous, hairy questions that humans deal with.
179:43 hairy questions that humans deal with. And not just human scientists deal with
179:45 And not just human scientists deal with it all the time too.
179:47 it all the time too. These are issues of sensemaking.
179:57 And you can't just say, well, yeah, there's all these fuzzy human
179:59 yeah, there's all these fuzzy human issues and then there's hard science and
180:02 issues and then there's hard science and what's really true is the hard science
180:04 what's really true is the hard science and then all this fuzzy human
180:06 and then all this fuzzy human sentimental kind of weird chaotic stuff.
180:08 sentimental kind of weird chaotic stuff. It's like, yeah, we don't know how to
180:09 It's like, yeah, we don't know how to deal with it, but it it doesn't really
180:10 deal with it, but it it doesn't really matter because it's not the hard truth.
180:13 matter because it's not the hard truth. See, you can't even say that because you
180:16 See, you can't even say that because you can't even define this difference.
180:24 You don't know if that's true. You don't know if this is a valid
180:26 You don't know if this is a valid distinction that you're making.
180:28 distinction that you're making. You certainly haven't proved it.
180:32 You certainly haven't proved it. You certain have you certainly haven't
180:34 You certain have you certainly haven't determined this using science.
180:37 determined this using science. That's not science. That's a vague
180:39 That's not science. That's a vague intuition that you have at best.
180:42 intuition that you have at best. Nothing scientific about it, nothing
180:44 Nothing scientific about it, nothing formal about it. There is no formal
180:47 formal about it. There is no formal proof that
180:51 proof that the fuzzy stuff is over here and the
180:53 the fuzzy stuff is over here and the hard science facts are over here.
180:55 hard science facts are over here. There's no formal proof of that.
181:06 See, just to say non-scientific questions do not matter is itself a
181:10 questions do not matter is itself a non-scientific
181:12 non-scientific bad form of sensemaking. Do you see
181:15 bad form of sensemaking. Do you see that? And just to say everything
181:18 that? And just to say everything important is scientific,
181:21 important is scientific, that is not science.
181:24 that is not science. You can't say that because that's not
181:26 You can't say that because that's not science.
181:33 And yet scientists say that kind of stuff and believe that kind of stuff all
181:34 stuff and believe that kind of stuff all the time. They make those kind of
181:35 the time. They make those kind of arguments.
181:41 So the key for you to understand here is that making sense of science requires
181:43 that making sense of science requires more than science.
181:51 Rationalism tries to dodge ontology and metaphysics because it would reveal
181:53 metaphysics because it would reveal nebulosity and doom the rationalist
181:54 nebulosity and doom the rationalist project.
181:57 project. But epistemology and ontology cannot be
182:00 But epistemology and ontology cannot be separated because we have a unity.
182:04 separated because we have a unity. If objects are nebulous, then statements
182:05 If objects are nebulous, then statements about those objects must also be
182:07 about those objects must also be nebulous.
182:09 nebulous. Rationalism tries to solve this
182:11 Rationalism tries to solve this nebulosity by reduction of everything to
182:13 nebulosity by reduction of everything to atomic physics. But atomic physics
182:15 atomic physics. But atomic physics doesn't apply to macro objects and it
182:17 doesn't apply to macro objects and it doesn't deal with the actual sensemaking
182:20 doesn't deal with the actual sensemaking world that humans live in. All the
182:22 world that humans live in. All the things that humans reason about are at
182:24 things that humans reason about are at the macro scale where uh physics does
182:28 the macro scale where uh physics does not apply. And so in this way,
182:30 not apply. And so in this way, rationality and rationalism creates a
182:32 rationality and rationalism creates a flat land.
182:35 flat land. And when confronted with this problem,
182:36 And when confronted with this problem, rationalists double down and try to add
182:38 rationalists double down and try to add more logical structure to try to fix it
182:41 more logical structure to try to fix it because they think that something like
182:42 because they think that something like this must work, but it will never work.
182:46 this must work, but it will never work. That's the point. The entire domain of
182:48 That's the point. The entire domain of concept is nebulous. mind is is is a
182:51 concept is nebulous. mind is is is a nebulous implicit thing.
182:59 Here's a long quote from David Chapman. He says, quote, "The hope of rationalism
183:02 He says, quote, "The hope of rationalism is that some mechanical criterion or
183:04 is that some mechanical criterion or procedure could provide certainty,
183:06 procedure could provide certainty, understanding, and control by
183:07 understanding, and control by eliminating non-rational factors. This
183:10 eliminating non-rational factors. This is not possible because rationality by
183:12 is not possible because rationality by itself can't deal with the nebulous
183:14 itself can't deal with the nebulous macrosized world at all. abstract formal
183:18 macrosized world at all. abstract formal reasoning cannot reach into that realm.
183:20 reasoning cannot reach into that realm. It requires reasonable activity as a
183:22 It requires reasonable activity as a bridge.
183:24 bridge. End quote.
183:32 So higher order phenomena cannot be reduced to atomic facts and you cannot
183:35 reduced to atomic facts and you cannot say that higher order phenomena
183:37 say that higher order phenomena are unreal, untrue, unimportant and
183:40 are unreal, untrue, unimportant and unscientific
183:46 because they're crucial to human life. and also even crucial to science as
183:48 and also even crucial to science as well.
183:50 well. Even crucial in your application of
183:52 Even crucial in your application of logic and your reasoning about
183:53 logic and your reasoning about mathematics.
183:58 Crucial aspects of reality are not atomic facts. For example, what is
184:00 atomic facts. For example, what is truth? That's not an atomic fact. What
184:03 truth? That's not an atomic fact. What is the self or the ego? That's not an
184:05 is the self or the ego? That's not an atomic fact. What is consciousness?
184:07 atomic fact. What is consciousness? That's not an atomic fact. What is
184:09 That's not an atomic fact. What is corruption? What is love? What is
184:11 corruption? What is love? What is intelligence?
184:13 intelligence? the understanding of survival the way
184:15 the understanding of survival the way that I teach it. What is insight? How
184:18 that I teach it. What is insight? How does self-deception work? What is a
184:21 does self-deception work? What is a paradigm? What is an orgasm? What is
184:22 paradigm? What is an orgasm? What is sanity? What is relativity?
184:25 sanity? What is relativity? What is good? What is bad? These are not
184:27 What is good? What is bad? These are not atomic facts. And yet, these are
184:29 atomic facts. And yet, these are fundamental to your sense of self and to
184:34 fundamental to your sense of self and to everything you understand, including
184:36 everything you understand, including science.
184:38 science. Science and rationality themselves are
184:40 Science and rationality themselves are plagued with higher order questions and
184:42 plagued with higher order questions and issues. That's the performative par
184:44 issues. That's the performative par paradox here or contradiction
184:47 paradox here or contradiction of this uh rationalist project.
184:50 of this uh rationalist project. What is rationality? You can't define
184:52 What is rationality? You can't define it. Is rationalism the best worldview?
184:56 it. Is rationalism the best worldview? That's not an atomic fact. Does
184:58 That's not an atomic fact. Does scientific method capture all of
185:00 scientific method capture all of reality? That's not an atomic fact. Is
185:03 reality? That's not an atomic fact. Is that thing pseudocience? That's not an
185:04 that thing pseudocience? That's not an atomic fact that you find out in the
185:06 atomic fact that you find out in the world. You can't look through a
185:08 world. You can't look through a microscope and discover that the
185:10 microscope and discover that the scientific method captures all of
185:11 scientific method captures all of reality. See, is science true? Is not an
185:15 reality. See, is science true? Is not an atomic fact.
185:18 atomic fact. People just think that science is true
185:20 People just think that science is true is scientific. No, it's not. To say that
185:22 is scientific. No, it's not. To say that is not scientific.
185:27 [snorts] Or to say something like everyone should be rational. That's not
185:29 everyone should be rational. That's not an atomic fact and that's not a
185:31 an atomic fact and that's not a scientific statement.
185:34 scientific statement. But people who are deep into scientism
185:36 But people who are deep into scientism and rationalism are not conscious enough
185:39 and rationalism are not conscious enough to understand this. They lack the
185:41 to understand this. They lack the self-reflection
185:44 self-reflection and the objectivity
185:46 and the objectivity as they are professing to be objective.
185:49 as they are professing to be objective. That's the irony of all this. That's the
185:52 That's the irony of all this. That's the twistedness of it and the absurdity of
185:55 twistedness of it and the absurdity of it.
185:58 it. So sensemaking cannot be reduced to
185:59 So sensemaking cannot be reduced to atomic facts.
186:02 atomic facts. Natural
186:05 Natural language is loose and informal precisely
186:06 language is loose and informal precisely because it needs to be.
186:09 because it needs to be. Or over formalizing it would lead to a
186:11 Or over formalizing it would lead to a severe limiting of the mind and
186:14 severe limiting of the mind and oversimplification of reality and to
186:16 oversimplification of reality and to falsehood and error not to truth and it
186:19 falsehood and error not to truth and it would stifle mental creativity. Your
186:21 would stifle mental creativity. Your mind needs to be loose to handle reality
186:24 mind needs to be loose to handle reality because reality is onlogically nebulous
186:26 because reality is onlogically nebulous and fluid.
186:33 And an even deeper point here is that rationalism doesn't merely assume a
186:36 rationalism doesn't merely assume a rigid ontology.
186:38 rigid ontology. It actually creates that reality through
186:42 It actually creates that reality through projection and self-limitation of the
186:43 projection and self-limitation of the mind. See what you have is you have an
186:46 mind. See what you have is you have an infinite mind as a rationalist. This
186:49 infinite mind as a rationalist. This infinite mind then gets limited by the
186:52 infinite mind then gets limited by the fact that you want mind and real. You
186:55 fact that you want mind and real. You want the world to be rigid,
186:57 want the world to be rigid, ontologically rigid and simple to fit
187:00 ontologically rigid and simple to fit into these categories. So what happens
187:01 into these categories. So what happens is that you take an infinite fluid mind
187:04 is that you take an infinite fluid mind and then you compress it down into the
187:08 and then you compress it down into the rigid categories that you think should
187:10 rigid categories that you think should be out there that you want to see
187:14 be out there that you want to see that your ego mind wants because it's
187:16 that your ego mind wants because it's just easier to deal with these kind of
187:18 just easier to deal with these kind of rigid categories.
187:20 rigid categories. And so what ends up happening is that
187:22 And so what ends up happening is that you end up creating it and then
187:25 you end up creating it and then projecting it and then seeing it. You
187:28 projecting it and then seeing it. You see it. You see the rigid categories out
187:30 see it. You see the rigid categories out there and then you tell yourself, "Oh,
187:32 there and then you tell yourself, "Oh, look, it's exactly the way that I
187:34 look, it's exactly the way that I thought it was." The world is made out
187:36 thought it was." The world is made out of rigid objective categories.
187:39 of rigid objective categories. How do I know that? Because I see it out
187:41 How do I know that? Because I see it out there. But then what you're not
187:42 there. But then what you're not conscious of is that you projected that
187:44 conscious of is that you projected that and you created it out of a larger
187:47 and you created it out of a larger infinite mind.
187:50 infinite mind. You don't even recognize the world as a
187:52 You don't even recognize the world as a mind anymore because what all you see is
187:54 mind anymore because what all you see is a world out there. You don't see your
187:55 a world out there. You don't see your own self or your own mind. You just see
187:57 own self or your own mind. You just see the objective world.
188:06 Like Leo, that cat and that dog, that man and that woman, that's not my mind.
188:09 man and that woman, that's not my mind. That's not myself. That's the objective
188:11 That's not myself. That's the objective world. And then my mind is in here. But
188:15 world. And then my mind is in here. But see, but you created that different. You
188:17 see, but you created that different. You created that
188:19 created that and you're not seeing that you created
188:20 and you're not seeing that you created it. That's the tragedy of this thing.
188:24 it. That's the tragedy of this thing. You have to see this profound paradigm
188:27 You have to see this profound paradigm lock that you've created for yourself.
188:31 lock that you've created for yourself. If you insist on it deeply enough, then
188:33 If you insist on it deeply enough, then it it'll become your reality in a sense.
188:39 Of course, there's limits to that. You can't just imagine unicorns and have
188:41 can't just imagine unicorns and have them materialize. But um when it comes
188:45 them materialize. But um when it comes to projecting conceptual categories,
188:47 to projecting conceptual categories, you're very very good at that. When when
188:48 you're very very good at that. When when it comes to projecting your own paradigm
188:50 it comes to projecting your own paradigm onto the world, you know, a Christian
188:51 onto the world, you know, a Christian literally feels like he's in a Christian
188:54 literally feels like he's in a Christian world. And a Muslim feels like she's in
188:56 world. And a Muslim feels like she's in a Muslim world. And a scientist feels
188:58 a Muslim world. And a scientist feels like they're in a material world. Why?
189:01 like they're in a material world. Why? Because you projected and created it.
189:03 Because you projected and created it. And it's entangled with your sense of
189:05 And it's entangled with your sense of self insanity such that when we when we
189:08 self insanity such that when we when we start to deconstruct it, you're going to
189:10 start to deconstruct it, you're going to feel like you're losing your mind.
189:13 feel like you're losing your mind. That's not a mistake. That's exactly
189:14 That's not a mistake. That's exactly correct.
189:20 See, rationalism is not conscious that the mind is creating distinctions and
189:22 the mind is creating distinctions and that reality is made out of
189:23 that reality is made out of distinctions. The distinction between
189:25 distinctions. The distinction between the objective and subjective cannot hold
189:28 the objective and subjective cannot hold as rationalism assumes.
189:33 Physical objects do depend on the subjective ideas you have about them.
189:35 subjective ideas you have about them. Unlike rationalism assumes,
189:39 Unlike rationalism assumes, rationalism does not see how it projects
189:40 rationalism does not see how it projects and constructs rigor
189:43 and constructs rigor onto mind at large, consciousness and
189:46 onto mind at large, consciousness and reality at large. Uh rationalism doesn't
189:49 reality at large. Uh rationalism doesn't understand that the world is the mind.
189:53 understand that the world is the mind. Rationalism underestimates the degree to
189:55 Rationalism underestimates the degree to which the mind mediates reality.
189:58 which the mind mediates reality. Rationalism assumes that the mind is not
190:00 Rationalism assumes that the mind is not central to understanding reality, which
190:02 central to understanding reality, which it is.
190:07 This is why rationalism thinks that you can make a computer that will just like
190:09 can make a computer that will just like plug away mechanically and understand
190:11 plug away mechanically and understand the world. That you can't do that
190:20 because not only is mind central, mind is reality. There is no other reality
190:23 is reality. There is no other reality but mind.
190:25 but mind. And rationalism doesn't take this
190:26 And rationalism doesn't take this possibility seriously. It cons it calls
190:28 possibility seriously. It cons it calls it woo. That's called woo.
190:31 it woo. That's called woo. And as soon as rationalism can label
190:33 And as soon as rationalism can label something as woo, it doesn't need to
190:35 something as woo, it doesn't need to take it seriously anymore.
190:44 All because rationalism assumes mind and reality are objectively distinct things,
190:46 reality are objectively distinct things, which they're not. See, once you start
190:48 which they're not. See, once you start to understand what I'm saying here, this
190:50 to understand what I'm saying here, this now opens up the door to legitimate
190:52 now opens up the door to legitimate mysticism.
190:58 All that mysticism ultimately is is just the realization that reality and and
191:01 the realization that reality and and mind physical reality and mind are one.
191:05 mind physical reality and mind are one. That's what mysticism is ultimately.
191:10 That's what mysticism is ultimately. That's not woo. That's literally what's
191:12 That's not woo. That's literally what's true.
191:17 Rationalism also assumes that the self is objectively distinct from reality.
191:20 is objectively distinct from reality. But you are reality. The self is
191:23 But you are reality. The self is reality. There's no other reality but
191:24 reality. There's no other reality but the self.
191:33 Rationalism takes definition for granted as though everything just has a
191:34 as though everything just has a definition. Right? There is a definition
191:36 definition. Right? There is a definition for a cat, a dog and so forth.
191:40 for a cat, a dog and so forth. This definition question is is central
191:42 This definition question is is central because to the to this problem of
191:44 because to the to this problem of demystification. See the reason that
191:47 demystification. See the reason that rationalists want to demystify
191:48 rationalists want to demystify everything is because they think
191:49 everything is because they think everything has a crisp definition.
191:52 everything has a crisp definition. Mysticism is a recognition that you
191:54 Mysticism is a recognition that you can't define infinity. Infinity is
191:56 can't define infinity. Infinity is literally undefined.
191:58 literally undefined. But the rationalist mind doesn't
191:59 But the rationalist mind doesn't understand this. It thinks that it can
192:00 understand this. It thinks that it can define everything through analysis
192:02 define everything through analysis because it ultimately assumes that
192:04 because it ultimately assumes that reality is finite.
192:10 And so the rationalist minds ends up projecting its definitions onto the
192:11 projecting its definitions onto the world. And the rationalist mind looks
192:14 world. And the rationalist mind looks out into the world. It sees finite
192:15 out into the world. It sees finite objects. It doesn't see infinity. It
192:17 objects. It doesn't see infinity. It can't see infinity because it insists
192:19 can't see infinity because it insists that the only things that are real are
192:21 that the only things that are real are those things that have definitions. But
192:23 those things that have definitions. But infinity can't have a definition. But if
192:25 infinity can't have a definition. But if it's it has no definition, then it's not
192:27 it's it has no definition, then it's not real according to the rationalist. See,
192:29 real according to the rationalist. See, this is a paradigm lock.
192:38 But the definition of any object is never fully known because every object
192:40 never fully known because every object is actually infinite and part of
192:42 is actually infinite and part of infinity.
192:44 infinity. Every object is a sub infinity of
192:46 Every object is a sub infinity of infinity.
192:48 infinity. which is why you can't define anything.
192:50 which is why you can't define anything. Ultimately,
192:56 all your definitions are just vague attempts
192:57 attempts to somehow um reduce the complexity,
193:02 to somehow um reduce the complexity, infinite complexity down to something,
193:03 infinite complexity down to something, you know, some manageable thumbnail that
193:05 you know, some manageable thumbnail that your mind can can work with.
193:12 See, what you need to start to recognize is that you use concepts to think
193:15 is that you use concepts to think without knowing what anything is in
193:17 without knowing what anything is in full. Your own mind does not know the
193:20 full. Your own mind does not know the primitives that it thinks in.
193:23 primitives that it thinks in. All the key primitives are undefined,
193:24 All the key primitives are undefined, unknown, and implicit. For example,
193:28 unknown, and implicit. For example, I'm going to give you a list of words
193:29 I'm going to give you a list of words here. You're going to know in general
193:31 here. You're going to know in general what all these words are, but at the
193:34 what all these words are, but at the same time, you don't know what any of
193:35 same time, you don't know what any of them are. So, here we go. Game: Man
193:40 them are. So, here we go. Game: Man alive. Electron, particle, energy, time,
193:44 alive. Electron, particle, energy, time, force, truth, concept, rationality,
193:47 force, truth, concept, rationality, science, intelligent, object, objective,
193:51 science, intelligent, object, objective, matter,
193:57 uh, physical, consciousness, mind, formal, quantity, quality, woo,
194:01 formal, quantity, quality, woo, irrational, sanity.
194:04 irrational, sanity. Technically,
194:07 Technically, none of you know what any of these
194:08 none of you know what any of these things are.
194:11 things are. On the other hand, you kind of know what
194:13 On the other hand, you kind of know what I'm talking about when I say these
194:15 I'm talking about when I say these words. You're not that deeply confused.
194:23 Yet, even though you don't have definitions for any of these terms, you
194:24 definitions for any of these terms, you use them all the time to make sense of
194:26 use them all the time to make sense of the world. And you use them all the time
194:28 the world. And you use them all the time in science, too. You couldn't do science
194:29 in science, too. You couldn't do science without knowing any of these words.
194:32 without knowing any of these words. So that's the situation that we're in.
194:36 So that's the situation that we're in. So what I what what the rationalist
194:38 So what I what what the rationalist needs to notice is that his own process,
194:43 needs to notice is that his own process, his own mental process is not formal or
194:45 his own mental process is not formal or rigorous. It is inherently fuzzy,
194:48 rigorous. It is inherently fuzzy, sketchy, and intuitive.
194:55 And your mind couldn't work any other way. Or if it did, you'd be very stupid.
194:59 way. Or if it did, you'd be very stupid. And you certainly couldn't do science or
195:02 And you certainly couldn't do science or math or logic or anything sophisticated
195:05 math or logic or anything sophisticated like that. You'd basically be retarded
195:13 like one of those robots caught in a logic loop.
195:21 See, because think of it, when a child is born, how does a child make sense or
195:24 is born, how does a child make sense or understand anything at all?
195:27 understand anything at all? Because the child doesn't have
195:29 Because the child doesn't have definitions in its own mind. The child's
195:31 definitions in its own mind. The child's mind is not operating on formality of
195:33 mind is not operating on formality of any kind. How does a child know when you
195:36 any kind. How does a child know when you point at an apple that that's an apple?
195:40 point at an apple that that's an apple? Not formally in any way. The child
195:42 Not formally in any way. The child grocks it. The child intuitits it
195:44 grocks it. The child intuitits it through fuzzy, you know, intuitive
195:46 through fuzzy, you know, intuitive process.
195:49 process. How does a child know the difference
195:50 How does a child know the difference between a cat and a dog? How does a
195:52 between a cat and a dog? How does a child know what for example uh you know
195:57 child know what for example uh you know abstract things are? You might tell a
195:59 abstract things are? You might tell a child you know happiness. How does a
196:02 child you know happiness. How does a child learn what happiness is? How does
196:04 child learn what happiness is? How does a child learn what suffering is? How
196:06 a child learn what suffering is? How does a child learn what a what color is?
196:10 does a child learn what a what color is? These are gross abstractions. How does a
196:12 These are gross abstractions. How does a child's mind figure this out? Through a
196:15 child's mind figure this out? Through a very nebulous intuitive process.
196:21 Uh, and in fact, I'm going to go one step
196:23 and in fact, I'm going to go one step further for you here. And this is where
196:25 further for you here. And this is where your mysticism allergy, you know, your
196:27 your mysticism allergy, you know, your woo allergy, watch out.
196:31 woo allergy, watch out. What I want to claim is that the child's
196:35 What I want to claim is that the child's process of learning the difference
196:38 process of learning the difference between a cat and a dog is actually a
196:41 between a cat and a dog is actually a mystical process.
196:50 entertain that idea. I want you to seriously entertain that idea
196:54 seriously entertain that idea that without a mystical process, without
196:56 that without a mystical process, without a mystical intelligence, the child
196:59 a mystical intelligence, the child literally could not understand the
197:02 literally could not understand the difference between a cat and a dog
197:05 difference between a cat and a dog or what a color is or what happiness is.
197:07 or what a color is or what happiness is. And that your child would just be like
197:09 And that your child would just be like an animal. It would be like a like a
197:10 an animal. It would be like a like a baboon which doesn't know these higher
197:13 baboon which doesn't know these higher abstract concepts. So here's the here's
197:17 abstract concepts. So here's the here's the thing that I'm trying to teach you
197:19 the thing that I'm trying to teach you is that
197:21 is that you have a baboon, you have a human
197:24 you have a baboon, you have a human child, then you have a human adult, then
197:27 child, then you have a human adult, then you have, let's say, a highly rational
197:28 you have, let's say, a highly rational MIT scientist. And what's the difference
197:31 MIT scientist. And what's the difference between all these? Well, the the baboon
197:34 between all these? Well, the the baboon understands some stuff. Of course, a
197:35 understands some stuff. Of course, a baboon can differentiate between a cat
197:37 baboon can differentiate between a cat and a dog. Not in a conceptual way but
197:39 and a dog. Not in a conceptual way but just you know in a in a perceptual way.
197:43 just you know in a in a perceptual way. The child the human child has now more
197:47 The child the human child has now more advanced concepts. It has think of it
197:49 advanced concepts. It has think of it this way. Imagine that a baboon.
198:01 It's like a baboon lives in a um in like a purely sensory kind of world with no
198:04 a purely sensory kind of world with no higher concepts, no abstractions, right?
198:06 higher concepts, no abstractions, right? The child gets some abstractions. It
198:08 The child gets some abstractions. It learn it learns concepts like happiness,
198:10 learn it learns concepts like happiness, cat and dog, color. It learns about
198:13 cat and dog, color. It learns about science and so forth. And then as you
198:15 science and so forth. And then as you move up in human development, you know,
198:16 move up in human development, you know, you learn more advanced abstractions.
198:23 Now, the human scientist tells itself, well, we're just being rational and
198:24 well, we're just being rational and scientific. There's no mysticism going
198:26 scientific. There's no mysticism going on. But really, think of it this way.
198:28 on. But really, think of it this way. Here's a recontextualization for you.
198:32 Here's a recontextualization for you. To go from baboon to child in terms of
198:34 To go from baboon to child in terms of mental understanding, that is a step up
198:38 mental understanding, that is a step up in mysticism. The child is more mystical
198:40 in mysticism. The child is more mystical than a baboon because it's able to
198:42 than a baboon because it's able to understand abstract concepts like color.
198:49 The scientist is able to understand even more abstract concepts
198:52 more abstract concepts like you know relativity,
198:54 like you know relativity, spaceime and all sorts of fancy
198:57 spaceime and all sorts of fancy scientific jargon concepts
199:01 scientific jargon concepts that is actually a step up in mystical
199:03 that is actually a step up in mystical intelligence. You need more
199:05 intelligence. You need more intelligence, more abstract capacity,
199:08 intelligence, more abstract capacity, more intuition to do that because you
199:10 more intuition to do that because you couldn't get there through a formal
199:11 couldn't get there through a formal process. Just like a baboon cannot use a
199:13 process. Just like a baboon cannot use a formal process to get to the child's
199:15 formal process to get to the child's level of understanding of anything. And
199:18 level of understanding of anything. And likewise beyond the scientists here, you
199:21 likewise beyond the scientists here, you can open your mind so so deeply that you
199:24 can open your mind so so deeply that you you expand to a level beyond the
199:27 you expand to a level beyond the difference the difference between a
199:28 difference the difference between a baboon and a child. There's an equal
199:30 baboon and a child. There's an equal level difference between a child and a
199:31 level difference between a child and a scientist and an equal level difference
199:34 scientist and an equal level difference between a scientist and me.
199:37 between a scientist and me. my understanding of reality
199:44 that does not compute from the lower paradigm. So a baboon cannot understand
199:47 paradigm. So a baboon cannot understand anything that it's missing from what the
199:50 anything that it's missing from what the child understands. And a child can't
199:52 child understands. And a child can't understand what it's missing from what a
199:53 understand what it's missing from what a scientist understands. And a scientist
199:54 scientist understands. And a scientist can't understand what it's missing from
199:55 can't understand what it's missing from what I understand. And what I'm trying
199:57 what I understand. And what I'm trying to teach you is what I understand.
199:59 to teach you is what I understand. [laughter] Which is why we're having
200:00 [laughter] Which is why we're having this whole conversation.
200:03 this whole conversation. Right?
200:05 Right? The problem here is that you're so
200:06 The problem here is that you're so paradigm locked in each one of these
200:08 paradigm locked in each one of these cases that you think that there's you
200:10 cases that you think that there's you can't even imagine anything above you.
200:18 So if you're a scientist, if you're an MIT scientist, I want you to imagine
200:19 MIT scientist, I want you to imagine that there's something above you that's
200:22 that there's something above you that's more above you than the difference
200:23 more above you than the difference between you and a baboon.
200:26 between you and a baboon. And you can't imagine what that is,
200:29 And you can't imagine what that is, but I know what that is because I have
200:31 but I know what that is because I have it and I can teach it to you. but only
200:33 it and I can teach it to you. but only if you're extremely open-minded.
200:39 [sighs] That's where we're at.
200:43 That's where we're at. So,
200:44 So, fundamental to what intelligence is
200:48 fundamental to what intelligence is is the ability to handle vagueness,
200:50 is the ability to handle vagueness, fuzziness, nebulosity, uncertainty, lack
200:53 fuzziness, nebulosity, uncertainty, lack of information, ambiguity,
200:56 of information, ambiguity, perspective.
201:06 You ask, "Well, Leah, where's the mystical stuff?" The mysticism is just
201:09 mystical stuff?" The mysticism is just in your ability to understand the
201:11 in your ability to understand the difference between a cat and a dog.
201:13 difference between a cat and a dog. That's already mystical.
201:25 Your ability to count from one to two to three to four is already mystical.
201:29 three to four is already mystical. You're just not grasping it because you
201:31 You're just not grasping it because you take it for granted.
201:34 take it for granted. If a baboon got the ability to count
201:36 If a baboon got the ability to count numbers,
201:38 numbers, it would think it has a mystical p
201:40 it would think it has a mystical p superpower.
201:42 superpower. But you, because you just take
201:44 But you, because you just take everything for granted because you're
201:45 everything for granted because you're just like a fucking baboon in human
201:47 just like a fucking baboon in human form. You take everything for granted.
201:49 form. You take everything for granted. Even as a scientist, you take it all for
201:51 Even as a scientist, you take it all for granted. All of your intelligence, all
201:53 granted. All of your intelligence, all of your understanding capacity, all of
201:55 of your understanding capacity, all of your intuitive abilities, all of it, you
201:56 your intuitive abilities, all of it, you take for granted. because you're so
201:58 take for granted. because you're so fucking stupid.
202:04 That to you that's just like it's nothing. It's nothing. It's just normal.
202:06 nothing. It's nothing. It's just normal. It's just atoms. All it is is atoms and
202:09 It's just atoms. All it is is atoms and neurons.
202:12 neurons. You don't understand the insane mystical
202:14 You don't understand the insane mystical intelligence that is inside of you. Just
202:17 intelligence that is inside of you. Just understand just to understand this
202:19 understand just to understand this episode required such crazy levels of
202:22 episode required such crazy levels of intelligence that no other animal on
202:24 intelligence that no other animal on this planet can do it other than humans.
202:26 this planet can do it other than humans. And even most humans won't understand
202:27 And even most humans won't understand what I'm talking about.
202:38 Mysticism is not about seeing crazy fucking unicorns and shit flying around
202:40 fucking unicorns and shit flying around you and ghosts and whatever and Jesus.
202:43 you and ghosts and whatever and Jesus. It's about recognizing that all of the
202:44 It's about recognizing that all of the ordinary stuff that you thought was just
202:46 ordinary stuff that you thought was just Adams is not actually Adams. It's a
202:48 Adams is not actually Adams. It's a profound mystical experience. Your
202:51 profound mystical experience. Your entire life is one giant incredible
202:53 entire life is one giant incredible mystical experience and you're just
202:55 mystical experience and you're just sleepwalking through it as a scientist,
202:58 sleepwalking through it as a scientist, as a materialist, as an atheist.
203:02 as a materialist, as an atheist. That's all that's all we're talking
203:04 That's all that's all we're talking about here.
203:07 about here. [snorts]
203:16 So, let's wrap it up here. We still have a ton of content. All the most
203:18 a ton of content. All the most interesting content is still yet to
203:19 interesting content is still yet to come.
203:20 come. But before we get to there, um, just let
203:23 But before we get to there, um, just let me give you a homework assignment.
203:25 me give you a homework assignment. You must, if you like this content, you
203:27 You must, if you like this content, you must go read metarrationality.com.
203:30 must go read metarrationality.com. A lot of the examples and information
203:31 A lot of the examples and information that I pulled was from that website.
203:33 that I pulled was from that website. That website is like a book. It'll take
203:35 That website is like a book. It'll take you weeks of of reading. There's a lot
203:37 you weeks of of reading. There's a lot of dense material there, technical
203:39 of dense material there, technical material to read through all that stuff.
203:41 material to read through all that stuff. Go read it.
203:42 Go read it. Uh, credit again to David Chapman for
203:44 Uh, credit again to David Chapman for for a lot of me borrowing his insights
203:48 for a lot of me borrowing his insights here and quoting from him and all that.
203:49 here and quoting from him and all that. He did some great work. However, his
203:52 He did some great work. However, his work is still limited. It is still very
203:55 work is still limited. It is still very bounded within rationality and there's
203:57 bounded within rationality and there's something beyond his work and in order
203:59 something beyond his work and in order to understand that that's what we need
204:00 to understand that that's what we need part two and part three for. So, we're
204:02 part two and part three for. So, we're going to get that to that uh quite soon.
204:06 going to get that to that uh quite soon. So, we're done here. Uh, that's it for
204:09 So, we're done here. Uh, that's it for part one. Please check out my life
204:10 part one. Please check out my life purpose course. Check out my book list
204:12 purpose course. Check out my book list if you're looking for profound books to
204:14 if you're looking for profound books to read. I compile a list of the best books
204:16 read. I compile a list of the best books that I've read um in my book list. Um
204:19 that I've read um in my book list. Um check out my blog. I'm going to fix that
204:21 check out my blog. I'm going to fix that soon and keep updating it more. That's
204:23 soon and keep updating it more. That's coming soon. And um let me just end on
204:27 coming soon. And um let me just end on this.
204:30 this. Part two is coming in the next two
204:31 Part two is coming in the next two weeks. I know sometimes when I do these
204:32 weeks. I know sometimes when I do these series, I can take a long time between
204:34 series, I can take a long time between episodes. I'm being very careful not to
204:37 episodes. I'm being very careful not to do that because all these parts need to
204:39 do that because all these parts need to go together. So, in about two weeks,
204:41 go together. So, in about two weeks, I'll release part two, and another two
204:43 I'll release part two, and another two weeks I'll release part three. So, don't
204:45 weeks I'll release part three. So, don't worry. All that's coming in the next
204:46 worry. All that's coming in the next month. You'll have the whole three-part
204:48 month. You'll have the whole three-part series. All of it will be one one
204:50 series. All of it will be one one seamless um you know, body of work. Um
204:54 seamless um you know, body of work. Um make sure that you don't quit halfway
204:55 make sure that you don't quit halfway through here.
204:57 through here. Uh like I said, most people who watch my
205:00 Uh like I said, most people who watch my videos only watch for 25 minutes.
205:03 videos only watch for 25 minutes. Imagine finding this information and
205:06 Imagine finding this information and then only watching for 25 minutes and
205:08 then only watching for 25 minutes and missing all of it.
205:11 missing all of it. How tragic is that, right? How tragic is
205:14 How tragic is that, right? How tragic is that? Um,
205:16 that? Um, so you're only going to get from this
205:18 so you're only going to get from this work what you put into it. I I expect
205:20 work what you put into it. I I expect you to
205:23 you to spend at least a thousand hours watching
205:25 spend at least a thousand hours watching and listening to my content. At least
205:28 and listening to my content. At least you need at least a thousand hours to
205:29 you need at least a thousand hours to understand what I'm trying to teach you.
205:32 understand what I'm trying to teach you. And so it's very important that you
205:34 And so it's very important that you found you got so lucky that you found
205:35 found you got so lucky that you found this material basically by accident. you
205:37 this material basically by accident. you stumbled upon it through some YouTube
205:39 stumbled upon it through some YouTube algorithm or whatever.
205:41 algorithm or whatever. Um, but see, the saddest thing is that
205:45 Um, but see, the saddest thing is that people find this content by accident.
205:47 people find this content by accident. They're like one in a million to find
205:49 They're like one in a million to find this content. You find it, but then you
205:51 this content. You find it, but then you get lazy and complacent, as we usually
205:54 get lazy and complacent, as we usually do, and then you fall off track. You
205:56 do, and then you fall off track. You forget about it. You only watch halfway.
205:59 forget about it. You only watch halfway. I say this because whenever I release
206:01 I say this because whenever I release one of these deep series, part one, part
206:03 one of these deep series, part one, part two, part three, I get like a 100,000
206:06 two, part three, I get like a 100,000 views on part one, 50,000 views on part
206:09 views on part one, 50,000 views on part two, and then 25,000 views on part
206:11 two, and then 25,000 views on part three, which is the most important part.
206:21 It's baffling just how complacent people are with this kind of information.
206:24 are with this kind of information. This information will change your life.
206:26 This information will change your life. I spent 20 years trying to understand
206:28 I spent 20 years trying to understand this. Seriously,
206:31 this. Seriously, this is like all of my philosophical
206:33 this is like all of my philosophical contemplations boiled down into 10
206:35 contemplations boiled down into 10 hours. You think 10 hours is a lot to
206:37 hours. You think 10 hours is a lot to ask? It's nothing. It's nothing compared
206:39 ask? It's nothing. It's nothing compared to the 20 years I've spent thinking
206:40 to the 20 years I've spent thinking about this stuff. Hundreds of hours,
206:42 about this stuff. Hundreds of hours, thousands of hours went into just this.
206:50 And with my work, we're always building towards something bigger.
206:54 towards something bigger. Everything is interconnected. We're
206:55 Everything is interconnected. We're interconnecting more and more stuff
206:56 interconnecting more and more stuff together, so the best is always yet to
206:58 together, so the best is always yet to come. So, make sure you stick around. Do
207:00 come. So, make sure you stick around. Do not quit halfway through.