0:00 This is Endgame. I'm Amanda Cassid. This
0:02 episode is with George Hot. Hots, best
0:04 known in some circles as Geohot, hacker
0:06 of the iPhone and Sony PlayStation
0:08 turned AI founder with the open- source
0:10 self-driving device, Comma, and the
0:12 elegant and concise machine learning
0:14 framework Tiny Grant. We talk about
0:15 where AI is going and of course the
0:17 macro, what on earth's going on with the
0:19 US government and with China. Give it a
0:21 watch or listen wherever you consume
0:22 podcasts.
0:26 Okay, the first question I wanted to ask
0:28 is let's just place you in the spectrum
0:32 of AI people. So we have the Yudcowski
0:36 style AI doomer on one hand and we have
0:39 the Daario Amode style machines of
0:42 loving grace um abundance utopian on the
0:46 other side. You are neither of these
0:48 things. Let's just lay the foundation.
0:51 Where are you on the spectrum and what
0:53 do you think is going to happen? I think
0:55 you have to separate the what's going to
0:58 happen from the how do you feel about
1:00 it? So let's have two separate
1:03 conversations. What's going to happen
1:04 and how do you feel about it? Machines
1:07 are going to take over and replace
1:08 humanity and I feel good about this. So
1:12 let's talk about the steps of number
1:14 one. Then we'll return to our feelings.
1:17 How did the machines do it? How does it
1:18 happen? Robin Hansen's Age of M is,
1:21 despite getting a bunch of the hows
1:25 wrong, I think the overarching story it
1:27 tells is
1:28 right. It's just a question of economic
1:31 competition. Um, if machines can produce
1:34 more value for less input cost, they
1:36 will out compete
1:38 humanity and that is the natural way of
1:42 things. So what's the timeline? So right
1:44 now we start with LLMs. take us on the
1:47 journey from how we start with these
1:49 chat bots and how that ends up with
1:51 human economic value.
1:54 I mean decreasing to zero. No, this
1:56 journey started in 1950. The journey
1:58 started the minute someone had a
1:59 computer. This is not a there's nothing
2:01 new about LLMs. There's nothing new
2:03 about AI. AI is just a blanket word for
2:06 kind of like things machines can't do
2:08 yet, but now like machines can kind of
2:10 do everything. So we kind of call it AI.
2:12 But no, this journey has been happening
2:14 for a long time. There used to be people
2:16 whose job it was to sit in a room and
2:18 add numbers up. And then machines came
2:22 in and did that much more
2:24 cost-effectively than those people. For
2:27 a person, you have to give them food,
2:28 give them housing. For the machine, you
2:31 don't. And it adds numbers a lot better
2:33 than people. So, it's just a
2:35 continuation of the same trend. And that
2:37 trend will continue smoothly into the
2:40 future. There is no singularity. There
2:42 is no inflection point. is just a smooth
2:45 trend continuing. So the singularity is
2:48 not coming. The singularity is not
2:49 nearer. We're just going to slide
2:51 unconsciously into it and past it. I
2:53 don't even know what the singularity
2:55 actually means. If you define the
2:58 singularity as a point where machines
3:01 can do most of the productive work. I
3:04 argue that that probably even
3:06 happened 20 years ago. If you take the
3:09 singularity as machines could do all of
3:11 the productive work, I don't think that
3:12 will ever happen due to comparative
3:14 advantage. Explain comparative
3:15 advantage. It's like a trade thing. Even
3:17 if one country is capable of producing
3:20 everything for cheaper than another
3:22 country, it still makes sense for those
3:24 two countries to trade
3:26 because yeah, comparative advantage.
3:28 It's it's in the it's in the term. Yeah.
3:30 So it's more proximal for that country
3:32 to create things to serve their market
3:34 even if it's more expensive for them to
3:36 create things. No. Okay, comparative
3:38 advantage. The concrete example is like
3:40 you have good A and good B and country
3:42 one and country two. Country one can
3:44 produce both good A and B cheaper than
3:46 country two. But does that mean they
3:48 shouldn't trade? Well, no. Because there
3:50 could be a comparative advantage. If
3:52 country one can produce B way cheaper
3:55 and produce A only slightly cheaper, it
3:57 actually makes sense for country two to
3:58 produce A and for them to trade uh for
4:01 B. So there will always be this
4:03 comparative advantage with humans. Even
4:05 if machines can do everything better
4:06 with humans, humans will always have a
4:09 comparative advantage. So human economic
4:11 value isn't going to zero because of
4:13 comparative advantage. And so let's talk
4:15 about where those comparative advantages
4:17 are because so so just to plot you on
4:19 that plot with Yudowski and Dario Amade.
4:23 So Dario Ammeday is also of the opinion
4:25 that machines are going to replace
4:27 people at every task except for he's
4:29 joyful about it and he's excited to have
4:31 a life that is free from um labor and to
4:34 devote himself in his essay to mountain
4:36 biking. And then you have Yudkowski who
4:39 thinks that yeah machines can replace
4:41 people but also they might explode the
4:43 world and we might be in some doomsday
4:45 scenario where we all die because of
4:46 some error ma made through AI. So let's
4:49 let's plot this kind of comparative
4:51 advantage where humans are going to be
4:53 in that because you also have people
4:54 like Cararpathy saying agency, right?
4:56 Like humans human the human value prop
4:59 is having the agency to use all these
5:01 tools. The locus of agency is still
5:03 going to be with humans in terms of how
5:05 they instrumentalize the machines. No,
5:07 it's nothing like this. All right. It
5:09 it's more like some form of they work
5:11 for each other. What's they work for
5:13 each other and where does that come
5:15 from?
5:16 Uh, well, it comes from Rick and Morty
5:18 about that. No, no, no, no, no. You see,
5:20 it's not it's not slavery. They work for
5:21 each other. So, there's a Rick and Morty
5:23 episode where Rick builds a car battery.
5:26 And it turns out the way the car battery
5:27 works is that there's a little
5:28 civilization inside the car battery. And
5:30 Rick siphons 20% of the excess power of
5:34 the civilization to power his car. And
5:36 Morty says, "Rick, that's slavery." And
5:38 Rick's like, "Well, you see, no, no, no,
5:40 they work for each other." And then just
5:42 the excess power is devoted to my car.
5:45 So no, I I think that there'll always be
5:47 human economies in the same way today
5:49 there's ant economies. Even though
5:52 humans could do everything economically
5:53 more effective than ants, if we wanted
5:56 to build an antill, we can do it very
5:57 economically effectively compared to
5:59 ants. Ants still work for each other.
6:01 Mhm. So that that's kind of what I mean
6:04 by so at the end of the day, humans are
6:07 only using machines mostly because we
6:10 want things from each other and in order
6:12 to get things from each other. And so
6:14 the human value prop will still be alive
6:16 because at the end of the day, what's
6:18 the goal of using machines anyway? It's
6:21 to change your interaction with other
6:22 humans. No, I think that this whole
6:24 thing puts the story of the universe
6:27 what's the what's the like like what
6:29 what they used to believe about how the
6:30 earth was the center of the universe?
6:32 Anthroposentric or something?
6:34 Geocentrism. Yeah. Yeah. Yeah. But this
6:36 is like like anthropocentrism, right?
6:38 It's the idea that you're putting humans
6:40 at the center of the story when I mean
6:43 you can take like EAC at its best will
6:45 talk about the story of the universe is
6:49 basically the uh mastery of entropy.
6:52 Yeah. The human mastery of entropy. No,
6:54 no, no, no. Humans using machines to
6:56 master entropy. No, there's no humans in
6:58 the story. It's entropy itself, right?
7:00 It's this idea that we can create
7:01 structures. It's the idea that we can
7:03 locally combat entropy. Who's we?
7:06 Doesn't matter. the inhabitants of the
7:07 universe. I mean,
7:09 humans privilege human stories in the
7:12 same way I'm sure ants privilege ant
7:13 stories. So, ants don't care, but
7:16 there's this species called humans
7:18 that's more advanced than them that
7:19 could crush them if they really tried
7:20 hard to. It's not worth it, too.
7:22 Obviously, ants don't care about that
7:23 because primarily what they want from
7:25 life is from other ants. And so, the
7:27 fact that these other things exist is
7:28 almost irrelevant to them. Is that what
7:30 our relationship with machines is going
7:32 to be? I mean, all kinds of things will
7:34 exist, right? There's some ants that
7:36 don't interact with humans. There's some
7:37 ants probably deep in the Amazon jungle
7:39 that have never seen a human that have
7:40 never thought once about humans. And
7:42 then there's some ants that come infest
7:43 your trash can. So, we're the ant
7:45 infestation in AI's trash can. We're
7:47 everything. Humans humans humans will be
7:50 all sorts of different things. There is
7:53 no there is no shared common human
7:54 destiny. So, there are these two
7:57 phenomenon and I'm trying to fit them
7:59 together. So there's this story where
8:03 machines can do everything a human can
8:05 do better and more economically whether
8:07 it's physical labor with AI robotics or
8:11 whether it's cognitive labor with things
8:13 that look like LLMs but will improve. So
8:15 there's that human labor eventually
8:18 reaches almost zero except for where
8:20 there's comparative advantage. So
8:22 there's that story and then there's the
8:24 story oh human labor now human labor
8:26 never reaches zero. So human labor
8:28 continues to be valuable. What do you
8:30 mean by valuable? Right? Like is ant
8:32 labor valuable to other ants? Sure.
8:34 Yeah. Then human labor will still be
8:35 valuable. I think maybe makes sense to
8:37 go a layer of abstraction out and try to
8:39 answer try try to contextualize the
8:41 question. I think the thing that people
8:43 worry about is that they're not going to
8:46 have a job. People worry about all sorts
8:48 of stuff. Well, people worry about the
8:50 universe laughs at your worries. People
8:52 worry about their survival, right?
8:53 People worry about their mass hierarchy
8:55 of needs. People worry about am I going
8:58 to food, water, shelter. Oh no, you
9:00 might die. You might join all the
9:02 billions of humans before you who die.
9:04 How am I doing relative to my
9:05 expectations? How am I doing relative to
9:07 my cultures expectations? The people in
9:10 my community. There's absolute how
9:12 you're doing in absolute terms. There's
9:14 how you're doing in relative terms. So
9:16 people have fears because of AI that
9:19 they're not going to do as well either
9:21 in absolute or relative terms than they
9:23 are now or than they expect. Well,
9:25 people have a lot of things and they'll
9:27 have options to take pills to make those
9:29 things go away if they so choose. But
9:31 what you're saying is there's nothing
9:32 about AI that inherently is going to
9:35 mean people people's relative or
9:38 absolute quality of life is going to go
9:40 down. No. So people should not worry
9:42 about that. Well, it depends what you
9:43 mean by relative. Absolute. No way.
9:45 Relative to other people. Well, I mean
9:47 that's stupid. I I some humans are going
9:49 to coexist with the AIS and be
9:52 fabulously wealthy. we'll have
9:54 trillionaires and stuff that that if
9:56 you're worried about your relative
9:58 wealth, like I have some bad news for
10:00 you. Um, but if you're worried about
10:02 absolute, if you're worried about like
10:04 starvation or AI showing up with a
10:06 machine gun uh and and and shooting you
10:09 in the head for some, that's not going
10:11 to happen. Well, let's talk about wealth
10:12 for a moment. So if a person alive today
10:17 believes your thesis about what's going
10:19 to happen with AI, what can they do to
10:22 get economically aligned with that
10:24 outcome? What what should they invest in
10:26 in order to be one of those people that
10:28 benefits? So maybe let's see, it's 2
10:30 million years ago and you are a bonobo.
10:33 Okay. Was I a bonobo 2 million years
10:35 ago? Sounds about right. Right.
10:37 Ancestors. I don't know. 2 million, 10
10:39 million, 100 million, whatever. I don't
10:41 not one of those. I'm not one of those
10:43 animal history guys. But so you're
10:46 asking the question as a bonobo Mhm. I
10:49 see the rise of humanity coming. Mhm. Uh
10:51 what should I invest in? It's it's it's
10:54 ab it's as ridiculous as that. What
10:56 about within my bonobo time span
10:58 lifespan? So there's nothing to do
10:59 within our lifespan. What do you mean
11:01 nothing to do? What what are you trying
11:03 to do? Well, there's everything to do in
11:05 our lifespan. You could you could you
11:07 could like go to Tibet and climb a
11:09 mountain. But the question is how to
11:11 economically align with the rise of AI
11:13 that's going to happen within our
11:15 lifetime. Is there a way to do that? Is
11:16 there a way to make bets on the future
11:19 materializing how you think it will be?
11:21 That'll be useful to people. Look, uh I
11:25 don't this is this is very alpha proby.
11:28 Uh and I avoid I avoid things that are
11:31 alpha proby. Yeah, of course there are.
11:33 You can think about them. I don't even
11:35 understand like why here like I'll say
11:39 why. I just like kind of ignore that
11:41 question.
11:44 Your future has so much more your
11:47 outcomes in life in your in your 80y
11:50 year 100year life expectancy are so much
11:53 more aligned with sorry so much more
11:56 like correlated with humanity as a whole
11:58 than any decisions you make. How so? In
12:00 one world, we can end up in a nuclear
12:03 war. We can end up devastating all the
12:06 technological capacity, setting things
12:08 back. We could we could stop AI with
12:10 with with with today's setting off all
12:11 the nuclear bombs. Yeah. Yeah. That
12:13 would certainly set AI back 20 years at
12:15 least. It might not stop it forever, but
12:17 will have so much more impact on your
12:20 individual life than any decisions you
12:22 make. So with respect to that question,
12:24 how can I get ahead? That's probably the
12:27 wrong question. Just ask how can I not
12:29 fall behind? And the answer to that is
12:31 nothing more than figure out how to
12:32 produce value for other people. Figure
12:34 out how to produce more than you
12:35 consume. And always make sure you're
12:36 producing more than you're consum
12:38 consuming. Think about how to
12:39 sustainably produce more than you're
12:41 consuming. And as long as you keep doing
12:43 that, well, the universe wants you
12:47 around. The thing that I was trying to
12:48 steer you to with that admittedly alpha
12:51 heavy question was toward your essay,
12:53 Nobody Profits, and the the thesis
12:56 around that. Maybe you can frame for us
12:58 who's trying to corner off the value
13:00 from AI. And this this whole thing is
13:03 it's just so ridiculous. Like they're
13:04 obviously going to lose. It's not even
13:06 worth like anyone who believes in
13:08 intellectual property is a clown, right?
13:11 There's no there's no there's no future
13:13 in that. That doesn't make any sense,
13:14 right? But that's not the thesis of
13:16 nobody profits. But set up the
13:17 intellectual property argument. So
13:19 intellectual property is this idea. If
13:21 if I had a machine tomorrow that could
13:24 make unlimited amounts of rice and you
13:26 could just it's like it's like a rice
13:27 cooker, but you always open it and it's
13:28 always full with new rice. We could feed
13:30 the world. We could solve world hunger.
13:34 But we won't because you see that would
13:36 interfere with rice company profits. So
13:39 big rice would come and um confiscate
13:41 your infinite rice cooker. Of course, I
13:43 mean forget even about big rice. You
13:44 know, you can look at the problem with
13:46 like uh people who are hungry in Africa
13:49 and it's not really due to food
13:50 scarcity. It's due to okay, some UN
13:53 programs going to give food to this
13:54 country. Who are you giving the food to?
13:56 You're not going to be able to hand a
13:58 bowl to each person. You're going to
13:59 give it to the government. And the
14:01 government says, well, we could either
14:03 give the food to the people or we could,
14:05 you know,
14:07 not right. We're just going to we're
14:09 going to we're going to hoard it. We're
14:10 going to create we're going to create
14:12 scarcity around this abundance because I
14:15 can personally profit off of that
14:16 scarcity. Now, of course, the rice
14:17 machine doesn't exist, but the the uh
14:21 music copying machine does, the video
14:23 copying machine does, the idea copying
14:25 machine does. There's a Joe Biden quote
14:28 where he says uh people downloading
14:32 movies is no different from someone
14:33 smashing the window at Tiffany's and
14:35 grabbing the bracelet. And like, no, it
14:38 is. That's a Biden quote. I thought that
14:39 was that warning message they used to
14:40 play on DVDs in the morning. You
14:42 wouldn't download a car. Yay. Wait,
14:44 man. I can download cars. I I'd
14:47 love to download a car. That'd be sick.
14:48 Like, that's the kind of future I want
14:50 to live in. So, what intellectual
14:51 property is is it takes a good that's
14:53 fundamentally non-scarce and uses
14:56 violence to create a regime of scarcity
14:58 around it. It's nonsensical and it will
15:00 go away. So, currently, if you use chat
15:02 GBT image generation, for some reason,
15:05 you could use the studio Gibli because
15:06 it's a general style. It's not a IP
15:10 protected name, but then if you ask it
15:12 to create something in the style of um
15:16 Tim Burton or if you ask it to create
15:18 something in the style of Lisa Frank, it
15:21 won't do it. Are those the kinds of IP
15:22 protections that you think are going to
15:24 vanish? All I heard was someone has a
15:26 gun to Sam Alman's head and told him
15:28 that he better not generate any
15:29 copyrighted Tim Burton or Anne Frank
15:32 stuff. Sarah Frank, whatever. Lisa
15:34 Frank. Lisa Frank. There you go. Yeah.
15:36 No, but I mean, no. Yeah, of course.
15:38 Right. The there's no fundamental reason
15:39 the AI can't do that. In fact, by
15:41 nature, the AI can do that and there's
15:44 an extra set of guard rails put on top
15:46 of it to prevent that. Of course, that's
15:47 going to go away. I think that gets us
15:49 to AI alignment. Do you think that the
15:52 AIs that are emerging today are aligned
15:55 with people or do you think they are
15:57 misaligned? Uh, mostly aligned. I I
16:00 really see very little difference
16:02 between alignment and capability.
16:04 alignment is just I asked the AI to do
16:06 something and did it do it and yeah like
16:09 sure you'll have the people who are
16:10 focused on this whole idea of well if I
16:13 ask JGBT how to build a bomb it won't
16:15 tell me it's not aligned with me I'm
16:16 like libertarians my much you know
16:20 bigger concern about alignment is when I
16:21 ask it to prove the reman hypothesis and
16:24 it doesn't do that so it's misaligned
16:25 alignment and capability are the same
16:27 thing it's not fully aligned with humans
16:28 because it's not capable of answering
16:30 all the questions you ask it I don't
16:31 even think aligned is the right
16:33 question. I think there's only a
16:35 question of capability. There's just a
16:37 question of saying, "What can this AI
16:39 do?" What kinds of things do you think
16:40 governments should actually pressure AI
16:43 companies to prohibit their AIS from
16:45 doing? Do you think there are things
16:47 that I Do you think there are legitimate
16:51 reasons why a government should have
16:52 that gun to Samman's head? I'm not even
16:54 like going to like entertain this like
16:56 like governments are
16:59 a relatively new construct in history.
17:03 How so? Some kind of government has
17:05 always No. What? A tribe has a
17:08 government. Not really. What's a
17:11 government? Yeah. Like like this concept
17:13 of oh well should we use policy to
17:16 influence the D? This is nonsense. Sure,
17:19 it might make some sense for the next 10
17:21 or 20 years, but like the overall arc of
17:24 history bends toward we are literally a
17:26 bunch of bonobos talking about well
17:29 should our bonobo tribes do something to
17:32 to to limit the rise of human? Should we
17:34 take action about the human question? So
17:36 you don't think that government will
17:38 ultimately have any say in what happens
17:40 or doesn't happen with AI? Yeah. people
17:42 will just the incentives are so great to
17:46 build certain kinds of things that
17:48 regardless of whatever restrictions are
17:50 in place they will be trampled. Yeah. I
17:54 mean I don't even think it's incentives.
17:56 Sure. You can like kind of frame it as
17:57 incentives. there is
18:01 a maybe this is maybe this is like wig
18:03 history but I do think that there is a
18:07 like direction in
18:10 which maybe maybe I'm just like so
18:13 deeply a believer in like fundamental
18:15 capitalism like the the
18:17 the direction flows towards like wealth
18:24 the direction what do you mean history
18:26 history trends toward wealth there's
18:28 there's local aberrations against that.
18:32 Sometimes you'll have like a local okay
18:34 this is a you know a collapse or
18:36 whatever but the overall trend is
18:39 towards Yeah. So again, like when AI is
18:46 spreading von Newman probes across the
18:49 galaxy, wow, the Little Earth government
18:52 is saying, "You can't make a Tim Burton
18:54 picture." Like, it doesn't make any
18:56 sense, right? Enforce your IP regime on
19:00 Mars. How you going to enforce it? IP is
19:04 this complete aberration enforced by
19:06 really the US. And as we see a decline
19:08 in US power, we'll see a decline in
19:09 intellectual property. I don't think the
19:10 Chinese have the same idea on it.
19:12 Chinese, US and Europe have always taken
19:16 pretty fundamentally different
19:18 philosophical approaches to technology
19:21 and IP, right? So we have the US with
19:23 this IP enforcement regime. We have the
19:27 we have the EU that's even more extreme
19:30 with things like the right to be
19:32 forgotten with kind of gumming up the
19:33 works with all these cookie policies and
19:36 GDPR information policies. So they're
19:38 all all the way on the extreme. Then the
19:40 other extreme we have China which
19:42 doesn't have the same concept around IP
19:45 and there were you know concerns that
19:47 deepseek uses some uh of open AI's
19:51 resources or copies them. Do you think
19:52 that that's the future? Do you think
19:54 that we're going to have all of the
19:56 values siphoned out of US and European
19:59 projects by people in countries that are
20:01 willing to copy? No. Like again I I I I
20:04 look at this from like a whole different
20:06 like p a whole different like scale. Of
20:09 course when things are no longer
20:11 competitive they go for protectionism.
20:13 That's it. Well the charact so the
20:14 characters in Silicon Valley the TV show
20:16 perfectly encapsulate kind of the
20:18 stereotypical vision of what each of
20:20 those countries is right like all of the
20:22 guys that are that are building the tech
20:24 company in Silicon Valley are building
20:25 proprietary technology. Then Jinying
20:28 comes and copies it and brings it to
20:29 China and scales it up. Is is that a
20:32 real reflection of the dynamics between
20:35 countries building technologies? No, I
20:37 don't think so. I think that the era of
20:39 China copying is is over and China's
20:41 innovating now. I think I I like to
20:43 think about it more like that. More like
20:44 this. So like back in the day, you might
20:46 have had like the Romans and the Greeks
20:47 and the Romans and the Greeks would have
20:49 a war. And both of them believed that
20:52 the gods that they prayed to determine
20:54 their success in that war. Mhm. We
20:56 prayed to Athena. We prayed to Janice.
20:58 We prayed to Hera. You prayed to Zeus. I
21:01 don't know. I don't know the names of
21:02 the gods. But it turns out none of these
21:05 gods actually matter which one you pray
21:06 to. Whether you pray to Athena or
21:08 Janice, it doesn't matter. But some
21:10 people started praying to the metal god.
21:13 See these guys were making their swords
21:14 out of wood and making their swords out
21:16 of wood. He festus out of metal working.
21:18 A new a new tribe came along and started
21:20 praying to the metal god. And the thing
21:21 is the way that you worship the metal
21:22 god is that you make your swords out of
21:24 metal instead of out of wood. And wow.
21:28 Wow. the metal god that that turned out
21:29 to be the right god. The people who
21:31 prayed to the metal god started doing
21:33 really well in
21:35 war. Everyone else is hitting people
21:37 with wooden sticks and one guy shows up
21:38 with a metal sword metal sword guy. So I
21:42 think about the arc of technology much
21:43 more like that than any of these local
21:45 sort of things. There is a correct god
21:47 to pray to and the winner will simply be
21:52 who's ever best at communing with that
21:54 god. Who's the correct god to pray to
21:56 now? And is there a group of people
21:58 that's doing it right? The Chinese are
22:00 doing it far better than West. And what
22:02 is that god? If you Yeah. Like I think
22:05 you can look at the difference between
22:07 the West praise to some god of capital
22:10 and the Chinese pray to some god of but
22:13 no we make steel in a big factory. But
22:15 what about the profit? I don't worry
22:17 about that. The Chinese might believe
22:20 that 10 competing steel firms ekking
22:23 out. I think it's I mean I guess I'm I'm
22:26 really against the like Peter Teal's
22:28 idea of a monopoly. Monopoly is bad.
22:30 Competition is good. Competition which
22:32 grinds everybody to small profit margins
22:35 is good. What's going on that's
22:37 different between US and Chinese
22:39 economy. The financialization over here
22:42 and the focus on raw materials in China.
22:45 I I I I don't even think it's just that.
22:47 I think there's many more factors at
22:50 play here that have more to do with the
22:53 fact
22:54 that kind of just expectations. I think
22:57 the expectations for a worker in the US
22:59 is much higher than the expectations for
23:01 a worker in China. Of course, as
23:03 countries fall in prosperity,
23:04 expectations of their workers go up and
23:06 then they're disrupted by the Romans.
23:08 The Romans are are are are they're
23:10 they're sitting on toilets that they can
23:12 flush. That's how they We're
23:14 in the woods.
23:16 kill them
23:18 and the barbarians win. And that's just
23:19 the cycle of history.
23:22 The people who have no expectations, the
23:24 people who do sleep in the dirt every
23:25 night end up killing the people in the
23:27 cities and taking their silk uh taking
23:29 their their all their nice city things
23:31 and then 20 years later you look around
23:33 and they're to wearing galls and
23:35 barbarians in your metaphor. They end up
23:37 becoming city people of course. So the
23:39 whole thing flats and then well the
23:40 cycle starts over and then there's new
23:41 barbarians who look oh they used to be
23:44 they used to be tough they used to sleep
23:45 in the dirt like us but now they took
23:47 over the city and yeah and so today
23:50 there are two groups of people. There's
23:52 the proverbial barbarians. There's the
23:53 proverbial city people. Some people have
23:56 become weak, have become sort of like
23:58 time machine elloy forgotten how to do
24:00 things. Have forgotten how to work hard.
24:02 And then there are the people that
24:04 remember those things that are going to
24:05 beat them. Is that the conceit here?
24:07 Yeah. I mean, again, I think it also
24:08 comes down to to expectations. Pass a
24:12 90% tariff on Vietnam. Great. We can
24:14 bring all those great jobs to America.
24:16 Do you want to work in a Do you want to
24:18 work in a textile factory and get paid
24:19 $3 an hour and sewing the same shirts
24:21 for Nike over and over again? You can.
24:22 We can bring that job to America. So,
24:25 between comma and tiny, you're thinking
24:27 about software and hardware. You're
24:29 thinking about shipping, manufacturing,
24:32 margins. What do you think of Trump's
24:34 tariffs? Tariffs are regulations.
24:36 There's no difference between tariffs
24:37 and regulations. Tariffs are
24:39 protectionist regulations and all forms
24:41 of protectionism. You you can't win in a
24:44 competition ever with protectionism.
24:46 Protectionism is just you saying I give
24:48 up. But what about just to steal the
24:51 protariff perspective? Wouldn't you
24:52 agree that if all of the different
24:55 countries, if pangia broke up again,
24:58 kind of like almost like the courtesy of
25:00 an idea, if pangia broke up again, no
25:02 one could shift between their countries,
25:04 no one had cultural exchange between
25:05 their countries for a hundred years, and
25:07 then we put everything back together,
25:09 everyone would be better, everyone would
25:10 be stronger, we'd have more interesting
25:12 things to trade. So by that logic, you
25:15 could take America, you could close it
25:17 off through tariffs for some short
25:18 period of time that would incentivize
25:21 the return of American manufacturing,
25:24 American salailable ingenuity and
25:27 products. And so then when you open it
25:29 up again, it would have a better balance
25:30 of trade with the external world. I
25:32 think a protariff person would say
25:34 that's the plan. Okay. Well, your
25:36 tariffs aren't nearly high enough.
25:37 That's the plan. Then it's not enough.
25:40 If that's the plan, try a blockade. I'm
25:41 more supportive of a blockade than I am
25:43 of 90% tariffs. Well, the current
25:45 tariffs are what 20 25%. No, some of
25:47 them are some of them are higher. What
25:48 is the actual wage for a textile worker
25:50 in Vietnam? I couldn't tell you. You can
25:53 look it up. Let's say $3 an hour. Okay.
25:55 What is the or Cambodia at least would
25:57 be three. Maybe Vietnam's a little more
25:58 now. What would you have to pay that
25:59 person in America? America has a minimum
26:01 wage. I'm not sure exactly what it is.
26:03 Maybe it's $15. We can we can fact check
26:05 it. Yeah. So, you're talking 5x right
26:09 there. at least 5x. So a 90% tariff is
26:13 going to do nothing except create
26:15 inefficiency. If you want to truly
26:18 change the incentives, you need
26:20 something that looks a lot more like a
26:22 blockade. If you want to do the the the
26:24 the Jarvin
26:26 pangia thing because so complete
26:29 withdrawal. Yeah. And then import
26:31 export. Complete withdrawal. And and
26:33 then even if you were to say okay fine
26:35 okay what do you say 3 to 15? Okay 500%.
26:37 We'll do 500% tariff. If you do a 500%
26:40 tariff, the market for subverting
26:43 tariffs will be massive. The the illegal
26:46 shipping market will exceed the shipping
26:48 market. If you truly enforced a
26:51 technical blockade, the US can do that.
26:55 That would work. So you have uh
26:58 gunboats preventing trade from
27:01 happening. Yeah. If the US Navy actually
27:03 wanted to do a blockade of all foreign
27:06 imports into America, they could. And
27:09 that would create the kind of again
27:11 through a very very painful adjustment
27:13 period that would
27:15 create this this independent America but
27:20 a tariff on the order of 20 to 90% is
27:24 just going to introduce inefficiency.
27:27 So just it's the wrong order of
27:28 magnitude. So in your view, tariffs of
27:32 this scale are and protectionism in
27:35 general is the wrong approach because
27:38 instead of trying to make America
27:40 actually competitive with overseas
27:43 producers, it's trying to
27:45 artificially shut them off from
27:47 competing with them when you can't
27:49 actually do that. They're just out there
27:51 and all you're doing is something
27:52 artificial that's not going to work out
27:54 long term. The world just cuts you out.
27:56 That's what happens. So let's say you
27:58 were in charge of let's say you were in
28:01 Elon's role. What would you be
28:03 suggesting that's different in order to
28:05 restore American dynamism, bring about
28:08 the golden age? Well, three basic
28:10 things. One is massive deregulation.
28:13 Again, tariffs are regulations. Minimum
28:15 wage is a regulation. All of these
28:17 things are regulations that make America
28:19 non-competitive.
28:21 Why can't we pay people $3 an hour to
28:24 make to make uh textiles in America? Do
28:27 you want an actual answer to that? Sure.
28:29 Or should we go through the other two?
28:30 Well, let's start with an actual answer
28:31 to that. I think the actual answer is
28:34 because then those people wouldn't be
28:35 able to pay the costs associated with
28:37 living in the place where that factor
28:39 is. Why are the costs of living high?
28:41 because there's a cabal of real estate
28:45 developers
28:46 preventing mayors from adding more
28:49 housing artificially inflates the cost
28:51 of housing. We're going to have to
28:52 massively deregulate that. But it'd be
28:54 hard to do all those things at the same
28:55 time, right? It's going to be jagged. So
28:57 that person everything is deregulated.
29:00 Congratulations. Yeah, but it takes time
29:02 to build the new housing, right? Sure.
29:03 It'll take some time. But again, the
29:05 same people who are maybe before we
29:07 bring in the textile workers, we got to
29:08 bring in the $3 an hour building
29:10 builders. Well, I'm actually that that
29:12 gets to my second point, which is we
29:13 have to bring a lot of people in. Well,
29:15 so let's say suddenly there's the $3
29:17 building builders. Won't those buildings
29:19 fall down? Won't people won't people be
29:21 frightened to live in a $3 building? Do
29:23 the buildings Do the buildings in Dubai
29:25 fall down? You and I were in Dubai
29:26 together for the flood. like the
29:28 buildings didn't fall down, but we
29:30 watched what happens when a society is
29:33 as levered up as possible to create
29:36 economic growth and isn't thinking about
29:39 what happens if the third shoe drops.
29:41 The city was flooded. Yeah. Okay. Maybe
29:45 maybe buildings in China. I would trust
29:47 a Chinese city and Dubai. In China, to
29:50 be fair, there's a lot of regulation.
29:52 And people would be terrified to build a
29:54 building that would collapse because
29:56 there's a really powerful government
29:58 that is making sure that the building
30:01 isn't
30:03 Yeah. Okay. I guess I guess I'll be a
30:06 little more clear about what I mean by
30:07 get rid of all regulation. I don't mean
30:09 get rid of building code. I mean get rid
30:11 of permitting. So the difference between
30:14 uh building code and permitting. If for
30:16 permitting I have to if I want to build
30:18 a house I have to ask for permission. I
30:21 have to get approval from some zoning
30:23 board, from some ridiculous group of
30:24 people who are absolute clowns who are
30:26 basically incentivized to tell you no
30:29 because they own houses and they don't
30:30 want more supply. No, no, no, no, no,
30:33 no, no. So, that's the kind of
30:34 regulation I'm saying that needs to go
30:36 away right away. The kind of regulation
30:38 that says if you are building a building
30:41 uh you know more than six stories, you
30:42 can't make it out of wood. Sure, that's
30:44 a good that's a that's a building code.
30:46 So your first step is deregulate
30:48 everything that's capturable but keep
30:50 the regulations that simply make it safe
30:53 to exist. I mean again I sure I guess
30:56 you can say building code is a
30:57 regulation but building code is not a if
31:00 you want to build a building it has to
31:01 be this code fine but no one can say
31:03 whether you can build a building or not.
31:04 That's just the rule for the buildings
31:05 you have to build in this country. But
31:07 then you have a group of people whose
31:08 job it is to look at all the buildings
31:10 and tell what they're up to code. Sure.
31:13 Come on. Is this that hard? And that
31:15 group isn't capturable. Well, that group
31:18 isn't going to say you're building on
31:19 the wetlands of the No, no, no, no, no.
31:21 That's not Whoa, whoa, whoa. That's not
31:22 a building code. The building code
31:24 doesn't say anything about the wetlands.
31:25 The building code says Does the building
31:26 code say something about not harming the
31:28 environment? No, no. Of course not.
31:30 What? Harming the environment? What do
31:32 you mean? So, your building can just
31:33 emit all kinds of crud out? No. No, no,
31:35 no, no, no, no, no, no, no. Like again,
31:39 um, sure. you you have to you have to
31:41 draw a line between a regulation that
31:44 says uh you know you can't you can't uh
31:47 emit uh sulfurous organic compounds but
31:50 again that's not really a regulation
31:52 like that's just kind of like a sure I
31:55 guess they're all regulations and then
31:56 you need a group of people to enforce
31:57 the regulations. Yeah. Again, is this so
32:00 difficult? Like like like you can make
32:02 it all sound so difficult, but come on.
32:05 A 5-year-old can see the difference
32:07 between uh we have to protect the
32:09 wetland of the of the African spotted
32:12 frog from well, you can't you know, dump
32:15 lead into the river. Do you think we
32:16 should protect the wetlands of any of
32:18 the African spotted frogs? No. So just
32:21 let that rip. Let there be, you know,
32:23 loss of
32:24 biodiversity from human activity. I
32:27 mean, yeah. I guess like fundamentally
32:29 I'm a progressive. I'm not a
32:30 conservative.
32:32 And conservation is inherently
32:34 conservative. Yeah. So, you wouldn't be
32:35 upset if we lost a significant amount of
32:38 species biodiversity, but produced a lot
32:40 more factories and housing such that we
32:42 could remove other regulations such like
32:44 such as minimum wage. Well, a few things
32:46 about like the species biodiversity. If
32:48 there's an economic reason for it to
32:50 exist, it will continue to exist. if
32:53 there's like an economic reason for the
32:54 African frog to exist on those wetlands.
32:57 Yeah. And I think I think a human
32:59 economic reason like that there's a
33:02 there's no such thing as a human
33:03 economic reason or an AI economic
33:04 reason. There just is like okay an
33:08 economy.
33:10 If there there's a reason for them to
33:11 exist, they'll continue to exist. I'm
33:13 all
33:14 for preservation efforts, privately
33:17 funded preservation efforts
33:19 to this gets into like very specific
33:23 things. I think that we don't even need
33:25 to get this specific to see what the
33:26 major problem is, which is that if you
33:29 ask a lot of people to I could snap my
33:32 fingers right now and there's 5x as much
33:34 housing, right? No, no, no African fraud
33:36 spotted frogs destroyed. No uh stuff
33:40 emitted. No, no undercutting American
33:42 labor. I could snap my fingers right now
33:44 and there's 5x more housing. Way too
33:46 many people would say no to that. And
33:49 again, this is the same thing as
33:50 intellectual property. If I could snap
33:52 my fingers right now and say there's
33:54 five times as many copies of uh White
33:57 Lotus in the world. So, of course,
34:00 there'd be people up in arms screaming
34:02 being like, "No, no, we need to enforce
34:04 intellectual property rights." Right?
34:06 When you take something that is
34:07 fundamentally unscar and then
34:09 artificially make it scarce, that's the
34:12 kind of person who just needs to be
34:14 shot. I don't know what else to say. So,
34:17 I want to tease out exactly what you're
34:19 saying here, which is you're comparing
34:21 the incentives around home ownership,
34:23 not wanting more supply to come on the
34:25 market, which brings the value of your
34:26 home down. Yeah. To trying to enforce IP
34:29 ownership, saying we've created this IP,
34:31 you have to pay us a fee to use it.
34:33 You're basically saying there shouldn't
34:34 be any form of rent seeking, whether
34:36 it's physical kind of rent seeeking
34:39 behavior around your house price or
34:41 whether it's digital kind of behavior
34:44 for like a software subscription. any
34:46 person who's making a decision to
34:48 artificially limit the supply of
34:51 something for their own personal
34:54 economic benefit. If you think about the
34:56 economy as a pie, any person who would
34:59 choose to not grow the pie or even worse
35:01 shrink the pie in order for them to get
35:03 a larger percent of it just needs to be
35:04 shot. A lot of people like I think if
35:06 you asked I think about this a lot as
35:09 like the crypto space grows. I I think
35:11 about the the goals of crypto like and I
35:14 don't want to get too derailed down this
35:15 rabbit hole because I want to get to the
35:16 other two points of what you would do
35:17 but I think about how many people what
35:21 if they were given the devil's bargain
35:22 of the goal of true permissionless
35:25 finance for everyone that's configurably
35:27 private that's fluid that's borderless
35:30 that exists and it flourishes in the
35:32 world but you personally can't profit
35:34 from it which do you choose that that
35:36 exists and you can't profit from it or
35:38 it doesn't exist and you can maybe
35:40 profit from its attempts. I feel like
35:42 most people in or a lot of people in
35:45 various technology fields for their
35:47 version of their version of the future
35:49 would would do the second they would
35:51 choose the second option. Yeah. So let
35:53 me be absolutely clear about who we're
35:55 shooting here. If you have a person who
35:57 given the choice of the pie gets bigger
36:00 but their absolute share gets smaller
36:03 like their absolute dollars gets
36:04 smaller. No, I don't fault that person.
36:06 Okay? Right? That's a person following
36:07 That's a person following rational and
36:09 Zentos, which you can't fault that
36:10 person. I'm talking about a person who's
36:12 upset because their relative share gets
36:14 smaller. The pi is 100. The pi goes from
36:18 100 to 200 and you go from two to three.
36:22 In an absolute sense, you have more, but
36:24 in a relative sense, you have less. And
36:26 if that is person because I relatively
36:29 have less, I I'm going to block all of
36:32 this. That person needs to be shot. And
36:34 by shot, you mean metaphorically? No,
36:36 no, no, no, no. I mean I mean shot.
36:39 Yeah. Okay. So, suggestion number one. I
36:42 mean, look, if they're talking about
36:43 eating the billionaires, I think that
36:45 there's some billionaires who are bad
36:47 who who should be eaten who should be
36:49 eaten and some who shouldn't, right? And
36:51 you'll decide. No, I I I have a very
36:53 clear formula for deciding. I give a
36:54 very clear formula for deciding. It's
36:56 those who are preventing growth of the
36:58 pie in order to prevent the shrinking of
37:01 their relative share. So again, I'm not
37:05 going to get into nuance. Obviously, I
37:06 don't believe there should be no
37:07 regulation at all. Anybody can just
37:09 build everything. We don't need fire
37:10 codes. We don't need earthquake codes.
37:11 Obviously, I don't believe in that. But
37:14 I'm saying that fundamentally most
37:16 people in society or a good number of
37:18 people in society aren't even on team.
37:21 We should have more. And like you're not
37:25 even you're not even having the same
37:27 debate. If we're having a debate over
37:29 regulation, I might come from a
37:31 perspective of, yeah, I mean, we don't
37:32 want buildings to collapse. But if
37:34 you're coming from a perspective, no, we
37:36 don't want more buildings because then
37:37 the value of my building will go down
37:39 because there'll be more supply. I mean,
37:40 you just need to like that's the point
37:42 where you just need to take guns out and
37:43 shoot people. Like that's it. I think
37:45 the brilliant point is comparing the
37:47 physical rent seeking of the home
37:49 ownership and building ownership to the
37:52 digital rent seeking around IP. All
37:54 right. Recommendation one was qualified
37:57 deregulation. You said you had three
37:59 recommendations if you were in Elon's
38:01 role advising Trump to make America
38:03 great again. What are the other two? Uh
38:06 the second is high-skilled immigration.
38:09 Okay, take us through it. Massive
38:10 high-skilled immigration. So I think the
38:12 open border is disgusting. I couldn't
38:14 believe during the Biden administration
38:16 how demoralizing it must have been.
38:19 Comma brings in people, a lot of
38:21 Europeans. How demoralizing is it if
38:24 you're a European immigrant working
38:26 through this European immigrant to
38:28 America working through this paperwork
38:31 to try to follow the rules and then you
38:34 see Tik Toks every day of people
38:35 literally just walking across the
38:37 border. It's so demoralizing. It's the
38:40 worst selection function I've ever seen.
38:43 You can't have this. Well, to be fair,
38:45 this was Elon's suggestion to Trump. Oh,
38:48 yeah. I mean, maybe it's the same one. I
38:49 I think Elon and I probably have very
38:51 similar perspectives on all of these
38:52 things. So yeah, I mean you need to
38:54 close a border, right? I'm not talking
38:55 about I'm not talking about open
38:56 borders. I'm not talking about we don't
38:58 have a country. This is this is
38:59 We have a country. There's a
39:01 border. And every single person who
39:02 wants to cross that border is absolutely
39:04 going to be documented, background
39:06 checked, and all of that stuff. Mhm. Now
39:08 that said, any person who can produce
39:11 more value than they consume, here's
39:14 your work visa. Welcome to America. One
39:16 day process. You want to come to
39:18 America? You want to come work here? You
39:20 want to produce value for
39:22 America?
39:23 Absolutely. No, but no. You see, that's
39:26 going to lower American job. No. No.
39:29 We're going to grow the pie.
39:31 So, the US's ability to be the leader in
39:35 technology that it wants to be is really
39:38 going to be driven by its ability to
39:40 attract the best talent to come and live
39:42 and build things in America. America is
39:44 a multicultural, multi-racial society
39:46 with a civil religion. Everybody
39:49 is welcome, who wants to come produce
39:51 value and not be a criminal. That's it.
39:55 And I'm not saying citizenship right
39:56 away for everybody. I think that work
39:58 visa is right away for everybody.
40:00 Everybody, yeah, you get a chance, come
40:02 to work visa, come to America, come get
40:04 an American job, pay American taxes
40:06 because people think of jobs as a fixed
40:09 pie. Jobs are so completely not a fixed
40:11 pie. If we bring in the best people,
40:13 they're going to create companies that
40:14 create jobs to bring in more people to
40:16 employ Americans. Wealth is good for
40:18 everybody. The absolute growth of wealth
40:20 is good for everybody. So yeah, if we
40:22 bring in a hundred million people from
40:24 abroad, 100 million highskilled people.
40:27 Now, you know, like Lee Juan said, you
40:29 can't have an economy of fruit pickers.
40:30 I I don't know. I'm I'm I'm mostly pro
40:33 immigration across the board, but who
40:35 should immediately be allowed in is
40:37 anybody who's getting paid a six figure
40:39 salary. And so the third recommendation,
40:41 I think you need a path to citizenship,
40:42 too. And I think the path to citizenship
40:44 simply looks like how big how much have
40:46 you paid to the IRS? Once your total
40:48 accumulation to the IRS has exceeded a
40:50 certain amount and you've stayed in the
40:52 country for a certain amount of time,
40:53 congratulations. You can be a citizen if
40:55 you want. And so you said you had three
40:56 recommendations. What's the third one?
40:58 Deregulation, uh,
41:00 immigration, and I think you get most.
41:03 Oh. Oh, the third one. Okay.
41:07 So, this one I don't think Elon's
41:09 proposing. America has a real problem
41:11 with financialization.
41:13 uh the financial system. So, this is
41:15 actually a Andrew Tate video. Normally,
41:16 I think uh Oh gosh. Well, normally
41:19 normally I think uh you know, but you
41:20 can really I I really try to separate
41:22 the art from the artist. Uh I'm not even
41:24 saying I'm just saying normally he's a
41:25 clown, but this one I really liked. He's
41:26 like, "Look, you look at the community.
41:28 Let's talk about Elgen, Ohio. I don't
41:30 know if that's a real place, but you can
41:31 imagine Elgen, Ohio." And in Elgen,
41:33 Ohio, there's a there's a shoe shiner,
41:35 there's a baker, and there's a barista.
41:37 Uh and they're passing dollars back and
41:39 forth. The shoe shiner buys coffee from
41:42 the barista. The barista gets their shoe
41:43 shined at the shoe shiner. They work for
41:45 each other. They work for each other. Um
41:47 except they're all paying with credit
41:48 cards. And every time you pay with a
41:51 credit card, the credit card company
41:53 centered in New York takes 3%. Mhm. So
41:56 each time you have that transaction, 3%
41:59 is being siphoned off. And you can just
42:01 see how all the wealth is being siphoned
42:03 out of that community. I don't think you
42:04 need Andrew Tate to point that out.
42:06 Well, I just I saw an Andrew look look.
42:08 I mean there's a reason he's successful.
42:10 There's a he he he he puts it in such a
42:13 you know no other country does that
42:14 right? Every other country does not
42:17 allow a third party credit card company
42:20 or bank to take two 3% out of every
42:22 transaction. You look at the new system
42:24 for payments in India that's now being
42:26 adopted in Dubai. It's all zero fee but
42:29 run by the government. Yeah. I mean
42:30 again you don't even have to look to new
42:32 things in weird countries. You can just
42:34 look to cash. Cash doesn't have this
42:36 problem. the cash doesn't degrade at the
42:38 3%. So if you go back to 1950, that
42:42 community was paying in cash and there
42:45 wasn't this degragation. But now they've
42:46 moved to credit cards and all the wealth
42:48 is just being siphoned out by the
42:50 financial sector.
42:52 So I'm all for deregulation everywhere
42:55 except on industries that should pretty
42:57 much not exist like finance.
43:00 Let's double down on finance should not
43:03 exist. What do you mean by that? So,
43:05 don't people need to So, so finance is a
43:08 set of incentives for properly
43:09 allocating capital. You you should be
43:11 able to get a loan to start a business.
43:13 I should be able to um lend you money if
43:15 I believe you're going to be able to pay
43:17 it back if I want to and charge some
43:18 interest based on the risk I'm taking
43:20 on. Finance allows risk to move
43:23 correctly around a society. I think we'd
43:25 just be better off without it. the
43:27 entire industry neither a borrower nor a
43:29 lender be just no no no credit no
43:33 borrowing no lending no I think no
43:35 venture I think that that's extreme but
43:38 I think that no stock market I think if
43:41 you ask the oh yeah I mean I think if
43:43 you ask the question like where we are
43:45 right now is the is the status quo or is
43:48 zero finance a better place and I think
43:50 that zero finance is a far better place
43:53 even in like the merchant of Venice
43:55 there's a financeier right? Like
43:57 finances existed um and you're never
43:59 going to you know century in the 1600s
44:02 regardless of what laws I pass or
44:05 regulations I pass you're never going to
44:06 completely get rid of finance. Let's
44:08 talk about some actual regulations that
44:10 I would pass. Right. I'm not going to
44:11 I'm not going to I'm not saying that
44:13 every person biblical and make usury a
44:15 crime. Well, I don't think we go that
44:16 far, but no, I think that I'm not saying
44:18 we're going to shoot every banker. We're
44:19 just going to have to constrain the
44:20 bankers in a few ways. The biggest
44:22 problem with the financial system in
44:23 America now is that the money is fake.
44:24 Take us through that. I know a lot of
44:26 crypto people think that. Well, Bitcoin
44:28 is Bitcoin is fake, too. Bitcoin is
44:30 fake, too. There's no We'll come back to
44:32 that one. Yeah. Um, so the money is
44:34 created by a humanmade system. If you
44:36 want more gold, if you want there to be
44:38 more gold, there's no amount of people.
44:40 I couldn't if I held a gun to every
44:42 single person's head at in the world
44:44 simultaneously and I said double the
44:46 supply of gold, they just couldn't do
44:48 it. But it's only humans that have
44:50 decided that gold is valuable. Gold
44:52 isn't inherently valuable in some
44:54 context outside of humans placing that
44:56 value upon. I'm not even talking about
44:57 the value of gold. I'm just talking
44:59 about the natural scarcity of gold.
45:00 Again, it really comes down to the
45:01 scarcity argument. Our money today,
45:04 there's nothing fundamentally making it
45:05 scarce. It's all numerator, no
45:07 denominator. You can make the
45:08 denominator be whatever you want. But
45:09 even everyone knows the denominator of
45:11 Bitcoin. It's not even a denominator
45:13 question. 21 million. Well, right now it
45:16 can change, right? So, so I don't think
45:17 so. But you can argue that it won't
45:19 change. But you agree that if I had a
45:22 gun to everybody's head in the world at
45:24 the same time, I could change that
45:25 denominator of Bitcoin. You could, but
45:27 if you had a gun to everyone's head, you
45:29 could decide on the price of gold, too.
45:31 No, not the price. I'm saying the
45:32 amount, the denominator. Yes. You could
45:34 not You could not decide on the amount
45:36 of gold. Exactly. And because I can't
45:39 The gold is real scarcity. Gold is
45:43 actually scarce. Physical. Well, yeah.
45:46 If we want more gold, there's ways to
45:47 get it. We can mine for it. And think
45:51 about gold mining. Think about all the
45:53 industries that it supports. Okay, we're
45:55 going to have to build heavy machinery.
45:56 Well, that's going to need steel. That's
45:58 going to need metal. We're going to use
45:59 machine learning models to predict where
46:01 there's likely to be gold. Great. That
46:02 employs data scientists, right?
46:04 Eventually, there's going to be gold and
46:06 asteroids. We're going to fund a space
46:07 program because we got to go get more
46:09 shiny rock. That natural scarcity from
46:13 gold is why it works. what it's actually
46:16 valued at. I mean, sure, it's of course
46:18 it's humans agreeing. We can all say
46:19 gold is valueless or gold is worth seven
46:22 billion dollars a gram tomorrow, but
46:24 because the denominator is fixed by
46:26 nature and in order to increase that
46:28 denominator, you have to do real
46:29 productive work. Your complaint about
46:32 modern finance is that it's divorced
46:35 from reality in that there's no external
46:38 hard constraint to the production of new
46:42 value. Well, again, it's artificial
46:45 scarcity enforced with guns or whether
46:48 it's guns or whether it's I mean Bitcoin
46:50 again it's it's enforced with friction.
46:51 So friction whatever it's basically
46:54 enforced with so much friction that it's
46:56 extremely unlikely that you you say you
47:00 say the hard fork are going to accept
47:01 it. It's so difficult to coordinate
47:03 intentionally that they're very
47:05 unlikely. But so so difficult is so
47:07 different from impossible. Of course.
47:10 Yeah. Of course. And right now we're at
47:12 gold alltime highs, but not not at
47:14 Bitcoin all-time highs. But both of them
47:15 are what the actual like prices are.
47:18 This stuff's so like local and doesn't
47:19 matter. The long the thing that actually
47:22 makes gold good is that the denominator
47:25 is controlled by nature, not by
47:29 consensus of people. So what would you
47:31 actually do as your third suggestion in
47:33 order to fix Let's not even talk about
47:35 America. Who cares? In order to fix it
47:37 to fix the over financialization that
47:39 you're talking about, would you try to
47:41 get everything back onto the gold
47:42 standard? Absolutely. At first, you have
47:43 to start. And by the way, you're not
47:45 going to the US dollar is going to zero.
47:46 Like the US dollar is going to zero. Why
47:48 is the dollar going to zero? What are
47:50 what are what are fake Roman dollars
47:52 worth today? Take a Roman dollar. If the
47:55 Roman dollar is a physical coin made of
47:57 gold, it's actually still very valuable
47:58 today. It's probably worth more than it
48:00 was in the Roman Empire. Well, so Rome
48:03 notoriously had a currency that it
48:06 eventually decided to devalue by mixing
48:09 the precious metal with cheaper metal so
48:11 that it could fund more different kinds
48:12 of at least there's still some precious
48:14 metal in there, right? At least the
48:15 denominator has gone up in like a clear
48:17 way. My point is even if you have that
48:19 Roman coin that's now only half gold,
48:20 it's still very valuable today. But if
48:22 you think about if you compare that gold
48:24 coin to like the the the the German yark
48:27 or whatever whatever fake currency
48:29 Germany had after World War I. the the
48:31 currency during the Vimar Republic. My
48:33 history is not that great. I'm not going
48:34 to I'm not going to I'm not going to say
48:36 that. But I don't need history to say if
48:38 your currency is made of paper and
48:40 backed by nothing, when your empire
48:42 falls, the value of that currency goes
48:44 to zero and all empires will fall. So,
48:47 but I don't think modern Americans
48:49 necessarily
48:50 even care that the dollar would go to
48:54 zero if America fails. They don't think
48:56 America is going to fail anytime soon. I
48:58 mean, sure, maybe it won't. But my point
49:00 is the US dollar is going to zero. We've
49:03 agreed on that. Now we're just arguing
49:05 about the time horizon. Well, humanity
49:08 is going to zero on some kind of time
49:09 horizon. Well, yeah. Okay. Well, the
49:12 dollar is not going to zero on the time
49:13 horizon of the heat death of the
49:14 universe here. Do you think that within
49:16 your in my lifetime the American dollar
49:19 is going to zero? I don't think so. I
49:20 think there's a decent chance. What What
49:22 would you put as the chances? Be a
49:24 betting man for a moment. 50/50 that the
49:26 US dollar goes to zero in our lifetime.
49:28 Us being mid-30s, we might live forever.
49:31 I might live a really long time. But
49:32 yeah, I think I think I think even if we
49:34 live to 100, I think Yeah. Wow. Yeah.
49:36 50/50 the dollar goes to zero. Yeah. All
49:38 right. And so what do you think the
49:41 right approach is if you think that
49:43 that's possible? So there's the biology
49:45 approach of gathering people up to start
49:48 their own network states. There's the
49:51 sort of practis approach of okay, can we
49:54 actually start a country and physically
49:56 colllocate people? What do you think
49:58 there is to do about this? I mean, I
49:59 think that's extreme. Why doesn't the
50:01 the US just issue a new currency backed
50:04 by gold? It exists in parallel with the
50:06 US dollar. We'll call it USG, US gold.
50:09 We'll make some like real nice looking
50:11 bills. We'll like hire some decent
50:12 graphic designers and make our money not
50:14 look like crap. Who's ever making the
50:16 money in Hong Kong looks real good. So
50:18 So we'll make some good-looking money.
50:19 that's backed by gold and you can start
50:21 using it. I think people would really
50:23 hesitate to I think governments would
50:25 really hesitate to reback their currency
50:27 in gold. Of course, because they've been
50:29 running a scam for so long. So, we got
50:30 to get rid of that government and put in
50:31 a new government that's going to build
50:32 decent currency. Here's another thing. I
50:35 mean, it's the same thing they're
50:36 already doing in India. You can have you
50:38 can have VUSG. VUSG is backed by real
50:42 gold, redeemable for real gold. Has lots
50:44 of warnings. You know, Google's thing
50:46 about like don't be evil, right? Like
50:47 like if I ever started a company, mine
50:49 would be don't be evil. And if we ever
50:51 remove don't be evil, it's because we
50:53 are evil now, right? Like that would be
50:54 the that would be the slogan, right? You
50:56 would say the quiet part out loud.
50:57 Exactly. So, uh you know, I'd make sure
50:59 to to put in a lot of friction. And it
51:02 is friction. It's always going to
51:03 fundamentally be friction that
51:05 says VUSG is is
51:09 uh redeemable for gold. And the minute
51:12 it's not redeemable for gold, this
51:14 currency is going to zero. You should
51:15 get out. Right? That's like the like
51:17 like that is printed on every bill. On
51:20 every bill it'll say this this bill is
51:23 backed by physical gold and redeemable
51:26 for physical gold at this address and
51:28 the day that it is not get out of this
51:30 currency because it's going to zero and
51:31 that's written on all the bills. You
51:33 want to enforce that as much as possible
51:34 and then I'll set up a alipe style
51:39 people can pay each other. You don't
51:40 actually have to carry around lumps of
51:42 gold or bills or anything with 0% fees.
51:45 And that thing would be run by the
51:46 government. It'd be run by private
51:47 companies. I think I think that we could
51:49 actually use blockchain technology for
51:51 that. Stable coins are perfect for it.
51:52 In fact, Franklin is building something
51:54 kind of I have no idea what the hell a
51:56 stable coin is. It's it's
51:59 it's not going to I'm not going to I'm
52:00 not going to fiat currency pegged. Yeah,
52:04 it's a fiat currency pegged. So, in
52:05 other words, it's pegged. It's pegged to
52:08 It's pegged to
52:09 on Yeah, it sounds
52:11 great. You know what? You know what? I
52:14 shouldn't have said the word blockchain.
52:15 We're going to use an openly
52:16 replicatable MySQL database that anybody
52:19 can verify all the transactions of. Or
52:21 maybe we'll put some ZK on top of it so
52:23 that they're private. We'll still store
52:25 the transactions in my SQL just so it's
52:26 not a blockchain.
52:29 How are you going to determine if this
52:30 MySQL techn is up to state? Well, we'll
52:33 have stakers. We'll have stakers who
52:34 attest to the state hash. Isn't that
52:36 just like a blockchain, dude? Are you
52:37 telling me blockchains were just my SQL
52:38 databases the whole time? No, because
52:40 they're much crappier than MySQL
52:42 databases. If someone said, "Here's a
52:43 MySQL database. It can handle 10
52:45 transactions per second." You're like,
52:46 "Dude, what is this? The 1950s make
52:49 currency that's backed by gold." And you
52:51 have a low friction 0% way for people to
52:55 pass it back and forth, right? You want,
52:57 oh, well, who's going to deal with
52:58 chargebacks? Yeah, that's literally the
53:00 job of a government. Who's going to pay
53:01 for this? That's literally something I
53:03 would love my tax dollars to go for a
53:06 welladministered 0% payment system for
53:08 everybody so nobody can rentseek on
53:09 that. Incredible. Let's talk about it's
53:11 not going to even cost that much. It can
53:13 cost nothing. Let's talk about compute.
53:14 Let's talk about compute, GPUs, CPUs.
53:19 Um, what do you think people don't
53:21 understand about Nvidia GPUs and the
53:25 market for different kinds of chips? I
53:28 don't think people understand anything
53:29 about it. I think people really have a
53:34 cargo cult understanding and they'll
53:36 just like repeat something and it
53:37 doesn't make any sense. People don't
53:39 know what CUDA is. you start with that.
53:40 It's like, let's start with what CUDA
53:42 is. It's actually funny. I tweeted that
53:43 no one knows what CUDA is. And then
53:45 Chris Latner replied and said, actually
53:47 at Modular, we know what CUDA is. And
53:49 I'm like, all right, Chris Latner, you
53:50 know what CUDA is. I'm not talking about
53:52 you. CUDA is an ecosystem. Said it's an
53:55 ecosystem for what? Like AI stuff. You
53:59 threw the AI stuff into CUDA. You use
54:01 the CUDA for the AI stuff. And what does
54:03 that have to do with Nvidia? Nvidia
54:05 makes the ecosystem. The CUDA ecosystem
54:07 includes Nvidia TM hardware. So a way
54:10 that you could describe this is that
54:11 CUDA enables Nvidia to have a moat
54:14 around its hardware and that if one were
54:18 to surmount the software moat around
54:22 Nvidia's hardware that their lock on the
54:26 the future might be disrupted. Yeah. So
54:29 the more like you have an ecosystem. If
54:31 something's the dominant ecosystem,
54:32 people want to build in the dominant
54:34 ecosystem. Mhm. So you have a lot of
54:36 people who build Yeah. You have network
54:38 effects. The people build in the Nvidia
54:40 CUDA ecosystem and it's not an open
54:42 ecosystem. AMD cards can't run CUDA.
54:45 Mhm. And like it's not even that
54:47 proprietary to Nvidia. Yes. And it's not
54:49 even that like AMD cards can't run CUDA.
54:51 That's kind of the wrong way of like
54:53 thinking about it. It's that there's a
54:56 lot of complexity inherent to CUDA that
54:59 make it only work well on Nvidia
55:01 hardware. Every couple years you'll get
55:03 someone who's like, "Oh, I built a
55:05 translation layer that will let me run
55:06 CUDA on AMD." And that's never going to
55:08 work. AMD is the number two competitor
55:11 behind Nvidia. Yeah. Because it's never
55:13 going to be fast. Oh, open CL. Like I
55:16 think all these things are kind of just
55:17 being done at the wrong abstraction
55:19 layer. But this is a specific tiny grad
55:20 bat. I think I think Chris Lightner
55:22 basically has the same bat saying that
55:25 instead of operating at this very
55:27 specific looking API layer right where
55:30 you have like something like you define
55:31 a kernel it has global size and local
55:33 size or CUDA's that is like that metal's
55:35 like that hips like that instead of that
55:37 you just say okay I have the tensor
55:39 compute and I have hardware to run the
55:41 tensor compute how do I best compile the
55:42 tensor compute for that hardware that's
55:44 what tiny bread is and so you think that
55:47 no company is going to be able to put a
55:50 moat around their compute hardware. I
55:52 think that companies are going to try.
55:55 Our mission statement at Tiny Corp is to
55:58 commoditize the pay to flop. What does
56:00 that mean? Our investors like George, do
56:01 do you really want to say that? I'm
56:02 like, yeah, I really want to say that.
56:03 When something's a commodity, there's no
56:06 uh rent seeking available. So none of
56:08 the stuff people are doing wrong with IP
56:10 and with their houses. Yeah. So, so for
56:13 the paflop to be commoditized and what
56:15 that actually means is there's many
56:16 chips that the idea of a payoff flop the
56:18 idea of a chip that can do a floating
56:19 point calculation is obviously a
56:20 commodity. There's many chips you can
56:22 buy that can do floating point
56:23 calculations. Nvidia GPUs, AMD GPUs,
56:26 even Intel's things are capable of doing
56:27 floatingoint calculations. Many many of
56:29 them. Can you give people a sense of how
56:31 much compute one pa flop is? Like how
56:34 much compute in an iPhone? How much
56:36 compute in a data center? So pa is 10^
56:38 the 15. Um, an iPhone is on the order of
56:42 10 teraflops. So, a 100 iPhones is one
56:45 payoff. Got it. So, maybe a 100
56:47 teraflops in a new iPhone, 100 tops. So,
56:49 maybe it's 10 iPhones, but somewhere
56:51 between 10 and 100 iPhones.
56:54 And so, if you succeed in that mission
56:56 and the pay to flop becomes a commodity,
56:58 what does the world look like? So,
57:00 currently there's an extreme premium for
57:02 Nvidia flops over other flops. When you
57:05 look at the 4090 versus the 7900 XTX,
57:09 they're very similar looking cards. They
57:11 have very similar natural hardware. Not
57:15 I mean there's like there's nuance and
57:16 details here. But my point is per flop
57:19 there's a 2 to 3x premium for Nvidia
57:22 hardware because of the network effect
57:23 because of CUDA because of the moat.
57:25 Yes. So there's that 2 to 3x premium and
57:28 then there's another 2 to 3x premium
57:29 which is a technical premium where
57:31 people buy these data center cards that
57:33 use like HBM memory because has higher
57:36 bandwidth and lower power but I don't
57:37 think you need that either. I think what
57:39 you basically want is
57:41 a huge number of racks of consumer GPUs
57:45 uh running your models. So just to to be
57:48 clear there are two kinds of GPUs that
57:49 are being sold. There are kinds that are
57:51 intended to be put in data centers and
57:53 there are kinds that are sold directly
57:54 to the consumer. and these two things
57:56 have different margins. It's not all
57:59 about margins. There is a margin aspect
58:01 to it, but there's also just some
58:02 technical decisions being made in the
58:04 data center GPUs that make them more
58:05 expensive. So, when you look at just a
58:08 flop, if you think about a flop on an
58:11 AMD consumer card versus a flop on an
58:13 Nvidia data center card, there's a 10x
58:16 price difference in that flop. And that
58:19 10x is what I want to commoditize. Mhm.
58:22 And then also I think it could be more
58:25 like a 100x if there was more of a
58:29 market down here. There's not that much
58:31 of a market for consumer AMD flops.
58:33 There's a huge market for Nvidia data
58:36 center flops. If there's more of a
58:38 market down here for just like any kind
58:40 of flops, all flops are like commoditiz.
58:43 Every flop should be equivalent. That's
58:45 what a commodity really is. Compare a
58:47 fungeible token to a nonfgeible token. I
58:50 don't I want flops to be funible. I want
58:52 every flop to be created equal. That's
58:54 the dream. That's the dream of the tiny.
58:56 All flops are created equal. And Tiny
58:58 allows you to use those flops to do
59:01 useful compute. And so you just started
59:03 using some flops from AMD. Been using
59:04 AMD for a long time. AMD finally came
59:07 around and agreed to give me some
59:10 computers. Okay. Again, I can buy the
59:13 computers, but I like to I like to test
59:18 for buyin. are you interested in what
59:19 I'm doing or not? Thank you for the
59:21 computers, AMD. A good cultural test and
59:23 I think that AMD passing my cultural
59:24 test means that they're understanding
59:27 the value of this kind of stuff and I
59:30 see a bright future for them. So you
59:33 have this external culture test but also
59:36 you are running a very unique internal
59:38 culture around how tiny is organized and
59:41 managed. Maybe you can talk a little bit
59:43 about that. Um I don't think that it's
59:45 like that radical anymore. I think
59:47 everything is basically going to go to
59:48 this format. So, we'll start with like
59:51 we don't do interviews. Your resume goes
59:54 right in the trash. God, you went to
59:56 college. Yeah, I don't care about that.
59:57 The only way that we hire is from people
59:59 who've contributed to the tiny grad
60:00 repo. Mhm. And then we have bounties.
60:02 So, you can get paid right away. And if
60:04 you are really skilled, you can make we
60:07 we had one guy who's just like tearing
60:08 through the bounties and he was like
60:10 making two grand a week just tearing
60:12 through things basically and like not
60:14 even full-time. If you are very skilled,
60:16 you can get paid a full-time salary off
60:18 of bounties alone. Are these folks
60:20 contributing to your bounties? Are they
60:22 themselves using AI for coding? The
60:24 thing about using AI for coding is AI is
60:28 below the bar of the person we would
60:30 hire at Tiny Corp. So, some companies
60:32 may have a very low bar. If you have a
60:34 very low bar and you you like if you
60:37 have employees that are worse coders
60:39 than chat GPT, oh god, how did you hire
60:42 them? What was your process here? uh
60:44 they should all be fired and they should
60:46 all be replaced by judgment team. So we
60:49 encourage everybody to use AI to the
60:51 fullest of their ability. This isn't
60:52 like we're not going to like oh you
60:54 can't use AI for the interview. No, use
60:55 AI. Please use AI. Please use AI when
60:57 you work here. The only thing that I
60:59 care about at the end of the day is the
61:00 final product. And I think that
61:03 everything is going to move. It's a
61:04 machine learning concept. You have a
61:06 training set and a test set. And you
61:08 want your training set and your test set
61:10 to come from the same distribution. If
61:11 your interview is your training set and
61:12 the actual work is the test set, you
61:14 want your interview and your work to be
61:17 as close as possible together. Think
61:18 about it. If you're a fang company and
61:20 you do leak code interviews and then
61:22 what you actually have to do is fix this
61:25 bug in this web endpoint for people in
61:27 Nigeria trying to pay in Empa like
61:31 there's just there's such a disconnect
61:33 there. So that disconnect should go to
61:34 zero. And that's kind of the first
61:38 premise of of Tiny Corp. The only way to
61:39 get hired here is to contribute to tiny
61:41 grad and then once you do get hired
61:42 here, it's the same job. Are there
61:44 specific tools that you feel like are
61:46 better and worse for assisting these top
61:49 level coders currently? No, it's it's
61:52 all the same knowledge as everybody
61:54 else. I think that they're You don't
61:56 have a preference amongst them. I think
61:58 they're good for different things. Um,
62:01 Chad JPT is probably the all-around
62:03 best. Uh, Grock is recent. It's got
62:07 great recency, but it's prone to
62:09 hallucination. Uh, and then Deep Seek is
62:11 the best at like thinking. If you want
62:13 something that's really going to think
62:14 through something, JBT is kind of lazy.
62:16 Deep Seek works hard.
62:20 Uh, maybe something to be said there
62:22 about America and China, but uh, yeah.
62:24 So, yeah, I think I think I use some
62:25 combination of Deep Seek, Chad GPD, and
62:28 Grock. Same with everybody else. One of
62:30 the risks that you and I have talked
62:31 about with AI and of course you're not a
62:33 doomer in the Udkowski sense is what
62:36 you've referred to as wireheading. What
62:38 is that risk and do you see that
62:41 happening now? So wireheading is simply
62:43 when you've hijacked your own reward
62:45 function. So your brain has some reward
62:48 function. Your brain has things that it
62:49 finds pleasurable, like food, like sex,
62:52 like productive work hopefully. But it
62:55 unfortunately also has other things that
62:57 it finds pleasurable, like Tik Tok and
63:00 Twitter and fentinel and meth. Uh, and
63:04 these are unfortunate because they're
63:08 nonproductive. They're these they're
63:10 these non-productive ways that stimulate
63:12 your your your pleasure center. I don't
63:14 think they're all like truly terrible
63:16 things and in moderation, whatever. But
63:19 the the term wireheading comes from this
63:22 experiment done on rats and they drill a
63:25 little hole in the back of the rat's
63:26 brain uh bring little little electrode
63:29 right into the brain stem and uh the
63:31 electrode would would pulse a little bit
63:32 of electricity deep down in the brain
63:34 where your pleasure center is and hooked
63:36 it up to a button and the button gives
63:38 you a little electricity and the rat's
63:39 like, "Oh, button. Oh, button. Yeah,
63:42 button. button. Oh, button button
63:46 button. And and the rats die and they
63:48 don't die because of a overdose of
63:50 pleasure or anything like this. They die
63:52 because why would you eat when you could
63:53 press button? Entertaining ourselves to
63:54 death. Entertaining ourselves to death.
63:56 Amusing ourselves to death. I think
63:57 that's the that's the Neil Postman,
63:59 right? Uh yeah. So I think that this is
64:02 that this is definitely happening and
64:03 that we will
64:05 lose some chunk of humanity to it. Do
64:08 you see it happening already? Yeah, but
64:11 not to the extremes which are well. I
64:13 was talking to a 21-year-old who just
64:16 graduated from a great school and she
64:18 told me that a big chunk of her
64:20 classmates are dysfunctionally addicted
64:23 to Tik Tok. As in, they live with their
64:25 parents. They can't get out of bed and
64:27 do stuff. They aren't looking for jobs.
64:29 They're just watching Tik Tok all day.
64:30 Do you think we're starting to head in
64:32 that direction? I'm always reluctant to
64:36 you can go back to like the quotes where
64:37 Socrates said, "Wow, the youth of today
64:39 is is so is so corrupt and obsessed with
64:42 there's always that article, right?
64:44 There's always that article of the older
64:45 people saying the kids are not all
64:47 right." Yeah. So, I'm not so sure that
64:50 we've really even seen it yet. Um, and
64:54 I'll get to what it might sort of look
64:56 like. And then people like, "Oh,
64:57 fentanyl. People are opiate addicts."
64:59 Yeah, but like then you go back to like
65:00 the opium wars a 100 years ago. I'm not
65:02 so sure this stuff was that different. I
65:05 mean, it's stronger. Yeah, but I don't
65:07 know. It's all It's all interpreted
65:09 through a cultural lens, right? There's
65:10 no problem with Fentel in China today.
65:12 Well, there's no crisis of meaning and
65:13 purpose. Yeah. So, I'm not so sure that
65:16 these things even Tik Tok are are
65:18 reflective of it. But there's I mean,
65:19 there's a great story. It's uh it's uh
65:22 My Little Pony friendship is optimal. Uh
65:25 and I'm not going to I'm not going to
65:26 give away the You're going to out
65:27 yourself as a brony. Oh, I mean I have
65:29 watched many have watched many episodes
65:31 of My Little Pony: Friendship is Magic.
65:33 I can name the core six, uh, you know.
65:35 So, yeah. Um, I can name the different
65:37 types of ponies. Yeah, I do. So, the
65:38 rationalist version. Yeah. And and I I
65:41 think that that wire heading is going to
65:42 look a lot more like that. Or we can
65:44 talk about the concept of heaven
65:45 banning. Heaven banning. It's some guy
65:47 on on Twitter near Sion uh popularized
65:50 it and it's an interesting idea where
65:53 heaven banning is this imagine you're
65:55 posting on uh social media you're
65:57 getting lots of replies praising you
66:00 giving you the kind of feedback you're
66:01 looking for giving you likes giving you
66:03 retweets but of course all of these
66:05 people you're interacting with are AI
66:07 bots I don't think that wireheading is
66:09 possible without sufficiently advanced
66:11 intelligence I don't think that humans I
66:14 mean maybe if you do it like with a
66:16 drill and an electrode, sure you do a
66:19 drill and an electrode will bypass
66:20 anything. But if you're trying to do
66:22 wireheading through an app, I don't
66:24 think wireheading through an app is
66:25 going to be possible until that app has
66:28 sufficiently more intelligence than you.
66:31 What are you doing personally today
66:33 yourself to protect yourself from
66:36 wireheading and various kinds of amusing
66:39 oursel to death addictions? There's many
66:41 you you've asked several of these sort
66:42 of questions like
66:45 They're like alpha mining. And I feel
66:48 that there's way too much of this in the
66:49 world. Like way too much of these like,
66:51 "Oh, can you tell me the secret to being
66:53 a good programmer?" And I tell people
66:54 the story. You know how I became a good
66:56 programmer? I'm working on it. Oh, no,
66:59 no, no, no, no. So, there's a
67:01 volcano. It's a true story. Uh, so, so,
67:04 so in in in uh in Indonesia, so you
67:07 know, there's the island with Bali. Uh,
67:09 you can go all the way to the west of
67:10 that island, and then you can get on a
67:11 boat, and that boat will take you to the
67:13 main island. Mhm. And in the main
67:14 island, there's this city called Ban
67:16 Yuanji. No one's heard of this city.
67:18 Maybe you've heard of Jakarta, maybe
67:20 you've heard of Surupaya, but no one's
67:21 heard of Ban Yuanji. Okay. And then an
67:23 hour outside of Banuangi, there is this
67:26 volcano called Aenet. And this volcano
67:28 is the only volcano in the world where
67:30 there is blue flames. There's sulfur
67:33 released by the volcano. The sulfur is
67:35 lit on fire and it turns blue. So one
67:38 night I set out in a
67:40 car, went to the volcano. You have to
67:43 climb up the volcano. I rented a gas
67:45 mask from an Indonesian man. Just true
67:47 story. I put that I put the gas mask on
67:49 and I I I climbed down to the bottom of
67:52 the volcano and I saw the blue flames
67:54 and at that point I understood
67:56 programming. Okay. So just to defend my
67:58 question because I feel like it it
67:59 requires defending. People like Cal
68:01 Newport, Digital Minimalism, have
68:03 written books with the intention of
68:05 helping people uh protect themselves
68:07 against companies spending billions of
68:09 dollars with data scientists to
68:11 monopolize their attention and change
68:13 how they think. You have no such
68:15 recommendations to offer and think it's
68:16 an illegitimate exercise to make such
68:19 recommendations. Is that the same guy
68:20 who made Newport cigarettes?
68:23 I don't think so. I I'm sure they were
68:24 there to help people, too. Big family.
68:26 Big family. Lots of different or you can
68:28 read seven habits there. Newports highly
68:30 effective people or how to make friends
68:32 and influence people or the 4-hour work
68:34 week or you can read. And do you think
68:35 those things are valueless? Yeah, of
68:38 course. You don't think people picking
68:39 up habits by mimisis is useful? I mean,
68:42 I don't think they're like valueless.
68:43 Like they're fine books to read. They're
68:44 not like, "Oh god, you read that book.
68:45 It's a terrible book and it'll lead you
68:47 in the wrong way." But no, I I don't
68:49 think that there are I think that a lot
68:51 of people spend way too much time
68:53 thinking about what I should do or how I
68:55 should optimize things and they should
68:56 actually just do things. I think that
68:58 people spend a lot of time in
69:00 transances, failing to introspect about
69:03 how they're thinking, how they're
69:04 spending their time. And I think helping
69:06 people break that trance enables people
69:08 to live more self-directed lives instead
69:10 of falling victim to companies that just
69:13 want their attention for money. I'm an
69:15 emo kid. Non-conforming as can be. You'd
69:17 be non-conforming, too, if you look just
69:19 like me.
69:22 All right. Well, is that a good place to
69:24 leave it? Those are my questions. Thanks
69:26 for being on, George Hots. I appreciate
69:29 it. Bye, Amanda Show. Watch the other
69:32 episodes. Watch my Twitch stream. Don't
69:34 watch my Twitch stream. Thank you for
69:36 listening to this episode of Endgame.
69:38 Please follow me on X, Amanda Casset.
69:40 Two S's, two T's. Follow the company I
69:42 founded, Serotonin HQ, and listen
69:44 wherever you get your podcast.