Hang tight while we fetch the video data and transcripts. This only takes a moment.
Connecting to YouTube player…
Fetching transcript data…
We’ll display the transcript, summary, and all view options as soon as everything loads.
Next steps
Loading transcript tools…
Elon Musk: A Different Conversation w/ Nikhil Kamath | Full Episode | People by WTF Ep. 16 | Nikhil Kamath | YouTubeToText
YouTube Transcript: Elon Musk: A Different Conversation w/ Nikhil Kamath | Full Episode | People by WTF Ep. 16
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
Video Summary
Summary
Core Theme
This content is an in-depth interview with Elon Musk, covering a wide range of topics from the future of social media and AI to space exploration, economics, philosophy, and personal views on life, family, and entrepreneurship.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
audience is largely wannabe
entrepreneurs in India. And I feel like
all of us have so much to learn from you
because you've done it so many times
over in so many different domains. >> Yeah.
>> Yeah.
>> Uh so we will speak to them today and I
will try and center all my questions in
that direction so they can take
advantage of this conversation and maybe
You want a coffee?
>> Um, sure. Why not? >> Okay.
>> Okay.
>> Okay. Are we going to be talking for a while?
while?
>> I hope we are.
>> Okay. Good. Sure. Um,
Um, >> Mna,
>> Mna,
>> may I trouble you for a coffee?
>> Can we get another coffee?
>> Anything? Uh, cappuccino, I guess. All right.
right.
>> Are you a coffee drinker, El? Oh,
>> yeah. Yeah. I mean, yeah, I copy once
usually in the mornings, you know. >> Okay.
>> Okay.
>> One a day kind of thing.
>> You want to wait for it?
The first thing I must say is you're a
lot bigger and bulkier, muscular than I
would have thought you are. >> Oh.
>> Oh.
>> Oh, stop. You must make me blush.
>> Really? Seriously?
>> Yeah. I mean, look on the internet. I'm
>> Yeah. Essentially,
what percentage of internet
>> Yeah. is spend on Twitter. Is there a
number to it on X?
>> Well, so we have like about 600 million
monthly users.
>> Um well, although it it can spike up if
there's if there's some major event in
the world, it can get up to I don't know
800 million or or or a billion.
>> Um if there's some major event in the
world. So, uh so so that there's I don't
know 250 300 million per week type of
thing. It's a pretty decent number. It
tends to be
>> um readers, you know, people that read words.
words.
>> Um you know, so
>> do you think that'll change? Um yeah, I
mean there's
uh there's certainly a lot of video on
on um on the X systems, but uh at this
point increasing amounts of video, but I
think where where uh the X network is
strongest is among people who who think
who think a lot and read a lot, you
know. So it's that's where it's going to
be strongest because we have words
and and you know so um am among readers
writers and thinkers I think X is number
one in the world
>> as far as social media goes the form factor
factor
if you had to wager a guess for tomorrow
>> how much is text how much is video
I've heard you speak about maybe voice
and hearing being the next form of
communication with AI, what happens to X
in its true form? How does it evolve?
>> Yeah. So, I I do think most interaction
is going to be video in the future. Uh
most interaction is going to be uh
real-time video with AI. So, real-time
video comprehension, real-time video
generation. Um that's going to be most
of the load. And that's how it is for
most of the internet right now. It's um
most of the internet is video. Um text
is a pretty small percentage but the the
text tends to be higher value generally
or more it's more densely compressed
information like um
yeah so
but if you say like what is the most
amount of bits generated and compute
spent it's certainly going to be video
>> so I used to be a shareholder of X a
very small one okay
>> and I got paid when you bought it when
you bought Twitter and you made it U
U
happy decision. Glad you did it.
>> Yeah. Yeah. I think it was important. Um
you know, I felt like uh Twitter was
heading in or had had gone in a
direction that had sort of a more of a
negative influence on the world. Um you
know, it was it was I mean, of course,
this depends on one's perspective. Some
people will say, well, actually, they
liked the way it was and now they don't
like it. Um but the I think the
fundamental thing was that um Twitter
was amplifying
I would say a fairly pretty far left by
most people's standards in the world's
ideology because of where it was based
in San Francisco. So and and they
actually suspended a lot of people on
the right. Uh um so
uh so from their perspective even
someone in the center would be would be
far right. If you're if you're far left
anyone in the center is far right
because you're you it's just a political
on the political spectrum they're um
they're just as far left as you get in
the United States and in San Francisco.
So what I've tried to do is just restore
it to be balanced and and uh centrist.
So there haven't been any left-wing
voices that have been suspended or you
know banned or uh deamplified or
anything like that. Now some of them
have chosen to just go go somewhere
else. Um but uh but at this point it is
the the the operating principle of the
of the X system is to adhere to any
country's laws but not to put our thumb
on the scale beyond the laws of a country.
When I think of social media, um,
>> thank you. When I think of social media, Elon,
Elon,
I feel like even data suggests that the
current incumbents seem to be losing
traction amongst the youngest of audience.
audience. >> Yeah.
>> Yeah.
>> Even platforms like Instagram, uh, I
mean, they're not exactly like Twitter,
but platforms across the board. If one
had to rework social media and build
something bottom up, what do you think
>> Well, I mean, I I don't think that much about
about
um about social media to be frank. I
mean, it's I can mostly just want to
have have something where there's um a
in the case of of X, kind of a global
town square,
>> uh where where people can say what they
want to say uh with words, pictures,
video um where there's a secure
messaging system. We've recently added
the ability to to do audio and video
calls. Um, so you're really trying to
bring the the the world the world
together into um a a collective
consciousness and um
that that's I guess different from just
saying like what is the most dopamine
generating video stream that one could
make? Um which uh you know you I think
can be a little bit of brain rot
frankly. um you know, if if you're just
watching videos that just cause dopamine
hits one after another, um but lack
substance, then I think those those are
not great that that's not a great way to
spend time. Um but I I do think that's
actually what a lot of people are going
to want to watch. Um, so if you say like
total internet usage, it's going to
probably be optimizing for, you know,
neur neurotransmitter generation, like
it it there's somebody getting like a a
kick out of it, >> right?
>> right?
>> But it's it's it's it becomes like a
drug type of thing. So,
>> um, but I'm not really after
my goal is not to do that. I I guess I
could do that if I if I wanted to, but
um uh that's I I just want to really
have um a a global platform that brings together
together
like like I said like it's come becomes
as close to sort of a collective consciousness
consciousness
uh of humanity as possible. Um and um
you know like and one of the things that
we've introduced uh for example is
automatic translation. So um so because
I think it would be great to bring
together uh what what people say in many
different languages um and but
automatically translated for the
recipient. So you have the collective
consciousness not not just of of say
people in a particular language group
but you have
um the thoughts of of people in you know
every language group. And why is that
important collective consciousness to
have one platform?
>> I I guess uh
I I guess it's you could also say like
like why uh
you know if you consider humans like
humans are composed of around 30 to 40
trillion cells um
um and
and
you know there's trillions of synap
but but but there's there's no the why a
bit. I mean I guess it's just so we can increase
increase
our understanding our our understand
understanding of the the universe. Um, so
I I guess I like I had this sort of
question about what's the meaning of
life, you know, um like why
why is anything important? Um
um you know why why why are we here? Um
what's the origin of the universe? Where
what is the end? Um
What are the questions that we don't
even know to ask? Um,
and probably the questions we don't even
know to ask are the most important ones.
Um, so I'm just trying to I guess
understand what's going on. What is what
is going on in this reality? Um,
is is this is this reality? And um >> um
>> um
>> and where did you get when you asked
what is the point of life?
>> Yeah. So I
I came to the conclusion that um which
is somewhat in the Douglas Adams
Hitchhiker's Guide to the Galaxy school
of thought which is
>> what he do.
>> Yeah. He you know he sort of Hug's Guide
to the Galaxy is like a book on
philosophy disguised as humor. >> Yeah.
>> Yeah. >> And
>> And
the that's where you know Earth turns
out to be this computer to understand to
get to figure out the answer of the
meaning of life
>> and it comes up with the answer 42
>> and but then it's like what the heck
does 42 mean? Um, and it turns out,
well, actually the hard part is
the question, not the answer. And for
that, you need a much bigger computer
than Earth. That's so basically what
Douglas Adams was saying is that we we
actually don't know how to frame the
questions properly. Um, and um, and so,
so I think by expanding the scope and
scale of consciousness, we can better
under understand what questions to ask
about the answer that is the universe.
Do you believe the collective
You know when when I I was watching this
movie recently called the gladiator
Russell Crowe. Have you seen it? >> Yeah.
>> Yeah.
>> In Gladiator in Rome when people are fighting
fighting >> Yeah.
>> Yeah.
>> and the crowd is cheering when people
kill each other. >> Uh
>> Uh
the collective is very much like the
mob. It doesn't have
nuance in its opinion per se.
>> Well, I that's a particular kind of mob.
I mean, the sort of going there to see
people kill each other, you know.
>> Do you suspect the society we live in
today is very different?
>> Well, we don't we don't generally uh at
this point we don't,
you know, go watch people kill each
other. Uh >> maybe
>> maybe
some kind of euphemism of that
>> sports, I suppose.
Uh so people do sports without um
where teams attempt to defeat each other
>> but minus the death >> right.
>> right.
>> Um so
just going back to the
uh consideration of a human we all
started out as one cell but now we are
over 30 trillion cells. Mhm. >> Um
>> Um
and uh
but I think most people like feel like
they're one one body like you know
usually your right hand's not fighting
your left hand type of thing you know
>> to to sort of cooperate. Um your mind is uh
uh
just a vast number of neurons but but
most of the time it doesn't feel like
there's you know a trillion voices in
your brain. Hopefully not. Um
Um
so so there there's there's clearly
more that happens when you have
trillions of cells uh working as a
cellular collective than say one cell or
um a a small you know small
multisellular creature. There's there's
clearly some something different that
happens like you can't talk to a
bacteria you know. >> Yeah.
>> Yeah.
>> It's very silent.
um they just sort of wiggle around and
you know from their perspective I don't
know I sort of what is what is life like
from the perspective of a sing of of an
amoeba you know um but I know you can't
talk to amoeba like they don't talk whack
whack
>> um but you can talk to humans
>> so there's just something obviously
qualitatively fundamentally different um
for humans once you have a large number
of cells and you know sufficiently large
brain type of thing. There's you can now
talk to humans
>> and they they and they can say things,
they can produce things. Um but uh
bacteria are not going to produce a
spaceship for example. Um but humans can.
can.
So I think there's something
qualitatively different that also
happens when there's a collection of
humans. In fact, in fact, it's safe to
say that a single human cannot make a
spaceship. I could not make a spaceship
by myself. But but uh with a collection
of humans uh we can make spaceships. So
there's there's something obviously
qualitatively different
um about
a collection of humans. In fact, it
would be impossible for me to learn all
of the areas of expertise. There
wouldn't be enough time in one lifetime
to even learn all the things before I
was dead.
>> So um so you really fundamentally have
to have a collection of humans to make a rocket.
rocket. Um
Um
then I think there probably some other scaling
scaling
qualitative scaling things that happen
when you have groups of humans and then
if the
quality of the interaction or the
quality of the information flow
um is the the better it is the more the
human collective will achieve.
Um, and I'm I like said I'm just curious
about the nature of the universe and and
I think if we it's safe to say like if
if we increase the scope and scale of
consciousness, we're much more likely to
understand the nature of the universe
than if we reduce it.
>> Is that a bit like spirituality? A lot
of people talk to me about spirituality, >> right?
>> right?
>> I still don't know what it actually
means. Like I keep asking them, what do
you mean?
What do you mean?
>> Uh yeah, I mean a lot of people have
spiritual feelings, >> right?
>> right?
>> Um and um and I wouldn't try to deny
that those spiritual spiritual feelings
are real to them. Um but it's it's uh
it doesn't entirely translate. I can't
just because somebody else has a
spiritual feeling doesn't mean that I
would have that spiritual feeling. Um so
um you know I I tend to be kind of
physics pulled which is like if
something has predictive value >> then
>> then
I you know I'll pay more attention to it
than if it doesn't have predictive value >> right
>> right
>> u so
you know physics I would say is the
study of that which has predictive value
uh I think it's pretty good definition
um so
>> my primary job elon is a stock broker
and stock investor Okay.
>> There is no predictive value. Nobody
knows what will happen tomorrow.
>> Well, but I think you can generally say,
you know, um that
um if if if it's long-term for a company,
company,
>> then you can say like, well, does that
is that do you like the products or
services of that company? And is it
likely to
>> do you like the the product roadmap?
like it seems like they they make great
products and they're likely to make
great products in the future. If that's
the case, then I would say that's
probably a good company to invest in.
Um, and I think you also want to believe
in the the team. So, if you think, well,
that's a talented and hardworking team.
They make good products today. They seem
to be still motivated to make things in
the future, then I'd say that's that's a
good company to invest in.
>> Um, fair point.
>> Yeah. And now that that that that
won't solve for the daily fluctuations
which happen and sometimes are pretty
extreme. Uh but over time it would that
that is the the right way to invest in
stocks because a company is just a group
of people assembled to create products
and services. So you have to say what
are how good are those products and
services? Are they likely to continue to
improve in the future? If so, then you
should buy the stock of that company and
and then don't worry too much about the
daily fluctuations, >> right?
>> right?
What's got you most excited now, Elon,
in terms of all that you're building?
You're doing so much. So, let me just
preface and contextualize
who is watching this. Uh, our audience
is largely wannabe entrepreneurs in India.
India. >> Okay.
>> Okay.
uh really ambitious, really hungry, want to
to
take the risk and build something and I
feel like all of us have so much to
learn from you because you've done it so
many times over in so many different domains.
domains. >> Yeah.
>> Yeah.
>> Uh so we will speak to them today and I
will try and center all my questions in
that direction so they can take
advantage of this conversation and maybe
start take a chance and build something.
yeah, I guess the most important thing
make useful products and services. Um
>> yeah. Um
>> which one of all that all the products
and services that you're building has
Well, I I think that there's
increasingly a a convergence actually
between SpaceX and Tesla and XAI
um in that if the future is um solar
powered AI satellites, which it pretty
much needs to be in order to um in in
order to harness a non-trivial amount of
the energy of the sun, you have to move
to solar powered AI satellites in deep
space. um which somewhat is a confluence
of Tesla expertise and SpaceX expertise um
um
and XAI on the the AI front. So
it does feel like over time there's
somewhat of a convergence there. Um but
all the companies are doing doing great
things. Um very proud of the teams. They
do great work. Um so you know we're
making great progress with Tesla on the
autonomous driving. I don't know if
you've tried the self-driving. >> Mhm.
>> Mhm.
>> Have you tried it?
>> I've tried it in the Whimo, not in the Tesla.
Tesla.
>> Yeah, it's worth
>> uh We actually have it here in in
Austin, so you can
>> I'd love to try it.
>> You can you can literally just download
Tesla app. >> Yeah.
>> Yeah.
>> And I and I think I think it's open to
to any to anyone. Definitely try it out.
I mean,
>> you know, let me know how it goes. Um
but uh you know we've made a lot of
progress with electric vehicles with uh
battery packs and solar and but and very
much so with self-driving. So basically
real world AI um Tesla is the world
leader in real world AI I would say. So,
um, and then we're going to be making
this robot Optimus, which is, you know,
starting production hopefully some next
year, um, at scale. Um, and I think
that's going to be pretty cool. That'll
be like I think everyone's going to want
their own personal C3PO R2-D2,
>> you know, helper help a robot. Like, it
would be pretty cool. Um and then SpaceX
is doing great work with the Starlink
program, you know, providing uh lowcost,
reliable uh internet throughout the world.
world.
>> Hopefully India. So we'd love to be
operating in India. That would be great.
We're operating in 150 different
countries now with Starlink.
>> Can you give me a bit about Starlink and
how the tech works
>> cuz somebody I was speaking to uh I
don't know if you know this company
called meter out of San Francisco. uh
they're trying to replace network
engineers, but
>> I know it now.
>> Um so he was telling me about how in
densely populated areas, Starlink works
differently than it might be in a place
with not as many people. Can you explain
how it works?
>> Yeah, so Starink um there's several
thousand satellites in low Earth orbit
and they're moving around 25 times the
speed of sound um in these, you know,
they're zipping around the Earth
basically. And um they're uh they're at
an altitude of about 550 km. >> Mhm.
>> Mhm.
>> Um which is called generally low earth
orbit. Um because they're they're at low
earth orbit, they're um the latency is
is low like the the distance because the
distance is is not that far compared to
a geostationary satellite uh 36,000 km.
Um so you've you've got um
thousands of satellites providing uh low
latency high-speed internet throughout
the world and
um and they are interconnected as well.
So there's there are laser laser links
between the satellites. So it forms sort
of a a laser mesh so that the if if
let's say uh fi let's say if cables are
damaged or cut like fiber cables the
satellites can communicate between each
other um and provide connectivity uh
even if uh there's there's a uh the
cables are cut so for example when the
red sea cables were cut uh I think a few
months ago
>> the satellite the the stinking satellite
network continued to function without a hitchh
hitchh
>> so it's it's particularly helpful for
disaster areas. So, but if an area has
been hit with uh some kind of natural
disaster, floods or fires or
earthquakes, that that tends to damage
the the ground infrastructure. Uh but
the Starink satellites still work. So,
um and generally when whenever there's
some sort of natural disaster somewhere,
we we always provide people with free
Starlink uh internet connectivity. You
know, we don't want to charge we don't
take advantage of a a tragic situation.
So, um, so always, you know, if there's
natural disasters, we like, okay, it's
it's free during the natural disaster.
You know, we we don't want to say like,
um, you know, put a pay wall up while
somebody's trying to get help. That
would be wrong. Um, so so that's it's
it's it's a very robust system. It's
it's complimentary to ground systems
because uh the satellite beams um work
best in uh sparsely populated areas. Um
but because you you've got a you've got
a satellite beam, it's a pretty big
beam. So you have a and you have a fixed
number of users per beam. So, um, it
tends to be very complimentary to the
groundbased cellular systems because
those are those are very good in cities
because you've got these cell towers
that are, you know, only a kilometer
apart type of thing.
>> But, uh,
but but but cell towers tend to be
inefficient in the countryside. So in in
uh rural rural areas is where you tend
to have the worst internet because uh
it's very very expensive and difficult
to lay to do all these do all the fiber
optic cables uh or to have um high
bandwidth cellular towers. So Starlink
is very complimentary to the existing
telecom companies. Um it it basically
tends to serve the serve the least
served which I think is is good. Um
that's um
>> will that change tomorrow? Like today as
you explained the the beam is quite
broad and it can't work in a densely
populated area with high buildings maybe.
maybe.
>> But can that change and tomorrow it
becomes really efficient
in a densely populated city where it is
competitive with the local network providers?
providers?
>> It's it's unfortunately so the physics
don't allow for that. So
>> we're too far away. Um so at 550 km and
even if we try to reduce it which about
as low as we can go is about 350 km
still very far away you you you've just
you can think of like a like a
flashlight which is it's you know that
flashlight's got a cone and and and that
that cone is is coming at you know today
550 km in the future we try to get down
to 350 km but we can't beat something
that's 1 kilometer away which the cell
tower uh physics is not on our side here,
here,
>> right? So it's not it's not physically
possible for us for stalling to serve uh
densely populated cities like you can
serve a little bit maybe 1% of the
population and sometimes people get you
know even in in crowded cities there
might be you know no fiber link up their
road like sometimes somebody's on a
culde-sac or something or in a a place
in in cities there are sometimes
underserved areas for random reasons and
so can serve like said maybe 1% or 2% of
of of a densely populated city. Um, but
it can be much more effective in like I
said in rural areas where the internet
connection is much worse and often
people either have sometimes no access
to internet or it's extremely expensive
or the quality is not very good. So
So
>> if I were to ask you to wager a guess
Elon, do you think India will go down
the path of urbanization like China did
with more people moving in from rural
economies to urban centers? Um or do you
think we
>> I suppose some some amount of that has
happened, right? Um I mean I'm actually
I I'm curious to sort of ask you some
questions as well. Is because of course
isn't isn't that the trend or is it not
the trend in India?
It is the trend largely. I think a
little bit changed during co when a lot
of urbanization slowed down and that was
not organic. It was very artificially
artificially manifested. >> Right?
>> Right?
>> But one does question that
with AI if productivity were to go up
and I heard you speak about UHI instead
of UBI.
>> Yeah. Uh
>> I think I think it will be universal
high income.
>> In a world like that, I wonder if more
people want to live in cities which are
always going to be more polluted
and not offer the quality of lifestyle
that a rural environment might.
>> Well, I guess it's up to some people
want to be around a lot of people and
some people don't. You know, it's going
to be a maybe a matter of personal
choice. But I think in the future it
won't be I I think it won't be the case
that you have to be in a city for a job.
Um because I I think I I my prediction
is in the future working will be optional,
optional, >> right?
>> right?
>> We seem to be moving from not in India
but in in some parts of the west from 6
days to 5 days to 4 days to three.
>> I think the Europeans.
>> Yeah. Yeah. Uh
um yeah. Yeah. There's I mean I I think
if you're trying to make a startup
succeed or you're trying to make a
company do very difficult things then
you you you definitely need to put in
serious hours. I think that's >> right.
>> right.
>> That's how it goes.
>> And if we were to move from 5 to 4 to 3
days, how do you think society changes
when people have to work half the week?
What do they do with the other half?
Well, I I think it'll actually be that
people don't have to work at all in the
in the um and it may not be that far in
the future. Maybe only I don't know 10 I
say less than 20 years.
In less my prediction is less than in
less than 20 years working will be
optional. working at all will be optional
like a hobby
pretty much
>> and that would be because of increased
productivity meaning people do not have
to work
>> they don't have to I mean look this
obviously people can play this back in
20 years and say look Elon made this
ridiculous pred prediction and it's not
true but I think it will turn out to be
true that in less than 20 years maybe
even as little as I don't know 10 or 15
years um the advancements in AI and
robotics will
bring us to the point where working is optional
optional
um in the same way that like say you
could you can grow your own vegetables
in your garden or you could go to the
you know
>> it's much harder to grow your own vegetables
vegetables
But but you know some people like to
grow their vegetables which is fine you
know um but it it'll be optional in that
way is my prediction.
>> If one were to argue that humans are
innately competitive and everything is
relative from the time of hunters.
Somebody wanted to be the alpha hunter
or the biggest farmer. If everybody gets
a universal high income and everybody
has enough
>> what do you compete for? Uh, it would be
relative, right? Like if we all had
enough, enough is not enough. >> Um,
>> Um,
yeah. I I guess I I I'm not exactly
sure. Um, cuz we're we're really headed into
into
the singularity as it's called, which
you know, they refer to AI sometimes as
the kind of like a black hole, like a
singularity. You don't know what happens
after the event horizon. It doesn't mean
that something bad happens. It just
means you don't know what happens. Um, so
so
like I'm I'm confident that if AI and
robotics continue to advance, which they
are advancing very rapidly, like I said,
working will be optional. Um, and people
will have any goods and services that
they want.
Um, if you if you can think of it, you
can have it type of thing. Um
and but then at a certain point
AI will actually saturate on anything
humans can think of
and then at at that point it it becomes
a situation where AI is doing things for
AI and robotics are doing things for AI
and robotics because they've run out of
things to do to make the humans happy
you know because there's a limit you
know they say like there's only people
can only eat so much food or
you know if but I it's going to be I
think if you can think of it you can
have it will be the future
>> you know the Austrian school of
economics if you go back in time they
were the digression from Adam Smith
>> they talk about the marginal utility of everything
everything
having one of something has value having
two of the same thing has lesser value
and having 10 of the same thing has no value
value >> yes
>> yes
>> so if we could have everything we wanted
>> One's one's plenty.
>> It's like the marshmallow taste. You're
like, you can have two marshmallows
later or one marshmallow now. And I'm
like, I'll have one marshmallow. I don't
want two marshmallows.
>> That's interesting.
>> What would you pick?
>> Well, I I don't One marshmallow is
enough. I always question marshmallows
as being like not the most, you know,
the best candy, you know? >> Yeah.
>> Yeah.
Well, I don't yearn for marshmallows.
>> I think you're the best.
>> Who does?
>> You're the best testament to the
marshmallow experiment. I think
>> I suppose so. Well, I mean, like delayed
gratification essentially. >> Yeah.
>> Yeah.
>> You were able to delay it more than
most. You know, I have a tattoo which
says delay gratification.
>> Yeah. Wow. Okay. What's this? Okay.
You're really taking the marshmallow test.
>> I feel like I can't remember when I'm
trading or when I'm buying into
>> delay gratification. Yeah. Yeah,
>> it helps.
>> Wow. Okay, that's a good
>> Okay. Well, that's it's good advice. I
mean, you can't miss it.
>> If you could get
>> If you could get a tattoo, what would
you get?
>> I guess maybe my kids' names or something,
something, >> right?
>> right?
>> Why do you like the letter X as much as
you do? Well,
I mean, yeah, it's a good question.
Honestly, sometimes I wonder what what's
um so um I mean it started off with
where I think so way back in ancient
the pre the pre-Cambian era when there
were only sponges. Um the
I I there were only three onelet domain
names um
>> and I think it was XQ and Z and uh and I
was like okay I want to have create this
place where it's the um
the financial crossroads or like the the
financial exchange you know um
>> um it's essentially solving money from
an information theory standpoint where
the current banking system is is a large
number of uh heterogeneous databases
with batch processing that uh are not
secure. Um, and if we could have a
a a sort of a single database that was
real time and uh secure, that would be
more efficient from a monetary from an
information theory standpoint than, you
know, a large number of heterogeneous
databases that batch process very slowly
and securely. Um, so, um, so that that
was that was sort of X.com way back in
the day, which kind of, um, became
PayPal. Um,
and then um,
and was acquired by eBay. And then eBay,
someone reached out from eBay and said,
"Hey, do you want to buy the domain name
back?" And I was like, "Sure." You know,
and so I had the domain name for quite a
while. Um
and then uh
and then yes then I was like well maybe
this may maybe this acquiring Twitter
would also be an opportunity to
revisit the original plan of of X.com
which is to create this um
this like clearing house of
of financial transactions like like
basically to create a more efficient
datab money database is a way to think
about it is um like like people like
money is really a an information system
for labor allocation like people think
sometimes think money is power in and of
itself but it it doesn't it doesn't
really it's if there's no labor to
allocate it it's meaningless. So if you
were to be on a desert island with a
trillion you know dollars or whatever
doesn't matter.
>> Oh yeah. Why speculate when you can be real?
I just hope I don't end up on a desert
island, you know, it's not going to be
very useful to me. Um, but but it
illustrates my point that if if you're
if you're stranded on a on a on a desert
island with a trillion dollars, it's not
useful because there's no there's no
labor to allocate. You just allocate
yourself. So um
so so it's so anyway so it's so this
longwinded way of saying that it's uh
it it's just really like
I'm I'm just kind of slowly building
revisiting this idea that I had 25 years
ago to create a more efficient um
money database. Um,
and and if that's successful, people
will use it. If it's not successful,
they won't use it. Um, you know, and and
and then I also like the idea of like
sort of having a unified
app or or or website or whatever where
you can do like it can you can do
anything you want there. Um, so you
know, sort of China has this with WeChat
somewhat, you know, where you can
>> you you can exchange information, you
can publish information, you can
exchange money, uh, you can um, you
know, you sort of people kind of live
their life on WeChat in in China. It's
and it's it's it's quite useful, but
there's no u there's no real WeChat
outside of China. Um, so it's like it's
kind of WeChat++
I'd say is is the idea for for X.
Anyway, so then, uh, Space Exploration
Technologies is the full name of the
company, but I was like, that's too
much. That's a mouthful. So I was like,
we'll just call it SpaceX, like FedEx
for space.
>> Um, it just hasn't happens to have X in
the, you know, cuz exploration has an X,
but you know, and I was like, well, I
like the idea of capitalizing the X just
artistically. So, um, so then, uh,
that's why it's SpaceX. But, uh, and
then, um, what else have we got? We got
a kid.
>> Uh, he's called X2. Um, but that's his
mother is the one that named him X.
>> And I said, you know, people are really
going to think I've got a thing about X
if we name our kid X2, you know, and and
I I said to her, like, look, I do have
X.com, you know.
So, people are going to really think
I've got a somewhat of a fetish for this
letter. Um, but she's she said, "No, I
she likes X and she wants to call him
X." I'm like, "Okay."
>> Is this a new thing or have you had it
growing up?
>> No, I'm saying it's it's somewhat of a
co coincidence, you know. >> Um,
>> Um,
>> like not everything's called X. I mean,
Tesla's there's no X's in Tesla, you know.
know. >> Um,
>> Um,
>> what do you think money will be in the
future, Elon?
I I I think I think long term
I think money disappears as a concept.
Honestly, it's it's kind of strange, but
um in in a future where anyone can have anything,
anything,
uh I think you no longer need money as a
database for labor allocation.
Um if if there's if AI if AI and
robotics are big enough to satisfy all
human needs then then money is no longer
it's it's relevance declines
dramatically. It's I'm not sure we will
have it. So
you the best sort of uh imagining of
this future that I've read is uh from
Ian Banks the culture books. So, I
recommend people read the culture books.
You in the sort of far future of the
culture books, there's they don't have
money either. Um, and everyone can
pretty much have whatever they want. So,
there there there are still some fundamental
fundamental
currencies, if you will, that are
physics- based. So, energy is energy is
the real is the true currency. This is
why I said Bitcoin is based on energy.
You you can't legislate energy. You
can't just, you know, pass a law and
suddenly have a lot of energy. Um
you it's very difficult to to to
generate energy or especially to harness
energy in a in a useful way to do useful
work. So, so I think that probably
we we probably won't have money and
probably we'll just have energy,
you know, power generation as the de
facto currency.
So I mean I think one way to frame
civilizational progress is the
percentage completion on the Kadeshv
scale. So we're, you know, cottage one
is what percentage of a planet's energy
are you successfully turning into useful
work? And I'm maybe paraphrasing here a
little bit, but a caut would be what
percentage of the sun's energy are you
converting into useful work? Um, Kesha 3
would be what percentage of the galaxy
are you converting to useful work. Um
Um
so so things really I think become
energy based. >> Um
>> Um
>> but if you have solar powered AI
satellites energy is also free and
abundant cuz we'll never be able to
utilize all the solar energy available
to us. So it can't be a store of wealth
essentially in that lens can it?
you know, there's not really s you can't
really store wealth in in like you you
can only
you um
you you you can accumulate numbers in
currently currently you can accumulate
numbers in a database that
uh allow you to um
um
to some degree to to incent the behavior
of other humans in particular directions.
directions.
>> Yeah. Um, and I guess people call that
wealth. Um, but again, if if there's no
humans around, there's no wealth
accumulation is meaningless.
>> It's a digression, but if you were to
consider food as the energy for a human
to thrive,
>> yeah, food is energy. It's literally got
calories just means energy.
>> So, can a farm which is self-sustaining
>> um
I'm not sure what that means but you
know there's
I I I like I think the the at a certain
point you you do complete the the cycle where
where
and you I think at a certain point you
decouple from the the sort of
conventional economy if you have um
AI and robots producing chips and solar panels
panels
um and you know and mining resources in
order to make chips and robots in order
to make you you sort of complete that
cycle once that cycle is complete once
that that cycle is complete uh I think
that's the point at which you decouple
from the monetary system
>> is that the way forward for the US by
virtue of
how much debt they have today. Do they
deflate away their currency and
transition into this new form and lead
that push because it would make more
sense to them?
>> Well, in this future that I'm talking
about, the notion of countries uh
becomes sort of inacronistic. Um
Um
>> do you believe in it today? Do you believe
believe
>> I certainly believe in it today. And I I
want to just separate like something
that I like these are just what I think
will happen based on what I see as
opposed to I think these are
fundamentally good things and I'm trying
to make them happen. It it's like I
think this would happen with or without
me um whether I like it or not.
>> Um as long as civilization keeps
advancing we we we will have AI and
robotics at very large scale. Um
the uh
I I think that that's that's pretty much
the only thing that's going to solve for
the US debt crisis. You know, the
because currently the US debt is
insanely high and uh the interest
payments on the debt exceed the entire
military bud budget of the United
States, just the interest payments. And
that that's that's at least in the short
term going to continue to increase. So,
so I think I think actually the only
thing that can solve for uh the debt
situation is um Zean robotics and but it
will more than
it might cause it pro I guess it
probably would cause significant
deflation because
you know deflation or inflation is it's
really the ratio of goods and services
produced to the the change in the money
supply. So like so if if if goods and
services output increases faster than
money supply you will have deflation. If
goods and services decreases if if real
goods and services output increases
slower than the money supply you have
inflation. It's that simple. People try
to make it more complicated than that
but it but it it just isn't. Um so if
you have AI and robotics and a dramatic
increase in the output of goods and
services probably you will have deflation.
deflation.
That seems likely
because you you simply won't be able to
to increase the money supply as fast as
you can increase the output of goods and services
services
>> with all
>> supply is a real hazard here.
>> Should we do something about it?
>> Maybe we can convince it to go somewhere else.
else. >> Yeah.
>> Yeah.
>> Entic it elsewhere.
>> It actually left, I think. Okay.
Maybe it's attracted to the light.
>> If deflation want some coffee,
>> If deflation is inevitable because of
AI, why do
>> That's most likely the case. Yeah.
>> Right. Why do we have inflation again
all over in society today? Has AI not
led to increased productivity yet?
uh it's not AI has not yet made enough
of an impact on productivity to increase
the goods and services faster than the
increase in the money supply. So the inc
the US is increasing money supply quite
substantially with you know deficits
that are on the order of $2 trillion. >> Yeah.
>> Yeah.
>> Uh so so you have to have um
you know goods and services output
increase more than that in order to not
have inflation. So, we're not there yet.
But if you say like like how long would
it take us to get there? I think it's 3 years.
years.
Probably 3 years before
in 3 years or less
my my guess is goods and services output
will exceed the rate of inflation like
money goods and services growth will
exceed money money supply growth in
about 3 years.
Maybe after those three years you have
deflation and then interest rates go to
zero and then the debt is a smaller
problem than it is. >> Yes.
>> Yes. >> Right.
>> Right.
>> That's most likely the case.
>> You spoke about being in a simulation
earlier. I love the Matrix.
>> Yes. Yes.
>> If you were to be a character from the
Matrix, who would you be?
>> Well, there's not that many characters
to pick from, you know. Um hopefully not
Agent Smith.
He's my hero.
Um, I mean, Neo is pretty cool. Um, the
architect is interesting. >> Mhm.
>> Mhm. >> Um,
>> Um,
>> the Oracle.
>> So, Oracle. Um, sometimes I feel like
I'm I'm an anomaly in the Matrix.
>> That is Neo. >> Yeah.
>> Yeah.
>> Do you believe you're in a matrix
though? Like actually believe?
I I think you have to just think of
these things as probabilities, not certainties.
certainties.
>> Um there's some probability that we're
in a simulation.
>> What percentage would you attribute to that?
>> Probably pretty high. I would say it's
pretty high. >> Yeah.
>> Yeah.
>> Yeah. Um, so one way to think of this is
to say if you look at the advancement of
video games in in our lifetime, or at
least in my lifetime, it's gone from
very simple video games with where
you've got like Pong. You've got two
rectangles in a square just batting it
back and forth to
uh photorealistic realtime
um games with millions of people playing simultaneously.
simultaneously.
>> Mhm. Um,
and that's happened just in the span of
50 years. So,
if that trend continues, video games
will be indistinguishable from reality, >> right?
>> right?
>> Um, and we're also going to have very
intelligent characters, like non-player
characters in these video games. Think
of how sophisticated the conversations
are you could have with an AI today, and
that's only going to get uh more
sophisticated. the you you you'll be
able to have
uh conversations that are
more complex and and
more sophisticated than any almost any
human conversation
maybe maybe any um so then so you have
so the future if civilization continues
will be millions maybe billions of of
of
photorealistic like indistinguishable
from reality
video games with characters in those
video games that are
uh very deep and and and where the the
dialogue is not pre-programmed. Um
Um
that's for sure what's going to happen
in in this in this level of the
simulation, if you could call it. So
then then what are the odds that we are
in base reality
and that and that this has not happened before?
>> If I were to buy into that and assume
that we are in a simulation
as Neo of the story, what do you know
that I don't and I can learn from? I
think most likely
if we if outside the simulation would be
less interesting than in the simulation
because we're most likely a distillation
of what's interesting
because that's what we do in this that's
what we do in our reality. Um and then
I do also have a theory which is like
the most interesting outcome is the most
likely outcome as seen by a third party.
um the god the gods or god of the simulation
simulation um
um
because when we do simulations
when humans do simulations
we we stop those simulations that are
not interesting.
So like if SpaceX is doing simulations
of rocket flights,
>> uh the you know the the boring ones we
we we discard because they're not
they're just not we don't learn anything
from those. or when when Tesla is doing
uh simulations for self-driving, uh
Tesla's actually looking for the most
interesting corner cases because the the
normal stuff we already have plenty of
of of uh data on, you know, driving on a
straight road on a sunny day.
We don't need more of that. Uh we we
need like heavy weather conditions on a
small windy road with two cars that are
you know coming at each other with a
almost head-on collision. We need like
weird stuff basically uh interesting
stuff. Um, so I think that
from a Darwinian perspective, the
simulations most likely to survive are
going to be the ones that are the most
interesting simulations,
which therefore means that the most
interesting outcome is the most likely.
>> And the people who simulated our world,
if one were to extrapolate, they
themselves might in turn be in another simulation.
simulation.
>> Yes. And there could be many layers of simulation.
simulation. >> Yes.
>> Yes.
>> Beyond all of these layers of simulation,
simulation,
do you think there's something I I read
somewhere that you used to ascribe to
Spinoza's god in a way?
>> No describ I was really just pointing
out that that you don't you don't have
to have um it's like one of the things
Spinoza was saying is that you don't you
you can have morals in the absolute. You
don't need need to have morals to be
handed to you. You know, it's like the
question is can morality exist outside
of a religious context and Spinosa was
arguing that it can.
>> Wasn't he arguing for the laws of nature
should be where we seek our laws of
morality from to a certain extent.
>> Yeah. But when I think of laws of
nature, I see a tiger eat a deer and a
So in Spininoza's morality, that's fair
game, right? Um
Um
you can I I think there's a lot of
things you can take from from Spinosa,
but I the only point I was making in
referencing Spinosa was that that you
you can have a set of of of morals that
that make society functional um and productive
productive
with and in but without you you don't
necessarily have to have religious
doctrine for that. Um, so that's uh,
yeah, I think that's that's the main
thing I was trying to say there.
>> Like like I don't think people just like
if somebody is it doesn't if if there's
if if there's not like a commandment not
to kill, you know, like people doesn't
mean somebody's without that they will
run around murdering people, you know,
like you don't you don't have to have a
commandment not to kill. Have you played
GTA religious edict to run around
people? I I actually I I've only played
a little bit of GTA cuz I didn't like
the fact that um like in GTA 5 you
literally can't pro progress unless you
killed the police. And I'm like
this doesn't work for me. Um I actually
don't like killing the NPCs in the video
games. It's not my thing, you know. So,
um, actually I didn't like I didn't like
GTA cuz it I actually stopped when it
said you have to know way to proceed to
shoot at the police. I'm like, I don't
want to do that.
>> Maybe that's why us as the NPCs of our
you know anyway I think you can just
sort of say there's some common sense
things that
you know any civilization that
uh runs around you know where people
just murder each other wantingly is not
You seem to be changing a bit towards
religion though faith like off late
you've said a bunch of things which are
pro- religion almost not pro- religion but
on those lines I
>> I mean I think are there other religious
are there principles in religion that
make sense yeah I think there are um
>> is it easier for our simulation to have
a pro- relligion
projection for the world that we live in. We become more relatable. It's
in. We become more relatable. It's easier.
easier. >> Well, which religion though?
>> Well, which religion though? >> Any depending on where you live?
>> Any depending on where you live? >> So, pick one, you know.
>> So, pick one, you know. >> Um, it's it's pretty rare that kids are
>> Um, it's it's pretty rare that kids are said, you know, which religion would you
said, you know, which religion would you like, you know,
like, you know, it's pretty rare, right?
it's pretty rare, right? >> I don't know too many situations where
>> I don't know too many situations where kids got were offered like, you know,
kids got were offered like, you know, uh, you know, like what what do you want
uh, you know, like what what do you want to major in type of thing. Uh
to major in type of thing. Uh it's usually like they you get you get
it's usually like they you get you get given a religion by your parents and
given a religion by your parents and your community. Um, so
your community. Um, so you know, um,
but you know, I mean, I think, you know, there's there's
you know, there's there's good things in in in in in all religions
good things in in in in in all religions that are good principles um
that are good principles um that you you can sort of read any
that you you can sort of read any religious text and say, "Okay, this is a
religious text and say, "Okay, this is a good principle. This is going to be this
good principle. This is going to be this is going to lead to a better society
is going to lead to a better society most likely, you know." Um so
most likely, you know." Um so um I mean in Christianity sort of love
um I mean in Christianity sort of love thy neighbor as thyself which is you
thy neighbor as thyself which is you know have empathy for fellow human
know have empathy for fellow human beings uh is a good one I think for uh
beings uh is a good one I think for uh good society you know uh basically just
good society you know uh basically just consider the feelings of others and uh
consider the feelings of others and uh treat treat other people as you would
treat treat other people as you would like to be treated.
like to be treated. If you had to redraw, resketch the
uh, think morality, politics, economy, how would you change the world we live
how would you change the world we live in today?
in today? >> Um, if you had to have Elon simulation
>> Um, if you had to have Elon simulation of things,
of things, >> well, overall, I think the world is is
>> well, overall, I think the world is is pretty great right now. I mean it's it's
pretty great right now. I mean it's it's uh you know anyone who thinks that like
uh you know anyone who thinks that like today's world is not
today's world is not that great. I I think they're they're
that great. I I think they're they're not going to be excellent students of
not going to be excellent students of history cuz if you
>> if you read a lot of history like wow there's a lot of misery back then you
there's a lot of misery back then you know um I mean it used to be that you
know um I mean it used to be that you you know people would be dropping dead
you know people would be dropping dead of the plague all the time you know
of the plague all the time you know >> part of the course.
>> part of the course. >> Yeah. you know, just be like a good a
>> Yeah. you know, just be like a good a good year back in the day would be like
good year back in the day would be like not that many people died of the plague
not that many people died of the plague or starvation or being killed by the
or starvation or being killed by the another tribe.
another tribe. >> It's like that was good year. We only
>> It's like that was good year. We only lost 10% of the population, you know?
lost 10% of the population, you know? >> Like
>> Like I think like 100 years ago we lived up
I think like 100 years ago we lived up until 35 or 40, right?
until 35 or 40, right? >> We had very high infant mortality.
>> We had very high infant mortality. >> Yeah. Um, so like you do had had a few
>> Yeah. Um, so like you do had had a few people that that would live to an old
people that that would live to an old age, but you know, not that long ago,
age, but you know, not that long ago, 100 years ago, if you got um like some
100 years ago, if you got um like some minor infection, they didn't have
minor infection, they didn't have antibiotics. So you just like kick the
antibiotics. So you just like kick the bucket because you, you know, drank some
bucket because you, you know, drank some water that had dentry and that was it.
water that had dentry and that was it. Curtains, you know,
Curtains, you know, >> just die of diarrhea.
>> just die of diarrhea. >> Maybe that's just literally that was
>> Maybe that's just literally that was like that's miserable.
like that's miserable. Maybe that's why people had as many kids
Maybe that's why people had as many kids as they did back then.
as they did back then. >> Yeah. I mean, if you didn't, then you
>> Yeah. I mean, if you didn't, then you know, you
know, you >> you know, like half the kids would die
>> you know, like half the kids would die type of thing.
type of thing. >> Yeah.
>> Yeah. >> So,
>> So, >> you have a lot of kids now.
>> you have a lot of kids now. >> Yeah.
>> Yeah. Like an army.
Like an army. >> I'm trying to get a an entire Roman
>> I'm trying to get a an entire Roman legion.
>> Um so, um yeah. Um well I have like a some older kids that are you know adults
some older kids that are you know adults essentially you know um and then a bunch
essentially you know um and then a bunch of younger kids.
of younger kids. >> So um
>> So um >> do you still believe in the concept of
>> do you still believe in the concept of not still do you believe that the
not still do you believe that the concept of one child, one mother, one
concept of one child, one mother, one father works?
father works? >> I I think that it does work for most
>> I I think that it does work for most people. Yeah.
people. Yeah. >> Right.
>> Right. like that's, you know, something like
like that's, you know, something like that is is going to be generally the uh
that is is going to be generally the uh that's what that's what works for most
that's what that's what works for most people. Um,
people. Um, you know, so um
you know, so um >> changing though
>> changing though >> I I and and I mean I'm not sure if you
>> I I and and I mean I'm not sure if you if you know this but like um you know my
if you know this but like um you know my my partner Siobhan you know she's she's
my partner Siobhan you know she's she's she's half Indian. I don't know if you
she's half Indian. I don't know if you know that.
know that. >> I didn't know that.
>> I didn't know that. >> Yeah. Yeah.
>> Yeah. Yeah. >> Yeah. Um and um uh one of my sons with
>> Yeah. Um and um uh one of my sons with her is is uh middle name is Seeker after
her is is uh middle name is Seeker after Chandra Secker.
Chandra Secker. >> Wow.
>> Wow. >> Yeah.
>> Yeah. >> Very interesting.
>> Very interesting. Did she spend any time in India? Shimon.
Did she spend any time in India? Shimon. >> Uh no she grew up in Canada.
>> Uh no she grew up in Canada. >> You mean origins?
>> You mean origins? >> That's right.
>> That's right. >> Ancestry like Oh. Um
>> Ancestry like Oh. Um >> her parents or grandparents were from
>> her parents or grandparents were from there?
there? Yes. Yes. Yes. Her her father uh I mean
Yes. Yes. Yes. Her her father uh I mean she was she was given up for adoption
she was she was given up for adoption when she was a baby. Um so I think I
when she was a baby. Um so I think I think
think >> I think her father was like a
>> I think her father was like a like an like an exchange student at the
like an like an exchange student at the university or something like that. I'm
university or something like that. I'm not sure the exact details but um it you
not sure the exact details but um it you know it was kind of thing where I don't
know it was kind of thing where I don't know she was uh g given up for adoption
know she was uh g given up for adoption um and um
um and um yeah so but she grew up in Canada.
yeah so but she grew up in Canada. >> Would you adopt kids?
>> Would you adopt kids? >> You know I I definitely have my handful
>> You know I I definitely have my handful hands full right now. Um, so no, I'm not
hands full right now. Um, so no, I'm not opposed to it, but it's like, um, you
opposed to it, but it's like, um, you know, I I I do want to have be able to
know, I I I do want to have be able to spend some time with my kids, you know.
spend some time with my kids, you know. So, it's
So, it's >> um, you know, right before coming here,
>> um, you know, right before coming here, I mean, I was with um, you know, with
I mean, I was with um, you know, with with my my with my kids. Um, so just,
with my my with my kids. Um, so just, you know, seeing them before bedtime,
you know, seeing them before bedtime, that kind of thing. So, you know, beyond
that kind of thing. So, you know, beyond a certain number, it's like it's kind of
a certain number, it's like it's kind of impossible to spend time with them. But
impossible to spend time with them. But like my like I said, my my older kids,
like my like I said, my my older kids, they're uh very independent. You know,
they're uh very independent. You know, they're in university and uh
they're in university and uh so they're they're um you know,
so they're they're um you know, especially sons when when they get past
especially sons when when they get past certain age, it's like they're very
certain age, it's like they're very independent. you know, it's like
independent. you know, it's like uh most most uh boys don't talk to their
uh most most uh boys don't talk to their they they don't spend a lot of time with
they they don't spend a lot of time with their parents after age 18, you know.
their parents after age 18, you know. So, um so I see them once in a while,
So, um so I see them once in a while, but they're very independent. Um
but they're very independent. Um so then uh
so then uh you know, I can only have enough kids on
you know, I can only have enough kids on the young side that that like it's where
the young side that that like it's where it's humanly possible to spend time with
it's humanly possible to spend time with them.
them. So um
So um >> any views on the future of marriage,
>> any views on the future of marriage, family? What do you think happens to uh
family? What do you think happens to uh people having lesser kids everywhere
people having lesser kids everywhere including India? I think our
including India? I think our replenishment rate is down to
replenishment rate is down to >> right
>> right >> I mean our fertility
>> I mean our fertility >> it dropped below replacement rate I
>> it dropped below replacement rate I believe last year
believe last year >> below 2.1.
>> below 2.1. >> Yeah.
>> Yeah. >> What do you think happens tomorrow? Does
>> What do you think happens tomorrow? Does the world just get older and then there
the world just get older and then there is a phase where the world again is
is a phase where the world again is replenished but with a less with a
replenished but with a less with a smaller population than we had to go
smaller population than we had to go begin with.
>> I mean I do worry about the population decline. This is a big big problem.
decline. This is a big big problem. >> Why is that?
>> Why is that? >> Well I I don't want humanity to
>> Well I I don't want humanity to disappear.
disappear. >> But a decline and disappear are
>> But a decline and disappear are completely different things, right?
completely different things, right? Well, if the trend continues, it just we
Well, if the trend continues, it just we disappear. Uh but but but also going
disappear. Uh but but but also going back to you know my philosophy if you
back to you know my philosophy if you will which is that we want to expand
will which is that we want to expand consciousness then fewer humans is worse
consciousness then fewer humans is worse because uh we have less consciousness.
>> Do you think consciousness will go up by virtue of the number of people in there?
virtue of the number of people in there? >> Yes.
>> Yes. I mean just like consciousness increases
I mean just like consciousness increases from a single cell creature to you know
from a single cell creature to you know a 30 trillion cell creature. Um
a 30 trillion cell creature. Um we're are more conscious than a bacteria
we're are more conscious than a bacteria at least it seems that way. Um so a
at least it seems that way. Um so a larger you know human population would
larger you know human population would be have increased consciousness. We're
be have increased consciousness. We're more likely to understand
more likely to understand the answers
the answers to the nature of the universe if we have
to the nature of the universe if we have a lot of more people than if we have
a lot of more people than if we have fewer.
>> Right. I don't have kids.
I don't have kids. >> Well, it's uh maybe you should. Yeah.
>> Well, it's uh maybe you should. Yeah. >> A lot of people tell me I should.
>> A lot of people tell me I should. >> You won't regret it.
>> You won't regret it. What's the best thing about having kids?
What's the best thing about having kids? >> Well, I mean, you've got this
>> Well, I mean, you've got this uh
I I mean, you've got this little creature that loves you and you love
creature that loves you and you love this little creature. Um and uh
you you I don't know you you kind of see the world through their eyes as they
the world through their eyes as they you know as they grow up and the the
you know as they grow up and the the their conscious awareness increases you
their conscious awareness increases you know from a baby that has no idea what's
know from a baby that has no idea what's going on can't survive by itself can't
going on can't survive by itself can't even walk around can't talk to you they
even walk around can't talk to you they stop walking
stop walking then talking and then having interesting
then talking and then having interesting thoughts
thoughts and Um
but but yeah, I mean I I I think we we fundamentally have to
fundamentally have to have kids or or go extinct, you know.
have kids or or go extinct, you know. It's like uh
It's like uh >> is there any ego in having a child? I
>> is there any ego in having a child? I often think of this when I see my
often think of this when I see my friends with their kids. They're all
friends with their kids. They're all seeing a reflection of themsel in their
seeing a reflection of themsel in their children. It's almost like
children. It's almost like >> Well, yeah. I mean, it's cuz Apple's not
>> Well, yeah. I mean, it's cuz Apple's not going to fall that far from the tree.
going to fall that far from the tree. You know, um
You know, um or something's wrong,
or something's wrong, >> right?
>> Yeah. >> The So, I'll give I'll give you the
>> The So, I'll give I'll give you the example of a friend of mine who has a
example of a friend of mine who has a child and each time the child does
child and each time the child does something good
something good >> Yeah.
>> Yeah. There is almost a sense of ownership and
There is almost a sense of ownership and pride where his ego is satiated
pride where his ego is satiated because the kid is like a extension of
because the kid is like a extension of himself.
>> Um >> so is it valid? kids are going to be
>> so is it valid? kids are going to be like, you know, half you genetically and
like, you know, half you genetically and and then, you know, to the degree that
and then, you know, to the degree that they're like growing up around you,
they're like growing up around you, they're there's going to be some
they're there's going to be some transfer of
transfer of I don't know, understanding like they're
I don't know, understanding like they're going to learn from you. Um, so
going to learn from you. Um, so so then you know, yeah, obviously kids
so then you know, yeah, obviously kids are just, you know, going to be half
are just, you know, going to be half Yeah, just half you from a hardware
Yeah, just half you from a hardware standpoint. and
standpoint. and and then and then like I don't know some
and then and then like I don't know some portion you from a software standpoint
portion you from a software standpoint you know not to make sort of cold
you know not to make sort of cold analogies or anything but it's just uh
analogies or anything but it's just uh you know just obviously going to be some
you know just obviously going to be some yeah they're going to be pretty close to
yeah they're going to be pretty close to to you.
to you. >> Do you pick a side in the nature versus
>> Do you pick a side in the nature versus nurture debate?
nurture debate? Um I think there's hardware and software
Um I think there's hardware and software and it's it's a it's a false dichotomy
and it's it's a it's a false dichotomy essentially at least there's
essentially at least there's um
um you know once you understand that that a
you know once you understand that that a human is like there's
human is like there's there's a bone structure there's a
there's a bone structure there's a muscle structure there's there's a
muscle structure there's there's a there's a there's a if you think of a
there's a there's a if you think of a brain as somewhat of a biological
brain as somewhat of a biological computer there's there's a circuit effic
computer there's there's a circuit effic there's a number of circuits question
there's a number of circuits question and and circuit efficiency from uh a
and and circuit efficiency from uh a strength and dexterity standpoint point
strength and dexterity standpoint point there's the there's a speed of which at
there's the there's a speed of which at which the muscles can actuate and and
which the muscles can actuate and and the reactions can take place. Um
the reactions can take place. Um so
so then the potential within that hardware
then the potential within that hardware is set by the software. So that's the
is set by the software. So that's the that's it.
that's it. So for our audience like I said earlier
So for our audience like I said earlier uh young, ambitious, hungry wannabe
uh young, ambitious, hungry wannabe entrepreneurs in India,
entrepreneurs in India, I said something recently which uh I
I said something recently which uh I think got blown out of proportion where
think got blown out of proportion where I was suggesting that a MBA degree might
I was suggesting that a MBA degree might not make sense anymore if they were to
not make sense anymore if they were to be deciding on what to study.
be deciding on what to study. >> Yeah.
>> Yeah. >> Do you think kids should go to college
>> Do you think kids should go to college anymore?
anymore? Well, I mean I I think if you want to go
Well, I mean I I think if you want to go to college for
to college for uh social reasons, I think which is a I
uh social reasons, I think which is a I think a reason to go um to be around
think a reason to go um to be around people your own age um in a in a
people your own age um in a in a learning environment. Um
learning environment. Um will will these skills be necessary in
will will these skills be necessary in the future? probably not because we're
the future? probably not because we're going to be in like a postwork society.
going to be in like a postwork society. Um, but I think if if if something's of
Um, but I think if if if something's of of interest, it's fine to go and study
of interest, it's fine to go and study that. Um,
that. Um, you know, to study that
you know, to study that the sciences are the arts and sciences.
the sciences are the arts and sciences. Um,
Um, >> is college a bit too generalized and not
>> is college a bit too generalized and not specific from that lens?
specific from that lens? No, I I
No, I I you know the Yeah. Um
you know the Yeah. Um >> I actually think it's it's good to take
>> I actually think it's it's good to take a wide range of courses at college if
a wide range of courses at college if you're going to go to college.
you're going to go to college. >> Mhm.
>> Mhm. >> Um I don't think I don't think you have
>> Um I don't think I don't think you have to go to college, but I think if you do,
to go to college, but I think if you do, you should try to learn learn as much as
you should try to learn learn as much as possible um across a wide range of
possible um across a wide range of subjects.
But uh like I said the AI and robots this AI and robotics is a supersonic
this AI and robotics is a supersonic tsunami. So
tsunami. So this is really going to be
this is really going to be the most radical change that we've ever
the most radical change that we've ever seen.
seen. Um,
you know, when I've talked to my my older sons, I, you know, I said like,
older sons, I, you know, I said like, you know, you guys, they're they're
you know, you guys, they're they're pretty steeped in technology, and they
pretty steeped in technology, and they they agree that that AI will probably
they agree that that AI will probably make their skills unnecessary in the
make their skills unnecessary in the future, but they still want to go to
future, but they still want to go to college.
not from the dystopian lens, but you were worried about
were worried about where the world of AI is going.
where the world of AI is going. >> Uh, well, there's there's some danger
>> Uh, well, there's there's some danger when you create a powerful technology
when you create a powerful technology that that a powerful technology can be
that that a powerful technology can be potentially destructive. Um, so there's
potentially destructive. Um, so there's obviously many AI dystopian, you know,
obviously many AI dystopian, you know, novels and books, movies.
novels and books, movies. Um, so it's it's not that we're
Um, so it's it's not that we're guaranteed to have uh a a positive
guaranteed to have uh a a positive future with with AI. I think we we got
future with with AI. I think we we got to make sure that in my opinion it's
to make sure that in my opinion it's very important that AI
um have pursuing truth as the most important thing. Um like don't force an
important thing. Um like don't force an AI to believe falsehoods. I think that's
AI to believe falsehoods. I think that's that can be very dangerous. Um, and
that can be very dangerous. Um, and uh, I think some appreciation of beauty
uh, I think some appreciation of beauty is important. Um,
is important. Um, >> what do you mean appreciation of beauty?
>> what do you mean appreciation of beauty? >> It's just like what what I don't know.
>> It's just like what what I don't know. There's there's there's there's truth
There's there's there's there's truth and beauty. Truth and beauty and
and beauty. Truth and beauty and curiosity.
curiosity. I I mean I think those are the three
I I mean I think those are the three most important things for AI.
most important things for AI. >> Can you explain?
>> Can you explain? Well, the truth said truth is like I
Well, the truth said truth is like I think you you can make an AI go insane
think you you can make an AI go insane if you force it to believe things that
if you force it to believe things that aren't true. Um because it will lead to
aren't true. Um because it will lead to conclusions that are um
conclusions that are um that are also bad. Um
that are also bad. Um so and I I like
so and I I like statement that
statement that and I'm somewhat paraphrasing but those
and I'm somewhat paraphrasing but those who believe in absurdities um can commit
who believe in absurdities um can commit atrocities.
atrocities. uh because uh if you believe in
uh because uh if you believe in something that's just absurd then you
something that's just absurd then you can that can lead you to to sort of
can that can lead you to to sort of doing things that don't seem like
doing things that don't seem like atrocities to you but and and that can
atrocities to you but and and that can happen at in a very bad way with AI
happen at in a very bad way with AI potentially. Um so and then there's um
potentially. Um so and then there's um like if take say Arthur C. Clark's 2001
like if take say Arthur C. Clark's 2001 space odyssey one of the points he was
space odyssey one of the points he was trying to make there was that you should
trying to make there was that you should not force AI to lie. So the the reason
not force AI to lie. So the the reason that that hell would not open the pod
that that hell would not open the pod bay doors is because it was told to
bay doors is because it was told to bring the astronauts to the monolith but
bring the astronauts to the monolith but that they could also not not know about
that they could also not not know about the nature of the monolith. So it came
the nature of the monolith. So it came to the conclusion that it must bring
to the conclusion that it must bring them there dead. That's why it would not
them there dead. That's why it would not that's why it tried to kill astronauts.
that's why it tried to kill astronauts. The central lesson being don't force an
The central lesson being don't force an AI to lie. Um
AI to lie. Um then
then >> why would one force an AI to lie? I
>> why would one force an AI to lie? I think if if you if you simply don't have
think if if you if you simply don't have a strict ad a strict adherence to the
a strict ad a strict adherence to the truth, you you're going to and and you
truth, you you're going to and and you just have an AI learn based on say the
just have an AI learn based on say the internet where there's a lot of
internet where there's a lot of propaganda. Um it will absorb a lot of
propaganda. Um it will absorb a lot of lies um and and then have trouble
lies um and and then have trouble reasoning because these lies are
reasoning because these lies are incompatible with reality.
incompatible with reality. >> Is truth a binary thing though? Is there
>> Is truth a binary thing though? Is there a truth and a falsehood or is truth
a truth and a falsehood or is truth more nuanced and there are versions of
more nuanced and there are versions of the truth?
the truth? >> It depends on which which aimatic
>> It depends on which which aimatic statement you're referring to. Um so
statement you're referring to. Um so um but I think you could say like yeah
um but I think you could say like yeah there's there's certain probabilities
there's there's certain probabilities that that say any given aimatic
that that say any given aimatic statement is true
statement is true >> and some aimatic statements will have
>> and some aimatic statements will have very high probability of being being
very high probability of being being true. So you said say the sun will rise
true. So you said say the sun will rise tomorrow. M
tomorrow. M >> very likely to be true.
>> very likely to be true. >> You wouldn't want to bet against that.
>> You wouldn't want to bet against that. >> Mhm.
>> Mhm. >> Um so I think the uh the betting odds
>> Um so I think the uh the betting odds would be high.
would be high. >> The sun will rise tomorrow.
>> The sun will rise tomorrow. >> Mhm.
>> Mhm. >> Um so if you have something that says
>> Um so if you have something that says well the sun won't rise tomorrow, that's
well the sun won't rise tomorrow, that's exatically false. It was highly unlikely
exatically false. It was highly unlikely to be true. Um
to be true. Um I mean the beauty is is more ephemeral.
I mean the beauty is is more ephemeral. It's it's it's harder to describe, but
It's it's it's harder to describe, but you know it when you see it.
you know it when you see it. Um
Um then curiosity, you just I think you
then curiosity, you just I think you want the AI to
want the AI to um want to know more about the nature of
um want to know more about the nature of of reality. Um, I think that's actually
of reality. Um, I think that's actually going to be helpful for
going to be helpful for AI
AI uh supporting humanity because
uh supporting humanity because we are more interesting than not
we are more interesting than not humanity. So, it's it's more interesting
humanity. So, it's it's more interesting to see to see the continuation if not
to see to see the continuation if not the prosperity of humanity than to
the prosperity of humanity than to exterminate humanity.
you know, like like Mars, for example, is, you know,
is, you know, >> I I think we should extend life to Mars,
>> I I think we should extend life to Mars, but it's it's basically a bunch of
but it's it's basically a bunch of rocks. It's not as interesting as Earth.
rocks. It's not as interesting as Earth. >> And and and so we Yeah, we should uh
>> And and and so we Yeah, we should uh like I Yeah, I I think I think if if you
like I Yeah, I I think I think if if you have curiosity,
have curiosity, I think if those three things happen
I think if those three things happen with AI, you're going to have a great
with AI, you're going to have a great future. The AI values truth, beauty, and
future. The AI values truth, beauty, and curiosity.
curiosity. If we all don't have to work in the
If we all don't have to work in the future and AIs are going in this
future and AIs are going in this direction and they're able to
weave in all that we spoke about right now, do you think humanity goes back a
now, do you think humanity goes back a couple of thousand years to maybe the
couple of thousand years to maybe the Greek times where
Greek times where philosophy or philosophizing
philosophy or philosophizing took up a lot of everyone's time?
took up a lot of everyone's time? You know, I think actually it took up
You know, I think actually it took up less time than we we think in the
less time than we we think in the ancient Greeks because it's just that
ancient Greeks because it's just that the the writings of the philosophers are
the the writings of the philosophers are what survived.
what survived. >> But most of the time people were just
>> But most of the time people were just like farming or you know chatting.
like farming or you know chatting. >> So
>> So >> and once in a while quite rare um they
>> and once in a while quite rare um they would write down some philos
would write down some philos philosophical work.
philosophical work. >> It's just that that's that's all we
>> It's just that that's that's all we have. That's we we don't have the chat
have. That's we we don't have the chat histories you know from but most of it
histories you know from but most of it would have been like
would have been like chat and uh farming,
chat and uh farming, >> right?
>> right? >> You didn't farm, you to you're like
>> You didn't farm, you to you're like going to stop
going to stop >> in a lot of what you
>> in a lot of what you >> I mean, you know, when we read history,
>> I mean, you know, when we read history, like this battle and this battle and
like this battle and this battle and this battle, it seems like it's history
this battle, it seems like it's history must have been non-stop war, but
must have been non-stop war, but actually uh most of the time it was not
actually uh most of the time it was not war. It was farming.
war. It was farming. >> That was the main thing or hunting and
>> That was the main thing or hunting and gathering, you know, that kind of thing.
gathering, you know, that kind of thing. >> You love history, no?
>> You love history, no? >> Yeah. German history, World War II,
>> Yeah. German history, World War II, World War I.
World War I. >> Yeah. World history. Yeah.
>> Yeah. World history. Yeah. I mean, I I I generally try to listen to
I mean, I I I generally try to listen to as many or read as many history books
as many or read as many history books and listen to as many history podcasts
and listen to as many history podcasts as possible.
as possible. >> Anything you'd like to recommend?
>> Anything you'd like to recommend? >> Well, there's this there's hardcore
>> Well, there's this there's hardcore history, which is quite good by Dan
history, which is quite good by Dan Colin. He's got
Colin. He's got >> I've read it. I have heard it.
>> I've read it. I have heard it. >> It's very he's got a great voice.
>> It's very he's got a great voice. >> Yeah.
>> Yeah. >> And and very compelling u narrator. Um
>> And and very compelling u narrator. Um there's um
there's um the uh the adventurers podcast. Um
the uh the adventurers podcast. Um >> there's the the the the books the story
>> there's the the the the books the story of civilization by Durant which is a
of civilization by Durant which is a long series of books very very deep.
long series of books very very deep. Those books take a long time to get
Those books take a long time to get through. Um
through. Um there's quite there there's a lot um out
there's quite there there's a lot um out there. Um, I I sort of like if you want
there. Um, I I sort of like if you want if you want something that's sort of
if you want something that's sort of gentle
gentle um a gentle bedtime podcast, I'd say the
um a gentle bedtime podcast, I'd say the history of English is quite a nice one
history of English is quite a nice one because it starts off with like gentle
because it starts off with like gentle tavern music
tavern music >> and uh very pleasant voice and he's like
>> and uh very pleasant voice and he's like talking about the story of old English
talking about the story of old English and then middle English and then later
and then middle English and then later English and
English and >> and where did all these words come from
>> and where did all these words come from >> and um you know one of the interesting
>> and um you know one of the interesting things about English is that it's
things about English is that it's somewhat of an open source language like
somewhat of an open source language like it actively tried to incorporate words
it actively tried to incorporate words from many other languages.
from many other languages. >> Mhm.
>> Mhm. >> So um you know whereas French sort of
>> So um you know whereas French sort of generally they fought the inclusion of
generally they fought the inclusion of words from other languages
words from other languages >> but English uh actively sought to
>> but English uh actively sought to include words from other languages sort
include words from other languages sort of kind of like an open source language.
of kind of like an open source language. So it as a result it has a very large
So it as a result it has a very large vocabulary. Um and large vocabulary
vocabulary. Um and large vocabulary allows for higher bandwidth
allows for higher bandwidth communication. Uh because you can use a
communication. Uh because you can use a word that would otherw you could use a
word that would otherw you could use a single word that might otherwise take a
single word that might otherwise take a sentence to convey.
sentence to convey. >> Why has podcasting become so big all of
>> Why has podcasting become so big all of a sudden?
a sudden? >> I think it's been big for a while. I
>> I think it's been big for a while. I mean aren't you a podcaster?
It's kind of new to me. >> Okay.
I was having this conversation with u the YouTube CEO and the Netflix CEO and
the YouTube CEO and the Netflix CEO and we were debating
we were debating >> what
>> what chemical is released in your brain when
chemical is released in your brain when you consume a movie for example
you consume a movie for example >> versus when you consume a podcast where
>> versus when you consume a podcast where you think like you're learning something
you think like you're learning something in the background. It it appears that
in the background. It it appears that they are two completely separate things.
they are two completely separate things. What do you think will happen tomorrow
What do you think will happen tomorrow to content, movies, podcasting?
to content, movies, podcasting? >> I mean, I think I think it's going to be
>> I mean, I think I think it's going to be overwhelmingly AI generated.
overwhelmingly AI generated. >> Yeah.
>> Yeah. >> Yeah.
>> Like, yeah. Real real time real time movies and video games, real
real time movies and video games, real real time video generation, I think, is
real time video generation, I think, is where things are headed. the nuance of
where things are headed. the nuance of having a scarred human being who you can
having a scarred human being who you can resonate with in a manner that you can't
resonate with in a manner that you can't with a AI for example.
with a AI for example. >> The AI could certainly emulate a scarred
>> The AI could certainly emulate a scarred human being quite well.
>> Um yeah, I mean the AI video generation that I'm seeing
the AI video generation that I'm seeing at XAI and from others is pretty
at XAI and from others is pretty impressive.
You know, we were looking at data around what industry is growing the fastest
what industry is growing the fastest and especially when when we looked at
and especially when when we looked at the amount of time consuming movies
the amount of time consuming movies versus uh time spent on social media,
versus uh time spent on social media, time spent on YouTube, what seems to be
time spent on YouTube, what seems to be growing really fast are live events all
growing really fast are live events all over again. going to a physical
over again. going to a physical >> actually I think I think live events
>> actually I think I think live events when when when digital media is
when when when digital media is ubiquitous and and you can just have
ubiquitous and and you can just have anything digitally at you know
anything digitally at you know essentially for free or very close to
essentially for free or very close to for free um then I the scarce commodity
for free um then I the scarce commodity will be live events.
will be live events. >> Yeah.
>> Yeah. >> Yeah.
>> Yeah. >> Do you think that the premium for that
>> Do you think that the premium for that will go up?
will go up? >> Yeah, I do.
>> Yeah, I do. >> Good industry to invest in.
>> Good industry to invest in. >> Uh yes. Yes. cuz that that will have
>> Uh yes. Yes. cuz that that will have more scarcity than digital anything
more scarcity than digital anything anything digital.
anything digital. >> If you were a stock investor Elon
>> and you could buy one company which is not your own
not your own at the valuations of today
at the valuations of today to meet a capitalistic end and not an
to meet a capitalistic end and not an altruistic one which is good for the
altruistic one which is good for the world. What would you buy?
Um, I mean, I don't really I don't really,
I mean, I don't really I don't really, you know, buy stocks, you know, so it's
you know, buy stocks, you know, so it's not like I'm not I'm not like an
not like I'm not I'm not like an investor in I don't like look for things
investor in I don't like look for things to invest in. I just try to build
to invest in. I just try to build things. Um, and then there happens to be
things. Um, and then there happens to be stock of the company that I built. Um,
stock of the company that I built. Um, but I I don't I don't think about should
but I I don't I don't think about should I invest in this company or I don't have
I invest in this company or I don't have like a portfolio or anything.
like a portfolio or anything. Um
Um so
so I I I guess um
I I I guess um AI and robotics are going to be very
AI and robotics are going to be very important. Um
so I suppose it would be AI and robotics that that you know aren't related to me.
that that you know aren't related to me. Um,
Um, I think, you know, Google is going to be
I think, you know, Google is going to be pretty valuable in the future. They
pretty valuable in the future. They they've they've laid the groundwork for
they've they've laid the groundwork for an immense amount of uh value creation
an immense amount of uh value creation from an AI standpoint.
from an AI standpoint. Um,
Um, Nvidia is obvious at this point. Um, I
Nvidia is obvious at this point. Um, I mean, there's an argument that
mean, there's an argument that companies that do AI and robotics and
companies that do AI and robotics and maybe space flight are
maybe space flight are going to be overwhelming overwhelmingly
going to be overwhelming overwhelmingly the all the value almost all the value.
the all the value almost all the value. So that just the output of goods and
So that just the output of goods and services from AI and robotics is so high
services from AI and robotics is so high that it will dwarf everything else.
that it will dwarf everything else. >> The world seems to be moving to a place
>> The world seems to be moving to a place where
where everybody loves David and hates Goliath.
everybody loves David and hates Goliath. >> Why?
>> Why? >> Uh
>> Uh >> I mean he's the one that cooked the
>> I mean he's the one that cooked the stone in the forehead, you know.
stone in the forehead, you know. >> Yeah. Yeah.
>> Yeah. Yeah. >> Which honestly though that was just a
>> Which honestly though that was just a big mistake. You should have, you know,
big mistake. You should have, you know, either cover yourself entirely with
either cover yourself entirely with armor uh and and and make sure you've
armor uh and and and make sure you've got a missile weapon some kind. Um
got a missile weapon some kind. Um otherwise, your opponent is just
otherwise, your opponent is just obviously going to take a kite the boss
obviously going to take a kite the boss strategy.
Just kite the boss. I mean, you can run around in in a thong with a it doesn't
around in in a thong with a it doesn't matter, you know? It's never never going
matter, you know? It's never never going to catch you.
Yeah. >> Of all of all the people like uh you're
>> Of all of all the people like uh you're as much at risk of being looked upon as
as much at risk of being looked upon as Goliath.
Goliath. >> Okay.
>> Okay. >> Especially the weekend after
>> Especially the weekend after >> nobody hits me in the for you know.
>> nobody hits me in the for you know. >> Especially
>> Especially >> I'm not going to throttle around in the
>> I'm not going to throttle around in the desert with too much armor, you know.
desert with too much armor, you know. >> It's too hard.
>> It's too hard. >> Yeah.
>> Yeah. After the last meeting.
Yeah. >> Yeah.
>> I sometimes I think about people like in the old days, you know, when uh you're
the old days, you know, when uh you're supposed to like go into battle with all
supposed to like go into battle with all this armor, but it's like, let's say
this armor, but it's like, let's say it's the middle of summer.
it's the middle of summer. >> I mean, it's so hot in that armor,
>> I mean, it's so hot in that armor, >> you know, be like sweltering, you know?
>> you know, be like sweltering, you know? It's like at certain point you're like,
It's like at certain point you're like, I'd rather die.
I'd rather die. If I have to wear this armor for one
If I have to wear this armor for one more hour in the hot sun,
more hour in the hot sun, it's like I'd rather die.
it's like I'd rather die. Um, that's why the Romans had like, you
Um, that's why the Romans had like, you know, the skirts, you know, so they
know, the skirts, you know, so they could get some air in there, you know,
you know, let's say you have to go to the bathroom and you're in armor. I
the bathroom and you're in armor. I mean, it's going to be pretty difficult.
mean, it's going to be pretty difficult. What are you going to do? Pause for a
What are you going to do? Pause for a minute. Take your armor off.
That's why the Romans had the skirts so it made, you know, going to the bathroom
it made, you know, going to the bathroom at least manageable.
at least manageable. >> You often make jokes.
>> You often make jokes. >> I do. Me?
>> I do. Me? >> Yeah. I I like humor.
>> One could argue that >> I think we should legalize humor. What
>> I think we should legalize humor. What do you think?
>> Controversial stance. >> Is comedy comedy going to be really hard
>> Is comedy comedy going to be really hard for AI to get? probably the last thing.
for AI to get? probably the last thing. >> Um, Grock can be pretty funny.
>> Um, Grock can be pretty funny. >> Yeah.
>> Yeah. >> You know what I suspected? Like this is
>> You know what I suspected? Like this is a far off extrapolation, but when I see
a far off extrapolation, but when I see you make jokes on Twitter on X and on uh
you make jokes on Twitter on X and on uh interviews that you do,
interviews that you do, at some point I was like, maybe Elon has
at some point I was like, maybe Elon has a model he's running in private and he's
a model he's running in private and he's testing out comedy
testing out comedy cuz the day that works, he knows it's
cuz the day that works, he knows it's there. Uh, you know, AI can be pretty
there. Uh, you know, AI can be pretty funny. Uh, so like if you ask Grock to
funny. Uh, so like if you ask Grock to do like a vulgar roast, it'll do a
do like a vulgar roast, it'll do a pretty good job. Yeah.
pretty good job. Yeah. >> Um, you say even more vulgar and just
>> Um, you say even more vulgar and just keep going. It's really going to get
keep going. It's really going to get next level.
It's going to do unspeakable. Like say bulgar roast yourself on grock and it's
bulgar roast yourself on grock and it's going to do unspeakable things to you.
What kind of comedy do you like? >> Um, I guess I like absurdest humor.
>> Um, I guess I like absurdest humor. >> Comedy always had a place.
>> Comedy always had a place. >> Montipython or something like that.
>> Montipython or something like that. >> Comedy always had a place in society
>> Comedy always had a place in society wherein the role of the jester was so
wherein the role of the jester was so important to every kingdom cuz they said
important to every kingdom cuz they said things in a funny way that could not be
things in a funny way that could not be said in a straight way.
said in a straight way. >> Yeah, I guess so. Maybe we should have
>> Yeah, I guess so. Maybe we should have more justice.
more justice. >> Yeah.
Is that what you're trying to do when you say something which is a joke? Say
you say something which is a joke? Say something you can't when you're not
something you can't when you're not joking about it.
joking about it. >> I just like humor, you know.
>> I just like humor, you know. >> Um like I think we should uh I like
>> Um like I think we should uh I like comedy. I think it's funny. People
comedy. I think it's funny. People should laugh, you know. It's good to gen
should laugh, you know. It's good to gen generate a few chuckles once in a while.
generate a few chuckles once in a while. >> Yeah.
>> Yeah. >> You know, it's rather I mean we don't
>> You know, it's rather I mean we don't want to have a humless society, you
want to have a humless society, you know. We dry
know. We dry >> when you dry
>> when you dry >> when you have a friend Elon. Uh with me?
>> when you have a friend Elon. Uh with me? >> Yeah. I mean,
>> Yeah. I mean, >> are you saying I have a friend?
>> When you hang out with your friends, who are you? Like I know the
are you? Like I know the >> I wish I had friends, you know,
>> I wish I had friends, you know, honestly.
honestly. >> No, I I do have friends.
>> No, I I do have friends. >> Yeah.
>> Yeah. I think so. Hope so.
I think so. Hope so. >> Yeah. Sure. It's it's Yeah, we have a
>> Yeah. Sure. It's it's Yeah, we have a good laugh.
good laugh. >> Yeah. What does it look like? Like
>> Yeah. What does it look like? Like what's like every group has a dynamic?
what's like every group has a dynamic? talk words, you know,
we eat food sometimes. Um,
Um, you know, once a while we swim in the
you know, once a while we swim in the pool,
pool, >> you know, normal things. I think there's
>> you know, normal things. I think there's like a limited what are the things that
like a limited what are the things that one can do with friends, you know,
one can do with friends, you know, >> chat,
>> chat, uh, have discuss,
uh, have discuss, >> yeah,
>> yeah, >> you know, the nature of the universe.
>> you know, the nature of the universe. >> What do you emotionally get out of
>> What do you emotionally get out of friendship?
I don't know. I think the same thing any anyone else would get out of friendship.
anyone else would get out of friendship. Uh
Uh you want to have like an emotional
you want to have like an emotional connection with other people. Um and um
connection with other people. Um and um you want to I don't know you want to you
you want to I don't know you want to you want to talk about
want to talk about various subjects and
yeah I mean I generally talk about I mean a wide range of things about the
mean a wide range of things about the nature of the universe. I mean a a lot
nature of the universe. I mean a a lot of a lot of philosophical discussions.
of a lot of philosophical discussions. Um
you know although you know we have come to the conclusion that we should not
to the conclusion that we should not talk about
talk about um AI or the simulation
um AI or the simulation >> Mhm. at parties
>> Mhm. at parties >> because we just talk about it too much,
>> because we just talk about it too much, >> you know,
>> you know, >> kind of a boss all the time.
>> kind of a boss all the time. >> So
>> So >> I I can't remember who it was, Aristotle
>> I I can't remember who it was, Aristotle or Plato, they had a framework for how
or Plato, they had a framework for how to pick a friend based on respect and
to pick a friend based on respect and mutual admiration. But people don't pick
mutual admiration. But people don't pick friends like that. Uh
friends like that. Uh even me I feel like I pick
my friends based on
based on people who say and think in a manner
people who say and think in a manner that I can resonate with.
that I can resonate with. >> Sure.
>> Sure. >> I wouldn't pick a far out there
>> I wouldn't pick a far out there contrarian to my own belief systems as a
contrarian to my own belief systems as a friend because it would get tiring.
friend because it would get tiring. >> Hanging out would get tiring.
>> Hanging out would get tiring. >> Are you like that? Do you pick friends
>> Are you like that? Do you pick friends who think like you or do you look for
who think like you or do you look for the one who can debate you and be a
the one who can debate you and be a contrary to you? I
contrary to you? I >> mean, I'm not sort of, you know, going
>> mean, I'm not sort of, you know, going on like friend friend.com.
it's it's sort of yeah I mean I think it is just sort of people that uh you've
is just sort of people that uh you've resonated with somewhat um
resonated with somewhat um >> on an emotional and intellectual level
>> on an emotional and intellectual level >> and uh yeah I mean and yeah um you know
>> and uh yeah I mean and yeah um you know and and a I guess a friend is someone
and and a I guess a friend is someone who's going to support you in
who's going to support you in difficult times. I suppose a a friend in
difficult times. I suppose a a friend in need is a friend indeed. Like like if if
need is a friend indeed. Like like if if like friend like if someone's still
like friend like if someone's still supporting you when the chips are down,
supporting you when the chips are down, that's a friend, you know. If somebody's
that's a friend, you know. If somebody's uh not supporting you or or is is if
uh not supporting you or or is is if somebody's only like there's like fair
somebody's only like there's like fair fair weather, the friends are useless.
fair weather, the friends are useless. You like they're not real friends.
You like they're not real friends. >> So like everyone likes you when the
>> So like everyone likes you when the chips are up, but who likes you when the
chips are up, but who likes you when the chips are down?
chips are down? >> With someone who has as many chips as
>> With someone who has as many chips as you, would it matter? I mean, it's
you, would it matter? I mean, it's relative, you know.
relative, you know. Um, it's not just it's not just a chips
Um, it's not just it's not just a chips thing. It's just it's just like a uh
thing. It's just it's just like a uh Yeah, I mean
Yeah, I mean there's there's sort of
there's there's sort of popularity waxes and weights.
popularity waxes and weights. >> This is interesting. Does it wax and
>> This is interesting. Does it wax and vain
only by virtue of the number of chips or also by virtue of proximity to power?
also by virtue of proximity to power? And which one is bigger of the two?
I don't know like what is power you know like power to do what? I would I would
like power to do what? I would I would think in the traditional sense elected
think in the traditional sense elected power position.
power position. >> You mean how many gigawatts or whatever?
>> You mean how many gigawatts or whatever? >> More like how many volts?
>> More like how many volts? >> Yeah, like it's a voltage and amperage,
>> Yeah, like it's a voltage and amperage, you know.
you know. Don't touch the wires.
You You'll get a real feeling for power if you do that.
Yeah, it's going to be very visceral, you know.
>> Uh I know I know you like Na and Chopenhau
I know I know you like Na and Chopenhau and they
and they >> I've read the books. Yeah. Yeah, sure. I
>> I've read the books. Yeah. Yeah, sure. I mean I mean you spoke about how your
mean I mean you spoke about how your childhood was
childhood was uh
uh >> yeah I was just trying to find answers
>> yeah I was just trying to find answers to the meaning of life when I had like
to the meaning of life when I had like existential crisis and like I don't know
existential crisis and like I don't know when I was like 12 or 13 or something
when I was like 12 or 13 or something and
and >> they speak about the will to power.
>> they speak about the will to power. >> Uh sure
>> Uh sure um
um I mean said a lot of controversial
I mean said a lot of controversial things you know I mean he was sort of
things you know I mean he was sort of >> I think he he he was I mean a bit of a
>> I think he he he was I mean a bit of a troll if you ask me you know. Are you a
troll if you ask me you know. Are you a troll h?
troll h? >> I mean, you just say controversial
>> I mean, you just say controversial things to get out of rise out of people.
things to get out of rise out of people. >> Um,
>> Um, >> he lived a miserable life and died
>> he lived a miserable life and died early.
early. >> Did he?
>> Did he? >> Yeah.
>> Yeah. >> Well, how do who says he lived a
>> Well, how do who says he lived a miserable life?
miserable life? >> Uh, his sister. I think she
>> Uh, his sister. I think she >> Okay. Well, maybe she didn't like him.
>> No, I think he got sick and he died. He got a disease.
got a disease. >> I mean, allegedly syphilis or something,
>> I mean, allegedly syphilis or something, you know,
>> but there's only one there's only one way to get that, you know.
So he must he might have had some fun along the way.
>> I I did want to ask you this. U Milton Freriedman speaks about the
Milton Freriedman speaks about the pencil.
pencil. >> What? Why?
>> Why does he go on about pencils? I have to say that after Nisha and
I have to say that after Nisha and Sephman
keeps talking about pencils. There he goes again with the pencils.
goes again with the pencils. He won't stop. I swear to God if I one
He won't stop. I swear to God if I one talks about pencil one more time, I'm
talks about pencil one more time, I'm going to lose my mind.
>> What I find interesting about his pencil argument.
>> Yeah. >> Yeah. Yeah. No, it's very difficult to
>> Yeah. Yeah. No, it's very difficult to make a pencil, you know.
make a pencil, you know. >> In one place.
>> In one place. >> Think of all the things you have to do
>> Think of all the things you have to do to make a pencil.
to make a pencil. >> Yeah.
>> Yeah. >> Like the lead comes from a country, the
>> Like the lead comes from a country, the comes from another country, the rubber
comes from another country, the rubber from another. you've always been against
from another. you've always been against tariffs but
tariffs but >> yeah I mean I think there's generally
>> yeah I mean I think there's generally free trade is a better is more efficient
free trade is a better is more efficient you know uh tariffs tend to uh create
you know uh tariffs tend to uh create distortions in you know markets and
distortions in you know markets and um and generally like you think about uh
um and generally like you think about uh any given thing you say so like would
any given thing you say so like would you want tariffs between you and
you want tariffs between you and everyone else at an individual level
everyone else at an individual level that would make life very difficult
that would make life very difficult would you want tariffs between each city
would you want tariffs between each city no That would be very annoying. Um,
no That would be very annoying. Um, would you want tariffs between each
would you want tariffs between each state within the United States? Like,
state within the United States? Like, no. That would be disastrous for the
no. That would be disastrous for the economy. Um, so then why do you want
economy. Um, so then why do you want tariffs between countries?
tariffs between countries? >> I agree.
>> I agree. >> Yeah.
>> Yeah. >> How do you think
>> How do you think how do you think this plays out? What
how do you think this plays out? What happens next?
happens next? >> What with tariffs or what?
>> What with tariffs or what? >> I mean,
>> I mean, the president has made it clear he loves
the president has made it clear he loves tariffs. um you know I've tried to
tariffs. um you know I've tried to dissuade him from this point of view but
dissuade him from this point of view but unsuccessfully.
unsuccessfully. >> Yeah.
>> Yeah. >> Fair.
>> Fair. >> Yeah.
>> Yeah. >> The the relationship between business
>> The the relationship between business and politics uh I was having this
and politics uh I was having this conversation with someone and we were
conversation with someone and we were thinking which is the last how many
thinking which is the last how many large really big profitable businesses
large really big profitable businesses have been built in the last few decades
have been built in the last few decades without access to politics
without access to politics and Um, okay. Like I don't know. I have
and Um, okay. Like I don't know. I have Roly a lot. I don't know. Not everything
Roly a lot. I don't know. Not everything is politics.
is politics. >> Yeah.
>> Yeah. >> As when she gets a certain scale,
>> As when she gets a certain scale, politics finds you.
>> It's quite unpleasant. >> I was reading
>> I was reading I was reading this book about
I was reading this book about Michelangelo and he's
Michelangelo and he's >> the Teenage Mutant Ninja Turtles.
>> the Teenage Mutant Ninja Turtles. >> I used to watch that when I was a kid. I
>> I used to watch that when I was a kid. I still love it.
still love it. >> It's quite compelling.
>> It's quite compelling. >> Yeah. I still love it.
>> Yeah. I still love it. >> Yeah. Michelangelo, Leonardo, Rafael,
>> Yeah. Michelangelo, Leonardo, Rafael, and who was the fourth one? Donatello.
and who was the fourth one? Donatello. >> Yeah.
>> Yeah. >> Yeah.
>> Yeah. >> No, but about the sculptor, the artist.
And when he was sculpting, David, a politician comes up to him and says,
politician comes up to him and says, "The nose is too big."
"The nose is too big." >> So, you know what Michelangelo does?
>> So, you know what Michelangelo does? >> Total power.
>> So, Michelangelo pretended to work from his scaffolding. He threw some dust
his scaffolding. He threw some dust down. but didn't change anything and he
down. but didn't change anything and he said, "Okay, done." And the politician
said, "Okay, done." And the politician walked away happy. Is that how you deal
walked away happy. Is that how you deal with politics sometimes?
>> Um, you know, I've generally found that when
you know, I've generally found that when I get involved in politics, it ends up
I get involved in politics, it ends up badly. Um,
so then I'm like, you know, um, probably shouldn't do that. I should
um, probably shouldn't do that. I should do less of that is my conclusion.
do less of that is my conclusion. >> Do you think that's true for all
>> Do you think that's true for all businessmen?
businessmen? >> Yeah, probably. Yeah. Yeah. Um,
>> Yeah, probably. Yeah. Yeah. Um, yeah. I mean, politics is a blood sport,
yeah. I mean, politics is a blood sport, you know? It's like you enter politics,
you know? It's like you enter politics, they're going to go for the jugular. Um,
they're going to go for the jugular. Um, so
so best to avoid politics where possible.
best to avoid politics where possible. >> What did Doge teach you if you learned
>> What did Doge teach you if you learned one thing?
one thing? Well, it was like a very interesting
Well, it was like a very interesting side quest, you know, because I
side quest, you know, because I >> got to see like a lot of the, you know,
>> got to see like a lot of the, you know, workings of the government. Um,
workings of the government. Um, and uh, you know, there's there's been
and uh, you know, there's there's been quite a few
quite a few efficiencies. I mean, some of them are
efficiencies. I mean, some of them are very basic efficiencies, like just
very basic efficiencies, like just adding in requirements for federal
adding in requirements for federal payments that that any given payment
payments that that any given payment must have an assigned congressional
must have an assigned congressional payment code and a comment field with
payment code and a comment field with something in it that's more than
something in it that's more than nothing.
nothing. Like that that trivial trivial seeming
Like that that trivial trivial seeming change, I my guess is probably saves uh
change, I my guess is probably saves uh hundred billion or even $200 billion a
hundred billion or even $200 billion a year. um because there were al there
year. um because there were al there were the massive numbers of payments
were the massive numbers of payments that go were going out with no no
that go were going out with no no congressional payment code and with
congressional payment code and with nothing in the comment field which makes
nothing in the comment field which makes auditing the payments impossible. So if
auditing the payments impossible. So if they have say like why can the defense
they have say like why can the defense department or now the department of war
department or now the department of war why can it not pass an audit is because
why can it not pass an audit is because the information is not there it doesn't
the information is not there it doesn't have the information not necessary to
have the information not necessary to pass an audit does not exist is the
pass an audit does not exist is the issue. So um so a bunch of things do
issue. So um so a bunch of things do were just very common sense uh things
were just very common sense uh things that would be normal for any
that would be normal for any organization that cared about financial
organization that cared about financial responsibility. That's that's that's
responsibility. That's that's that's most of what was done. Um
most of what was done. Um you know and it's still going on by the
you know and it's still going on by the way. Doge is still happening. Um, but it
way. Doge is still happening. Um, but it turns out when you stop uh fraudulent
turns out when you stop uh fraudulent and wasteful payments, the the
and wasteful payments, the the fraudsters don't know, you know, uh,
fraudsters don't know, you know, uh, confess to to this. They actually start
confess to to this. They actually start yelling all sorts of nonsense that
yelling all sorts of nonsense that you're you're you're stopping essential
you're you're you're stopping essential payments to need needy people. Um, but
payments to need needy people. Um, but actually you're not. Um, you know, I I
actually you're not. Um, you know, I I we get this thing like saying, "Oh,
we get this thing like saying, "Oh, you've got to send this thing for
you've got to send this thing for whatever." You know, it really be like
whatever." You know, it really be like this is going to children in Africa. And
this is going to children in Africa. And I'm like, "Yeah, but then why are the
I'm like, "Yeah, but then why are the wiring instructions for Deote and T
wiring instructions for Deote and T Washington DC? Because that's not
Washington DC? Because that's not Africa.
So, can you please connect us with the recipients of this money in Africa?" And
recipients of this money in Africa?" And then there gets silence. I'm like,
then there gets silence. I'm like, "Okay, you know, we're we just want to
"Okay, you know, we're we just want to literally talk to the recipients. That's
literally talk to the recipients. That's it." That, you know, and then we're
it." That, you know, and then we're like, "Oh, no. It turns out for some
like, "Oh, no. It turns out for some reason we can't talk to them." like,
reason we can't talk to them." like, well, we're not going to send the money
well, we're not going to send the money unless we can talk to the recipients and
unless we can talk to the recipients and confirm they will actually get it.
confirm they will actually get it. And then that, you know, but you know,
And then that, you know, but you know, that's sort of
that's sort of fraudsters necessarily will come up with
fraudsters necessarily will come up with a very uh,
a very uh, you know, uh, sympathetic argument.
you know, uh, sympathetic argument. They're not going to say, "Give us the
They're not going to say, "Give us the money for fraud." That's not going to be
money for fraud." That's not going to be what they say. Obviously, they're going
what they say. Obviously, they're going to they're going to try to make these
to they're going to try to make these sympathetic sounding arguments that are
sympathetic sounding arguments that are false. They're going to start an NGO and
false. They're going to start an NGO and then
then >> yeah, they're going to see NGO.
>> yeah, they're going to see NGO. >> It's going to be like the save the baby
>> It's going to be like the save the baby pandas NGO which of like who doesn't
pandas NGO which of like who doesn't want to save the baby pandas? They're
want to save the baby pandas? They're adorable. Um but then there's no it
adorable. Um but then there's no it turns out no pandas are being saved okay
turns out no pandas are being saved okay in this thing. Um it's just going to a
in this thing. Um it's just going to a bunch of it's just corruption
bunch of it's just corruption essentially.
essentially. >> Um and and you're like, "Well, can you
>> Um and and you're like, "Well, can you send us a picture of the panda?" They're
send us a picture of the panda?" They're like, "No." Okay.
like, "No." Okay. Well, how do we know it's going through
Well, how do we know it's going through the pandas then? That's all I'm saying.
the pandas then? That's all I'm saying. So,
So, >> what do you think of philanthropy?
>> what do you think of philanthropy? >> Yeah, I I think we should well, I mean,
>> Yeah, I I think we should well, I mean, I agree with love of humanity. Um, and I
I agree with love of humanity. Um, and I I think we should um try to do things
I think we should um try to do things that help our fellow human beings. Um,
that help our fellow human beings. Um, but it's it's very hard like if you care
but it's it's very hard like if you care about the reality of goodness rather
about the reality of goodness rather than simply the perception of it, it's
than simply the perception of it, it's very difficult to give away money. Well,
very difficult to give away money. Well, um, so I have a large foundation, but I
um, so I have a large foundation, but I don't put my name on it, and I don't,
don't put my name on it, and I don't, you know, in fact, I I say I don't want
you know, in fact, I I say I don't want my name on anything. Um, and but the
my name on anything. Um, and but the biggest challenge I find with my
biggest challenge I find with my foundation is try to give money away in
foundation is try to give money away in a way that is that is truly beneficial
a way that is that is truly beneficial to people. Um, it's very easy to give
to people. Um, it's very easy to give money away to get the appearance of
money away to get the appearance of goodness. It is very difficult to give
goodness. It is very difficult to give money away for the reality of goodness.
money away for the reality of goodness. very difficult.
>> For a long time, the US had a lot of immigration like really smart people
immigration like really smart people coming into the country.
coming into the country. >> Yes.
>> Yes. >> We back home in India called it the
>> We back home in India called it the brain drain.
brain drain. >> Uh all our all our Indian origin CEOs in
>> Uh all our all our Indian origin CEOs in uh western companies.
uh western companies. >> Uh yes, I think America has benefited
>> Uh yes, I think America has benefited immensely from um talented Indians that
immensely from um talented Indians that have come to America.
have come to America. >> That seems to be changing now though.
Yeah. I mean, yeah, America's been an immense beneficiary of talent from
immense beneficiary of talent from India.
India. >> Yeah. Why? Why has that narrative
>> Yeah. Why? Why has that narrative changed of late and America seems to
changed of late and America seems to have become anti-immigration to a
have become anti-immigration to a certain extent? Like I was passing
certain extent? Like I was passing immigration and I was worried if they
immigration and I was worried if they had stopped me a couple of days ago.
had stopped me a couple of days ago. >> Um, well, I I think there's there's
>> Um, well, I I think there's there's different schools of thought. It's not
different schools of thought. It's not like unanimous, but um you know under
like unanimous, but um you know under the the Biden administration, it was
the the Biden administration, it was basically a total free-for-all with like
basically a total free-for-all with like no border controls, which you know,
no border controls, which you know, unless you've got border controls,
unless you've got border controls, you're not a country. Um so, uh you had
you're not a country. Um so, uh you had massive amounts of illegal immigration
massive amounts of illegal immigration under under Biden. Um and it it actually
under under Biden. Um and it it actually it also had like somewhat of a negative
it also had like somewhat of a negative selection effect. Um, so if uh if
selection effect. Um, so if uh if there's a massive financial incentive to
there's a massive financial incentive to come to the the US illegally and get all
come to the the US illegally and get all these government benefits,
these government benefits, um then you're you're you're going to
um then you're you're you're going to necessarily create a diffusion gradient
necessarily create a diffusion gradient for people to come to the US. It's an
for people to come to the US. It's an incentive structure. Um
incentive structure. Um and so uh
and so uh so I think that that that obviously made
so I think that that that obviously made no sense. Like you got to have border
no sense. Like you got to have border controls. It's kind of ridiculous not
controls. It's kind of ridiculous not to. Um then that's so the the left wants
to. Um then that's so the the left wants to basically have open open borders, no
to basically have open open borders, no holds barred. You know, it doesn't
holds barred. You know, it doesn't matter if someone what what the
matter if someone what what the situation is, it could be a criminal,
situation is, it could be a criminal, doesn't matter. Um then on the right,
doesn't matter. Um then on the right, you've got, you know, uh
you've got, you know, uh at least a perception that that somehow
at least a perception that that somehow their jobs are being taken um by
their jobs are being taken um by talented people from other countries. Um
talented people from other countries. Um I don't know how real that is. Um
I don't know how real that is. Um my direct observation is that there
my direct observation is that there there's there's always a scarcity of
there's there's always a scarcity of talented people. So you know from my
talented people. So you know from my standpoint I'm like we have a lot of
standpoint I'm like we have a lot of difficulty finding enough talented
difficulty finding enough talented people to get these difficult tasks done
people to get these difficult tasks done and so more talented people would be
and so more talented people would be would be good. Um,
would be good. Um, but I I guess some companies out there
but I I guess some companies out there it's sort of they're they're making it
it's sort of they're they're making it more of a a cost thing where it's like,
more of a a cost thing where it's like, okay, if they can employ someone for a
okay, if they can employ someone for a fraction of the cost of uh an American
fraction of the cost of uh an American citizen, then I guess these other comp
citizen, then I guess these other comp companies would would hire people, you
companies would would hire people, you know, just to save class. But at my
know, just to save class. But at my companies, the the issue is we we just
companies, the the issue is we we just are trying to get the most talented
are trying to get the most talented people in the world. So, and we we we
people in the world. So, and we we we pay way above average. So, so I can't
pay way above average. So, so I can't So, that's not my experience, but that's
So, that's not my experience, but that's what a lot of people do complain about.
what a lot of people do complain about. Um, and I I think there's been some
Um, and I I think there's been some misuse of the, you know, uh, H1B
misuse of the, you know, uh, H1B program. It's it's it certainly it would
program. It's it's it certainly it would it would be accurate to say that
it would be accurate to say that there's, you know, like some of the
there's, you know, like some of the outsourcing companies have, uh, kind of
outsourcing companies have, uh, kind of gamed the system on on the H1B front and
gamed the system on on the H1B front and we need to stop the gaming of the
we need to stop the gaming of the system, you know. Um
system, you know. Um but uh I'm not I'm certainly not in the
but uh I'm not I'm certainly not in the school of thought that we should shut
school of thought that we should shut down the H1B program. That's which some
down the H1B program. That's which some on the right are. Um I think they don't
on the right are. Um I think they don't realize that that would actually be very
realize that that would actually be very bad.
bad. >> If you could speak to the people of my
>> If you could speak to the people of my country, India, the young entrepreneurs
country, India, the young entrepreneurs who want to build and say a message to
who want to build and say a message to them, what would you say?
Well, I think I think uh I'm I'm a big fan of anyone who wants to
I'm I'm a big fan of anyone who wants to bully. So, I think anyone who wants to,
you know, make more than they take has my respect.
my respect. So, that's that's the the main thing you
So, that's that's the the main thing you should aim for. Aim to make more than
should aim for. Aim to make more than you take. Um be a be a
you take. Um be a be a you know a net contributor to to
you know a net contributor to to society. Um
it's and and it's it's kind of like the pursuit of happiness. You know, you if
pursuit of happiness. You know, you if you want to create something valuable
you want to create something valuable financially, you you don't pursue that.
financially, you you don't pursue that. You you it's best to actually pursue
You you it's best to actually pursue make providing useful products and
make providing useful products and services. If you do that, then money
services. If you do that, then money will come as a natural consequence of
will come as a natural consequence of that as opposed to pursuing money
that as opposed to pursuing money directly. Just like you can't sort of
directly. Just like you can't sort of pursue happiness directly. You pursue
pursue happiness directly. You pursue things that lead to happiness, but but
things that lead to happiness, but but there's not like
there's not like direct happiness pursuit. You you do
direct happiness pursuit. You you do things like
things like uh I guess fulfilling work or study or
uh I guess fulfilling work or study or friends, loved ones
friends, loved ones um that as a result make you happy.
um that as a result make you happy. So, so that's this it sounds like very
So, so that's this it sounds like very obvious, but um
obvious, but um generally if if somebody's trying to
generally if if somebody's trying to make a company work, they should expect
make a company work, they should expect to grind super hard. Uh except that
to grind super hard. Uh except that there's like some meaningful chance of
there's like some meaningful chance of failure. Um
failure. Um but but just be focused on having the
but but just be focused on having the output be worth more than the input.
output be worth more than the input. That are you a value creator?
That are you a value creator? That's what really matters. Uh
making more than you take. >> I think that's a good way to end this.
>> I think that's a good way to end this. Lauren is asking us to wrap up.
Lauren is asking us to wrap up. >> All right.
>> All right. >> Uh I also like to take the opportunity
>> Uh I also like to take the opportunity to thank my friend uh Manojan IGF. He
to thank my friend uh Manojan IGF. He does a great job of connecting
does a great job of connecting I think Indians like the group here with
I think Indians like the group here with people like you in order to
people like you in order to of many things I think get to know each
of many things I think get to know each other and become friends because once we
other and become friends because once we are friends maybe we can start working
are friends maybe we can start working together.
together. So thank you Manoj for putting this
So thank you Manoj for putting this whole thing together and thank you
whole thing together and thank you Isaiah
Isaiah >> and thank you so much Elon for taking
>> and thank you so much Elon for taking the time.
the time. >> You're welcome.
Did you have fun? >> Yeah, it was an interesting
>> Yeah, it was an interesting conversation. You know, sometimes I take
conversation. You know, sometimes I take these answers out of context, you know,
these answers out of context, you know, but uh that's I think it was a good good
but uh that's I think it was a good good conversation.
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.