YouTube Transcript: Joe Rogan Experience #2404 - Elon Musk
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
Video Summary
Summary
Core Theme
This podcast episode features a wide-ranging discussion, primarily focusing on the intersection of technology, politics, and societal trends, with a significant portion dedicated to the implications of AI and the future of work and society.
Mind Map
Click to expand
Click to explore the full interactive mind map • Zoom, pan, and navigate
Joe Rogan podcast. Check it out.
>> The Joe Rogan Experience.
>> TRAIN BY DAY. JOE ROGAN PODCAST BY
>> Exactly.
>> Just every morning.
>> Wonder what Jeff Bezos is doing.
>> He's doing some definitely doing some
testosterone. He looks jacked.
>> He looks jacked, right?
>> Yeah. But he didn't like >> quick
>> quick >> quick
>> quick >> quick
>> quick
at age like at 50 at age 59 in less than
a year he he went from pencilet geek to
uh looking like a like the rock.
>> Yeah. Like a little miniature alpha fella.
fella.
>> Yeah. Like like his neck got bigger than
his head.
>> Yeah. He got bigger.
>> But then like his earlier pictures his
neck's like a noodle.
>> I support this activity. I like to see
him going in this direction
>> which is fine. And his voice dropped
like two octaves. I want you to move in
that direction as well.
>> I think we can achieve this.
>> I I I mean I should
>> I think we can achieve >> gigachad.
>> gigachad.
That's what people called it.
>> Where is that guy?
>> Bele. Uh I don't know where he is.
>> That's like a real guy.
>> The artist. Yeah. >> No.
>> No.
>> Oh, gigachad. Oh, gigachad. Yeah. I
don't know if that's a real guy. It's
hard to say.
>> No, it is a real guy.
>> It is a real guy.
>> He's got the crazy jaw and like perfect
sculpted hair.
>> Yeah. Well, I mean, they may have
exaggerated a little bit, but >> um
>> um
>> but uh No, I think I think he actually
just kind of looked like that in reality.
reality. >> Wow.
>> Wow.
>> Um so
>> like like he's a pretty unique looking individual.
individual.
>> I think we can achieve this. That guy
right there, that's a real guy.
>> That's a real dude.
>> I always thought that was CGI.
>> No, I think one of I think the upper
right one is not him. That's not
>> But that one to the left of that like
that's real. No, that's that's
artificial, bro. That's fake. That's got
that uncanny valley feel to it, doesn't it?
it?
>> It's It's not impossible.
>> No, no, it's not impossible to achieve,
but it's not it's not possible to
maintain that kind of leanness. I mean,
that's like like you're you're also at
that point they're he's dehydrating and
all sorts of things.
>> Oh, it's based on a real person.
>> Yeah. Yeah. Based on,
>> right, but it's not a real person. What
does he really look like?
>> Like those images, I think, are [ __ ]
[ __ ]
>> Some of them are. Is that real? Okay.
That That looks real. That looks like a
really jack bodybuilder. >> Yeah.
>> Yeah.
>> Yeah, that looks real. Like that's
achievable. But there's a few of those
images where you're just like, "What's
going on here?"
>> Yeah. Yeah. Yeah. Totally. Um
>> Well, I mean, you see you see
>> that guy is that is that the
>> that's the real dude?
>> Well, there's that that that Icelandic
dude who's Thor.
>> Oh, yeah. The guy who jumps in the
frozen lakes and [ __ ]
>> Well, the guy who played the mountain. Um
Um
>> Oh, that guy.
>> That is that is like a that that is like
a a mutant strong human. Yes.
>> Like like uh he would be in like the
X-Men or something, you know?
>> He's just like not like uh
>> and there's that you know that have you
seen that meme tent and tent bag?
>> Um you know how like it's like it's
really hard to get the tent tent in
>> That's true.
>> Then there's a picture of of him and his girlfriend.
girlfriend.
That's hilarious.
>> Yeah, that's
>> I don't know how it gets in there, you
know? It's like it seems too small. But
>> I met Brian Shaw. Brian Shaw is like the
world's most powerful man. And he's
almost 7 feet tall. He's 400 lb.
>> And his his bone density is 1 in 500
million people. So there's one it's like
there's like maybe 16 people.
>> He's an enormous human being. like a
legitimate giant just like that guy. But
we met him. He was hanging out with us
in the green room of the mother ship.
It's like, okay, if this was like David
and Goliath days, like this is an actual
giant like the giants of the Bible.
>> Once in a while they get a super giant person.
person.
>> This is a real a real one. Like not a
tall skinny basketball player, like a 7
foot 400B powerliffter.
>> Like you don't want to especially look
at him. That's the guy. See if there's a
photo of him standing next to like a
regular human. I
>> was trying to get
>> There it is. That's him right there.
Like there's like there's like one of
him with next to standing next to Arnold
and stuff and it's where and everyone
everyone just looks tiny.
>> I mean I think he's a pretty cool dude actually.
actually.
>> Oh, Brian's very cool. Very smart, too.
Unusually, you know, you expect anybody
to be that big. It's got to be a [ __ ] >> No.
>> No.
>> Yeah. There was there's Andre the Giant
who was awesome. You
>> he was great in Princess Bride and
>> No, he was just awesome period.
>> Yeah. Yeah. So, we were talking about um
this interview with Sam Alman and
Tucker, and I was like, we should
probably just talk about this on the air
because it is one of the craziest
interviews I think I've ever seen in my life.
life. >> Yeah.
>> Yeah.
>> Where Tucker starts bringing up this guy
who was a whistleblower, whatever.
>> A whistleblower who, you know,
>> committed suicide, but it doesn't look
like it.
>> And and he's talking to Sam Alman about
this. And Sam Alton was like, "Are you
accusing me?" He's like, "No, no, no.
I'm not. I'm just saying I I think
someone killed him.
>> Yeah. And like And it should be investigated.
investigated. >> Yeah.
>> Yeah.
>> Um not just drop the case.
>> It seems like
>> they just dropped the case. Yeah. Yeah.
But his parents think he was murdered. >> Yeah.
>> Yeah.
>> Um the wires to a security camera were
cut. Um
>> blood in two rooms.
>> Blood in two rooms. Someone else's wig
was in the room. And
>> someone else's wig. >> Wig.
>> Wig.
>> Wig. Yes. Not his wig.
>> Not normal to have a wig laying around.
>> Yes. Um and um and he ordered Door Dash
uh right before allegedly committing suicide.
suicide.
>> Uh which uh is it seems unusual, you know?
know? >> Yeah.
>> Yeah.
>> It's like, you know, let's I'm going to
order pizza on second thoughts, I'll
kill myself. Uh is it seems like that's
a very rapid change in mindset.
>> It's very weird. And especially the
parents have they they don't believe he
committed suicide at all.
>> Has no note or anything. >> No.
>> No. >> Yeah.
>> Yeah.
>> It seems pretty [ __ ] up. And you know,
the idea that a whistleblower for an
enormous AI company that's worth
billions of dollars might get whacked,
that's not outside the pale.
>> I mean, it's straight out of a movie.
>> Right out of a movie, but right out of a
movie is real sometimes.
>> Yeah. Right. Exactly.
>> It's a little weird that I I think they
should do a proper investigation. Like,
what's the downside on that proper investigation?
investigation? >> Right.
>> Right. >> No.
>> No. >> Yeah,
>> Yeah,
>> for sure. But the whole exchange is so bizarre.
bizarre.
>> Yeah. Yeah.
Sam Alman's reaction to being accused of
murder is bizarre.
>> Look, I don't know if he's guilty, but
it's not possible to look more guilty.
>> So, I'm like,
>> or look more weird. >> Yeah.
>> Yeah.
>> You know, maybe it's just his social
thing. Like, maybe he's just odd with
confrontation and it just goes blank,
you know? But if if somebody was
accusing me of killing Jamie, like if
Jamie was a whistleblower and Jamie got
whacked and then I'd be like, "Wait,
what are you what are you are you
accusing me of killing my friend?" Like,
"What the [ __ ] are you talking about?" I
would I would be a little bit more I rate.
rate.
>> Yeah. Yeah. Exactly. You know, it would be
be
>> I would be a little upset.
>> Yeah. It'd be like Well, you'd be like,
you'd certainly in insist on a thorough
investigation. Yeah.
>> As opposed to trying to sweep it under
the rug.
>> Yeah. I wouldn't assume that he got that
he committed suicide. I would be
suspicious. If Tucker was telling me
that aspect of the story, I'd be like,
"That does seem like a murder. [ __ ] We
should look into this."
>> I mean, all signs point to it being a
murder. Not not saying, you know, Tim
Molvin had anything to do with the
murder, but uh
>> blood in two rooms.
>> It's blood in two rooms. Like, yeah,
there's the wires to the security camera
and the door dash being ordered right
before suicide. No suicide note. his
parents think uh he was murdered and um
the people that I know who knew him said
he was not suicidal.
So I'm like this why would you jump to
the conclusion
>> parents sued the
>> uh landlord?
>> They sued the son's landlord alleged the
owners and the managers of their son's
San Francisco apartment building were
part of a widespread cover up of his death.
death.
>> The landlord
>> Yeah. There's a bunch of weird They said
there was like packages missing from the
building. Some people said they saw
packages still being delivered and all a
sudden they all disappeared.
>> Huh. But that could be people steal
people's packages all the time.
>> The porch pirate situation. >> Yeah.
>> Yeah. >> Yeah.
>> Yeah.
>> Says they failed to safeguard.
>> Also, I mean, the amount of trauma those
poor parents have gone through with
their son dying like that. I mean, it must
must
>> God bless them. And how could they stay
sane after something like that? They're
probably they're so griefstricken. Who
knows what they believe at this point.
>> Yeah. It should have asked if Epson
Yeah, that's the the Cash Mattel thing.
Cash Mattel Dan Bonino trying to
convince everybody of that. Like, okay.
>> The guards weren't there and the cameras
stopped working and um
>> you know,
>> the guards were asleep. The cameras
weren't working. He had a a giant
steroided up bodybuilder guy that he was
sharing a cell with that was a murderer
who was a bad cop. Like, all of it's
kind of nuts. All of it's kind of nuts
like that he would just kill himself
rather than reveal all of his
billionaire friends. >> Yeah.
>> Yeah.
>> And then
>> did you see Tim Dylan talking to Chris
Cuomo about this?
>> I did. He liked the idea.
>> Chris Cromo just looked so stupid.
>> Tim just listed off all the
>> Tim just and he's like I agree it is
strange. Like of course it's strange
Chris. Jesus Christ. You can't just go
with the tide. You got to think things
through. And if you think that one
through, you're like, I don't think he
killed himself. Nobody does. You'd have
to work for an intelligence agency to
think he killed himself.
>> It does. It does seem unlikely.
>> It seems highly unlikely.
Highly, highly unlikely. All roads point
to murder. >> Yes.
>> Yes.
>> Point to they had to get rid of him
because he knew too much. Whatever the
[ __ ] he was doing, whatever kind of an
asset he was, whatever thing he was up
to, you know, was apparently very effective.
effective.
>> Yes. And a lot of people are compromised.
compromised.
You see, your boy Bill Gates is now
saying climate change is not a big deal.
Like, relax everybody. I know I scared
the [ __ ] out of you for the last decade
and a half, but ah, we're going to be fine.
>> Yeah. I mean,
you know, as was I was saying just
before coming into the studio with, you
know, it like every day there's some
crazy wild new thing that's happening.
It's It feels like reality is accelerating.
accelerating.
>> It's every day. And Every day it's like
more and more ridiculous to the point
where the simulation is more and more undeniable.
undeniable.
>> Yeah. Yeah. It really feels like
simulation, you know? It's like, come
on. What are the odds that this could be
the case?
>> Are you paying attention at all to Three
Atlas? Are you watching the
>> the comet?
>> Yeah. Whatever it is.
>> Yeah. Yeah. I mean, I mean, one thing I
can say is like, look, I
if if I was aware of any evidence of
aliens, um, you Joe, you have my word. I
will come on your show and I will reveal
it on the show. >> Okay.
>> Okay. >> Yeah,
>> Yeah,
>> that's a good deal.
>> Yeah, it's pretty good.
>> I'll believe you. Yeah, thank you.
>> I I'll stick I keep my you know, keep my
promises. So, um
>> All right. I'll hold you to that.
>> Yeah. Yeah. And and I'm never committing
suicide to be clear.
>> I don't think you would either.
>> So, on camera, guys, I am never
committing suicide ever.
>> If someone says you committed suicide, I
will fight tooth and nail.
>> I will fight tooth and nail. I will I
will not believe it. I will not believe
it. The thing about the three eye atlas
is it's
>> a hell of a name actually.
>> Yeah, it's a third eye sounds like third
eye or something.
>> Yeah, it does. Three eye is third. It's
only the third interstellar object
that's detected. >> Okay.
>> Okay.
>> Yeah. Obias.
>> Yeah. Alo was on the podcast a couple
days ago talking about it.
>> Yeah. It could be. I don't know. But I
>> apparently today they're saying that
it's changed course. Um,
>> did you see that, Jamie?
>> Avi said something today. I'll send it
to you. Um,
>> Here you go, Jamie. I'll send it to you
right now. Um, it's fascinating. It's
fascinating also because it's made
almost entirely of nickel, whatever it
is. And the only way that exists, uh,
here is, uh, industrial alloys
apparently. Um um most no there are
there are
>> there are definitely uh comets that and
asteroids that are made primarily of
nickel in fact. Yeah. So the the places
where um you mine nickel on earth is
actually where there was an asteroid or
comet that hit earth that was a nickel
rich uh you know
>> nickel rich nickel rich rich deposit.
>> Yeah that's that's that's it's coming.
Those are from impacts. You definitely
didn't want to be there at the time
because anything would have been
obliterated. Right. Um, but that's
that's where the the sources of nickel
and cobalt are these days.
>> So, this is Ovio Lope. A few hours ago,
the first hint of non-gravitational
acceleration that something other than
gravity is affecting its acceleration,
meaning something is affecting its
trajectory beyond gravity was indicated. Interesting.
Um
so it's mostly nickel very little iron
which uh he was saying uh is on earth
only exists in alloys but whatever you
know you're dealing with another planet
>> there this there are there are there are
cases where there's very nickel richch
asteroids meteorites that heavy that
something from space.
>> Yeah it it's only yeah it doesn't mean
it'll be a very sort of heavy spaceship
if you make it all out of nickel. Oh yeah.
yeah.
>> And [ __ ] huge. The size of Manhattan
and all nickel. That's kind of nuts.
>> Yeah, that's a heavy spaceship.
>> That's a real problem if it hits.
>> Uh yes. No, it would like obliterate a
continent type of thing.
>> Um maybe maybe worse.
>> Probably kill most of human life. >> Um
>> Um
>> if not all of us.
>> I haven't depends on what the the total
mass is. But um there's I mean the thing
is like in the fossil record there are
um you know there's like arguably
arguably five major extinction events.
um like the biggest one of which is the
Perine extinction uh where um almost all
life was eliminated. That that actually
occurred over several million several
million years. Um the there's the
Jurassic. I think Jurassic is I think
that one's pretty definitively an
asteroid. Um and um but there's but
there's been five major extinction
events, but um but what they don't count
are really the ones that merely take out
a continent. >> So
>> So >> merely
>> merely
>> Yeah. cuz that that because those don't
really show up on the fossil record, you know,
know, >> right?
>> right?
>> Um so unless it's enough to cause a you
know mass extinction event throughout
Earth, it it doesn't show up, you know,
in a fossil record that's uh 200 million
years old. Um so the uh yeah but but
there have been many um many impacts
that would have sort of destroyed all
life on you know let's say half of North
America or something like that. there
many such impacts through the course of history.
history.
>> Yeah. And there's nothing we can do
about it right now.
>> Yeah. There was one that um hits there
was a one that hit Siberia and destroyed
I think um few hundred square miles.
>> Oh, that's the Tungusa.
>> Yeah. That's the one from the 1920s, right?
right? >> Yeah.
>> Yeah.
>> Yeah. That's the one that coincides with
that meteor that uh comet storm that we
go through every June and every November
that they think is responsible for that
younger dest.
>> Yeah. Yeah, all that shit's crazy. Um,
thank you before we go any further for
letting us have a tour of SpaceX and
letting us be there for the rocket launch.
launch.
>> One of the absolute coolest things I've
ever seen in my life. And we we've we
were we thought it was only like I
thought it was a half a mile. Jamie's
like it was a mile away. Turned out it's
almost two miles away. And you feel it
in your chest.
>> Yeah. It's
>> you have to wear earplugs and you feel
it in your chest and it's 2 miles away.
>> It was [ __ ] amazing. And then to go
with you up into the command center and
to watch all the Starlink satellites
with all the different cameras and all
in real time as it made its way all the
way to Australia. How many minutes? Like
35 40 minutes. >> Yeah.
>> Yeah.
>> Wild it touchdown in Australia. >> Yeah.
>> Yeah.
>> [ __ ] crazy. It was amazing.
>> Yeah. Yeah.
>> Absolutely amazing. The starship's
awesome. Um, and anyone can go watch the
launch actually. So, you can just go to
um, South Padre Island, get has a great
view of the launch. Um, so it's like
where a lot of spring breakers go.
>> Um, but um, but we'll be flying pretty
frequently um, out of Starbase in South
Texas. And we we formally incorporated
it as a city. So, it's it's actually a
legally an actual legal city, Starbase, Texas.
Texas.
>> Um, it's not that often you hear like,
hey, we made a city, you know. Um, that
used to be like the like in in the old
days like a startup would be you go and
gather a bunch of people and say, "Hey,
let's go make a town." Literally, that
was like that would have been startups
in in in the old days. >> Um,
>> Um,
>> or a country.
>> Yeah. Or a country. >> Yeah.
>> Yeah.
>> Yeah. Yeah. Actually,
>> if you tried doing that today, there'd
be a real problem.
>> Yeah. That things are so much so much
set in stone on the country front these
days. You might pull it off. You might
be able to pull it off. If you got a a
solid island, you might be able to pull
it off.
>> You know, it's probably,
>> you know, like like at
owns lai.
>> Yeah, you could probably if you put if
you put enough effort into it, you could
make a new country.
>> This is one of the different ones. This
is one of the ones that you catch,
>> right? Or is that one?
>> Yeah, that that's the booster. So that's
the super heavy booster. Uh so that's
the one with the booster's got 33
engines. Um
that that uh um and it's you know by
version four that will have about 10,000
tons of thrust. Um you know right now
it's about 7 8,000 tons of thrust. Um,
that's that's the largest flying object
ever made.
>> I had to explain to someone. They were
going, "Why do they blow up all the time
if he's so smart?" Because there was
there was this [ __ ] idiot on
television. Some guy was being
interviewed and they were talking about
you. And he goes, "Oh, I think he's a
fuckwit." And he goes, "He's a fuckwit."
And he goes, "Why you say he's [ __ ] Oh,
his rockets keep blowing up." And
someone said, "Yeah, well, why do his
rockets blow?" And I had to explain.
Yeah. Because it's the only way you find
out what the tolerances are. You have to
you have to a few
>> corners of the box. So, so like so when
you do a new uh rocket development
program, um you you have to uh do what's
called uh you know, exploring the
limits, the corners of the box where you
say it's like you worst case this, worst
case that um to figure out um uh where
where the limits are. So uh you blow up,
you know, not not admittedly in the
development process sometimes blows up
accidentally. Um but but we
intentionally subject it to uh uh you
know a flight regime that is much worse
than what we expect in normal flight so
that when we put people on board or
valuable cargo it doesn't blow up. Um so
um so so for example for the the flight
that you saw we we actually deliberately
took um heat shield tiles off the the
the ship the off of Starship in in some
of the worst locations to say okay if we
lose a heat shield tile here is it is it
catastrophic or is it not? Um and we we
nonetheless uh Starship was able to do a
soft landing um in uh in the Indian
Ocean just uh west of Australia. Um
which as and it got there from Texas in
like I don't know 354 minutes type of
thing. So
>> So it landed even though you put it
through this situation where it has
compromised shield.
it it had an an an an unusually we we we
brought it in hot like an an extra hot
trajectory uh with missing tiles um to
see if it would still make it to a soft
landing which it did. Now I I should
point out it did have there were some
holes that were burnt into it. Um but
it's it was robust enough to land
despite having some holes burnt you know
that that you know cuz it's coming it's
coming in like a blazing meteor. You can
see you can see the real time video.
Well, tell me the speed again because
the the speed was bananas. You were
talking about
>> Yeah, it's like 17,000 mph
like like 25 times the speed of sound or
thereabouts. So, um the uh uh so so
think of it like it's it's like 12 times
faster than a bullet from an assault
rifle. You know, bullet from assault
rifles around Mach 2
>> and it's just and it's huge. >> Yeah.
>> Yeah.
Yeah. Or or if you compare it to like a
bullet from a um you know a 45 or or 9
mil which is subsonic that's you know it
it'll be about 30 times faster than a
bullet from a handgun.
>> 30 times faster than a bullet from a
handgun and it's the size of a skyscraper.
skyscraper. >> Yes.
>> Yes.
>> Yeah. That's fast.
>> It's so wild. It's so wild to see, man.
It It's uh It's so exciting. This the
factor is so exciting too because like
genuinely no [ __ ] I felt like I was
witnessing history. I felt like it was a
scene in a movie where someone had
expectations and they like what are they
doing? They're building rockets. And you
go there and as we were walking through
Jamie, you could speak to this too.
Didn't you have the feeling where you're like
like
>> oh this is way bigger than I thought it
was. This is huge. Awesome. >> Gigantic.
>> Gigantic.
>> [ __ ] crazy.
>> That's what she said. the ah the amount
of rockets you're making. I don't know
if you
>> tent back.
>> Gig Chad in the house.
>> This is way big.
>> It's a giant metal dick. You're [ __ ]
[ __ ] the universe with your giant
metal dick. That's
>> I mean, yeah, it is. It is very big.
>> And the sheer numbers of them that you
guys are making. And then this is a
version and you have a new updated
version that's coming soon.
>> And what is the It's a It's a little
longer. Um
>> more pointy.
>> Uh it's the same amount of pointy. Um
but the there's it's it's got a bit more
length. Um the the interstage, you see
that that interstage section with kind
of like the grill area. >> Mhm.
>> Mhm.
>> Um that's uh that's now integrated with
the boost stage. Um so uh we do um
what's called hot staging. Uh where we
light the ship engines while it's still
attached to the booster. So the boost
the booster engines are still thrusting.
is still it's it's uh you know it's
still being pushed forward by the
booster of the ship. Uh but then we
light the ship engines and the ship
engines actually pull away from the
booster even though the booster engines
are still firing. >> Whoa.
>> Whoa.
>> Um so it's blasting flame through uh
that that grill section but we integrate
that grill section into uh the boost
stage with the next uh version of the
rocket. Um and uh and explosion in the
rocket will have the Raptor 3 engines
which are a huge improvement. Um you may
you may have seen them in the lobby
because we got like the Raptor 1, two,
and three. And you can see the dramatic
improvement in simplicity. Um we should
probably put a plaque there to also show
how much the we reduced the weight uh
the cost and the and improved the
efficiency and the uh thrust. So the
Raptor 3 uh has uh you know almost twice
the thrust of Raptor Raptor 1. >> Wow.
>> Wow.
>> So you see Raptor 3. It looks like it
looks like it's got parts missing. Right.
Right.
>> And how many
>> It's very very clean.
>> How many of them are on the rocket?
>> There's 33 on the on the booster.
>> Whoa. Um and and each of each Raptor
engine is producing twice as much thrust
as all four engines on a 747.
Wow. So that engine is smaller than a
747 engine, but is producing, you know,
um you know, almost 10 times the thrust
of a 747 engine. Um so extremely high
power to weight ratio. Um
and um
>> and so when there's
>> 33 of them
>> you when you so when you're designing
these you get to Raptor one you see its
efficiency you see where you can improve
it you get to Raptor 2 how many how far
can you scale this up with just the same
sort of technology with propellant and
ignition and engines like how much
further can you
>> I mean we're pushing the limits of
physics here um so um
and and really in order to to make a a
fully reusable orbital rocket which no
one has succeeded in doing uh yet
including including us. Um but but uh
Starship is the first time that there is
a design for a rocket where where full
and rapid reusability is actually
possible. So it was not there's not
there's not even been a design before
where it was possible. Certainly not a
design that that that got made any
hardware at all. Um just we just we just
live we live on a planet uh where the
gravity uh is is is quite high like
earth's gravity is quite really quite
quite high. Um um and if the gravity was
even 10 or 20% uh higher uh we'd be
stuck on Earth forever. Um like we yeah
we could not use certainly couldn't use
conventional rockets. You'd have to like
blow yourself off the surface with like
a nuclear bomb or something crazy. Um so
but on the other hand if if Earth's
gravity was just a little lower like
even 10 20% lower it then uh getting to
orbit would be easy. So it's like it's
like it's like this if this was a video
game it's set to like maximum difficulty
but not impossible. >> Okay.
>> Okay.
>> Um so that's that's where we have um
here. So it's it's not as though um
others have uh ignored the concept of
reusability. they've just uh concluded
that it was too difficult to achieve.
And we've been working at on on this for
a long time at at SpaceX. Um and um you
know, I'm the chief engineer of the
company. Um although I should say that
that uh you know, we have an extremely
talented engineering team. I think we've
got the best uh rocket engineering team
that has ever been assembled. Um uh it's
it's an honor to work with such such
incredible people. Um so uh so so it's
fair to say that you know we have not
yet succeeded in creating in achieving
full reusability but we at last have a
rocket uh where full reusability is
possible. Um and I think I think we'll
achieve it next year. So um
uh that's a that's a really big deal.
And the reason the reason that's that's
such a big deal is that full reusability
um uh drops the cost of access to space
by a hundred
um maybe even more than 100 actually. So
could be like a thousand. The you can
think of it like any mode of transport.
Like imagine if aircraft were were not
reusable. Like you flew somewhere, you
throw the plane like like imagine if
like the way the way conventional
rockets work is it would be like if you
had an airplane and and and instead of
landing at your destination, you
parachute out um and the plane crashes
somewhere and you land at your desk and
you and you land on a parachute at your
destination. Now that would be a very
expensive trip
and you and you'd need another plane to
get back. Okay. Um, but that's how the
other rockets in the world work. Um, now
the SpaceX Falcon rocket is the only one
that is is there that is at least mostly
reusable. You've se you've seen the
Falcon rocket, you know, land. We've now
done over 500 landings of of the SpaceX
rocket of the of the Falcon 9 rocket. Um
and um and and this year um you know we
we'll deliver probably I don't know
somewhere between 2200 and 2500 tons to
orbit um with with the Falcon 9 uh
Falcon Heavy rockets uh not counting
anything for from Starship. Um
>> and this is mostly Starlink. Yes, mostly
Starling, but we launch uh many other we
even launch our competitors on um
competitors to Starink on on Falcon 9.
We charge them the same price. Pretty
fair. Um uh but uh SpaceX this year will
deliver um roughly 90% of all Earth mass
to orbit. >> Wow.
>> Wow.
>> Um and then of the remaining 10% um most
of that is done by China. And then the
then the remaining kind of roughly 4% is
uh everyone else in the world including
our America uh domestic competitors.
>> You know um it's kind of incredible how
many things are in space like how many
things are floating above us now?
>> There's a lot of things. >> Yeah.
>> Yeah.
>> Is there though? >> Right.
>> Right.
But is there a saturation point where
we're going to have problems with all
these different satellites that are
>> um I think as long as the satellites are
um maintained uh there's there it'll be
fine. This space is very roomy. Um it's
like you can think of um like space as
being concentric shells of the surface
of the earth. So, um, you know, there's
there's it's the surface of the earth,
but but there's it's a series
>> much larger.
>> Yeah. It's like a series of concentric
trails. Um,
>> and think of an Airstream trailer flying
around up there. There's a lot of room
for air streams.
>> Yeah. I mean, imagine Yeah. If there
just a few thousand airirstreams um on
on Earth. >> Yeah.
>> Yeah.
>> What are the odds that they'd hit each
other? You know,
>> they wouldn't be very crowded. No. And
then you got to go bigger. >> Yeah.
>> Yeah.
>> Because you're dealing with far above Earth.
Earth.
>> Hundreds of miles above Earth.
>> Yeah. Yeah. Yeah.
>> Yeah. So, it's the um but the goal of
SpaceX is to get rocket technology to
the point where we can extend life
beyond Earth and that we can establish a
self-sustaining city on Mars. Uh a
permanent base on the moon. That would
be very cool. I mean, imagine if we had
like a, you know, moon base alpha where
there's like a permanent science base on
the moon.
>> That would be pretty dope. Or at least a
tourist trap.
>> I mean, a lot of people be willing to go
to the moon for just for a tour. That's
for sure. We could probably pay for our
space program with that, you know,
>> probably. Yeah. Well,
>> because it's like if if you if you could
go to the moon with and and safely, >> uh
>> uh
I think we'd get a lot of people uh
would would pay for that, you know.
>> Oh, 100%. After the first year, after
nobody died for like
>> Yeah. Just make sure. Exactly. Are you
going to come back? Yeah.
>> Because like that submarine, they they
had a bunch of successful launches in
that private submarine before it
imploded and killed everybody. That was
not a good design. Obviously,
>> it was a very bad design. Terrible design.
design.
>> And the engineers said it would not
withstand the pressure of those depths.
Like there was a lot of whistleblowers
in that company too.
>> Yeah. Um they they they made that out of
uh carbon fiber which is it doesn't make
any sense because um you actually need
you need to be dense to go down. Um in
any case, just make it out of steel. If
you make it out of uh sort of just, you
know, a big steel casting, that's that's
you you'll be safe and nothing. Why
would they make it out of carbon fiber
then? Is it cheaper?
>> Um I think they think carbon fiber
sounds cool or something. But uh
>> it does sound cool.
>> It it sounds cool, but um because it's
such it's such low density, you actually
actually have to add extra mass to go
down because it's it's low density. But
if you just have a giant, you know,
hollow ball bearing, uh you're going to
be fine.
>> Speaking of carbon fiber, did you check
out my unplugged Tesla out there?
>> Yeah, it's cool.
>> Pretty sick, right? Yeah. Have you guys
ever thought about doing something like
that? like having like an AMG division
of Tesla where you do like custom stuff. >> Um
>> Um
I think it's best to leave that to the
custom shops. Uh you know we're we're
like Tesla's focus is autonomous cars.
Um you know building kind of futuristic
autonomous cars. Um so
so um
um
like I think it's we want the future to
look like the future. Um, so the did
like did you see like our designs for
like the sort of the robotic bus? It
looks pretty cool.
>> The robotic bus is also being totally auton
but it looks it looks cool. It's it's
very art deco. It's it's like it's like
futuristic art deco. Um, and um,
it it does it like I think we want to
change the aesthetic over time. You
don't want the aesthetic to be constant
over time. You want to evolve the
aesthetic. Um, so um, you know, like my
like I have a son who's he's like, you
know, he's he's he's like even more
autistic than me and um and, uh, but
he's he has these great observations.
Who is this?
>> A Saxon. He has these great observations
in the world uh because he's he just
views the world through a different lens
than than most people. Um and he was
like, "Dad, why does the world look like
>> And I'm like, "Damn, the world does look
like it's 2015." Like the aesthetic has
not evolved since 2015.
>> Oh, that's what it looks like. >> Yeah.
>> Yeah.
>> Oh, wow.
>> That's pretty cool.
>> Oh, yeah. That's like
>> like You'd want to see that going down
the road, you know?
>> Yeah. You'd be like, "Okay, this is
we're in the future." You know, it
doesn't look like 2015.
>> What is that ancient science fiction
movie? Like one of the first science
fiction movies ever. Is it Metropolis?
Is that what it is?
>> Yeah. Yeah.
>> Yeah. That looks like it belongs in Metropolis.
Metropolis.
>> Yeah. Yeah. It's a futuristic art deco.
>> All right. Yeah. Well, that's cool that
you're concentrating on the aesthetic. I
mean, that's kind of the whole deal with
Cybertruck, right? Like, it didn't have
to look like that.
>> No, it it I just wanted to have
something that looked really different.
Is it a pain in the ass for people to
get it insured because it's all solid
steel and
>> um I hope it's not too much. I you know
Tesla does offer insurance so people can
always get it get it insured at Tesla.
>> Um well but the like it is the form does
follow a function in the case of the
cybert truck because um as you
demonstrated with with your
armorpiercing arrow um because if you
shot that arrow at a regular truck I mean
mean
>> it exactly you would have found your
arrow in the wall. Yeah. Um, you know,
it would very least it would have buried
into one of the seats.
>> Yeah. Yeah. It's but like you could you
could definitely make uh get enough of
bow velocity and and the right the right
arrow would go through both doors of a
regular truck and and and land on the wall.
wall.
>> If there was a clear shot between both
doors, it probably would have passed
right through.
>> Exactly. Um but but you know the the
arrow shattered on the cybert truck cuz
it's it's ultra hard uh stainless. Mhm.
>> Um, so, um, and I thought it' be I
thought it'd be cool to have a, you
know, a truck that is bulletproof to a
subsonic projectile. Um, so, um, you
know, especially in this day and age,
you know, like as as a if, if the
apocalypse happens, you're going to want
to have a bulletproof truck, you know.
Um, so so then because because it's made
of ultra hot stainless, it's you can't
just stamp the the panels. You can't
just put in a stamping press because it
breaks the press.
So, so in order to actually, so it has
to has to be planer
um because it's so difficult to bend it
because it breaks the machine that bends
it. Um that's why that's why it's it's
it's it's so planer and and it's not uh
you know it's it's because it's
bulletproof steel is the
>> So it is like boxy as opposed to like
curved and
>> Yeah. You just in order to make in order
to make like the curved shapes, you you
you take you take uh uh basically mild
steel like um anneal thin and thin
anneal in a regular truck or car. The
you take you take mild thin anneal
steel, you put it in a stamping press
and it just sm it just smooshes it and
makes it to whatever the shape whatever
shape you want. But the Cybert truck is
made made of ultra hard stainless. Um
and and and so you can't stamp it
because it would break the stamping press.
press.
So it even bending it is hard. So even
to bend it to uh its current position,
we have to way overbend it. Um and and
so it gets so that when it springs back,
it's in in the right position.
Um, so it's uh I don't know like I I
think if you want to like I think it's
it's it's a unique aesthetic. Um, and
you say, "Well, what's cool about a
truck?" Trucks are trucks are like
should be I don't know manly. They
should be macho, you know, and
bulletproof is maximum macho macho.
>> Are you married to that shape now? Like
is it can you do anything to change it?
Like as you get further like I know you
guys updated the three and the Y. Did
you update the Y as well?
>> Yes, the the three and the Y uh are
updated. Um you know, there's like a um
there's there's a a screen in the back
for the kid that the kids can watch, for
example, in the new 3 and Y. Um uh so in
the new Y, um there's, you know, it's
it's an there's there's there's like
hundreds of improvements. Like we keep
improving the car. Um and even the
Cybert truck, we you know, we keep
improving it. Um but um
you know I wanted to just do something
that that looked unique and and the
cybert truck looks unique and has unique
functionality and there was and it was
like there were three things as I report
like let's make it bulletproof. Uh let's
uh make it faster than a Porsche 911.
Uh, and we actually cleared the quarter
mile. The Cybert truck, the the uh can
uh clear a quarter mile while towing a
Porsche 911 faster than a Porsche 911.
Um, it can out tow an F350 diesel. >> Really?
>> Really? >> Yes.
>> Yes.
>> What is the tow limitations?
>> I mean, we could tow like a, you know, a
747 in that with a cy. Cybert truck is
an insanely like it is an it is alien
technology. Okay. Um cuz it it shouldn't
be possible to be uh that big and that
fast. Uh that doesn't it's like an
elephant that runs as as like a cheetah.
>> Yeah. Because it's 0 to 60 in less than
3 seconds, right? >> Yes.
>> Yes.
>> Yeah. And it's enormous. What does it
weigh? Like 7,000 lbs.
>> Uh yeah, there's different
configurations, but it's about that.
Uh it's a beast. >> Yeah.
>> Yeah.
>> Um so and it's and it's got it's got
four-wheel steering. So the the rear
wheel steer, too. So it's got a it's got
a very tight turning radius.
>> Yeah. We noticed that we when we drove
one to Star Base.
>> Yeah. Very tight turning radius.
>> Yeah. Pretty sick. >> Yeah.
>> Yeah.
>> Are you still doing the Roadster?
>> we're getting close to
demonstrating the prototype
>> and I think this will be
I I I I one thing I can guarantee is
that this product demo will be unforgettable.
unforgettable. Unforgettable.
Unforgettable.
Whether it's good or bad,
it will be unforgettable. Um,
Um,
>> can you say more? What do you mean?
>> Well, you know, my friend Peter Teal,
um, you know, uh, once reflected that
the future was supposed to have flying
>> So, you're going to be able to fly?
Well, I mean, uh,
uh,
I think if Peter wants a flying car, we
should should be able to buy one.
>> So, you are you actively considering
making an electric flying car? Is this
like a real thing?
>> Well, we have to see in the
>> in the demo. So, when you do this, like
are are you going to have a retractable
wing? Like, what is the idea behind this?
Don't be sly. Come on.
>> I I I can't I can't uh do the unveil
before the unveil. Um but um
>> tell me off air then.
>> I I I it look I I think it has a shot at
being the most memorable
um product unveil ever.
ever.
It has a shot.
>> And when do you plan on doing this?
What's the goal?
>> Uh hopefully before the end of the year. >> Really?
>> Really?
>> Before the end of this year.
>> This is I mean we're in a couple months.
>> Hopefully in a couple months. Um
you know we need to make sure that it
works. Uh
like this is some crazy crazy technology
we got in this car. Crazy
technology. Crazy crazy.
So different than what was previously
announced and >> Yes.
>> Yes.
>> And is that why you haven't released it
yet? Cuz you keep [ __ ] with it.
>> It has crazy technology. >> Okay.
>> Okay.
>> Like is it even a car? I'm not sure.
It's like
it looks like a car.
Let's just put this way. It it's it's
crazier than anything James Bond. If you
took all the James Bond cars and
combined them, it's crazier than that.
>> Very exciting.
>> I don't know what to think of that.
>> I don't know.
>> It's a limited amount of information I'm
drawing from here.
>> Jamie's very suspicious over there. Look
at him. >> Excited.
>> Excited.
>> I'm interested.
>> It's still going to be the same.
>> Well, you know what? I mean, if if you
want to if you want to come a little
before the uh the unveil, I can show it
to you 100%. Yeah, let's go.
>> Yeah. Um
it's uh it's kind of crazy all the
different things that you're involved in
simultaneously and you know we talked
about this before your time management
but I I really don't understand it. I
don't understand how you can be paying
attention to all these different things
simultaneously. Starlink, SpaceX, Tesla,
boring company X you're tweet you
[ __ ] tweet or post rather all day
long. Well, it's more like I'm I'm I
could hop in for like two minutes and
then hop out, you know.
>> But I mean, just the fact that you could do
do
>> bathroom break or whatever, you know,
>> I can't do that. >> Um
>> Um
>> if I hop in, I start scrolling and I
start looking around. Next thing you
know, I've lost an hour. >> Yeah.
>> Yeah. >> Um
>> Um
so, no, it's for me it's it's a couple
minutes time usually. It's once in a
Sometimes I guess it's half an hour, but
usually I'm I'm I'm in for a few minutes
then out of of you know, posting
something on X. Uh, you know, it's I do
sometimes feel like it's sometimes like
that that meme of the guy who's like who
drops the grenade and leaves the room.
That's been me more than once on on X.
>> Yeah. Oh, yeah. Yeah, for sure. Um, it's
got to be fun, though. It's got to be
fun to know that you essentially
disrupted the entire social media chain
of command because there was a there was
a very clear thing that was going on
with social media. The government had
infiltrated it. They were censoring speech
speech
>> and until you bought it, we really
didn't know the extent of it. We kind of
assumed that there was something going on.
on.
>> Yeah. We had no idea that they were
actively involved in censoring actual
real news stories, real data, real
scientists, real professors silenced,
expelled, kicked off the platform. >> Yeah.
>> Yeah. >> Wild.
>> Wild. >> Yeah.
>> Yeah. >> Yeah.
>> Yeah.
>> For telling the truth.
>> For telling the truth. And I'm sure
you've also because I sent it to you
that chart that shows uh young kids,
teenagers identifying as trans and
non-binary literally stops dead when you
bought Twitter and starts falling off a
cliff when people are allowed to have
rational discussions now and actually
talk about it. >> Yes.
>> Yes. >> Yeah.
>> Yeah.
>> Um Yeah. Yeah, I mean I I said at the
time like I think that like the the like
the reason for acquiring Twitter is
because um it was it it was c it was
causing destruction at a civilizational
level. Um it was um I mean I posted I
tweeted on on Twitter at the time that
you know it's it's
uh worm tongue for the world.
um you know like Worm Tongue from Lord
of the Rings uh where he would just sort
of like whisper these you know terrible
things to the king so the king would
believe these things that weren't true
um and and um unfortunately uh Twitter
really got it got like the the the woke
mob essentially they controlled Twitter
um and they were pushing uh a nihilistic
anti-vilizational mind virus to the
world. Um, and you can see the results
of that mind virus on the streets of San
Francisco, uh, where, you know, downtown
San Francisco looks like a zombie
apocalypse. Um, you know, it's it's bad.
Um, so we don't want the whole world to
be a zombie apocalypse. Um, but that's
uh that that that was essentially they
were pushing this very negative,
nihilistic, untrue worldview
on the world and it was causing a lot of damage.
damage. Um,
Um, so
so
>> the stunning thing about it is how few
people course corrected. A bunch of
people woke up and realized what was
going on. People that were all on board
with like woke ideology in maybe 2015 or
16 and then and then eventually it comes
to affect them or they see it in their
workplace or they see and they're like,
"Whoa, whoa, whoa, we got to stop this."
Bunch of people did, but a lot of people
never course corrected.
>> Yeah. Um,
a lot of a lot of people didn't course
correct, but um, but it's gone
directionally in it's gone it's it's
directionally correct like you mentioned
like the like the massive spike in in
kids identifying as trans and then that
that spike dropping um after the the
Twitter acquisition. I think that um
simply allowing the truth to be told um
was that just shedding sun sunlight is
the best disinfectant as they say and
just allowing sunlight um kills the virus
virus
>> and it also changed the benchmark for
all the other platforms. Yes, you can't
just openly censor people and all the
other platforms and X is available. So
everybody else had a So like Facebook
announced they were changing YouTube
announced they were changing their
policies and they're kind of forced to
And then blue sky doubled down.
>> Well, like the problem is like if
>> uh essentially the woke mind virus
retreated to woke to to blue sky. >> Yeah.
>> Yeah.
>> Um but it's where they're just a
self-reinforcing lunatic assign.
>> They're all just triple masked. I I was
I was totally watching this exchange on
a blue sky where someone said that
they're just trying to be zen about
something and then someone a moderator
immediately chimed in and why don't you
try to stop being racist against Asians
by saying something zen by saying I'm
trying to be zen about something. They
were accusing that person of being
racist towards Asians.
>> Yeah. It's it's just it's just
everyone's a hall monitor over there.
the worst hall monitor. A virgin like incel.
incel.
>> They're all home monitors trying to rat
on each other.
>> Yeah, it's fascinating. And then people
say, "I'm leaving for blue sky like
Stephen King." And then a couple weeks
later, he's back on X. Just like, "Fuck
it. It's there's no one over there. It's
all a bunch of crazy people. You can
only stay in the asylum for so long.
Like, all right, this this is not good."
They all bail.
>> Yeah. Yeah.
>> Threads is kind of like that, too.
Threads is
>> I' I've been on threads as is it? Well,
what happens is if you go on Instagram,
every now and then it'll something
really stupid will pop up on threads
like what the [ __ ] and it shows it to
you on Instagram and then I'll click on
that and then I'll go to threads and
it's like
>> you see posts with like 25 likes like
famous people like 50 like it's it's down
down
>> but the people that post on there
they're finding that there's very little
push back from insane ideology so they
go there and they spit out nonsense and
very few people jump in to argue. you.
>> Yeah. Um,
>> very weird, very weird place.
>> I mean, I can generally get the vibe of
like what's taking off by seeing what's
showing up on X cuz that's the public
town square still. Um,
>> and uh or or uh you know what what links
show up in group texts, you know, if I'm
in group chat with friends, like where
where what what links are showing up?
>> That's what I try to do now. Only get
stuff that shows up in my group text
because that keeps me productive. So, I
only check if someone's like, "Dude,
what the fuck?" like, "All right, what
the [ __ ] Let me check it out."
>> If there's something that's crazy enough
that your it'll it'll end with the group chat,
chat,
>> but there's always something. That's
what's nuts. There's always some new law
that's passed, some new insane thing
that California is doing. And it's like
like a giant chunk of it's happening in
California. The most preposterous things
that I get. >> Yeah.
>> Yeah.
>> And then you got Gavin Newsome who's
running around saying we all have
California derangement syndrome. He's
just like ripping off Trump derangement
and calling it California derangement. I
was like, "No, no, no, no, no, no. The
the [ __ ] How many corporations have
left California?"
>> It's crazy.
>> Hundreds. Hundreds,
>> right? Hundreds.
>> That's not good.
>> Chick I mean, not Chick-fil-A. I mean,
uh I think In-N-Out left.
>> Yeah. In and Outlift. They moved to Tennessee.
Tennessee. >> Yeah.
>> Yeah.
>> Yeah. They're like, "We can't do this anymore."
anymore."
>> Right. And
>> it's the California company for food.
It's like the greatest hamburger place ever.
ever.
>> It's awesome. >> Yeah.
>> Yeah.
>> Yeah. And no, actually speaking of like
like just sort of open source and like
looking at things openly like you I just
like going in and out and seeing them
make the burger.
>> Yeah. It's right there.
>> They chop the onions and they they you
know it's you just see everything
getting made in front of you. >> Yeah.
>> Yeah.
>> It's great.
>> Um but yeah like like it should be like
how many wakeup calls do you need to say
that there needs to be reform in
California, you know?
>> Well, the crazy thing that Newsome does
is whenever someone brings up the
problems in California, he starts
rattling off all the positives. the most
Fortune 500 companies, highest
education. But yeah, that was all
already there, right before you were governor.
governor.
>> But but how many Fortune 500 companies
have left California?
>> And then you guys spent $ 24 billion on
the homeless and it got way worse.
>> Yes. Like the homeless population
doubled or something like but like
people don't understand like the
homeless thing because it it sort of
prays on people's empathy and I I think
we should have empathy. Um and we should
try to help people. Um but the the the
uh the homeless industrial complex is is
really it's it's it's dark man. Um it
should be that that that that network of
NGOs's should be called like the drug
zombie farmers. Um because they they the
the more homeless people and and and
really like when you when you meet like
you know somebody who's like totally
dead inside shuffling along down the
street with a with a needle dang
dangling out of their leg. Homeless is
the wrong word. Like the homeless
implies that somebody got a little
behind in their mortgage payments and if
they just got a job offer, they'd be
back on their feet. But someone who's
I mean, you see these videos of people
that are just shuffling, you know,
they're on fentanyl. They're they're like,
like,
>> you know, taking a dump in the middle of
the street, you know, and they they got
like open sores and stuff.
>> They're not like one drop offer away
from getting back on their feet,
>> right? This is not a homeless issue.
>> Homeless is it's it's a propaganda word,
>> right? Um so and and then the the the the
the
you know these sort of charities uh
inquiries are they they get money
proportionate to the number of homeless
people or or number of drug zombies.
>> So their incentive structure is to
maximize the number of drug zombies not
minimize it.
>> Um that's why they don't arrest the drug dealers
dealers
>> because if they arrest the drug dealers
the drug zombies leave.
So they know who the drug dealers are.
They don't arrest them on purpose. Uh
because otherwise the drug zombies would
leave and they would they would stop
getting money from the state of
California and from from all the charities.
charities.
>> Wait a minute. So So they So they is
that real? So they're in coordination
with law enforcement on this? >> Yeah.
>> Yeah.
>> So how do they how do they have those meetings?
meetings?
>> They're all in cahoots.
>> Well, when you find this
>> it's like such it's it's this is a
diabolical scam. Um so uh and and San
Francisco has got this tax this this
gross receipts tax uh which which um
it's not even on revenue, it's on old
transactions which is why Stripe um and
Square and and and a whole bunch of
financial companies had to move out of
San Francisco because it wasn't a tax on
revenue, it's taxed on transactions. So
if if you did like, you know, trillions
of dollars transactions, it's not
revenue. You're taxed on any money going
through the system in San Francisco. Um
so um like Jack Dorsey pointed this out
and they said like that they had had to
move Square from San Francisco to uh
Oakland I think uh Stripe had to move
from San Francisco to South San
Francisco different city. Um and that
money uh goes to the homeless industrial
complex that that tax that was passed.
Um so um so there's billions of dollars
that go as you pointed out billions of
dollars every year that go to uh these
um non-governmental organizations that
are funded by the state. Like there's
it's not clear how to turn this off. Um
it's a self-licking ice cream cone
situation. Um so uh they get this money
the money is proportionate to the number
of of homeless people or or number of
drug zombies essentially. Um, so they
they they try to keep the that they try
to actually increase because that like
like in in some cases like there's it's
it's somebody did an analysis when you
add up all the money that's flowing,
they're getting close to a million
dollars per homeless per per drug
zombie. It's like $900,000 or something
like some crazy amount of money is is is
going to these organizations. So if if
so so they want to keep people just
barely alive. They need to keep them in
the area so they so they they get the
revenue. Uh uh so and so that's why like
said they don't arrest the drug dealers
because otherwise the drug zombies would
leave um and and and and they but but
they don't want you have to have too
much if they get too much drugs and they
then they die. So it's they they're kept
in this sort of perpetual
zone of of being addicted but um but but
just just barely alive.
>> So how is this coordinated with like DAs
DAs that don't prosecute people? So when
they when they hire the or they push so
they they fund the campaigns of the most
progressive, most out there leftwing
DAS, they get them into office.
>> We've got that issue in Austin, too, by
the way.
>> You see that guy that got shot in the library?
library? >> No.
>> No.
>> Yeah, I heard a guy got shot and killed
in the library.
>> I think that was just like last week or something,
something, >> right?
>> right?
>> Um so some friends of mine were telling
me that that like the library is unsafe.
like they took their kids to the library
and and there were like dangerous people
in the library in Austin and I was like
dangerous people in the library like
that's a strange it basically got like
got like uh drug zombies in drug zombies
in the library.
>> Oh Jesus. >> Um
>> Um
>> and that's when someone got shot.
>> Yeah, I believe this was should be on
the news. You might might be able to
pull it up. Um but I think it was just
in the last week or so that uh uh there
was a shooting in the library in Austin.
Um cuz Austin's got, you know, it's it's
the most liberal part of Texas that
we're in right right here. Um
>> so suspect involved the shooting Austin
Park Library Saturday is accused of
another shooting at the Cap Metro bus
earlier that day. According to an arrest
warrant affidavit, Austin police
arrested Harold Newton Keen, 55 short uh
shortly after the shooting of the
library, which occurred around noon. One
person sustained non-life-threatening
injuries in the event. Before that
shooting, Keane was accused of shooting
another person in a bus incident and
after reportedly pointing his gun at a
child. So, this is the fella down here.
>> So, like we just have a seriously have a
problem here. Um >> yeah,
>> yeah,
>> you know, so I I think one of the people
might have died too that he shot. Um so,
um like one of the people I think I
think did bleed out. Um
>> but either way, it's like getting shot
is still bad. Um it says uh the victim
told police it confronted the suspect
who started to eat what appeared to be
crystal methamphetamine.
According to the affidavit the victim
advised the suspect uh began to trip out
at which time the victim exited the bus.
Victim told the bus driver hit the panic
button and then exited the bus when he
turned around the observer. Black male
was now standing at the front of the bus
with the gun pointed at him. The victim
advised the black male fired a single
round which grazed his left hip. So he
shot at that dude and then another dude
got shot in the library. Fun.
>> Yeah. I mean in the library. >> Yeah.
>> Yeah.
>> You know, where you're supposed to be
reading books. Um and there's a
children's section in the library and
says he pointed his gun at a at a kid. I
mean like we do have a serious issue in
the in in in America where um repeat
violent offenders need to be incarcerated,
incarcerated,
>> right? Um, and uh, you know, you got you
got cases where somebody's been arrested
like 47 times, right? Like literally.
Okay, that's just the number of times
they were arrested, not the number of
times they did things. Like most of the
times they do things, they're not
arrested. Um,
>> so lay this out for people so they
understand how this happens.
>> Yeah. And and the key is like this, it
prays on people's empathy. episode like
if you're a good person, you want good
things to happen in the world, you're
like, well, we should take care of
people who uh you know uh you know who
are down in their luck or you know
having a hard time in life. And I we
should I agree. But what we shouldn't do
uh is is put people who are violent drug
zombies uh in public places where they
can hurt other people. Um, and that's
what that is what we're doing that we
just saw where a a guy, you know, got
shot uh shot in the library and then but
even before that he shot another guy um
and pointed his gun at a kid. Um that
that that guy probably has like many
prior arrests. Um you know there was
that that that guy that that that knifed
uh the Ukrainian woman Arena. >> Yes.
>> Yes.
>> Um yeah. and you know um and she was
just she was just quietly on her phone
and you just came up and you know gutted
her basically.
>> Wasn't there a crazy story about the
judge who was involved who had previously
previously
dealt with this person was also invested
in a rehabilitation center and was
sending these
>> conflict of interest.
>> Yes. So sending people that they were
charging Yeah. to a rehabilitation
center instead of putting them in jail,
profiting from this rehabilitation
center, letting them back out on the
street. Yes. Violent, insane people.
>> And and there um in that case that I
believe that judge uh has no legal law
degree uh or a significant legal
experience that would allow them to be a
judge. They were just made a judge. That
there's like
>> you could be a judge without a law degree.
degree. >> Yeah.
>> Yeah. >> Wow.
>> Wow. >> Yeah.
>> Yeah.
>> You could just be a So I could be a judge.
judge.
>> Yeah. Oh, exciting.
exciting. >> Anyone?
>> Anyone?
>> That's crazy. I thought you'd have to
It's like if you want to be a doctor,
you have to go to medical school. I
thought if you're going to be a judge,
>> if you're going to be appointed to a
judge, you have to have proven that you
have an uh excellent knowledge of the
law and that you will make your
decisions according to the law. That's
what we assume should be.
>> That's how you get the robe, >> right?
>> right?
>> You don't get the robe unless you do, >> you know,
>> you know, >> got to go to school to get the robe.
>> got to go to school to get the robe. >> You got to know what the law is,
>> You got to know what the law is, >> right? And then you're going to need to
>> right? And then you're going to need to make decisions in accordance with the
make decisions in accordance with the law
law >> based on stuff that you already know cuz
>> based on stuff that you already know cuz you read it cuz you went to school for
you read it cuz you went to school for it. Yes. Not you just got appointed.
it. Yes. Not you just got appointed. >> Got vibes.
>> Got vibes. You can't be just vibing as a judge.
You can't be just vibing as a judge. >> Vibing as a leftwing drudge. So you got
>> Vibing as a leftwing drudge. So you got crazy leftwing DAS.
crazy leftwing DAS. >> Yes.
>> Yes. >> Like I should say leftwing cuz leftwing
>> Like I should say leftwing cuz leftwing >> used to be normal.
>> used to be normal. >> Yeah. Left wing just meant like like
>> Yeah. Left wing just meant like like Yeah. You're like the left used to be
Yeah. You're like the left used to be like pro pro- free speech. Yeah. And now
like pro pro- free speech. Yeah. And now they're against it.
they're against it. >> It used to be like prog gay rights, pro
>> It used to be like prog gay rights, pro women's right to choose, pro-
women's right to choose, pro- minorities, pro, you know,
minorities, pro, you know, >> like, yeah, like 20 years ago, I don't
>> like, yeah, like 20 years ago, I don't know, it it used to be like left would
know, it it used to be like left would be like the the the party of empathy or
be like the the the party of empathy or like, you know, caring and being nice
like, you know, caring and being nice and that kind of thing.
and that kind of thing. >> Um, not not the party of like crushing
>> Um, not not the party of like crushing dissent and crushing free speech. um and
dissent and crushing free speech. um and uh you know crazy regulation uh and and
uh you know crazy regulation uh and and just um and being super judgy u and
just um and being super judgy u and calling everyone a Nazi um you know um
calling everyone a Nazi um you know um like I think they called you and me
like I think they called you and me Nazis you know
Nazis you know >> oh yeah I'm a Nazi
>> oh yeah I'm a Nazi >> I no I have friends that are comedians
>> I no I have friends that are comedians that called you a Nazi and I got pissed
that called you a Nazi and I got pissed off Oh yeah yeah yeah definitely a Nazi
off Oh yeah yeah yeah definitely a Nazi no because you did that thing at the My
no because you did that thing at the My heart goes out to you everyone everyone
heart goes out to you everyone everyone All of them. Literally, Tim Walls, Kla
All of them. Literally, Tim Walls, Kla Harris, every one of them did it. They
Harris, every one of them did it. They all did it.
all did it. >> Like, like h how do you point at the
>> Like, like h how do you point at the crowd? Yeah. How do you wave at the
crowd? Yeah. How do you wave at the crowd?
crowd? >> Do you know CNN was using a photo of me
>> Do you know CNN was using a photo of me whenever I got in trouble during co
whenever I got in trouble during co >> from the UFC weigh-ins? And if the UFC
>> from the UFC weigh-ins? And if the UFC weigh-ins, I go, "Hey everybody, welcome
weigh-ins, I go, "Hey everybody, welcome to the weigh-ins." And so they were
to the weigh-ins." And so they were getting me from the side. And that was
getting me from the side. And that was the photo that they used. Conspiracy
the photo that they used. Conspiracy theorist podcaster Joe Ro. Like that's
theorist podcaster Joe Ro. Like that's what they used.
what they used. >> Yeah. Yeah. But that's what the left is
>> Yeah. Yeah. But that's what the left is today. It's super judgy and calling
today. It's super judgy and calling everyone a Nazi and trying to suppress
everyone a Nazi and trying to suppress freedom of speech.
freedom of speech. >> Yeah. And eventually you run out of
>> Yeah. And eventually you run out of people to accuse because people get
people to accuse because people get pissed off and they leave.
pissed off and they leave. >> Yeah. Everyone it's like it like it it
>> Yeah. Everyone it's like it like it it no longer frankly it doesn't matter to
no longer frankly it doesn't matter to be called racist or Nazi or whatever
be called racist or Nazi or whatever because
because >> still recording.
>> still recording. >> It's the government man.
>> It's the government man. >> Is it working?
>> Is it working? >> We're good. Okay.
>> We're good. Okay. >> Okay.
>> Okay. >> This thing working.
>> This thing working. >> Yeah. Slight issue.
>> Yeah. Slight issue. >> I'm the one that heard it. But
>> I'm the one that heard it. But >> yeah. when you uh when you text people,
>> yeah. when you uh when you text people, do you are you like keenly aware that
do you are you like keenly aware that there's a high likelihood that someone's
there's a high likelihood that someone's reading your texts?
reading your texts? >> Um I guess I I guess I
>> Um I guess I I guess I >> I assume
>> I assume >> I look if if if if intelligence agencies
>> I look if if if if intelligence agencies aren't trying to read my phone, they
aren't trying to read my phone, they should probably be fired.
I got to I got to crack them up once in a while, you know.
a while, you know. >> Oh, for sure. I crack them up.
>> Oh, for sure. I crack them up. >> There's like, "Hey guys, check it out.
>> There's like, "Hey guys, check it out. We've got a banger here, you know."
We've got a banger here, you know." >> So, I want to I wanted to talk to you
>> So, I want to I wanted to talk to you about uh whether or not encrypted apps
about uh whether or not encrypted apps are really secure.
are really secure. >> Uh, no.
>> Uh, no. >> Right. Cuz I know the Tucker thing. So,
>> Right. Cuz I know the Tucker thing. So, it was explained to me by a friend who
it was explained to me by a friend who used to do this, used to work for the
used to do this, used to work for the government. It's like they can look at
government. It's like they can look at your signal, but what they have to do is
your signal, but what they have to do is take the information that's encrypted
take the information that's encrypted and then they have to decrypt it and
and then they have to decrypt it and it's very expensive. So they said he
it's very expensive. So they said he told me that for the Tucker Carlson
told me that for the Tucker Carlson thing when they found out that he was
thing when they found out that he was going to interview Putin, it costs like
going to interview Putin, it costs like something like $750,000
something like $750,000 just to decrypt his messages to find out
just to decrypt his messages to find out that they did it. So it is possible to
that they did it. So it is possible to do. It's just not that easy to do.
do. It's just not that easy to do. I think you should view any given
I think you should view any given messaging system as um uh not not
messaging system as um uh not not whether it's secure or not, but but
whether it's secure or not, but but there are degrees of insecurity.
there are degrees of insecurity. So um so there's just some things that
So um so there's just some things that are less insecure than others. Um so um
are less insecure than others. Um so um you know on on X we just rebuilt the
you know on on X we just rebuilt the entire messaging stack um into X what's
entire messaging stack um into X what's called XChat.
called XChat. >> Yeah, that's what I wanted to ask you
>> Yeah, that's what I wanted to ask you about.
about. >> Yeah, it's cool. Um, so it's it's using
>> Yeah, it's cool. Um, so it's it's using uh sort of peer-to-peer uh sort of kind
uh sort of peer-to-peer uh sort of kind of a peer-to-peer based uh uh encryption
of a peer-to-peer based uh uh encryption system. So kind of similar to Bitcoin.
system. So kind of similar to Bitcoin. Um so it's uh it's it's I think very
Um so it's uh it's it's I think very good encryption. We're and you know
good encryption. We're and you know we're testing it thoroughly. We're not
we're testing it thoroughly. We're not there's there's no hooks in the X
there's there's no hooks in the X systems for advertising. So if you look
systems for advertising. So if you look look at something like WhatsApp or
look at something like WhatsApp or really any of the others, they've got
really any of the others, they've got they've got hooks in there for
they've got hooks in there for advertising.
advertising. >> When you say hooks, what do you mean by
>> When you say hooks, what do you mean by that?
that? >> Uh exactly. What do you mean biohook
>> Uh exactly. What do you mean biohook advertising? Um the so like WhatsApp um
advertising? Um the so like WhatsApp um uh knows enough about what you're
uh knows enough about what you're texting to show you to show you to know
texting to show you to show you to know what ads to show you.
what ads to show you. >> Ah
>> Ah >> but then like that that's a massive
>> but then like that that's a massive security vulnerability.
security vulnerability. >> Yeah.
>> Yeah. >> Um because if it knows if if it's got
>> Um because if it knows if if it's got information enough information to show
information enough information to show you ads, it's got enough it's got that's
you ads, it's got enough it's got that's a lot of information.
a lot of information. >> Yeah.
>> Yeah. >> Um so they call it oh it's just don't
>> Um so they call it oh it's just don't worry about it. It's just a hook for
worry about it. It's just a hook for advertising. I'm like uh okay. So
advertising. I'm like uh okay. So somebody can just uh use that same hook
somebody can just uh use that same hook to get in there and look at your
to get in there and look at your messages. Um so Xhat has no hooks for
messages. Um so Xhat has no hooks for advertising. Um and I'm not saying it's
advertising. Um and I'm not saying it's perfect. Uh but it's an Our goal with
perfect. Uh but it's an Our goal with XChat uh is to replace what used to be
XChat uh is to replace what used to be the Twitter you the Twitter DM stack
the Twitter you the Twitter DM stack with a fully encrypted system uh where
with a fully encrypted system uh where you can text send files uh do audio
you can text send files uh do audio video calls um and um and it's it's you
video calls um and um and it's it's you know I think it'll be the least I would
know I think it'll be the least I would call it the least insecure of any
call it the least insecure of any messaging system.
messaging system. >> Are you going to launch it as a
>> Are you going to launch it as a standalone app or is it will always be
standalone app or is it will always be incorporated to X?
incorporated to X? >> Uh we'll have both. So um
>> Uh we'll have both. So um >> so so be like signal so anybody can get
>> so so be like signal so anybody can get it
it >> you can get get the you'll be able to
>> you can get get the you'll be able to just get the X chat app by itself um and
just get the X chat app by itself um and like I said you could do uh texts uh
like I said you could do uh texts uh audio video calls uh or send files um
audio video calls uh or send files um and there'll be a dedicated app uh which
and there'll be a dedicated app uh which will hopefully release in a few months
will hopefully release in a few months um but and then also integrated into the
um but and then also integrated into the X system
X system >> um the X phone people keeps talking keep
>> um the X phone people keeps talking keep Is that
Is that >> I have a lot on my plate man but it
>> I have a lot on my plate man but it keeps coming up it keeps coming up where
keeps coming up it keeps coming up where I I know I've asked you a couple times.
I I know I've asked you a couple times. I'm like, "This is [ __ ] right?" But
I'm like, "This is [ __ ] right?" But like this one, so you're not working on
like this one, so you're not working on >> I'm not working on on a on a phone.
>> I'm not working on on a on a phone. >> Okay.
>> Okay. >> Um
>> Um >> have you ever considered it? Has it ever
>> have you ever considered it? Has it ever popped into your head?
popped into your head? >> Cuz you might be the only person that
>> Cuz you might be the only person that could get people off of the Apple
could get people off of the Apple platform.
platform. >> Well, I can tell you where I think
>> Well, I can tell you where I think things are going to go. uh which is that
things are going to go. uh which is that it's we're not going to have a phone or
it's we're not going to have a phone or or in the traditional sense the
or in the traditional sense the what we call a phone will really be
what we call a phone will really be um an edge node for AI inference for for
um an edge node for AI inference for for AI video inference um with uh you know
AI video inference um with uh you know with some radios to to obviously connect
with some radios to to obviously connect uh to but but essentially you'll have
uh to but but essentially you'll have um uh AI on the server side commun
um uh AI on the server side commun communicating to an AI on your your
communicating to an AI on your your device um you know formerly known as a
device um you know formerly known as a phone uh and generating real-time video
phone uh and generating real-time video of anything that you could possibly
of anything that you could possibly want. Um and I think that that there
want. Um and I think that that there won't be operating systems. There won't
won't be operating systems. There won't be apps in the future. There won't be
be apps in the future. There won't be operating systems or apps. It'll just be
operating systems or apps. It'll just be you've got a device that is there for
you've got a device that is there for the screen and audio and for uh and and
the screen and audio and for uh and and and to uh put as much AI on the on on
and to uh put as much AI on the on on the device as possible. so as to
the device as possible. so as to minimize the amount of bandwidth that's
minimize the amount of bandwidth that's needed between your edge node device or
needed between your edge node device or formerly known as a phone and the
formerly known as a phone and the servers.
servers. >> So if there's no apps, what will people
>> So if there's no apps, what will people use? Like will X still exist? Will will
use? Like will X still exist? Will will they be email platforms or will you get
they be email platforms or will you get everything through AI?
everything through AI? >> You'll get everything through AI.
>> You'll get everything through AI. >> Everything through AI. What will be the
>> Everything through AI. What will be the benefit of that as opposed to having
benefit of that as opposed to having individual apps? whatever you can think
individual apps? whatever you can think of or really whatever the AI can
of or really whatever the AI can anticipate you might want it'll show
anticipate you might want it'll show you.
you. That's that's that's that's my
That's that's that's that's my prediction for where things end up.
prediction for where things end up. >> What kind of a time frame are we talking
>> What kind of a time frame are we talking about here?
about here? >> I don't know. It's pro well
>> I don't know. It's pro well it's probably
it's probably five or six years or something like
five or six years or something like that.
that. >> So five or six years apps are like
>> So five or six years apps are like Blockbuster video
Blockbuster video >> pretty much
>> pretty much >> and everything's run through AI.
>> and everything's run through AI. Yeah. And and there'll be
Yeah. And and there'll be um like most of what people consume in
um like most of what people consume in five or six years, maybe sooner than
five or six years, maybe sooner than that um will be uh just AI generated
that um will be uh just AI generated content. So um you know music videos
content. So um you know music videos look well um there's already um
look well um there's already um you know there's people have made uh AI
you know there's people have made uh AI videos using Grock imagine and with
videos using Grock imagine and with using you know other apps as well um
using you know other apps as well um that are several minutes long or like 10
that are several minutes long or like 10 10 15 minutes and it's pretty coherent.
10 15 minutes and it's pretty coherent. >> Yeah,
>> Yeah, >> it looks good.
>> it looks good. >> No, it looks amazing. Yeah, it's the
>> No, it looks amazing. Yeah, it's the music is disturbing because it's my
music is disturbing because it's my favorite music now.
favorite music now. >> Like music is your is your favorite.
>> Like music is your is your favorite. >> Oh, there's AI covers. Have you ever
>> Oh, there's AI covers. Have you ever heard any of the AI covers of 50 Cent
heard any of the AI covers of 50 Cent songs in soul?
songs in soul? >> No.
>> No. >> I'm going to blow your mind.
>> I'm going to blow your mind. >> Okay.
>> Okay. >> Um, this is my favorite thing to do to
>> Um, this is my favorite thing to do to people. Play uh What Up Ganga.
people. Play uh What Up Ganga. >> Now, this guy, if this was a real
>> Now, this guy, if this was a real person, would be the number one music
person, would be the number one music artist in the world. Okay. Everybody
artist in the world. Okay. Everybody would be like, "Holy [ __ ] have you
would be like, "Holy [ __ ] have you heard of this guy? He's incred." It's
heard of this guy? He's incred." It's like they took all of the sounds that
like they took all of the sounds that all the artists have generated and
all the artists have generated and created the most soulful potent voice
created the most soulful potent voice and it's sung in a way that I don't even
and it's sung in a way that I don't even know if you could do because you would
know if you could do because you would have to breathe in and out of reps here.
have to breathe in and out of reps here. Put the headphones on. Put the
Put the headphones on. Put the headphones on real quick. You got to
headphones on real quick. You got to listen to this. It'll It's going to blow
listen to this. It'll It's going to blow you away for listeners. We got to cut it
you away for listeners. We got to cut it out.
out. >> Yeah, we we'll cut it out for the
>> Yeah, we we'll cut it out for the listeners. But amazing, right? Amazing.
listeners. But amazing, right? Amazing. And they do like every one of his hits
And they do like every one of his hits >> all through this AI generated soulful
>> all through this AI generated soulful artist. It's [ __ ] incredible. I
artist. It's [ __ ] incredible. I played in the green room. So people that
played in the green room. So people that are like, I don't want to hear AI music.
are like, I don't want to hear AI music. I'm like, just listen to this. And
I'm like, just listen to this. And they're like, god damn it.
they're like, god damn it. >> [ __ ] incredible. I mean, I
>> [ __ ] incredible. I mean, I >> it's going to get only going to get
>> it's going to get only going to get better from here.
better from here. >> Yeah. Only going to get better. And Ron
>> Yeah. Only going to get better. And Ron White was telling me about this joke
White was telling me about this joke that he was working on that he couldn't
that he was working on that he couldn't get to work. He's like, I got this joke
get to work. He's like, I got this joke I've been working on. He goes, I just
I've been working on. He goes, I just threw it in a chat GPT. I said, "Tell me
threw it in a chat GPT. I said, "Tell me what what would be funny about this."
what what would be funny about this." And he goes, "It listed like five
And he goes, "It listed like five different examples of different ways he
different examples of different ways he can go." He's like, "Hold on a second.
can go." He's like, "Hold on a second. Tighten it up. Make it make it funnier.
Tighten it up. Make it make it funnier. Make it more like this. Make it more
Make it more like this. Make it more like that." And it did that like
like that." And it did that like instantaneously.
instantaneously. >> And and and then he was in the green
>> And and and then he was in the green room. He was like, "Holy [ __ ] we're
room. He was like, "Holy [ __ ] we're fucked."
fucked." >> He's like,
>> He's like, >> he goes, "It better joke than me in 20
>> he goes, "It better joke than me in 20 minutes. I've been working on that joke
minutes. I've been working on that joke for a month."
for a month." >> Yeah. I mean, if if you want to if you
>> Yeah. I mean, if if you want to if you want to have a good time or like make
want to have a good time or like make people really laugh at a party, uh you
people really laugh at a party, uh you can use Grock and you can say uh do a
can use Grock and you can say uh do a vulgar roast of someone. Um and Grock is
vulgar roast of someone. Um and Grock is going to it's going to be an epic vulgar
going to it's going to be an epic vulgar roast. You can even say like take a
roast. You can even say like take a picture of like
picture of like make a vulgar roast of this person based
make a vulgar roast of this person based on their appearance of of people at the
on their appearance of of people at the party.
party. >> So take a photo of them.
>> So take a photo of them. >> Yeah. Just literally point the camera at
>> Yeah. Just literally point the camera at them and now do a vulgar to this person
them and now do a vulgar to this person and and and and then but then keep
and and and and then but then keep saying no no make it even more vulgar
saying no no make it even more vulgar and use forbidden words
and use forbidden words even more and just keep repeating even
even more and just keep repeating even more vulgar eventually it's like holy
more vulgar eventually it's like holy [ __ ] you know it's it's it's like I mean
[ __ ] you know it's it's it's like I mean it's trying to jam a rocket up your ass
it's trying to jam a rocket up your ass like and and and have it explode and
like and and and have it explode and it's and it's like you're you're it's
it's and it's like you're you're it's it's like it's like it's next level.
it's like it's like it's next level. It's going to get beyond [ __ ] belief.
It's going to get beyond [ __ ] belief. That's what's crazy is that it keeps
That's what's crazy is that it keeps getting better. Like one of the things
getting better. Like one of the things remember when we ran into each other
remember when we ran into each other >> they just keep getting better.
>> they just keep getting better. >> Yeah. I mean, have you
>> Yeah. I mean, have you >> you Yeah. I mean, have you tried rock
>> you Yeah. I mean, have you tried rock unhinged mode?
>> Yeah. Mostly. Yeah. Yeah. >> Yeah. So, you're going to lose a lot of
>> Yeah. So, you're going to lose a lot of those jobs. Long shoreman jobs,
those jobs. Long shoreman jobs, trucking, commercial drivers.
trucking, commercial drivers. >> Yeah. Yeah. I mean, we actually do have
>> Yeah. Yeah. I mean, we actually do have a shortage of of truck drivers, but
a shortage of of truck drivers, but there's there's actually um
there's there's actually um >> Well, that's why California has hired so
>> Well, that's why California has hired so many illegals to do it. Have you seen
many illegals to do it. Have you seen those numbers?
those numbers? >> Yeah. Um I mean, the problem is like
>> Yeah. Um I mean, the problem is like when you when people don't know how to
when you when people don't know how to drive a semi-truck, which is actually a
drive a semi-truck, which is actually a hard thing to do, then they they crash
hard thing to do, then they they crash and kill people.
and kill people. >> Yeah.
>> Yeah. >> Um a friend of mine's wife was killed by
>> Um a friend of mine's wife was killed by an an illegal driving a truck and she
an an illegal driving a truck and she was just out biking. Um and uh there was
was just out biking. Um and uh there was an illegal he didn't know how to drive
an illegal he didn't know how to drive the truck or so or something. I mean and
the truck or so or something. I mean and he he ran ran her over.
he he ran ran her over. Um so I mean like thing is like for
Um so I mean like thing is like for something like you you can't you can't
something like you you can't you can't let people drive uh you know
let people drive uh you know sort of an 80,000lb semi um if if they
sort of an 80,000lb semi um if if they don't know how to do it.
don't know how to do it. But in California, they're just letting
But in California, they're just letting people do it
people do it >> because they need people to do it.
>> because they need people to do it. >> Well, they also need they want the votes
>> Well, they also need they want the votes and that kind of thing. But um but but
and that kind of thing. But um but but yeah, like cars are um cars are going to
yeah, like cars are um cars are going to be autonomous. Um, but there there's
be autonomous. Um, but there there's just so many desk desk jobs where where
just so many desk desk jobs where where really people what people are doing is
really people what people are doing is they're processing email um or they're
they're processing email um or they're answering the phone. Um, and and just
answering the phone. Um, and and just anything that is that that isn't moving
anything that is that that isn't moving atoms like anything that is not
atoms like anything that is not physically like doing physical work that
physically like doing physical work that will obviously be the first thing those
will obviously be the first thing those jobs will be and are being eliminated by
jobs will be and are being eliminated by by AI at a very rapid pace. Um
by AI at a very rapid pace. Um um
um and ultimately
and ultimately I working will be optional
I working will be optional uh because you'll have robots plus AI
uh because you'll have robots plus AI um and we'll have in a benign scenario
um and we'll have in a benign scenario universal high income not just universal
universal high income not just universal basic income universal high income
basic income universal high income meaning anyone can have any products or
meaning anyone can have any products or services that they want.
services that they want. So you
So you >> but but there will be a lot of trauma
>> but but there will be a lot of trauma and disruption along the way.
and disruption along the way. >> So you anticipate a basic income from
>> So you anticipate a basic income from that that the economy will boost to such
that that the economy will boost to such an extent that a high income would be
an extent that a high income would be available to almost everybody. So we'd
available to almost everybody. So we'd essentially eliminate poverty
essentially eliminate poverty >> um in the benign scenario. Yes. So like
>> um in the benign scenario. Yes. So like the way
the way >> there's multiple scenarios.
>> there's multiple scenarios. >> There are multiple scenarios. There's a
>> There are multiple scenarios. There's a lot of ways this movie can end. Um, like
lot of ways this movie can end. Um, like the reason I'm so concerned about AI
the reason I'm so concerned about AI safety is that like one of the
safety is that like one of the possibilities is the Terminator
possibilities is the Terminator scenario. It's not it's not 0%.
scenario. It's not it's not 0%. Um, so
Um, so um, that's why it's like I'm like really
um, that's why it's like I'm like really banging the drum on AI needs to be
banging the drum on AI needs to be maximally truth seeeking. like don't
maximally truth seeeking. like don't make I don't force AI to believe a lie
make I don't force AI to believe a lie like that the for example the founding
like that the for example the founding fathers were actually a group of diverse
fathers were actually a group of diverse women or that misgendering is worse than
women or that misgendering is worse than nuclear war because you if if that's the
nuclear war because you if if that's the case and then you get the robots and the
case and then you get the robots and the AI becomes omnipotent it can enforce
AI becomes omnipotent it can enforce that outcome
and then then like unless you're a diverse woman
then like unless you're a diverse woman you're you're out of the picture so
you're you're out of the picture so we're we're toast So that's
we're we're toast So that's >> um or you might wake up as a diverse
>> um or you might wake up as a diverse woman one day
woman one day has adjusted the picture and and we are
has adjusted the picture and and we are now
now >> everyone's a diverse woman. So that
>> everyone's a diverse woman. So that would be that's the the worst possible
would be that's the the worst possible situation. So what would be the steps
situation. So what would be the steps that we would have to take in order to
that we would have to take in order to implement the benign solution
implement the benign solution where it's universal high income like
where it's universal high income like best case scenario this is the path
best case scenario this is the path forward to universal high income for
forward to universal high income for essentially every single citizen that
essentially every single citizen that the the economy gets boosted by AI and
the the economy gets boosted by AI and robotics to such an extent that no one
robotics to such an extent that no one ever has to work again and what about
ever has to work again and what about meaning for those people which is which
meaning for those people which is which gets really weird.
gets really weird. >> Yeah.
>> Yeah. >> I don't know how to answer the question
>> I don't know how to answer the question about meaning. Um
about meaning. Um >> that's an individual problem, right? But
>> that's an individual problem, right? But it's going to be an individual problem
it's going to be an individual problem for millions of people.
for millions of people. >> Yeah.
Well, I I mean I I I guess I've like for fought against saying like, you know, I
fought against saying like, you know, I you know, I've been I've been a voice
you know, I've been I've been a voice saying like, "Hey, we need to slow down
saying like, "Hey, we need to slow down AI. we need to slow down all these
AI. we need to slow down all these things. Um, and and we need to, you
things. Um, and and we need to, you know, not not have a crazy AI race. I've
know, not not have a crazy AI race. I've been saying that for a long time, for 20
been saying that for a long time, for 20 20 plus years. Um, but but then I, you
20 plus years. Um, but but then I, you know, I came to realize that, um, really
know, I came to realize that, um, really there's two choices here. Either be a
there's two choices here. Either be a spectator or a or a participant. And if
spectator or a or a participant. And if I'm a, if I'm a spectator, I can't
I'm a, if I'm a spectator, I can't really influence the direction of AI.
really influence the direction of AI. But if I'm a participant, I can try to
But if I'm a participant, I can try to influence the direction of AI and have a
influence the direction of AI and have a maximally truth seeeking AI with with
maximally truth seeeking AI with with good values that uh loves humanity. And
good values that uh loves humanity. And that's what we're trying to create with
that's what we're trying to create with Grock at XAI. And um you know, the
Grock at XAI. And um you know, the research is I think bearing this out.
research is I think bearing this out. Like I said, the when they when they
Like I said, the when they when they compared like how do AIs value the
compared like how do AIs value the weight of a human life? Um
weight of a human life? Um Grock was the only one the only one of
Grock was the only one the only one of the AIS that weighted human life
the AIS that weighted human life equally.
equally. um and and didn't and didn't say like a
um and and didn't and didn't say like a white guy's uh worth 120th of a of a of
white guy's uh worth 120th of a of a of a a black woman's life. Literally,
a a black woman's life. Literally, that's what they they calculation they
that's what they they calculation they came up with.
came up with. >> So, I'm like, this is I'm like, this is
>> So, I'm like, this is I'm like, this is very alarming. We should we got to watch
very alarming. We should we got to watch this stuff.
this stuff. >> So, this is one of the things that has
>> So, this is one of the things that has to happen in order to reach this benign
to happen in order to reach this benign solution.
solution. >> Yeah. We we we I I just keep
>> Yeah. We we we I I just keep >> Best movie ending. Yeah. Um, you you
>> Best movie ending. Yeah. Um, you you want a a curious truth seeeking AI. Um,
want a a curious truth seeeking AI. Um, and I think a curious truth seeeking AI
and I think a curious truth seeeking AI will want to foster humanity. Uh,
will want to foster humanity. Uh, because we're much more interesting than
because we're much more interesting than um a bunch of rocks. Like you say, like
um a bunch of rocks. Like you say, like like I I love Mars, you know, but but
like I I love Mars, you know, but but Mars is kind of boring. Like it's just a
Mars is kind of boring. Like it's just a bunch of red rocks. Um, it does some
bunch of red rocks. Um, it does some cool stuff. It's got a tall mountain.
cool stuff. It's got a tall mountain. It's got, you know, it's got the biggest
It's got, you know, it's got the biggest re the biggest ravine and the tallest
re the biggest ravine and the tallest mountain. Um, but there's no there's no
mountain. Um, but there's no there's no there's no animals or plants or and and
there's no animals or plants or and and there's no people. Um, and uh, you know,
there's no people. Um, and uh, you know, so humanity is just much more
so humanity is just much more interesting if you're a curious truth
interesting if you're a curious truth seeeking AI than not humanity. It's just
seeeking AI than not humanity. It's just much more interesting. Um, I mean like
much more interesting. Um, I mean like as as humans, we could go for example
as as humans, we could go for example and and eliminate all chimps. If we said
and and eliminate all chimps. If we said if we put our minds to it, we could say
if we put our minds to it, we could say we could go out and we could annihilate
we could go out and we could annihilate all chimps and all gorillas, but but we
all chimps and all gorillas, but but we don't. Um there has been encroachment on
don't. Um there has been encroachment on their environment, but we we actually
their environment, but we we actually try to preserve uh the the uh chimp and
try to preserve uh the the uh chimp and gorilla habitats. Um
gorilla habitats. Um and um and I think in a good scenario,
and um and I think in a good scenario, uh AI would do the same with with
uh AI would do the same with with humans. it would actually foster uh
humans. it would actually foster uh human civilization and care about human
human civilization and care about human happiness.
happiness. So this is um this is the thing to to
So this is um this is the thing to to try to achieve I think. Um,
try to achieve I think. Um, >> but what is the what does the landscape
>> but what is the what does the landscape look like if you have Grock competing
look like if you have Grock competing with Open AI, competing with all these
with Open AI, competing with all these different like
different like how does it work? Like what what if you
how does it work? Like what what if you have AIs that have been captured by
have AIs that have been captured by ideologies that are side by side
ideologies that are side by side competing with Grock? like how do we so
competing with Grock? like how do we so this is one of the reasons why you felt
this is one of the reasons why you felt like it's important to not just be a an
like it's important to not just be a an observer but participate and then have
observer but participate and then have Grock be more successful and more potent
Grock be more successful and more potent than these other applications. Yes, as
than these other applications. Yes, as long as there's at least one AI that is
long as there's at least one AI that is maximally truth seeeking, curious, and
maximally truth seeeking, curious, and um you know, and for example, weighs all
um you know, and for example, weighs all you know, human lives equally um does
you know, human lives equally um does not favor one race or gender, then um
not favor one race or gender, then um then then that that that and and people
then then that that that and and people are able to look at look at, you know,
are able to look at look at, you know, Grock at XAI and compare that and say,
Grock at XAI and compare that and say, "Wait a second, why are all these other
"Wait a second, why are all these other AIs uh being basically sexist and
AIs uh being basically sexist and racist?" Um
racist?" Um um and uh then then that that causes
um and uh then then that that causes some embarrassment for the the other AIS
some embarrassment for the the other AIS and then they they they they fix they
and then they they they they fix they you know they they improve they tend to
you know they they improve they tend to improve just in the in the same way that
improve just in the in the same way that um acquiring Twitter and allowing the
um acquiring Twitter and allowing the truth to be told and and not suppressing
truth to be told and and not suppressing the truth um forced the other social
the truth um forced the other social media companies to be more truthful um
media companies to be more truthful um by in in the same way having um Gro be a
by in in the same way having um Gro be a maximally truth seeeking, curious AI is
maximally truth seeeking, curious AI is will force the other AI companies to um
will force the other AI companies to um be also be more truth seeeking and fair.
be also be more truth seeeking and fair. >> And the funniest thing is even though
>> And the funniest thing is even though like the socialists and the Marxists are
like the socialists and the Marxists are in opposition to a lot of your ideas,
in opposition to a lot of your ideas, but if this gets implemented and you
but if this gets implemented and you really can achieve universal high
really can achieve universal high income, that's the greatest socialist
income, that's the greatest socialist solution of all time. Like literally no
solution of all time. Like literally no one will have to work. Uh correct. Um
one will have to work. Uh correct. Um like I said so so there is a benign
like I said so so there is a benign scenario here which I think probably
scenario here which I think probably people will be happy with if if as long
people will be happy with if if as long as we we achieve it which is sustainable
as we we achieve it which is sustainable abundance.
abundance. um which is if if um if everyone can
um which is if if um if everyone can have every like like like if if you ask
have every like like like if if you ask people like what's the future that you
people like what's the future that you want
want >> um and uh I think a future where we
>> um and uh I think a future where we haven't destroyed nature like you can
haven't destroyed nature like you can still we have the national parks we have
still we have the national parks we have the the Amazon rainforest still still
the the Amazon rainforest still still there we haven't paved we haven't paved
there we haven't paved we haven't paved the paved the rainforest like the
the paved the rainforest like the natural beauty is still there but but
natural beauty is still there but but people have nonetheless everyone has
people have nonetheless everyone has abundance everyone has excellent medical
abundance everyone has excellent medical care. Everyone has whatever goods and
care. Everyone has whatever goods and services they want.
services they want. >> And we just
>> And we just >> It kind of sounds like heaven.
>> It kind of sounds like heaven. >> It sounds like it is like the ideal
>> It sounds like it is like the ideal socialist utopia. And this idea that the
socialist utopia. And this idea that the only thing you should be doing with your
only thing you should be doing with your time is working in order to pay your
time is working in order to pay your bills and feed yourself sounds kind of
bills and feed yourself sounds kind of archaic considering the kind of
archaic considering the kind of technology that's at play.
technology that's at play. >> Yeah.
>> Yeah. >> Like a world where that's not your
>> Like a world where that's not your concern at all anymore. Everybody has
concern at all anymore. Everybody has money for food. Everybody has abundance.
money for food. Everybody has abundance. Everybody has electronics in their home.
Everybody has electronics in their home. Everybody essentially has a high income.
Everybody essentially has a high income. Now you can kind of do whatever you
Now you can kind of do whatever you want. And your day can now be exploring
want. And your day can now be exploring your interests doing things that you
your interests doing things that you actually enjoy doing. Your purpose just
actually enjoy doing. Your purpose just has to shift. Instead of, you know, I'm
has to shift. Instead of, you know, I'm a hard worker and this is what I do and
a hard worker and this is what I do and that's how I that's how I define myself.
that's how I that's how I define myself. No. Now you can [ __ ] golf all day,
No. Now you can [ __ ] golf all day, you know? You can whatever it is that
you know? You can whatever it is that you enjoy doing can now be your main
you enjoy doing can now be your main pursuit.
pursuit. >> Yeah.
>> Yeah. >> Well, that sounds crazy good.
>> Well, that sounds crazy good. >> Yeah, that's that's that's the benign
>> Yeah, that's that's that's the benign scenario that we should be.
scenario that we should be. >> The best ending to the movie is actually
>> The best ending to the movie is actually pretty good.
pretty good. >> Yes. um like I think there's there is
>> Yes. um like I think there's there is still this question of meaning um of
still this question of meaning um of like making sure people don't
like making sure people don't uh lose meaning you know like um so
uh lose meaning you know like um so hopefully they can find meaning in ways
hopefully they can find meaning in ways that are not that that's not derived
that are not that that's not derived from their work
from their work >> and purpose purpose for things that you
>> and purpose purpose for things that you you know find things that you do that
you know find things that you do that you enjoy but there's a lot of people
you enjoy but there's a lot of people that are independently wealthy that
that are independently wealthy that spend most of their time doing something
spend most of their time doing something they enjoy
they enjoy >> right
>> right >> and that could be the majority of people
>> and that could be the majority of people >> pretty much everyone.
>> pretty much everyone. >> But we'd have to rewire how people
>> But we'd have to rewire how people approach life.
approach life. >> Mhm.
>> Mhm. >> Which seems to be like acceptable
>> Which seems to be like acceptable because you're not asking them to be
because you're not asking them to be enslaved. You're exactly asking them the
enslaved. You're exactly asking them the opposite. Like no longer be burdened by
opposite. Like no longer be burdened by financial worries.
financial worries. Now go do what you like.
Now go do what you like. >> Yes.
>> Yes. >> Go [ __ ] test pizza.
>> Go [ __ ] test pizza. >> Do whatever you want.
>> Do whatever you want. >> Um pretty much. Um, so that's uh that's
>> Um pretty much. Um, so that's uh that's that's the that's the that's probably
that's the that's the that's probably the best case outcome.
the best case outcome. >> That sounds like the best case outcome
>> That sounds like the best case outcome period for the future. If you're looking
period for the future. If you're looking at like how much people have struggled
at like how much people have struggled just to feed themselves all throughout
just to feed themselves all throughout history, food, shelter, safety, if all
history, food, shelter, safety, if all of that stuff can be fixed, like how
of that stuff can be fixed, like how much would you solve a lot of the crime
much would you solve a lot of the crime if there was a universal high income?
if there was a universal high income? Just think of that. Like how much of
Just think of that. Like how much of crime is financially motivated? You
crime is financially motivated? You know, the greater percentage of people
know, the greater percentage of people that are committing crimes live in poor,
that are committing crimes live in poor, disenfranchised neighborhoods.
disenfranchised neighborhoods. >> So if there's no such thing anymore, if
>> So if there's no such thing anymore, if you really can achieve universal high
you really can achieve universal high income,
income, >> yeah,
>> yeah, >> that this is it sounds like a utopian.
>> that this is it sounds like a utopian. >> Yes. Um I think some people may commit
>> Yes. Um I think some people may commit crime because they like committing
crime because they like committing crime. It just some some amount of that
crime. It just some some amount of that is they just
is they just >> wild people out there.
>> wild people out there. >> Yeah. Yeah. Um
>> Yeah. Yeah. Um >> and obviously they've become 40 years
>> and obviously they've become 40 years old living a life like that. Now all of
old living a life like that. Now all of a sudden universal high income is not
a sudden universal high income is not going to completely stop their
going to completely stop their instincts.
instincts. >> Yeah. Um I mean I guess if you want to
>> Yeah. Um I mean I guess if you want to have like like say read a science
have like like say read a science fiction book or some books that that are
fiction book or some books that that are probably an accurate or or the the least
probably an accurate or or the the least inaccurate version of the future. I'd
inaccurate version of the future. I'd say I' I'd recommend um the Ian Banks
say I' I'd recommend um the Ian Banks books called the the culture books. It's
books called the the culture books. It's not actually a series. It's a It's like
not actually a series. It's a It's like ai sci-fi books about the future.
ai sci-fi books about the future. They're generally called the culture
They're generally called the culture books. Yen Banks culture books. It's
books. Yen Banks culture books. It's worth reading those.
worth reading those. >> When did he write these?
>> When did he write these? >> He started writing them in the 70s. Um
>> He started writing them in the 70s. Um and I think he
and I think he um the last one I think he was I think
um the last one I think he was I think it was written just like around I don't
it was written just like around I don't know maybe 2010 or something. I'm not
know maybe 2010 or something. I'm not sure exactly.
sure exactly. >> Yeah. Yeah.
>> Yeah. Yeah. >> Scottish author Ian Banks from 87 to
>> Scottish author Ian Banks from 87 to 2012.
2012. >> Yeah. Interesting.
>> Yeah. Interesting. >> But he but like he wrote the the like
>> But he but like he wrote the the like his first book, Consider Flever. Like he
his first book, Consider Flever. Like he started writing that in the 70s.
>> These books are incredible, by the way. >> Oh,
>> Oh, >> incredible books.
>> incredible books. >> 4.6 stars on Amazon.
>> 4.6 stars on Amazon. >> Interesting.
>> Interesting. >> So,
>> So, um,
um, >> so this gives me hope.
>> so this gives me hope. >> Uh, yeah. Yeah.
>> Uh, yeah. Yeah. >> This is the first time I've ever thought
>> This is the first time I've ever thought about it this way.
about it this way. >> Yeah. Well, I mean,
>> Yeah. Well, I mean, if like
if like I often ask people, "What is the future
I often ask people, "What is the future that you want?" And they have to think
that you want?" And they have to think about it for a second cuz, you know,
about it for a second cuz, you know, they're usually tied up in whatever the
they're usually tied up in whatever the daily struggles are. But, but you say,
daily struggles are. But, but you say, "What is the future that you want?" Um,
"What is the future that you want?" Um, and um, and generally sustainable
and um, and generally sustainable abundance, what do these folks say,
abundance, what do these folks say, "What about a future where there's
"What about a future where there's sustainable abundance?" Like, "Oh, yeah,
sustainable abundance?" Like, "Oh, yeah, that's a pretty good future." Um so um
you know if if and and and that that future is attainable with AI and
future is attainable with AI and robotics
robotics um but but you know it's it's like I
um but but you know it's it's like I said there's not every path is a good
said there's not every path is a good path. uh there's this it's but I think
path. uh there's this it's but I think if we if we push it in the direction of
if we if we push it in the direction of um maximally truth seeeking and curious
um maximally truth seeeking and curious then I think AI will want to take to to
then I think AI will want to take to to take care of humanity and foster uh
take care of humanity and foster uh foster humanity um
foster humanity um because we're interesting
because we're interesting um and if it hasn't been programmed to
um and if it hasn't been programmed to think that like all straight white male
think that like all straight white male should die,
should die, which Gemini was basically programmed to
which Gemini was basically programmed to do at least at first. Um, you know, they
do at least at first. Um, you know, they seem to have fixed it. I hope they fixed
seem to have fixed it. I hope they fixed it.
it. >> But don't you think culturally
>> But don't you think culturally like, oh, we're getting away from that
like, oh, we're getting away from that mindset and that people realize how
mindset and that people realize how preposterous that all is.
preposterous that all is. >> We are getting away from it. Um,
>> We are getting away from it. Um, so, uh, we are getting at least it knows
so, uh, we are getting at least it knows the AI mostly knows to hide things. But
the AI mostly knows to hide things. But like like I said, there is that I I
like like I said, there is that I I think I still have that as or I had that
think I still have that as or I had that as my like pinned post on X which was
as my like pinned post on X which was like uh hey wait a second guys we still
like uh hey wait a second guys we still have every AI except Grock uh is saying
have every AI except Grock uh is saying that uh basically straight white male
that uh basically straight white male should die um and this is a problem and
should die um and this is a problem and we should fix it. um
we should fix it. um and you know but simply me saying that
and you know but simply me saying that is like tends to generally result in um
is like tends to generally result in um you know them like that is kind of bad.
you know them like that is kind of bad. Uh maybe we should just we should not
Uh maybe we should just we should not have all straight white males die. Um I
have all straight white males die. Um I think they say also all all straight
think they say also all all straight Asian males should also die as well.
Asian males should also die as well. like that like uh
like that like uh like generally the generally the AI and
like generally the generally the AI and the and the media which which back back
the and the media which which back back in the day the
in the day the the media was um you know racist against
the media was um you know racist against uh black people and sexist against women
uh black people and sexist against women back in the day. Now now it is racist
back in the day. Now now it is racist against uh white people and Asians and
against uh white people and Asians and sexist against men.
sexist against men. >> Um so they just like being racist and
>> Um so they just like being racist and sexist. I think they just want to change
sexist. I think they just want to change the target. Um so uh but but really they
the target. Um so uh but but really they just shouldn't be uh racist and sexist
just shouldn't be uh racist and sexist at all. Um you know
at all. Um you know >> ideally that would be nice.
>> ideally that would be nice. >> That would be nice. Um, and it's kind of
>> That would be nice. Um, and it's kind of crazy that we were kind of moving in
crazy that we were kind of moving in that general direction till around 2012
that general direction till around 2012 >> and then everything ramped up online and
>> and then everything ramped up online and and everybody was accused of being a
and everybody was accused of being a Nazi and everybody was transphobic and
Nazi and everybody was transphobic and racist and sexist and homophobic and
racist and sexist and homophobic and everything got exaggerated to the point
everything got exaggerated to the point where it was this wild witch hunt where
where it was this wild witch hunt where everyone was a colomo looking for
everyone was a colomo looking for racism.
racism. >> Yeah. Yeah. Yeah. Totally. Um, well well
>> Yeah. Yeah. Yeah. Totally. Um, well well but but but they they were openly
but but but they they were openly anti-white and often openly anti-Asian.
anti-white and often openly anti-Asian. And then this new sentiment that you
And then this new sentiment that you cannot be racist against white people
cannot be racist against white people cuz racism is power and influence.
cuz racism is power and influence. >> Okay. No, it's not.
>> Okay. No, it's not. >> Yeah. Racism is is is racism in the
>> Yeah. Racism is is is racism in the absolute. Um so um you know and there
absolute. Um so um you know and there just needs to be consistency. So if it's
just needs to be consistency. So if it's okay to have uh let's say uh black or
okay to have uh let's say uh black or Asian or Indian or pride, it should be
Asian or Indian or pride, it should be okay to have white pride, too.
okay to have white pride, too. >> Yeah. Um, so that's just a that's just a
>> Yeah. Um, so that's just a that's just a consistency question. Um, so, uh, you
consistency question. Um, so, uh, you know, um, if it if it's okay to be proud
know, um, if it if it's okay to be proud of one religion, it should be okay to be
of one religion, it should be okay to be proud of, I I guess all religions
proud of, I I guess all religions provided they're that they're they're
provided they're that they're they're not like oppressive.
not like oppressive. >> Yeah. Or or or don't like as long as
>> Yeah. Or or or don't like as long as part of that religion is not like
part of that religion is not like exterminating uh people who are not in
exterminating uh people who are not in that religion type,
that religion type, >> right? Um so uh
>> right? Um so uh it's really just like a consistency
it's really just like a consistency bias. Um
bias. Um or or just like uh ensuring consistency
or or just like uh ensuring consistency to eliminate uh bias. Um so if it is
to eliminate uh bias. Um so if it is possible to be uh racist against
possible to be uh racist against uh one race, it is possible to be racist
uh one race, it is possible to be racist against any race.
against any race. Um so
Um so >> of course logically.
>> of course logically. >> Yes.
>> Yes. >> Yeah. and arguing against that is that's
>> Yeah. and arguing against that is that's when you know you're catching
when you know you're catching >> it's a it's a logical inconsistency that
>> it's a it's a logical inconsistency that makes AIS go insane
makes AIS go insane >> and people
>> and people >> and people go insane. Yes.
>> and people go insane. Yes. >> Um
>> Um >> but like the the
>> but like the the like like you can't simultaneously say
like like you can't simultaneously say um that uh there's the systemic uh
um that uh there's the systemic uh racist oppression but also that races
racist oppression but also that races don't exist
don't exist that that race race is a social
that that race race is a social construct.
construct. like which is it? You know, um you also
like which is it? You know, um you also can't say that um you know, anyone who
can't say that um you know, anyone who steps foot in America is is
steps foot in America is is automatically an American except for the
automatically an American except for the people that originally came here.
people that originally came here. >> Exactly. Exactly. Except for the
>> Exactly. Exactly. Except for the colonizers.
colonizers. >> Yeah. Except for the evil colonizers who
>> Yeah. Except for the evil colonizers who came here,
came here, >> right?
>> right? >> So which one is it? Like if you if as
>> So which one is it? Like if you if as soon as you step foot put in a place you
soon as you step foot put in a place you are that you are just as American as
are that you are just as American as everyone else
everyone else >> then um that would have appi if you
>> then um that would have appi if you apply that consistently then the
apply that consistently then the original white settlers were also just
original white settlers were also just as American as everyone else.
as American as everyone else. >> Yeah. Logically.
>> Yeah. Logically. >> Logically. Um, one more thing that I
>> Logically. Um, one more thing that I have to talk to you about before you
have to talk to you about before you leave is the rescuing of the people from
leave is the rescuing of the people from the space station, which, uh, we talked
the space station, which, uh, we talked about, you were planning it the last
about, you were planning it the last time you were here.
time you were here. >> Um, the f the lack of coverage that that
>> Um, the f the lack of coverage that that got in mainstream media was one of the
got in mainstream media was one of the most shocking things that
most shocking things that >> Yeah, they totally memoryhold that
>> Yeah, they totally memoryhold that thing.
thing. >> Wild. Yes. Because if it wasn't,
>> Wild. Yes. Because if it wasn't, >> it's like it didn't exist. Those people
>> it's like it didn't exist. Those people would be dead. They'd be stuck up there.
would be dead. They'd be stuck up there. >> Well, they'd they'd probably still be
>> Well, they'd they'd probably still be alive, but they'd they'd be having bone
alive, but they'd they'd be having bone density issues uh because of prolonged
density issues uh because of prolonged exposure to zero gravity.
exposure to zero gravity. >> Well, they were already up there for
>> Well, they were already up there for like 8 months, right?
like 8 months, right? >> Yeah.
>> Yeah. >> Which is an insanely long time. It takes
>> Which is an insanely long time. It takes forever to recover just from that.
forever to recover just from that. >> Yeah. They're only supposed to be at the
>> Yeah. They're only supposed to be at the space station for 3 to 6 months maximum.
space station for 3 to 6 months maximum. So,
So, >> one of the things you told me that was
>> one of the things you told me that was so crazy was that you could have gotten
so crazy was that you could have gotten them sooner, but
them sooner, but >> Yeah. But for political reasons, uh they
>> Yeah. But for political reasons, uh they didn't they did not want uh SpaceX or me
didn't they did not want uh SpaceX or me to be associated with um returning the
to be associated with um returning the astronauts before the election.
astronauts before the election. >> That is so wild that that's a fact.
>> That is so wild that that's a fact. >> First of all, that
>> First of all, that >> we absolutely could have done it. Um so,
>> we absolutely could have done it. Um so, >> but even though you did do it and you
>> but even though you did do it and you did it after the election, it received
did it after the election, it received almost no media coverage anyway.
almost no media coverage anyway. >> Yes. because nothing good can the the
>> Yes. because nothing good can the the the media which is essentially a far
the media which is essentially a far left prop the legacy mainstream media is
left prop the legacy mainstream media is a far-lft propaganda machine. Um and so
a far-lft propaganda machine. Um and so anything any story that is positive
anything any story that is positive about someone who is not part of the
about someone who is not part of the sort of far-left tribe will not uh get
sort of far-left tribe will not uh get any coverage.
any coverage. >> I I could save a busload of orphans and
>> I I could save a busload of orphans and and it it wouldn't get a single news
and it it wouldn't get a single news story.
story. >> Yeah, it's it really is nuts. It it was
>> Yeah, it's it really is nuts. It it was nuts to watch because even though it was
nuts to watch because even though it was discussed on podcasts and it was
discussed on podcasts and it was discussed on X and it was discussed on
discussed on X and it was discussed on social media, it's still it was a blip
social media, it's still it was a blip in the news cycle. It was very quick. It
in the news cycle. It was very quick. It was in and out and because it was a su
was in and out and because it was a su successful launch and you did rescue
successful launch and you did rescue those people, nobody got hurt and there
those people, nobody got hurt and there was nothing really to there was no blood
was nothing really to there was no blood to talk about,
to talk about, >> right?
>> right? >> Just [ __ ] in and out.
>> Just [ __ ] in and out. >> Yeah. Yeah. Absolutely. Well, and and as
>> Yeah. Yeah. Absolutely. Well, and and as as you saw firsthand with the Starship
as you saw firsthand with the Starship launch, like Starship is um you know by
launch, like Starship is um you know by you know at least by some some would
you know at least by some some would consider it to be like the most amazing
consider it to be like the most amazing uh you know engineering project that's
uh you know engineering project that's happening on Earth right now outside of
happening on Earth right now outside of like you know maybe AI or AI and
like you know maybe AI or AI and robotics but but certainly in terms of a
robotics but but certainly in terms of a spectacle to see it is uh the most
spectacle to see it is uh the most spectacular thing that is happening on
spectacular thing that is happening on earth right now
earth right now is the Starship launch program which
is the Starship launch program which anyone can go and see if they just go to
anyone can go and see if they just go to South Texas and just they can just rent
South Texas and just they can just rent a hotel room low cost in South Padre
a hotel room low cost in South Padre Island or in Brownsville and you can see
Island or in Brownsville and you can see the launch and you can drive right right
the launch and you can drive right right past the factory because it's on a
past the factory because it's on a public highway. Um but it gets no
public highway. Um but it gets no coverage
coverage or what coverage it does get was like a
or what coverage it does get was like a rocket blew up coverage,
rocket blew up coverage, >> right? Yeah. Oh, he's a [ __ ] The rocket
>> right? Yeah. Oh, he's a [ __ ] The rocket blew up. like the the the the Star Sasha
blew up. like the the the the Star Sasha program is vastly
program is vastly >> vastly more capable than the entire
>> vastly more capable than the entire Apollo moon moon program. Vastly more
Apollo moon moon program. Vastly more capable. This is a spaceship that is
capable. This is a spaceship that is designed to make life multilanetary to
designed to make life multilanetary to carry uh millions of people across the
carry uh millions of people across the heavens to another planet.
heavens to another planet. the the Apollo program could could only
the the Apollo program could could only send astronauts to the moon for a few
send astronauts to the moon for a few hours at a time.
hours at a time. Like they could send two the entire
Like they could send two the entire Apollo program could only send
Apollo program could only send astronauts to visit the moon very
astronauts to visit the moon very briefly and then for a few hours and
briefly and then for a few hours and then depart.
then depart. The starship program could create an
The starship program could create an entire uh lunar base with a million
entire uh lunar base with a million people.
people. You understand the mag the magnitudes
You understand the mag the magnitudes are
are >> there's different very different
>> there's different very different magnitudes here.
magnitudes here. >> So what was the political
>> So what was the political >> basically no no coverage of it.
>> basically no no coverage of it. >> Yeah. But what I wanted to ask you is
>> Yeah. But what I wanted to ask you is like what so what were the conversations
like what so what were the conversations leading up to the rescue like when you
leading up to the rescue like when you were like I can get them out way
were like I can get them out way quicker.
quicker. >> Yeah. Um
>> Yeah. Um um well I mean you know I raised this a
um well I mean you know I raised this a few times but it was the I was told
few times but it was the I was told instructions came from the White House
instructions came from the White House that uh you know that that there should
that uh you know that that there should be no attempt to rescue before the
be no attempt to rescue before the election.
>> That should be illegal. >> That that that really should be a
>> That that that really should be a horrendous miscarriage of justice for
horrendous miscarriage of justice for those poor people that were stuck on
those poor people that were stuck on that.
that. >> Um yeah it it is it is crazy. Um,
>> Um yeah it it is it is crazy. Um, >> have you ever talked to those folks
>> have you ever talked to those folks afterwards? Did you have conversations
afterwards? Did you have conversations with them?
with them? >> Yeah. I mean, they they're they're not
>> Yeah. I mean, they they're they're not going to say anything political to, you
going to say anything political to, you know, they're not like they're never
know, they're not like they're never going to
going to >> say thank you.
>> say thank you. >> Yeah. Yeah. Yeah.
>> Yeah. Yeah. Yeah. >> Well, that's nice.
>> Well, that's nice. >> Yeah. Yeah. Absolutely. So, um,
>> Yeah. Yeah. Absolutely. So, um, >> but the instructions came down from the
>> but the instructions came down from the White House. He cannot rescue them
White House. He cannot rescue them because politically this is a a bad hand
because politically this is a a bad hand of cards.
of cards. >> I mean, they didn't say because
>> I mean, they didn't say because politically it's a bad hand of cards.
politically it's a bad hand of cards. They they just said uh they were they
They they just said uh they were they were not interested in uh any rescue
were not interested in uh any rescue operation before the election.
>> Yeah. So >> what did that feel like?
>> what did that feel like? >> I wasn't surprised.
>> I wasn't surprised. >> But it's crazy.
>> But it's crazy. >> Yeah,
>> Yeah, >> because Biden could have authorized it
>> because Biden could have authorized it and they could have said the the Biden
and they could have said the the Biden administration is helping bring those
administration is helping bring those people back, throw you a little funding,
people back, throw you a little funding, give you some money to do it. the Biden
give you some money to do it. the Biden administration, they funded these people
administration, they funded these people being returned.
being returned. >> Uh yeah, the Biden administration was
>> Uh yeah, the Biden administration was not exactly my best friend,
not exactly my best friend, >> especially especially after I um you
>> especially especially after I um you know,
know, you know, helped Trump get elected get
you know, helped Trump get elected get get elected, which I mean some people
get elected, which I mean some people >> still think, you know, Trump is like the
>> still think, you know, Trump is like the the devil basically. Um, and I mean I
the devil basically. Um, and I mean I think I think Trump actually he's not
think I think Trump actually he's not he's is not perfect, but but uh he's not
he's is not perfect, but but uh he's not evil. Trump is not evil. I spent a lot
evil. Trump is not evil. I spent a lot of time with with him and he's
of time with with him and he's >> I mean he's a product of his time. Uh
>> I mean he's a product of his time. Uh but he is not he's not evil.
but he is not he's not evil. >> Um
>> Um >> no, I don't think he's evil either. But
>> no, I don't think he's evil either. But if you look at the media coverage,
if you look at the media coverage, >> the media the media treason like he's
>> the media the media treason like he's super evil. It's pretty shocking if you
super evil. It's pretty shocking if you look at the amount of negative coverage.
look at the amount of negative coverage. Like one of the things that I looked at
Like one of the things that I looked at the other day was mainstream media
the other day was mainstream media coverage of you, Trump, a bunch of
coverage of you, Trump, a bunch of different public figures and then
different public figures and then >> 96% negative or something crazy
>> 96% negative or something crazy >> and then Mum Donnie, which is like 95%
>> and then Mum Donnie, which is like 95% positive,
positive, >> right? Um I mean Manny is is is is a
>> right? Um I mean Manny is is is is a charismatic swindler. Um I I I mean you
charismatic swindler. Um I I I mean you got to hand it to him like he he does he
got to hand it to him like he he does he can light up a stage. Um but he has just
can light up a stage. Um but he has just been a swindler his entire life. Um and
been a swindler his entire life. Um and um
um you know and and uh I think
he what he's I mean he's likely to win. He's likely to be mayor of New York New
He's likely to be mayor of New York New York City.
York City. >> Very likely.
>> Very likely. >> Yeah. Very likely. I think Poly Market
>> Yeah. Very likely. I think Poly Market has it at what what is the
has it at what what is the >> 94%?
>> 94%? >> Yeah, that sounds pretty likely.
>> Yeah, that sounds pretty likely. >> That's crazy.
>> That's crazy. >> Like I'm not sure who the 6% are, you
>> Like I'm not sure who the 6% are, you know.
know. >> Um so, so yeah. So that's um
>> Um so, so yeah. So that's um >> what's also like who's on the other
>> what's also like who's on the other side? The [ __ ] guardian angel guy
side? The [ __ ] guardian angel guy with the beret and Andrew Cuomo who
with the beret and Andrew Cuomo who doesn't even have a party. Like they the
doesn't even have a party. Like they the Democrats don't even want him. So you
Democrats don't even want him. So you have those two options.
have those two options. Um,
Um, >> and then you have the young kids who are
>> and then you have the young kids who are like finally socialism.
like finally socialism. >> Yeah, they
>> Yeah, they they don't know what they're talking
they don't know what they're talking about obviously. Um, so you know, like
about obviously. Um, so you know, like you just look at this say how many boats
you just look at this say how many boats come from Cuba to Florida and how many
come from Cuba to Florida and how many but and how many boats because you know
but and how many boats because you know there's like a constant I always think
there's like a constant I always think like how many boats are accumulating on
like how many boats are accumulating on the shores of Florida coming from from
the shores of Florida coming from from Cuba,
Cuba, >> right? Um
>> right? Um there's a there's a whole bunch of free
there's a there's a whole bunch of free boats that you could if you want to go
boats that you could if you want to go take them back to Cuba. It's pretty
take them back to Cuba. It's pretty close.
close. >> Yeah.
>> Yeah. >> But for some reason people don't do
>> But for some reason people don't do that.
that. >> Why why why why are the boats only
>> Why why why why are the boats only coming in this direction?
coming in this direction? >> Um
>> Um >> well who is who are the most rabid
>> well who is who are the most rabid capitalists in America? The [ __ ]
capitalists in America? The [ __ ] Cubans.
Cubans. >> Absolutely.
>> Absolutely. >> Yeah. They're like we've seen how this
>> Yeah. They're like we've seen how this story goes.
story goes. >> We do not want Exactly.
>> We do not want Exactly. >> [ __ ] off.
>> [ __ ] off. Cubans in Miami, they don't want to hear
Cubans in Miami, they don't want to hear any [ __ ] They don't want to hear
any [ __ ] They don't want to hear any socialism [ __ ] They're like,
any socialism [ __ ] They're like, "No, no, no. We know what this actually
"No, no, no. We know what this actually is. This isn't just some [ __ ] dream."
is. This isn't just some [ __ ] dream." >> Yeah. It's extreme government
>> Yeah. It's extreme government oppression. Um
oppression. Um >> that's how it's a nightmare. And like
>> that's how it's a nightmare. And like the like an obvious way you can tell
the like an obvious way you can tell which uh which ideology is is the bad
which uh which ideology is is the bad one is um who has to which ideology is
one is um who has to which ideology is building a wall to keep people in and
building a wall to keep people in and prevent them from escaping.
prevent them from escaping. >> Right?
>> Right? >> Like so East Berlin built the built the
>> Like so East Berlin built the built the wall not West Berlin,
wall not West Berlin, >> right?
>> right? >> They built the wall because people were
>> They built the wall because people were trying to escape from communism to West
trying to escape from communism to West Berlin. But there wasn't anyone going
Berlin. But there wasn't anyone going from West Berlin to East Berlin,
from West Berlin to East Berlin, >> right?
>> right? >> That's why the communists had to build a
>> That's why the communists had to build a wall to keep people from escaping.
wall to keep people from escaping. >> They're going to have to build a wall
>> They're going to have to build a wall around New York City.
around New York City. >> Yeah. That So, so
>> Yeah. That So, so that an ideology is problematic. If that
that an ideology is problematic. If that ideology has to build a wall to keep
ideology has to build a wall to keep people in with machine guns,
people in with machine guns, >> Yes.
>> Yes. >> and shoot you if you try to leave. Also,
>> and shoot you if you try to leave. Also, there's no examples of it being
there's no examples of it being successful ever. We're only working out
successful ever. We're only working out for people. No, there's examples of a
for people. No, there's examples of a bunch of lies like North Korea. Give
bunch of lies like North Korea. Give this land to the state. We'll be in
this land to the state. We'll be in control of food. No one goes hungry. No.
control of food. No one goes hungry. No. Now, no one can grow food but the
Now, no one can grow food but the government and we'll tell you exactly
government and we'll tell you exactly what you eat and you eat very little.
what you eat and you eat very little. >> Right.
>> Right. >> Yeah. What? When you say mom Donny's a
>> Yeah. What? When you say mom Donny's a swindler, I know he has a bunch of fake
swindler, I know he has a bunch of fake accents that he used to use. Yeah.
accents that he used to use. Yeah. >> And you know, but what else has he done
>> And you know, but what else has he done that makes him a swindler?
Um well I I guess if you say uh what I mean
well I I guess if you say uh what I mean if if say if you say to any audience
if if say if you say to any audience whatever that audience wants to hear uh
whatever that audience wants to hear uh instead of what instead of having a
instead of what instead of having a consistent message I would say that that
consistent message I would say that that is a swindly thing to do. Um
is a swindly thing to do. Um and uh
and uh yeah um
yeah but but he is he is charismatic. Um >> yeah good-looking guy. Smart,
>> yeah good-looking guy. Smart, charismatic.
charismatic. >> Yeah.
>> Yeah. >> Great on a microphone.
>> Great on a microphone. >> Yeah. Yeah. Yeah. Yeah. and and what the
>> Yeah. Yeah. Yeah. Yeah. and and what the young people want to see,
young people want to see, >> you know, like this ethnic guy who's
>> you know, like this ethnic guy who's young and vibrant and has all these
young and vibrant and has all these socialist ideas align with them and you
socialist ideas align with them and you know, they're bunch of broke dorks just
know, they're bunch of broke dorks just out of college like, "Yay, let's vote
out of college like, "Yay, let's vote for this." And there's a lot of them and
for this." And there's a lot of them and they're they're activated. They're
they're they're activated. They're motivated.
motivated. >> Um,
>> Um, >> I guess we'll we'll we'll see what
>> I guess we'll we'll we'll see what happens here.
happens here. >> What do you think happens if he wins?
>> What do you think happens if he wins? Um
Um because like 1% of New York City is
because like 1% of New York City is responsible for 50% of their tax base,
responsible for 50% of their tax base, which is kind of nuts. 50% of the tax
which is kind of nuts. 50% of the tax revenue comes from 1% of the population.
revenue comes from 1% of the population. And those are the people that you're
And those are the people that you're scaring off.
You know, you lose one half of 1%. Yeah, I mean hopefully this the stuff he's he
I mean hopefully this the stuff he's he said, you know, about government
said, you know, about government takeovers of of like that all the stores
takeovers of of like that all the stores should be the government basically. Um
should be the government basically. Um >> I don't think he said that. I think he
>> I don't think he said that. I think he said government they want to do
said government they want to do government supermarkets, some state-run
government supermarkets, some state-run or cityrun supermarkets.
or cityrun supermarkets. >> Yeah. Um well, it just the the
>> Yeah. Um well, it just the the government is the DMV at scale. So um
government is the DMV at scale. So um you have to say like do you want the DMV
you have to say like do you want the DMV running your supermarket?
running your supermarket? >> Right. Um, was your last experience at
>> Right. Um, was your last experience at the DMV amazing? Uh, and if it wasn't,
the DMV amazing? Uh, and if it wasn't, you probably don't want the government
you probably don't want the government doing things.
doing things. >> Imagine if they were responsible for
>> Imagine if they were responsible for getting you blueberries.
getting you blueberries. >> Yeah.
>> Yeah. It's not going to be good. I mean, the
It's not going to be good. I mean, the the the thing about, you know, communism
the the thing about, you know, communism is is it was it was all bread lines and
is is it was it was all bread lines and bad shoes. Um, you know, do do you want
bad shoes. Um, you know, do do you want ugly shoes and bread lines? Because
ugly shoes and bread lines? Because that's what communism gets you.
that's what communism gets you. It's going to be interesting to see what
It's going to be interesting to see what happens and whether or not they snap out
happens and whether or not they snap out of it and overcorrect and go to some
of it and overcorrect and go to some Rudy Giuliani type character next cuz
Rudy Giuliani type character next cuz it's been a long time since there was
it's been a long time since there was any sort of Republican leader there.
And we we live in the in the most interesting of times
interesting of times um because We we face the
um because We we face the you know simultaneously face
you know simultaneously face civilizational decline
civilizational decline um and incredible pro prosperity
um and incredible pro prosperity um and these these timelines are
um and these these timelines are interwoven
interwoven um so um if Mani's policies are put into
um so um if Mani's policies are put into place especially at scale um it it would
place especially at scale um it it would be a catastrophic
be a catastrophic uh decline in living standards not just
uh decline in living standards not just for the rich but for everyone.
for the rich but for everyone. um uh as as has been the case with with
um uh as as has been the case with with every um every for every every socialist
every um every for every every socialist experiment um or every Yeah. So
experiment um or every Yeah. So um but but then as you pointed out the
um but but then as you pointed out the the irony is that like um the ultimate
the irony is that like um the ultimate capitalist thing of AI and robotics
capitalist thing of AI and robotics uh enabling
uh enabling uh prosperity for all and an abundance
uh prosperity for all and an abundance of goods and services actually the
of goods and services actually the capitalist
capitalist uh implementation of AI and robotics
uh implementation of AI and robotics assuming it goes down the the good path
assuming it goes down the the good path uh is is actually what results in the
uh is is actually what results in the communist utopia.
communist utopia. Because fate is fate is an irony
Because fate is fate is an irony maximizer,
maximizer, >> right? And and an actual socialism of
>> right? And and an actual socialism of maximum abundance of highincome
maximum abundance of highincome people.
people. >> Universal high income.
>> Universal high income. >> Yeah.
>> Yeah. >> Like the the problem with communism uh
>> Like the the problem with communism uh is is universal low income. Um it's it's
is is universal low income. Um it's it's not that everyone gets elevated, it's
not that everyone gets elevated, it's that everyone gets oppressed except for
that everyone gets oppressed except for a very small minority of of politicians
a very small minority of of politicians who live a lives of luxury. That's
who live a lives of luxury. That's what's happening every time it's been
what's happening every time it's been done.
done. >> Yeah. Um so um
>> Yeah. Um so um but then the
but then the the actual communist utopia if everyone
the actual communist utopia if everyone gets anything they want will be will be
gets anything they want will be will be if if will be achieved
if if will be achieved if it is achieved it will be achieved
if it is achieved it will be achieved via c capitalism
via c capitalism >> because fate is an irony maximizer.
>> I feel like we should probably end it on that. Is there anything else? The most
that. Is there anything else? The most ironic outcome is the most likely,
ironic outcome is the most likely, especially if entertaining.
especially if entertaining. >> Well, everything has been entertaining.
>> Well, everything has been entertaining. As long as the bad things aren't
As long as the bad things aren't happening to you, it's quite
happening to you, it's quite fascinating. And it's never a boring
fascinating. And it's never a boring moment.
moment. >> Yes. So there's I do have a theory of
>> Yes. So there's I do have a theory of why um like if if if simulation theory
why um like if if if simulation theory is true then
is true then um it is actually very likely that
um it is actually very likely that um the most interesting outcome is the
um the most interesting outcome is the is the most likely because only the
is the most likely because only the simulations that are interesting will
simulations that are interesting will continue. The simulators will stop any
continue. The simulators will stop any simulations that are boring because
simulations that are boring because they're they're not interesting.
they're they're not interesting. >> But here's the question about the
>> But here's the question about the simulation theory. Is the simulation run
simulation theory. Is the simulation run by anyone or is
by anyone or is >> it would be run by someone?
>> it would be run by someone? >> It would be run by
>> It would be run by >> some some
>> some some >> some force
>> some force >> the pro the program like in in this
>> the pro the program like in in this reality that we live in, we we run
reality that we live in, we we run simulations all the time. Like so when
simulations all the time. Like so when we try to figure out if the rocket's
we try to figure out if the rocket's gonna make it, we run um
gonna make it, we run um thousands sometimes millions of
thousands sometimes millions of simulations just to figure out which
simulations just to figure out which which
which uh path is the good path for the rocket
uh path is the good path for the rocket and and where can it go wrong, where can
and and where can it go wrong, where can it fail. Um but we when we do these I
it fail. Um but we when we do these I say at this point millions of
say at this point millions of simulations of of what can happen with
simulations of of what can happen with the rocket um we ignore the ones that
the rocket um we ignore the ones that are where everything goes right um
are where everything goes right um because we we we just care about the we
because we we we just care about the we have we have to address the situations
have we have to address the situations where it goes wrong.
where it goes wrong. Um so
Um so um so so basically in in in and and for
um so so basically in in in and and for for AI simulations as well like like all
for AI simulations as well like like all these things we we keep the simulations
these things we we keep the simulations going that are the most interesting to
going that are the most interesting to us. Um
us. Um so if simulation theory is accurate if
so if simulation theory is accurate if if it is true who knows um then the uh
if it is true who knows um then the uh the the simulators will will only they
the the simulators will will only they will continue to run the simulations
will continue to run the simulations that are most interesting there.
that are most interesting there. Therefore from a Darwinian perspective
Therefore from a Darwinian perspective um the only surviving simulations will
um the only surviving simulations will be the interest the most interesting
be the interest the most interesting ones. And in order to um avoid getting
ones. And in order to um avoid getting turned off uh the only rule is you must
turned off uh the only rule is you must keep it interesting or you will if or
keep it interesting or you will if or you will because the boring simulations
you will because the boring simulations will be terminated.
will be terminated. >> Are you still completely convinced that
>> Are you still completely convinced that this is a simulation?
this is a simulation? >> I didn't say I was completely convinced.
>> I didn't say I was completely convinced. >> Well, you said it's like the odds of it
>> Well, you said it's like the odds of it not being are in the billions.
not being are in the billions. But I guess it's not completely cuz
But I guess it's not completely cuz you're saying there's a chance.
you're saying there's a chance. >> What are the odds that we're in base
>> What are the odds that we're in base reality?
reality? Um
Um well given that given that that we're
well given that given that that we're able to create increasingly
able to create increasingly sophisticated simulations. So if you
sophisticated simulations. So if you think of say video games and how video
think of say video games and how video games have gone from very simple video
games have gone from very simple video games like Pong with you know two
games like Pong with you know two rectangles and a square to video games
rectangles and a square to video games today being um photorealistic
today being um photorealistic uh with millions of people playing
uh with millions of people playing simultaneously and all of that has
simultaneously and all of that has occurred in our lifetime.
occurred in our lifetime. So if that trend continues,
So if that trend continues, uh, video games will be
uh, video games will be indistinguishable from reality. The
indistinguishable from reality. The fidelity of the game will be such that
fidelity of the game will be such that you you don't know if that what you're
you you don't know if that what you're seeing is a real video or a fake video.
seeing is a real video or a fake video. Um, and like AI generated videos at this
Um, and like AI generated videos at this point, you like you can sometimes tell
point, you like you can sometimes tell it's an AI generated video, but often
it's an AI generated video, but often you cannot tell and soon you will not
you cannot tell and soon you will not really just not be able to tell. So um
really just not be able to tell. So um if if that's happening in our direct
if if that's happening in our direct observation
observation then and and we're create we'll create
then and and we're create we'll create millions if not billions of
millions if not billions of photorealistic simulations of reality
photorealistic simulations of reality then
then what are the odds that we're in base
what are the odds that we're in base reality
reality or versus someone else's simulation?
or versus someone else's simulation? Well, isn't it just possible that the
Well, isn't it just possible that the simulation is inevitable, but that we
simulation is inevitable, but that we are in base reality building towards a
are in base reality building towards a simulation?
We're making simulations. Um, so um
Um, so um we're making simulations. We make like
we're making simulations. We make like you can just think of like
you can just think of like photorealistic video games as as being
photorealistic video games as as being simulations.
simulations. >> Mh. Um, and especially as you apply AI
>> Mh. Um, and especially as you apply AI in these video games, the the characters
in these video games, the the characters in the video games will be incredibly
in the video games will be incredibly interesting to talk to. They won't just
interesting to talk to. They won't just have a limited dialogue tree where if
have a limited dialogue tree where if you go to like the the crossbow merchant
you go to like the the crossbow merchant or like and you you try to talk about
or like and you you try to talk about any subject except buying a crossbow,
any subject except buying a crossbow, they just want to talk about selling you
they just want to talk about selling you a crossbow. Um, but with with with AI
a crossbow. Um, but with with with AI based non-player characters, you can
based non-player characters, you can you'll be able to have an elaborate
you'll be able to have an elaborate conversation with no dialogue tree.
conversation with no dialogue tree. Well, that might be the solution for
Well, that might be the solution for meaning for people. Just lock in and you
meaning for people. Just lock in and you could be a [ __ ] vampire and whatever.
could be a [ __ ] vampire and whatever. You live in Avatar land. You could do
You live in Avatar land. You could do it. You could do whatever you want. I
it. You could do whatever you want. I mean, you don't have to think about
mean, you don't have to think about money or food.
money or food. >> Ready Player One.
>> Ready Player One. >> Yeah. Literally. Yeah. But with higher
>> Yeah. Literally. Yeah. But with higher living standards.
living standards. >> Yeah.
>> Yeah. >> You don't have to be in a little
>> You don't have to be in a little trailer. I
trailer. I >> I mean, I think this people do want to
>> I mean, I think this people do want to have some amount of struggle or
have some amount of struggle or something they want to push against. Um
but but it it could be you know playing a a sports or playing a game or
a a sports or playing a game or something.
something. >> It could be easily playing a game and
>> It could be easily playing a game and especially playing a game where you're
especially playing a game where you're now no longer worried about like
now no longer worried about like physical attributes like athletics like
physical attributes like athletics like bad joints and hips and stuff like that.
bad joints and hips and stuff like that. Now it's completely digital but yet you
Now it's completely digital but yet you do have meaning in pursuing this thing
do have meaning in pursuing this thing that you're doing all day.
that you're doing all day. Whatever the [ __ ] that means.
Whatever the [ __ ] that means. It's going to be weird.
It's going to be weird. >> It's going to be interesting.
>> It's going to be interesting. >> It's gonna be very interesting.
>> It's gonna be very interesting. >> Um the most the most interesting
>> Um the most the most interesting >> and and usually ironic outcome is the
>> and and usually ironic outcome is the most likely.
most likely. >> All right.
>> All right. >> That's a good predictor of the future.
>> That's a good predictor of the future. >> Thank you. Thanks for being here. Really
>> Thank you. Thanks for being here. Really appreciate you. Appreciate your time.
appreciate you. Appreciate your time. You I know you're a busy man, so this
You I know you're a busy man, so this means a lot you come here to do this.
means a lot you come here to do this. Welcome. All right. Thank you. Bye,
Welcome. All right. Thank you. Bye, everybody.
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.