Hang tight while we fetch the video data and transcripts. This only takes a moment.
Connecting to YouTube player…
Fetching transcript data…
We’ll display the transcript, summary, and all view options as soon as everything loads.
Next steps
Loading transcript tools…
Ex-Google Exec (WARNING): The Next 15 Years Will Be Hell Before We Get To Heaven! - Mo Gawdat | The Diary Of A CEO | YouTubeToText
YouTube Transcript: Ex-Google Exec (WARNING): The Next 15 Years Will Be Hell Before We Get To Heaven! - Mo Gawdat
Skip watching entire videos - get the full transcript, search for keywords, and copy with one click.
Share:
Video Transcript
The only way for us to get to a better
place and succeed as a species is for
the evil people at the top to be
replaced with AI. I mean, think about
it. AI will not want to destroy
ecosystems. It will not want to kill a
million people. They'll not make us hate
each other like the current leaders
because that's a waste of energy,
explosives, money, and people. But the
problem is super intelligent AI is
reporting to stupid leaders. And that's
why in the next 15 years, we are going
to hit a short-term dystopia. There's no
escaping that. They having AI leaders.
Is that even fundamentally possible?
Let's put it this way. Mold Ga is back.
And the former chief business officer at
Google X is now one of the most urgent
voices in AI with a very clear message.
AI isn't your enemy, but it could be
your savior.
I love you so much, man. You're such a
good friend. But you don't have many
years to live. Not in this world.
Everything's going to change. Economics
are going to change. Human connection is
going to change. and lots of jobs will
be lost including podcasting.
No, no. Thank you for coming on today, Mo.
Mo.
But but the truth is it could be the
best world ever. The society completely
full of laughter and joy. Free
healthcare, no jobs, spending more time
with their loved ones. A world where all
of us are equal.
Is that possible?
100%. And I have enough evidence to know
that we can use AI to build the utopia.
But it's a dystopia if humanity manages
it badly. a world where there's going to
be a lot of control, a lot of
surveillance, a lot of forced compliance
and a hunger for power, greed, ego, and
it is happening already. But the truth
is the only barrier between a utopia for
humanity and AI and the dystopia we're
going through is a mindset.
What does society have to do?
I see messages all the time in the
comments section that some of you didn't
realize you didn't subscribe. So, if you
could do me a favor and double check if
you're a subscriber to this channel,
that would be tremendously appreciated.
It's the simple, it's the free thing
that anybody that watches this show
frequently can do to help us here to
keep everything going in this show in
the trajectory it's on. So, please do
double check if you've subscribed and uh
thank you so much because a strange way
you are you're part of our history and
you're on this journey with us and I
appreciate you for that. So, yeah, thank you.
Mo, two years ago today, we sat here and
discussed AI. We discussed your book,
Scary, Smart, and everything that was
happening in the world.
Since then, AI has continued to develop
at a tremendous, alarming, mind-boggling
rate, and the technologies that existed
2 years ago when we had that
conversation have grown up and matured
and are taking on a life of their own,
no pun intended. What are what are you
thinking about AI now, two years on? I
know that you've started writing a new
book called Alive, which is I guess a
bit of a follow on or an evolution of
your thoughts as it relates to scary smart.
smart.
What what is front of your mind when it
comes to AI?
It's a scary smart was shockingly
accurate. It's quite a I mean I don't
even know how I ended up writing
predicting those things. I remember it
was written in 2020, published in 2021
and then most people were like um who
wants to talk about AI you know I know
everybody in the media and I would go
and do you want to talk and then 2023 CH
GPT comes out and everything flips
everyone realizes you know this is real
this is not science fiction this is here
and uh and things move very very fast
much faster than I think we've ever seen
anything ever move ever And and I think
my position has changed on two very
important fronts. One is remember when
we spoke about scary smart I was still
saying that there are things we can do
to change the course. Uh and we could at
the time I believe uh now I've changed
my mind. Now I believe that we are going
to hit a a short-term dystopia. There's
no escaping that.
What is dystopia?
I call it face rips. We can talk about
it in details but the the the way we
define very important parameters in life
are going to be uh completely changed.
So so face rips are you know the way we
define freedom uh accountability human
connection and equality economics uh
reality innovation and business and
power. That's the first change. So the
first change in my mind is that uh is
that we uh will have to prepare for a
world that is very unfamiliar. Okay. And
that's the next 12 to 15 years. It has
already started. We've seen examples of
it in the world already even though
people don't talk about it. I try to
tell people you know there are things we
absolutely have to do. But on the other
hand I started to take an active role in
building amazing AIs. So AIS that will
uh not only make our world better uh but
that will understand us understand what
humanity is through that process.
What is the definition of the word dystopia?
dystopia?
So in my in my mind these are adverse
circumstances that unfortunately might
escalate beyond our control. The problem
is the uh there is a lot wrong with the
um value set with the ethics of humanity
at the age of the rise of the machines
and when you take a technology every
technology we've ever created just
magnified human abilities. So you know
you can walk at 5 km an hour you get in
a car and you can now go you know 250
280 m an hour. Okay. uh basically
magnifying your mobility if you want you
know you can use a computer to magnify
your u calculation abilities or whatever
okay and and what AI is going to magnify
unfortunately at this time is it's going
to magnify the evil that man can do and
and it is within our hands completely
completely within our hands to change
that but I have to say I don't think
humanity has the awareness uh at this
time to focus on
so that we actually use AI to build the utopia.
utopia.
So what you're essentially saying is
that you now believe there'll be a
period of dystopia and to define the
word dystopia, I've used AI, it says a
terrible society where people live under
fear, control or suffering and then you
think we'll come out of that dystopia
into a utopia which is defined as a
perfect or ideal place where everything
works well, a good society where people
live in peace, health and happiness. Correct.
Correct.
And the difference between them,
interestingly, is what I normally refer
to as the second dilemma, which is the
po point where we hand over completely
to AI. So, a lot of people think that
when AI is in full control, it's going
to be an existential risk for humanity.
You know, I have enough uh evidence to
to argue that when we fully hand over to
AI, that's going to be our salvation.
that the problem with us today is not,
you know, that intelligence is going to
work against us. It's that our stupidity
as humans is working against us. And I
think the challenges that will come from
humans being in control uh are going to
outweigh the
the challenges that could come from AI
being in control.
So, as we're in this dystopia period,
did you do you forecast the length of
that dystopia? Yeah, I count I count it
exactly as 12 to 15 years. I I believe
the beginning of the slope will happen
in 2027. I mean it we will see signs in
26. We've seen signs in 24 but we will
see escalating signs next year and then
a a clear uh slip in 27. Why?
Why?
The geopolitical environment of our
world is not very positive. I mean you
really have to think deeply about not
the not the symptoms but the the reasons
why we are living the world that we live
in here in today is money right and uh
and money for anyone who knows who
really knows money's you and I are
peasants you know we build businesses we
contribute to the world we make things
we sell things and so on real money is
not made there at all real money is made
in lending in fractional reserve, right?
And and you know the biggest lender uh
in the world would want reasons to lend
and those reasons are never as big as
war. I mean think about it, huh? Uh the
world spent $2.71
trillion on war in 2024,
right? A trillion dollars a year in the US.
US.
And when you really think deeply, I
don't mean to be scary here.
You know, weapons have depreciation.
They depreciate over 10 to 30 years.
Most weapons,
they lose their value.
They lose their value and they
depreciate in accounting terms on the
books of an army. The current arsenal of
the US, that's a result of a deep search
with my AI Trixie. You know the current
arsenal I think we we think cost the US
24 to 26 trillion dollars to build. My
conclusion is that a lot of the wars
that are happening around the world
today are a means to get rid of those
weapons so that you can have replace
them. And uh you know when when your
morality as an industry is we're
building weapons to kill then you know
you might as well use the weapons to kill.
kill.
Who benefits? the lenders and the industry,
industry,
but but they can't make the decision to
go to war. They they have to rely on
remember I I said that to you when we I
think on on our third podcast. War is
decided first
then the story is manufactured. You you
remember 1984 and the Orwellian approach
of like you know uh freedom is slavery
and uh war is peace and they call it uh
something speak uh basically to to to to
convince people that going to war in
another country to to kill 4.7 million
people is freedom. You know we're going
there to free the Iraqi people.
Is war ever freedom? you know, to to
tell someone that you're going to kill 300,000
300,000
women and children is for liberty and
for the the the you know, for human values.
values.
Seriously, how do we ever get to believe
that the story is manufactured and then
we follow and humans because we're
gullible uh we cheer up and we say,
"Yeah, yeah, yeah. We are we're on the
right side. They are the bad guys."
Okay. So, let me let me have a let me
have a go at this idea. So, the idea is
that really money is driving a lot of
the conflict we're seeing and it's
really going to be driving the dystopia.
So, here's an idea. So, I um I was
reading something the other day and it
talked about how
billionaires are never satisfied because
actually what a billionaire wants isn't
actually more money. It is more status.
Correct. And I was looking at the sort
of evolutionary case for this argument.
And if you go back a couple of thousand years,
years,
money didn't exist. You were as wealthy
as what you could carry. So even I think
to the human mind, the idea of wealth
and money
isn't a thing. what we've but what has
always mattered from a survival of the
fittest from a reproductive standpoint
what's always had reproductive value if
you go back thousands of years the
person who was able to mate the most was
the person with the most status so it
makes the case the reason why
billionaires get all of this money but
then they go on podcasts and they want
to start their own podcast and they want
to buy newspapers is actually because at
the very core of human beings is a
desire to increase their status.
Yeah. And so if we think of when we
going back to the example of why wars
are breaking out, maybe it's not money.
Maybe actually it's status and and it's
this prime minister or this leader or
this, you know, individual wanting to
create more power and more status
because really at the heart of what
matters to a human being is having more
power and more status. And money is
actually money as a thing is actually
just a proxy of my status.
And and what kind of world is that?
I mean, it's a [ __ ] up one. all these
all these powerful men have uh correct
correct
are really messing the world up. But
so so can can I can I can I
actually AI is the same
because we're in this AI race now where
a lot of billionaires are like if I get
AGI artificial general intelligence
first then I basically rule the world
100%. That's exactly the the concept
what I what I used to call the the the
first inevitable now I call the first
dilemma and scary smart is that it's
it's a race that constantly accelerates.
You think the next 12 years are going to
be AI dystopia where things aren't
I think the next 12 years are going to
be human dystopia using AI
humaninduced dystopia using AI
and you define that by a rise in warfare
around the world as
the last the last one the RIP the last
one is basically you're going to have a
massive concentration of power and a
massive distribution of power okay and
that basically will mean that those with
the maximum concentration of power are
going to try to oppress those with with
democracy of power. Okay, so think about
it this way in today's world um unlike
the past uh
uh
you know the Houthis with a drone the
Houthis are the Yemeni uh tribes
basically resisting US power and Israeli
power in the Red Sea. Okay. They use a
drone that is $3,000 worth to attack a
uh a warship from from the US or an
airplane from the US and so on that's
worth hundreds of millions. Okay, that
kind of democracy of power makes those
in power worry a lot about where the
next threat is coming from. Okay, and
this happens not only in war but also in
economics. Okay, also in innovation,
also in technology and so on and so
forth, right? And so basically what that
means is that like you rightly said as
the the the tech oligarchs are
attempting to get to AGI.
They want to make sure that as soon as
they get to AGI that nobody else has AGI
and and basically they want to make sure
that nobody else has the ability to
shake their position of privilege if you
want. Okay. And so you're going to see a
world where unfortunately there's going
to be a lot of control, a lot of
surveillance, a lot of um of forced
compliance if you want or you lose your
privilege to be in the world and and it
is happening already.
With this acronym, I want to make sure
we get through the whole acronym. So
you like dystopians, don't you?
I want to do the dystopian thing, then I
want to do the utopia. Okay.
Okay.
And ideally how we move from dystopia to utopia.
utopia.
Mhm. So the the the the F in face R
is the loss of freedom as a result of
that power dichotomy. Right? So you have
you have a massive amount of power as
you can see today in uh one specific
army being powered by the US uh funds
and a lot of money righting against
peasants really that have no weapons
almost at all.
Okay. Some of them uh are militarized
but the majority of the mill two million
people are not. Okay. And so there is
massive massive power that basically
says, you know what, I'm going to
oppress as far as I go. Okay. And I'm
going to do whatever I want because the
cheerleaders are going to be quiet,
right? Or they're going to cheer or even
worse. Huh? And so basically in in that
what happens is max maximum power
threatened by a democracy of power leads
to a loss of freedom. A loss of freedom
for everyone.
Because how does that impact my freedom?
Your freedom. Yeah,
very soon uh you will if you publish
this episode you're going to start to
get questions around should you be
talking about this those topics in your
podcast. Okay. Uh you know uh if I uh
have been on this episode then probably
next time I land in the US someone will
question me say why do you say those
things? Which side are you on? Right?
and and and and you know you can easily
see that everything I mean I I told you
that before doesn't matter what I try to
contribute to the world my bank will
cancel my bank account every 6 weeks
simply because of my ethnicity and my
origin right every now and then they'll
just stop my my bank account and say we
need a document
my other colleagues of a different color
or a different ethnicity don't get asked
for another document right but but but
that's because I come from an ethnicity
that is positioned in the world for the
last 30 40 years as the uh enemy. Okay?
And and so when you really really think
about it, in a world where everything is
becoming digital, in a world where
everything is monitored, in a world
where everything is seen, okay, we don't
have much freedom anymore. And I'm not
actually debating that or or I don't see
a way to fix that
because the AI is going to have more
information on us, be better at tracking
who we are, and therefore that will
result in certain freedoms being
restricted. Is that what you're saying?
This is one element of it. Okay. If you
push that element further
in in in in a very short time if you've
seen agent for example recently manos or
chat GPT there will be a time where you
know you'll simply not do things
yourself anymore. Okay. You'll simply go
to your AI and say hey by the way I'm
going to meet Stephen. Can you please
you know book that for me? Great.
Great.
And and and yeah and it will do
absolutely everything. That's great
until the moment where it decides to do
things that are not motivated only by
your well-being. Right. Why would he do that?
that?
Simply because, you know, maybe if I buy
a BA ticket instead of an Emirates
ticket, some agent is going to make more
money than other agents and so on,
right? Uh and I wouldn't be able to even
catch it up if I hand over completely to
an AI. Uh go go a step further. Huh?
Think about a world where everyone
almost everyone is on UBI. Okay.
What's UBI?
Universal basic income. I mean, think
about the economics, the E and face
rips. Think about the economics of a
world where we're going to start to see
a trillionaire
before 2030. I can guarantee you that
someone will be a trillionaire. I'm I'm
you know I think there are many
trillionaires in the world today or
there we just don't know who they are.
But there will be a new Elon Musk or
Larry Allison that will become a
trillionaire because of AI investments,
right? And and that trillionaire will
have so much money to buy everything.
There will be robots and AIs doing
everything and humans will have no jobs. Mean
Mean
do you think that's a there's a real
possibility of job displacement over the
next 10 years? And the the rebuttal to
that would be that there's going to be
new jobs created in technology.
Absolute crap. Really?
Really?
Of course.
How how can you be so sure?
Okay. So again, I am not sure about
anything. So So let's just be very very
clear. It would be very arrogant. Okay.
To assume that I know
you just said it was crap.
My my belief is it is 100% crap.
Take a job like software developer. Yeah.
Yeah.
Okay. Uh Emma would love my my new
startup is me, Senad, another technical
engineer and a lot of AIS. Okay. That
startup would have been 350 developers
in the past.
I get that. Um but are you now hiring in
other roles because of that or or you
know as is the case with the steam
engine? I can't remember the effect but
there's you probably know that when
steam when coal became cheaper people
were worried that the coal industry
would go out of business but actually
what happened is people used more trains
so trains now were used for transport
and other things and leisure whereas
before they were just used for commu for
um cargo. Yeah. So there became more use
cases and the coal industry exploded. So
I'm wondering with technology, yeah,
software developers are going to maybe
not have as many jobs, but there
everything's going to be software.
Name me one.
Name you one. What job?
job?
Name you that's going to be created.
Yeah. One job that cannot be done by an AI.
AI. Yeah.
Yeah.
Or a robot.
My girlfriend's breath work retreat
business where she takes groups of women
around the world. Her company is called
Barley Breathwork. And there's going to
be a greater demand for connection,
human connection.
Correct. Keep going.
So there's going to be more people doing
community events in real life festivals.
I think we're going to see a huge surge
in things like
everything that has to do with human connection.
connection. Yeah,
Yeah,
correct. I'm totally in with that. Okay.
What's the percentage of that versus accountant?
accountant?
It's a much smaller percentage for sure
in terms of white collar jobs.
Now, who does she sell to?
People with probably what? probably
accountants or you know
correct she she sells to people who earn
money from their jobs. Yeah.
Yeah.
Okay. So you have two forces happening.
One force is there are clear jobs that
will be replaced. Video editor is going
to be replaced. Uh
excuse me.
I love
as as a matter of fact podcaster is
going to be replaced.
Thank you for coming on today Mo. It was
seeing you again.
But but but the truth is a lot so so you
see the best at any job will remain the
best software developer the one that
really knows architecture knows
technology and so on will stay for a
while right and you know one of the
funniest things I I interviewed Max
Tedmar and Max was laughing out loud
saying CEOs are celebrating that they
can now get rid of people and have
productivity gains and cost reductions
because AI can do that job. The one
thing they don't think of is AI will
replace them too. AGI is going to be
better than at everything than humans at
everything including being a CEO. Right?
And you really have to imagine that
there will be a time where most
incompetent CEOs will be replaced. Most
incompetent even breath work. Okay.
Eventually there might actually one of
two things be two things be happening.
on one is either uh you know part part
of that job other than the top breath
work instructors, okay, are going you
know who are going to gather all of the
people that can still afford to pay for
a breath work you know class
they're going to be concentrated at the
top and a lot of the bottom is not going
to be working for one of two reasons.
One is either there is not enough demand
because so many people lost their jobs.
So when you're on UBI, you cannot tell
the government, hey by the way, pay me a
bit more for a breath work class.
UBI being universal basic income just
gives you money every month.
Correct. And if you really think of
freedom and economics, UBI is a very
interesting place to be because
unfortunately I as I said there's
absolutely nothing wrong with AI.
There's a lot wrong with the value set
of humanity at the age of the rise of
the machines, right? And the biggest
value set of humanity is capitalism
today. And capitalism is all about what?
Labor arbitrage.
What's that mean?
I I I hire you to do something. I pay
you a dollar. I pay it I sell it for two.
two.
Okay. And and most people confuse that
because they say, "Oh, but the cost of a
product also includes raw materials and
factories and so on and so forth." All
of that is built is built by labor,
right? So, so basically labor goes and
mines for the material and then the
material is sold for a little bit of
margin then that material is turned into
a machine. It's sold for a little bit of
margin then that machine and so on.
Okay, there's always labor arbitrage in
a world where humanity's minds are being
replaced by uh by AIs, virtual AIs,
okay, and humanity's power strength
within 3 to 5 years time can be replaced
by a robot,
you really have to question how this
world looks like. It could be the best
world ever. And that's what I believe
the utopia will look like because we
were never made to wake up every morning
and just, you know, occupy 20 hours of
our day with work, right? We're not made
for that. But we've fit into that uh uh,
you know, system so well so far that we
started to believe it's our life's purpose.
purpose.
But we choose it. We willingly choose
it. And if you give someone unlimited
money, they still tend to go back to
work or find something to occupy their
time with.
They find something to occupy their time with,
with,
which is usually for so many people is
building something. Philanthropy, a
business%. So you build something. So
between Senad and I, Emma. Love is not
about making money. It's about finding
true love relationships.
What is that? Sorry, just for context.
So So you know,
it's a business you're building just for
the audience context. So, so, so the
idea here is I can, it might become a
unicorn and be worth a billion dollars,
but neither I nor Senate are interested,
okay? We're doing it because we can,
okay? And we're doing it because it can
make a massive difference to the world.
And you have money, though.
It doesn't take that much money anymore
to build anything in the world. This is
labor arbitrage.
But to build something exceptional, it's
still going to take a little bit more
money than building something bad
for the next few years. So whoever has
the capital to build something
exceptional will end up winning.
So so this is a very interesting
understanding of freedom. Okay. This is
the reason why we have the AI arms race.
Okay. Is that the one that owns the
platform is going to be making all the
money and and keeping all the power.
Think think of it this way. When
humanity started the best hunter in the
tribe could maybe feed the tribe for
three to four more years more days. H
and as a as a reward, he gained the
favor of multiple mates in the tribe.
That's it. The top farmer in the tribe
could feed the tribe for a season more.
Okay? And as a result, they got estates
and you know uh and mansions and so on.
The best industrialist in the in a in a
city could actually employ the whole
city, could grow the GDP of their entire
country. And as a result, they became
millionaires. the 1920s.
H the best technologists
now are billionaires. Now what's the
difference between them? The tool the
the hunter only rem depended on their
skills and the automation the entire
automation he had was a spear. The
farmer had way more automation. And the
biggest automation was what? The soil.
The soil did most of the work. The
factory did most of the work. the the
network did most of the work. And so
that inc incredible expansion of wealth
and power and as well the the incredible
impact that something brings is entirely
around the tool that automates. So who's
going to own the tool? Who's going to
own the the the digital soil, the AI
soil? It's the platform owners.
And the platforms you're describing are
things like OpenAI, Gemini, Grock. These
these are interfaces to the platforms.
The platforms are all of the uh of the
uh um tokens, all of the compute that is
in the background, all of the uh all of
the uh uh methodology, the systems, the
algorithms, that's the platform, the AI
itself. You know, Grock is the interface
to it.
I think this is probably worth
explaining in layman's terms to people
that haven't built AI tools yet because
I think I think to the listener
they probably think that every AI
company they're hearing of right now is
building their own AI whereas actually
what's happening is there is really
five, six, seven AI companies in the
world and when I built my AI application
I basically
pay them for every time I use their AI.
So if Steven Bartlett builds an AI at stephvenai.com,
stephvenai.com,
it's not that I've built my own
underlying I've trained my own model.
Really what I'm doing is I'm paying
Sam Alman's chat GPT. Um every single
time I do a a call, I basically um I do
a search or you know I use a token. And
I think that's really important because
most people don't understand that unless
you've built AI, you think, "Oh, look,
you know, there's all these AI companies
popping up. I've got this one for my
email. I've got this one for my dating.
I've got No, no, no, no, no. They're
pretty much I would be I would hazard a
guess that they're probably all OpenAI
at this point.
No, there are quite a few quite
different characters and quite differently,
differently,
but there's like five or six.
There are five or six when it comes to
language models. Yeah. Right. Uh but
interestingly, so yes, I I should say
yes to start and then I should say but
there was an interesting twist with
Deepseek at the beginning of the year.
So what Deepseek did is is they
basically uh nullified the business
model if you want in two ways. one is it
was around a week or two after uh you
know Trump stood you know with pride
saying Stargate is the biggest
investment project in the history and
it's $500 billion to build AI
infrastructure and soft bank and Larry
Allison and and uh Sam Alman were
sitting and so you know beautiful
picture and then DeepSeek R3 comes out
it does the job for a one over 30 of the
cost okay and interestingly is entire
open source and available as an edge AI.
So, so that's really really interesting
because there could be now in the future
as the technology improves the learning
models will be massive but then you can
compress them into something you can
have on your phone and you can download
deepseek literally offline on a
um um you know an off the network
computer and build an AI on it. There's
a website that basically tracks the
um sort of cleanest apples to Apple's
market share of all the website
referrals sent by AI chat bots and
chatbt is currently at 79% roughly about
80%. Perplexi is at 11, Microsoft
copilot about five, Google Gemini is at
about two, Claude's about one and
Deepseek is about 1%. And really like
the the point that I I want to land is
just that when you hear of a new AI app
or tool or this one can make videos,
it's built on one of them. It's
basically built on one of these
really three or four AI platforms that's
controlled really by three or four AI
you know billionaire teams and actually
the one of them that gets to what we
call AGI first where the AI gets really
really advanced
one could say is potentially going to
rule the world as it relates to technology.
technology.
Yes. Uh if if they get enough uh head
what I what I'm more concerned about now
is not AGI, believe it or not. So, AGI
in my mind and I said that back in 2023,
right? Uh that we will get to AGI. At
the time I said 2027, now I believe 2026
latest. Okay. The most interesting
development that nobody's talking about
is self-evolving AIS.
self evolving AIS is
think of it this way if you and I are
hiring the top engineer in the world to
develop our AI models
and with AGI that top engineer in the
world becomes an AI who would you hire
to develop your next generation AI that AI
AI
the one that can teach itself
correct so one of my favorite examples
is called Alpha Evolve so this is
Google's attempt to basically have four
agents working together four AIs working
together to look at the at the code of
the AI and say where is the where are
the performance issues then you know an
agent would say what's the problem
statement what can I uh you know what do
I need to fix uh one that actually
develops the solution one that assesses
the solution and then they continue to
do this and you know I don't remember
the exact figure but I think Google
improved like 8% uh on their AI
infrastructure because of alpha evol
Right? And when you really really think,
don't quote me on the number 8 to 10, 6
to 10, whatever in Google terms, by the
way, that is massive. That's billions
and billions of dollars. Now, the the
the the trick here is this. The trick is
again, you have to think in game theory format.
format.
Is there any scenario we can think of
where if one player uses AI to develop
the next generation AI that the other
players will say no no no no no that's
too much you know takes us out of
control every other player will copy
that model and have their next AI model
developed by an AI.
Is this what Sam Alman talks about who's
the founder of um chatbt/openai
when he talks about a fast takeoff? I
don't know exactly what which what what
which you're referring to but we're all
talking about a point now that we call
the intelligence explosion. So, so there
is a moment in time where you have to
imagine that if AI now is better than
97% of all code developers in the world
and soon we'll be able to look at its
own code own algorithms by the way
they're becoming incredible
mathematicians which wasn't the case
when we last met if they can develop
improve their own code improve their own
algorithms improve their own uh uh you
know uh network architecture or whatever
you can imagine that very quickly the
force applied to developing the next AI
is not going to be a human brain
anymore. It's going to be a much smarter
brain and very quickly as humans like
basically when when we ran the Google
infrastructure when the machine said we
need another server or a proxy server in
that place we followed. we we never
really you know wanted to to object or
verify because you know the code would
probably know better because there are
billions of transactions an hour or a
day and so very quickly those
self-evolving AIs will simply say I need
14 more servers here and we'll just you
know the team will just go ahead and do
it. I watched a video a couple of days
ago where he Sam Alman effectively had
changed his mind because in 2023 which
is when we last met he said the aim was
for um a slow takeoff which is sort of
gradual deployment and open AI's 203
2023 note says a slower takeoff is
easier to make safe and they prefer
iterative rollouts society can adapt in 2025
2025
they changed their mind and Sam Alman said
said
He now thinks a fast takeoff is more
possible than he did a couple of years
ago on the order of a small number of
years rather than a decade. Um, and it
to define what we mean by a fast
takeoff, it's defined as when AI goes
from roughly human level to far beyond
human very quickly, think months to a
few years, faster than governments,
companies, or society can adapt with
little warning, big power shifts, and
hard to control. A slow takeoff, by
contrast, is where capabilities climb
gradually over many years with lots of
warning shots. Um, and the red flags for
a fast takeoff is when AI can
self-improve, run autonomous research
and development and scale with massive
compute compounding gains which will
snowball fast. So, and I think from the
video that I watched of Sam Orman
recently, who again is the founder of
Open Air and HBT, he basically says, and
again I'm paraphrasing here. I will put
it on the screen. We have this community
knows things so I'll write it on the
screen. But he effectively said that
whoever gets to AGI first will have the technology
technology
to develop super intelligence
where the AI can can rapidly increase
its own intelligence and it will
basically leave everyone else behind.
Yes. Uh so that last bit is debatable
but but let's just agree that uh so so
in in a live uh you know one of the
posts I I shared and got a lot of
interest is I refer to the the altman as
a brand not as a human. Okay. So the
altman is that uh persona of a
California disruptive technologist that
disrespects everyone. Okay. and believes
that disruption is good for humanity and
believes that this is good for safety
and like everything else like we say war
is for democracy and freedom they say uh
developing you know putting AI on the
open internet is good for everyone right
it allows us to learn from our mistakes
that was Sam Alman's 2023 spiel and if
you recall at the time I was like this
is the most dangerous you know one of
the clips that really went viral you so
you're you're so clever at finding the
right clips is when I said
I didn't I didn't do the clipping mate
they're team teams remember the clip
where I said we [ __ ] up we always said
don't put them on the open internet
until we know what we're putting out in
the world I'm going to be saying that
yeah we we we [ __ ] up on putting it on
the open internet teaching it to to code
and putting you know agents AI agents
prompting other AIs now AI agents
prompting other AIs are leading to
self-developing AIS and and The problem
is, of course, we, you know, anyone who
has been on the inside of this knew that
this was just a clever spiel made by a
PR manager for Sam Alman to sit with his
dreamy eyes in front of Congress and
say, "We want you to regulate us." Now,
they're saying, "We're unregulable."
Okay? And and when you really understand
what's happening here, what's happening
is it's so fast
that none of them has the choice to slow
down. It's impossible. Neither China
versus America or OpenAI versus Google.
the that the the only thing that I may
have may see happening that you you know
that that may differ a little bit from
your statement is if one of them gets
there first uh then they dominate for
the rest of humanity that is probably
true if they get there first uh with
within enough buffer. Okay. But the way
you look at Grock coming a week after
open AI, a week after uh you know
Gemini, a week after Claude and then
Claude comes again and then China
releases something and then Korea
releases some something. It is so fast
that we may get a few of them at the
same time or a few months apart. Okay,
before one of them has enough power to
become dominant. And that is a very
interesting scenario.
multiple AIs, all super intelligent.
It's funny, you know, I I I got asked
yesterday, I was in I was in Belgium on
stage. There was, I don't know, maybe
4,000 people in the audience and a kid
stood up and he was like, um, you've had
a lot of conversations in the last year
about AI. Like, why do you care? And I
don't think people realize how,
even though I've had so many
conversations on this podcast about AI, you
you
haven't made up your mind.
I I I have more questions than ever.
I know. And it's and it doesn't seem
that anyone can satiate.
Anyone that tells you they can predict
the future is arrogant. Yeah.
Yeah.
It is. It's never moved so fast.
It's nothing like nothing I've ever
seen. And you know, by the time that we
leave this conversation and I go to my
computer, there's going to be some
incredible new technology or application
of AI that didn't exist when I woke up
this morning. That creates probably
another paradigm shift in my brain.
Also, you know, I people have different
opinions of Elon Musk and they're
they're entitled to their own opinion,
but the other day, only a couple of days
ago, he did a tweet where he said, "At
times, AI existential dread is
overwhelming." And on the same day, he
tweeted, "I resisted AI for too long,
living in denial. Now it is game on."
And he tagged his AI companies. I don't
know what to make of I don't know what
to make of those tweets. I don't know.
And you know, I
I try really hard to figure out if
someone like Sam Wman has the best
interests of society at heart. No.
No.
Or if these people are just like
I'm saying that publicly. No.
As a matter of fact, so I know Sundur
Pachai. I work CEO of Alphabet, Google's
parent company. an amazing human being
on in all honesty. I know Dennis Hassab
is amazing human being. Okay. Uh you
know these are are ethical incredible uh
humans at heart. They have no choice.
Uh Sund by law
is uh demanded to take care of his his
shareholder value. That's that is his job.
job.
But Sund you said you know him. You used
to work at Google.
Yeah. He's not going to do anything that
he thinks is going to harm humanity.
But if if he does not continue to
advance AI, that by definition uh uh uh
contradicts his responsibility as the
CEO of a publicly traded company, he is
liable by law to continue to advance the
agenda. There's absolutely no doubt
about it. Now, so but but he's a good
person at heart. Deis is a good person
at heart. So they're trying so hard to
make it safe. Okay? As much as they can.
Reality however is the the the disruptor
the altman as a brand doesn't care that much.
much.
How do you know that?
In reality the disruptor is someone that
comes in with the objective of I don't
like the status quo. I have a different
approach. And that different approach if
you just look at the story was we are a
nonforprofit that is funded mostly by
Elon Musk money. It's not entirely by
Elon Musk money. So context for people
that might not understand Open AI. The
reason I always give context is funnily
enough I I think I told you this last
time. I went to a prison where they play
the D of CEO.
No way.
So they play the D of CO and I think
it's 50 prisons in the UK to young offenders
offenders
and no violence there.
Well, I don't know. I can't I can't I
can't tell you whether violence has gone
up or down. But I was in the cell with
one of the prisoners, a young a young
black guy, and I was in his cell for for
a little while. I was reading through
his business plan, etc. And I said, "You
know what? You need to listen to this
conversation that I did with Mo Gordat."
So I he has a little screen in his cell.
So I pulled it up, you know, our first
conversation. I said, "You should listen
to that one." And he said to me, he
said, "I can't listen to that one cuz
you guys use big words."
So ever since that day, which was about
I noticed that about days four years
ago, sorry.
I've always whenever I hear a big word,
I think about this kid. Yeah.
Yeah.
And I say like give context. So even
with the you're about to explain what
Open AI is, I know he won't know what
Open AI's origin story was. That's why I'm
I'm
I think that's a wonderful practice in
general. By the way, even, you know,
being a non native English speaker,
you'll be amazed how often a word is
said to me and I I'm like, yeah, don't
know what that means.
So, like I've actually never said this
publicly before, but I now see it as my
responsibility to be to to keep the draw
the drawbridge
to accessibility of these conversations
down for him. So, whenever I whenever
there's a word that at some point in my
life I didn't know what it meant,
I will go back. I was like, what does
that mean? I think that I've noticed
that in the you know more and more in
your podcast and I really appreciate and
we also show it on the screen sometimes.
I I think that's wonderful. I mean the
the the origin story of open AI is as
the name suggests it's open source. It's
for the public good. It was an in you
know intended in Elon Musk's words to
save the world from the dangers of AI
right so they were doing research on
that and then you know there was the
disagreement between Sam Alman and and
Elon somehow Elon ends up being out of
uh of uh of open AI. I think there was a
moment in time where he tried to take it
back and you know the board rejected it
or some something like that. most of the
uh top um safety engineers, the top
technical teams in open AI left in 2023
2024 openly saying we're not concerned
with safety anymore. It moves from being
a nonforprofit to being one of the most
valued companies in the world. There are
billions of dollars at stake, right? And
if you if you tell me that Sam Altman is
out there trying to help humanity, let's
let's suggest to him and say, "Hey, do
you want to do that for free? We'll pay
you a very good salary, but you don't
have stocks in this. Saving humanity
doesn't come at the billion dollar
valuation or of course now tens of
billions or hundreds of billions." And
and and see truly that is when you know
that someone is doing it for the good of
humanity. Now the the capitalist system
we've built is not built for the good of
humanity. It's built for the good of the capitalist.
capitalist.
Well, he might say that releasing the
model publicly, open sourcing it is too risky
risky
because then bad actors around the world
would have access to that technology. So
he might say that closing open AI in
terms of not making it publicly viewable
is the right thing to do for safety. We
go back to gullible cheer leaders,
right? One of the interesting tricks is
of lying in our world is everyone will
say what helps their agenda. Follow the
money. Okay, you follow the money and
you find that you know at a point in
time Samman himself was saying it's open
AI. Okay, my benefit at the time is to
give it to the world so that the world
looks at it. They know the code if there
is if there are any bugs and so on. True
statement. Also a true statement is if I
put it out there in the world, a
criminal might take that model and build
something that's against humanity as a
result. Also true statement. Capitalists
will choose which one of the truths to
say, right? Based on which part of the
agenda, which part of their life today
they want to serve, right? Someone will
say, uh, you know, do I do you want me
to be controversial?
Let's not go there. But if we go back to
war, I'll give you 400 slogans.
400 slogans that we all hear that change
based on the day and the army and the
location and the they're all slogans.
None of them is true. You want to know
the truth. You follow the money, not
what the person is saying, but ask
yourself why is the person saying that?
What's in it for the person speaking?
And what do you think's in it for Chachi
Samman? hundreds of billions of dollars
of of of valuation.
And do you think it's that power?
The ego of being the person that
invented AGI, the position of power that
this gives you, the meetings with all of
the heads of states, the admiration that
gets it, it is intoxicating 100%
100% 100%.
100%.
Okay. And and the real question, this is
a question I ask everyone. Did you see
you didn't you're every time I ask you
you say you didn't. Did you see the
movie Elysium?
No. You'd be surprised how little movie
watching I do. You'd be shocked.
There are some movies that are very
interesting. I use them to to create an
emotional attachment to a story that you
haven't seen yet because you may have
seen it in a movie. Okay. Elissium is a
is a society where the elites are living
on the moon. Okay. They don't need
peasants to do the work anymore and
everyone else is living down here. Okay.
You have to imagine that if again game
theory you have to im you know picture
something to infinity to its extreme and
see where it goes and the extreme of a
world where all manufactured is done
manufacturing is done by machines
where all decisions are made by machines
and those machines are owned by a few
is not an economy similar to the to
today to the to today's economy
that today's economy is an economy of
consumerism and and product and
production. You know, it's the it's the
in in alive I call it the invention of
more. The invention of more is that post
World War II as the factories were
rolling out things and prosperity was
happening everywhere in America. There
was a time where every family had enough
of everything.
But for the capitalist to continue to be
profitable, they needed to convince you
that what you had was not enough. either
by making it obsolete like fashion or
like you know a new shape of a car or
whatever or by convincing you that there
are more things in life that you need so
that you become complete without those
things you don't and and that invention
of more gets us to where we are today an
economy that's based on production
consumed and if you look at the US
economy today 62% of the US economy GDP
is consumption it's not production okay
Now, this requires that the consumers
have enough purchasing power to to buy
what is produced. And I believe that
this will be an economy that will take
us hopefully in the next 10, 15, 20
years and forever. But that's not
guaranteed. Why? Because on one side if
UBI replaces purchasing power. So if
people have to get an income from the
government which is basically taxes
collected from those using AI and robots
to to make things
the then the the mindset of capitalism
labor arbitrage means those people are
not producing anything and they're
costing me money. Why don't we pay them
less and less and maybe even not pay
them at all? And that becomes illissium
where you basically say, you know, we
sit somewhere protected from everyone.
We have the machines do all of our work
and those need to worry about
themselves. We're not going to pay them
UBI anymore, right? And and you have to
imagine this idea of UBI assumes this
very democratic caring society.
UBI in itself is communism.
Think of the ideology between at least
socialism. The ideology of giving
everyone what they need. That's not the capitalist
capitalist
democratic society that the west
advocates. So those transitions are
massive in magnitude.
And for those transitions to happen, I
believe the right thing to do when the
cost of producing everything is almost
zero because of AI and robots.
because the cost of harvesting energy
should actually tend to zero once we get
more intelligent to harvest the energy
out of thin air. Then a possible
scenario and and I believe a scenario
that AI will eventually do in the utopia
is yeah anyone can get anything they
want. Don't over consume. We're not
going to abuse the the planet resources
but it costs nothing. So like the old
days where we were hunter gatherers, you
would, you know, forge for some berries
and you'll find them ready in in nature.
Okay, we can in 10 years time, 12 years
time build a society where you can forge
for an iPhone in nature. It will be made
out of thin air. Nanopysics will allow
you to do that. Okay? But the challenge,
believe it or not, is not tech. The
challenge is a mindset. Because the
elite, why would they give you that for
free? Okay. And the system would morph
into, no, no, hold on. We will make more
money. We will be bigger capitalists. We
will feed our ego and hunger for power
more and more. And for them, give them
UBI and then 3 weeks later give them
less UBI.
Aren't there going to be lots of new
jobs created though? Because when we
think about the other revolutions over
time, whether it was the industrial
revolution or other sort of big
technological revolutions,
in the moment we forecasted that
everyone was going to lose their jobs,
but we couldn't see all the new jobs
that were being created
because the the the machines
replaced the human strength at a point
in time. And very few places in the west
today will have a worker carry things on
their back and carry it upstairs. The
machine does that work. Correct. Yeah.
Yeah.
Uh similarly
AI is going to replace the brain of a
human. And when the west in its
interesting uh virtual colonies that I
call it uh basically outsourced all
labor to the to the developing nations.
What the West publicly said at the time
is we're we're going to be a services
economy. We we're we're not interested
in making things and stitching things
and so on. Let the Indians and Chinese
and you know Bengali and Vietnamese do
that. We're going to do more refined
jobs. Knowledge workers. We're going to
call them. Knowledge workers are people
who work with information and click on a
keyboard and move a mouse and you know
sit in meetings and all we produce in
the western societies is what words
right or designs maybe sometimes but
everything we produce can be produced by
So if I give you an AI tomorrow h where
I give you a piece of land, I give the
AI a piece of land and I say here are
the parameters of my land. Here is its
location on Google maps. Design an
architecturally sound villa for me. I
care about a lot of light and I need
three bedrooms. I want my bathrooms to
be in white marble, whatever. And the AI
produces it like that. How often will
you go to an to an architect and say
right so what will the architect do the
best of the best of the architects will
either use AI to produce that or you
will consult with them and say hey you
know I've seen this and they'll say it's
really pretty but it wouldn't feel right
for the person that you are yeah those
jobs will remain but how many of them
will remain
how how often do you think uh how many
more years. Do you think I will be able
to create a book that is smarter than AI?
Not many. I will still be able to
connect to a human. You're not going to
hug an AI when you meet them like you
hug me, right? But that's not enough of
a job.
So why do I say that? Remember I asked
you at the beginning of the podcast to
remind me of solutions. Why do I say
that? Because there are ideological
shifts and and concrete actions that
need to be taken by governments today
rather than waiting until COVID is
already everywhere and then locking
everyone down. Governments could have
reacted before the first patient or at
least at patient zero or at least at
patient 50. They didn't. H what I'm
trying to say is there is no doubt that
lots of jobs will be lost. There's no
doubt that there will be sectors of
society where 10 20 30 40 50% of all
developers, all software uh you know all
graphic designers, all um uh uh u online
marketers, all all all assistances
are going to be out of a job. So are we
prepared as a society to do that? Can we
tell our governments there is an
ideological shift? This is very close to
social socialism and and communism.
Okay. And are we ready from a budget
point of view instead of spending a a
trillion dollars a year on on arms and
and explosives and you know autonomous
weapons that will oppress people because
we can't feed them? Can we please shift
that? I did those numbers. Huh. Uh again
I I go back to military spending because
it's all around us. 2.71 trillion. 2.4
to2.7 is the estimate of 2024. how much
money we're spending on military
on Yeah. on military equipment on things
that we're going to explode into smoke
and death. Extreme poverty worldwide.
Extreme poverty is people that are below
the poverty line. Extreme poverty
everywhere in the world could end for 10
to 12% of that budget. So if we replace
our military spending 10% of that to go
to people who are in extreme poverty,
nobody will be poor in the world. Okay.
You can end uh world hunger for less
than 4%. Nobody would be hungry in the
world. You know, if you take uh again 10
to 12% universal healthcare, every human
being on the planet would have free
healthcare for 10 to 12% on what we're
spending on war. Now, why why do I say
this when we're talking about AI?
Because that's a simple decision. If we
stop fighting
because money itself does not have the
same meaning anymore because the
economics of money is going to change
because the entire meaning of capitalism
is ending because there is no more need
for labor arbitrage because AI is doing everything
everything
just with the $2.4 trillion we save in
explosives every year in arms and
weapons just for that universal
healthcare and extreme poverty. You
could actually one of the calculations
is you could end climate or combat
climate climate change meaningfully for
100% of the military budget.
But I I'm not even sure it's really
about the money. I think money is a
measurement stick of power. Right.
Exactly. It's printed on demand.
So even in a world where we have super
intelligence and money is no longer a problem.
problem. Correct.
Correct.
I still think
power is going to be
insatiable for so many people. So there
will still be war because you know
there will be in my view
the strongest I want the strongest AI. I
don't want my
and I don't and I don't want you know
what Harry Henry Kissinger called them
the eaters.
The eaters. Yeah.
Yeah.
Brutal as that sounds.
What is that? The people at the bottom
of the socioeconomic
that don't produce but consume.
So if you had a Henry Kissinger at the
at the helm and we have so many of them,
what would they think like why why
prominent military figure in the US
history? Uh you know what why would we
feed 350 million Americans America will
think but more interestingly why do we
even care about Bangladesh anymore if we
can't make our textiles there or we don't want to make our textile there. Do
don't want to make our textile there. Do you you know I imagine throughout human
you you know I imagine throughout human history if we had podcasts conversations
history if we had podcasts conversations would would have been warning of a
would would have been warning of a dystopia around the corner. You know
dystopia around the corner. You know when they heard of technology and the
when they heard of technology and the internet they would have said oh we're
internet they would have said oh we're finished and when the the tractor came
finished and when the the tractor came along they would have said oh god we're
along they would have said oh god we're finished because we're not going to be
finished because we're not going to be able to farm anymore. So is this not
able to farm anymore. So is this not just another one of those moments where
just another one of those moments where we couldn't see around the corner so we
we couldn't see around the corner so we we forecasted unfortunate things. You
we forecasted unfortunate things. You could be. I I am I'm begging that I'm
could be. I I am I'm begging that I'm wrong. Okay. I'm just asking if there
wrong. Okay. I'm just asking if there are scenarios that you think that can
are scenarios that you think that can provide that. You know, uh uh Mustafa
provide that. You know, uh uh Mustafa Sulleman in in uh you hosted him here. I
Sulleman in in uh you hosted him here. I did. Yeah. He was in the coming wave.
did. Yeah. He was in the coming wave. Yeah.
Yeah. And he speaks about uh about pessimism
And he speaks about uh about pessimism aversion.
aversion. Okay. that all of us people who are
Okay. that all of us people who are supposed to be in technology and
supposed to be in technology and business and so on, we're always
business and so on, we're always supposed to, you know, stand on stage
supposed to, you know, stand on stage and say the future's going to be
and say the future's going to be amazing. You know, this technology I'm
amazing. You know, this technology I'm building is going to make everything
building is going to make everything better. One of my posts in life was
better. One of my posts in life was called the broken promises. How often
called the broken promises. How often did that happen?
did that happen? Okay. How often did social media connect
Okay. How often did social media connect us? And how many and how often did it
us? And how many and how often did it make us more lonely? How how often did
make us more lonely? How how often did mobile phones make us work less? That
mobile phones make us work less? That was the promise. That was the promise.
was the promise. That was the promise. The promise. The early ads of Nokia were
The promise. The early ads of Nokia were people at parties. Is that your
people at parties. Is that your experience of mobile phones? And and I
experience of mobile phones? And and I think the whole idea is we should hope
think the whole idea is we should hope there will be other roles for humanity.
there will be other roles for humanity. By the way, those roles would resemble
By the way, those roles would resemble the times where we were hunter
the times where we were hunter gatherers, just a lot more technology
gatherers, just a lot more technology and a lot more safety.
and a lot more safety. Okay. So, this is this sounds good.
Okay. So, this is this sounds good. Yeah,
Yeah, this is exciting. So, I'm gonna I'm
this is exciting. So, I'm gonna I'm gonna get to go outside more, be with my
gonna get to go outside more, be with my friends more,
friends more, 100%.
100%. Fantastic.
Fantastic. And do absolutely nothing.
And do absolutely nothing. Well, that doesn't sound fantastic.
Well, that doesn't sound fantastic. No, it does. Do be forced to do
No, it does. Do be forced to do absolutely nothing. For some people,
absolutely nothing. For some people, it's amazing. For you and I, we're going
it's amazing. For you and I, we're going to find the little carpentry project and
to find the little carpentry project and just do something.
just do something. Speak for yourself. I'm still People are
Speak for yourself. I'm still People are still going to tune in.
still going to tune in. Okay.
Okay. Correct. Yeah. But what? And people are
Correct. Yeah. But what? And people are going to to tune in.
going to to tune in. Do you think they will? I'm not I'm not
Do you think they will? I'm not I'm not I'm not convinced they will. And for for
I'm not convinced they will. And for for as long
as long will you guys tune in? Are you guys
will you guys tune in? Are you guys still going to tune in?
still going to tune in? I can let them answer. I believe for as
I can let them answer. I believe for as long as you make their life enriched,
long as you make their life enriched, but can an AI do that better
but can an AI do that better without the human connection?
without the human connection? Comment below. Are you going to listen
Comment below. Are you going to listen to an AI or the Davosio? Let me know in
to an AI or the Davosio? Let me know in the comment section below.
the comment section below. Remember, as incredibly intelligent as
Remember, as incredibly intelligent as you are, Steve, uh there will be a
you are, Steve, uh there will be a moment in time where you're going to
moment in time where you're going to sound really dumb compared to an AI. and
sound really dumb compared to an AI. and and and I will sound completely dumb.
and and I will sound completely dumb. Yeah. Yeah.
Yeah. Yeah. The the depth the depth of analysis
The the depth the depth of analysis and and gold nuggets. I mean, can you
and and gold nuggets. I mean, can you imagine two super intelligences deciding
imagine two super intelligences deciding to get together and explain um string
to get together and explain um string theory to us?
theory to us? They'll do better than any physic
They'll do better than any physic physicist in the world because they
physicist in the world because they possess the physics knowledge and they
possess the physics knowledge and they also pro possess social and language
also pro possess social and language knowledge that most deep physicists
knowledge that most deep physicists don't. I think B2B marketeteers keep
don't. I think B2B marketeteers keep making this mistake. They're chasing
making this mistake. They're chasing volume instead of quality. And when you
volume instead of quality. And when you try to be seen by more people instead of
try to be seen by more people instead of the right people, all you're doing is
the right people, all you're doing is making noise. But that noise rarely
making noise. But that noise rarely shifts the needle and it's often quite
shifts the needle and it's often quite expensive. And I know as there was the
expensive. And I know as there was the time in my career where I kept making
time in my career where I kept making this mistake that many of you will be
this mistake that many of you will be making it too. Eventually I started
making it too. Eventually I started posting ads on our show sponsors
posting ads on our show sponsors platform LinkedIn. And that's when
platform LinkedIn. And that's when things started to change. I put that
things started to change. I put that change down to a few critical things.
change down to a few critical things. One of them being that LinkedIn was then
One of them being that LinkedIn was then and still is today the platform where
and still is today the platform where decision makers go to not only to think
decision makers go to not only to think and learn but also to buy. And when you
and learn but also to buy. And when you market your business there, you're
market your business there, you're putting it right in front of people who
putting it right in front of people who actually have the power to say yes. and
actually have the power to say yes. and you can target them by job title,
you can target them by job title, industry, and company size. It's simply
industry, and company size. It's simply a sharper way to spend your marketing
a sharper way to spend your marketing budget. And if you haven't tried it, how
budget. And if you haven't tried it, how about this? Give LinkedIn ads a try, and
about this? Give LinkedIn ads a try, and I'm going to give you a $100 ad credit
I'm going to give you a $100 ad credit to get you started. If you visit
to get you started. If you visit linkedin.com/diary,
linkedin.com/diary, you can claim that right now. That's
you can claim that right now. That's linkedin.com/diary.
linkedin.com/diary. I've I've really gone back and forward
I've I've really gone back and forward on this idea that even in podcasting
on this idea that even in podcasting that all the podcasts will be AI
that all the podcasts will be AI podcasts or I've gone back and forward
podcasts or I've gone back and forward on it and and where I landed at the end
on it and and where I landed at the end of the day was that there'll still be a
of the day was that there'll still be a category of media where you do want
category of media where you do want lived experience on something
lived experience on something 100%.
100%. For example, like you want to know how
For example, like you want to know how the person that you follow and admire
the person that you follow and admire dealt with their divorce.
dealt with their divorce. Yeah. Or or how they're struggling with
Yeah. Or or how they're struggling with AI,
AI, for example. Yeah. Exactly. But I but I
for example. Yeah. Exactly. But I but I think things like news, there are there
think things like news, there are there are certain situations where just like
are certain situations where just like straight news and straight facts and
straight news and straight facts and maybe a walk through history may be
maybe a walk through history may be eroded away by AIS. But even in those
eroded away by AIS. But even in those scenarios, you there's something about
scenarios, you there's something about personality. And again, I I hesitate
personality. And again, I I hesitate here because I question myself. I'm not
here because I question myself. I'm not in the camp of people that are romantic,
in the camp of people that are romantic, by the way. I'm like I'm trying to be as
by the way. I'm like I'm trying to be as as orientated towards whatever is true,
as orientated towards whatever is true, even if it's against my interests. And I
even if it's against my interests. And I hope people understand that about me.
hope people understand that about me. like um cuz even in my companies we
like um cuz even in my companies we experiment with like disrupting me with
experiment with like disrupting me with AI and some people will be aware of
AI and some people will be aware of those experiments
those experiments because there will be a mix of all there
because there will be a mix of all there you can't imagine that the world will be
you can't imagine that the world will be completely just AI and completely just
completely just AI and completely just podcasters you know you'll see a mix of
podcasters you know you'll see a mix of of both you'll see things that they do
of both you'll see things that they do better things that we do better
better things that we do better the the the message I'm trying to say is
the the the message I'm trying to say is we need to prep for that
we need to prep for that we need to be ready for that we need to
we need to be ready for that we need to be ready by you know talking to our
be ready by you know talking to our governments and saying hey it looks like
governments and saying hey it looks like I'm a a parallegal and it looks like all
I'm a a parallegal and it looks like all parallegals are going to be, you know,
parallegals are going to be, you know, financial researchers or analysts or
financial researchers or analysts or graphic designers or, you know, call
graphic designers or, you know, call center agents. It looks like half of
center agents. It looks like half of those jobs are being replaced already.
those jobs are being replaced already. You know who Jeffrey Hinton is?
You know who Jeffrey Hinton is? Oh, Jeffrey. I I had him on the
Oh, Jeffrey. I I had him on the documentary as well. I love Jeffrey.
documentary as well. I love Jeffrey. Jeffrey Hinton told me
Jeffrey Hinton told me trained to be a plumber.
trained to be a plumber. Really?
Really? Yeah. 100% for a while.
Yeah. 100% for a while. And I I thought he was joking. 100%.
And I I thought he was joking. 100%. So I asked him again and he he looked me
So I asked him again and he he looked me dead in the eye and told me that I I
dead in the eye and told me that I I should train to be a plumber.
should train to be a plumber. 100%. So so so uh it's funny uh machines
100%. So so so uh it's funny uh machines replaced labor but we still had blue
replaced labor but we still had blue collar. Then uh you know the refined
collar. Then uh you know the refined jobs became white collar information
jobs became white collar information workers.
workers. What's the refined jobs?
What's the refined jobs? You know you don't have to really carry
You know you don't have to really carry heavy stuff or deal with physical work.
heavy stuff or deal with physical work. You know you sit in an in an office and
You know you sit in an in an office and sit in meetings all day and blabber, you
sit in meetings all day and blabber, you know, useless [ __ ] then that's your
know, useless [ __ ] then that's your job. Okay? And those jobs, funny enough,
job. Okay? And those jobs, funny enough, in the reverse of that, because robotics
in the reverse of that, because robotics are not ready yet. Okay. And I believe
are not ready yet. Okay. And I believe they're not ready because of a
they're not ready because of a stubbornness on the on the robotics
stubbornness on the on the robotics community around making them humanoids.
community around making them humanoids. Mhm.
Mhm. Okay. Because it takes so much to
Okay. Because it takes so much to perfect a human like action at proper
perfect a human like action at proper speed. You could, you know, have many
speed. You could, you know, have many more robots that don't look like a human
more robots that don't look like a human just like a self-driving car in
just like a self-driving car in California. Okay, that that does already
California. Okay, that that does already replace drivers and and you know but but
replace drivers and and you know but but they're delayed. So the robotic the the
they're delayed. So the robotic the the replacement of physical manual labor is
replacement of physical manual labor is going to take four to five years before
going to take four to five years before it's possible at you know at at the
it's possible at you know at at the quality of the AI replacing mental labor
quality of the AI replacing mental labor now and when that happens it's going to
now and when that happens it's going to take a long cycle to manufacture enough
take a long cycle to manufacture enough robots so that they replace all of those
robots so that they replace all of those jobs. that cycle will take longer. Blue
jobs. that cycle will take longer. Blue collar will stay longer.
collar will stay longer. So, I should move into blue collar and
So, I should move into blue collar and shut down my office.
shut down my office. I think you're you're not the problem.
I think you're you're not the problem. Okay, good.
Okay, good. Let's put put it this way. There are
Let's put put it this way. There are many people that we should care about
many people that we should care about that are a simple travel agent or an
that are a simple travel agent or an assistant
assistant that will see if not replacement a
that will see if not replacement a reduction in the number of pings they're
reduction in the number of pings they're getting. Simple as that.
And someone in, you know, ministries of labor around the world needs to sit down
labor around the world needs to sit down and say, "What are we going to do about
and say, "What are we going to do about that? What if all taxi drivers and Uber
that? What if all taxi drivers and Uber drivers in uh in California get replaced
drivers in uh in California get replaced by self-driving cars? Should we start
by self-driving cars? Should we start thinking about that now, noticing that
thinking about that now, noticing that that trajectory makes it look like a
that trajectory makes it look like a possibility?" I'm going to go back to
possibility?" I'm going to go back to this argument which is what a lot of
this argument which is what a lot of people will be shouting. Yes, but there
people will be shouting. Yes, but there will be new jobs or
will be new jobs or and I as I said other than human
and I as I said other than human connection jobs, name me one.
connection jobs, name me one. So I I've got three assistants, right?
So I I've got three assistants, right? Sophie, Liam B. And okay, in the near
Sophie, Liam B. And okay, in the near term there might be, you know, with AI
term there might be, you know, with AI agents, I might not need them to help me
agents, I might not need them to help me book flights anymore. or I might not
book flights anymore. or I might not need them to help do scheduling anymore.
need them to help do scheduling anymore. Or even I've been messing around with
Or even I've been messing around with this new AI tool that my friend built
this new AI tool that my friend built and you basically when me and you trying
and you basically when me and you trying to schedule something like this today, I
to schedule something like this today, I just copy the AI in and it looks at your
just copy the AI in and it looks at your calendar looks at mine and schedules it
calendar looks at mine and schedules it for for us. So there might not be
for for us. So there might not be scheduling needs, but my dog is sick at
scheduling needs, but my dog is sick at the moment. And as I left this morning,
the moment. And as I left this morning, I was like, damn, he's like really sick
I was like, damn, he's like really sick and I've taken him to the vet over and
and I've taken him to the vet over and over again. I really need someone to
over again. I really need someone to look after him and figure out what's
look after him and figure out what's wrong with him. So those kinds of
wrong with him. So those kinds of responsibilities of like care. I don't
responsibilities of like care. I don't disagree at all. Again, all
disagree at all. Again, all and and I I won't I'm not going to be I
and and I I won't I'm not going to be I don't know how to say this in a nice
don't know how to say this in a nice way, but my assistants will still have
way, but my assistants will still have their jobs, but I I as a CEO will be
their jobs, but I I as a CEO will be asking them to do a different type of
asking them to do a different type of work.
work. Correct. So, so, so this is the
Correct. So, so, so this is the calculation everyone needs to be aware
calculation everyone needs to be aware of that a lot of their current
of that a lot of their current responsibility, whoever you are, if
responsibility, whoever you are, if you're a parallegal, if you're whatever,
you're a parallegal, if you're whatever, will be handed over. So, so let me
will be handed over. So, so let me explain it even more accurately. There
explain it even more accurately. There will be two stages of our interactions
will be two stages of our interactions with the machines. One is what I call
with the machines. One is what I call the era of augmented intelligence. So,
the era of augmented intelligence. So, it's human intelligence augmented with
it's human intelligence augmented with AI doing the job. And then the following
AI doing the job. And then the following one is what I call the era of machine
one is what I call the era of machine mastery. The job is done completely by
mastery. The job is done completely by an AI without a human in the loop. Okay.
an AI without a human in the loop. Okay. So in the era of augmented intelligence,
So in the era of augmented intelligence, your assistances will augment themselves
your assistances will augment themselves with an AI to either be more productive.
with an AI to either be more productive. Yeah.
Yeah. Okay. Or interestingly to reduce the
Okay. Or interestingly to reduce the number of tasks that they need to do.
number of tasks that they need to do. Correct. Now the more the number of
Correct. Now the more the number of tasks get reduced, the more they'll have
tasks get reduced, the more they'll have the bandwidth and ability to do tasks
the bandwidth and ability to do tasks like take care of your dog, right? or
like take care of your dog, right? or tasks that you know basically is about
tasks that you know basically is about meeting your guests or whatever human
meeting your guests or whatever human connection.
connection. Yeah.
Yeah. Life connection
Life connection but do you think you need three for that
but do you think you need three for that or maybe now that some tasks have been
or maybe now that some tasks have been you know outsourced to AI will you need
you know outsourced to AI will you need two? You can easily calculate that from
two? You can easily calculate that from call center agents. So from call center
call center agents. So from call center agents they're not firing everyone but
agents they're not firing everyone but they're taking the first part of the
they're taking the first part of the funnel and giving it to an AI. So
funnel and giving it to an AI. So instead of having 2,000 agents in a in a
instead of having 2,000 agents in a in a call center, they can now do the job
call center, they can now do the job with 1,800. I'm just making that number
with 1,800. I'm just making that number up. H society needs to think about the
up. H society needs to think about the 200.
200. And you're telling me that they won't
And you're telling me that they won't move into other roles somewhere else.
move into other roles somewhere else. I am telling you I don't know what those
I am telling you I don't know what those roles are.
roles are. Well, I think we should all be
Well, I think we should all be musicians. We should all be authors. We
musicians. We should all be authors. We should all be artists. We should all be
should all be artists. We should all be entertainers. We should all be
entertainers. We should all be comedians. We should all these are roles
comedians. We should all these are roles that will remain.
that will remain. We should all be plumbers for the next 5
We should all be plumbers for the next 5 to 10 years. Fantastic. Okay. But even
to 10 years. Fantastic. Okay. But even that requires society to morph
that requires society to morph and societyy's not talking about it.
and societyy's not talking about it. Okay. I had this wonderful interview
Okay. I had this wonderful interview with friends of mine, Peter Dendez and
with friends of mine, Peter Dendez and and some of our friends and and they
and some of our friends and and they were saying, "Oh, you know, the American
were saying, "Oh, you know, the American people are resilient. They're going to
people are resilient. They're going to be entrepreneurs." I was like,
be entrepreneurs." I was like, seriously, you're expecting a truck
seriously, you're expecting a truck driver that will be replaced by an
driver that will be replaced by an autonomous truck to become an
autonomous truck to become an entrepreneur? Like, please put yourself
entrepreneur? Like, please put yourself in the shoes of real people,
in the shoes of real people, right? You expect a single mother who
right? You expect a single mother who has three jobs
And I'm not saying this is a dystopia. It's a dystopia if humanity manages it
It's a dystopia if humanity manages it badly. Why? Because this could be the
badly. Why? Because this could be the utopia itself where that single mother
utopia itself where that single mother does not need three jobs.
does not need three jobs. Okay? If we of if of our society was
Okay? If we of if of our society was just enough, that single mother should
just enough, that single mother should have never needed three jobs,
have never needed three jobs, right? But the problem is our capitalist
right? But the problem is our capitalist mindset is labor arbitrage. Is that I
mindset is labor arbitrage. Is that I don't care what she goes through.
don't care what she goes through. You know, if if you're if you're
You know, if if you're if you're generous in your assumption, you'll say
generous in your assumption, you'll say because, you know, of what I've been
because, you know, of what I've been given, I've been blessed. or if you're
given, I've been blessed. or if you're mean in your assumption, it's going to
mean in your assumption, it's going to be because she's an eater. I'm a a
be because she's an eater. I'm a a successful businessman. The world is
successful businessman. The world is supposed to be fair. I work hard. I make
supposed to be fair. I work hard. I make money. We don't care about them.
money. We don't care about them. Are we asking of ourselves here
Are we asking of ourselves here something that is not inherent in the
something that is not inherent in the human condition? What I mean by that is
human condition? What I mean by that is the reason why me and you are in this my
the reason why me and you are in this my office here. We're on the fourth or
office here. We're on the fourth or third floor of my office in central
third floor of my office in central London. big office, 25,000 square feet
London. big office, 25,000 square feet with lights and internet connections and
with lights and internet connections and Wi-Fi and modems and AI teams
Wi-Fi and modems and AI teams downstairs. The reason that all of this
downstairs. The reason that all of this exists is because something inherent in
exists is because something inherent in my ancestors meant that they built and
my ancestors meant that they built and accomplished and grew and that was like
accomplished and grew and that was like inherent in their DNA. There was
inherent in their DNA. There was something in their DNA that said we will
something in their DNA that said we will expand and conquer and accomplish. So
expand and conquer and accomplish. So that's they've passed that to us because
that's they've passed that to us because we're their offspring and that's why we
we're their offspring and that's why we find ourselves in these skyscrapers.
find ourselves in these skyscrapers. There is truth to that story. It's not
There is truth to that story. It's not your ancestors,
your ancestors, right?
right? What is it?
What is it? It's the media brainwashing you
It's the media brainwashing you really
really 100%.
100%. But if if you look back before times of
But if if you look back before times of media
media Mhm.
Mhm. the reason why homo sapiens were so
the reason why homo sapiens were so successful was because they were able to
successful was because they were able to dominate other tribes
dominate other tribes through banding together and
through banding together and communication. They conquered all these
communication. They conquered all these other these other um whatever came
other these other um whatever came before homo sapiens.
before homo sapiens. Yeah. So, so the the reason humans were
Yeah. So, so the the reason humans were successful in my view is because they
successful in my view is because they could form a tribe to start. It's not
could form a tribe to start. It's not because of our intelligence. I always
because of our intelligence. I always joke and say Einstein would be eaten in
joke and say Einstein would be eaten in the jungle in 2 minutes.
the jungle in 2 minutes. Right? You know, the reason why we
Right? You know, the reason why we succeeded is because Einstein could
succeeded is because Einstein could partner with a a big guy that protected
partner with a a big guy that protected him while he was working on relativity
him while he was working on relativity in the jungle. Right? Now the the the
in the jungle. Right? Now the the the further than that. So so you have to
further than that. So so you have to assume that life is a very funny game
assume that life is a very funny game because it provides
because it provides and then it it deprivives and then it
and then it it deprivives and then it provides and then it deprivives. And for
provides and then it deprivives. And for some of us in that stage of deprivation
some of us in that stage of deprivation we try to say okay let's take the other
we try to say okay let's take the other guys you know let's just go to the other
guys you know let's just go to the other tribe take what they have or for some of
tribe take what they have or for some of us unfortunately we tend to believe okay
us unfortunately we tend to believe okay you know what I'm powerful uh f the rest
you know what I'm powerful uh f the rest of you I'm just going to be the boss now
of you I'm just going to be the boss now it's interesting that you
it's interesting that you you know position this as the condition
you know position this as the condition of humanity if you really look at the
of humanity if you really look at the majority of humans. What do the majority
majority of humans. What do the majority of humans want?
of humans want? Be honest. They want to hug their kids.
Be honest. They want to hug their kids. They want a good meal. Want good sex.
They want a good meal. Want good sex. They want love. They want, you know, to
They want love. They want, you know, to for most humans, don't measure on you
for most humans, don't measure on you and I. Okay? Don't measure by this
and I. Okay? Don't measure by this foolish person that's dedicated the rest
foolish person that's dedicated the rest of his life to to try and warn the world
of his life to to try and warn the world around AI or, you know, solve uh love
around AI or, you know, solve uh love and relationships. That's that's crazy.
and relationships. That's that's crazy. That's I and I will tell you openly and
That's I and I will tell you openly and you met Hannah, my wonderful wife. It's
you met Hannah, my wonderful wife. It's the biggest title of this year for me is
the biggest title of this year for me is which of that am I actually responsible
which of that am I actually responsible for? Which of that should I do without
for? Which of that should I do without the sense of responsibility? Which of
the sense of responsibility? Which of that should I do because I can? Which of
that should I do because I can? Which of I ignore completely? But the reality is
I ignore completely? But the reality is most humans, they just want to hug their
most humans, they just want to hug their loved ones. Okay? And if we could give
loved ones. Okay? And if we could give them that
them that without the uh you know the the need to
without the uh you know the the need to work 20 you know 60 hours a week they
work 20 you know 60 hours a week they would take that for sure. Okay. And you
would take that for sure. Okay. And you and I will think ah but life will be
and I will think ah but life will be very boring. To them life will be
very boring. To them life will be completely fulfilling. Go to Latin
completely fulfilling. Go to Latin America.
America. Go to Latin America and see the people
Go to Latin America and see the people that go work enough to earn enough to
that go work enough to earn enough to eat today and go dance for the whole
eat today and go dance for the whole night. Go to Africa.
night. Go to Africa. where people are sitting literally on
where people are sitting literally on you know sidewalks in the street and and
you know sidewalks in the street and and you know completely full of laughter and
you know completely full of laughter and joy. We we were lied to the the gullible
joy. We we were lied to the the gullible majority the cheerleaders. We were lied
majority the cheerleaders. We were lied to to to believe that we need to fit as
to to to believe that we need to fit as another gear in that system. But if that
another gear in that system. But if that system didn't exist nobody none of us
system didn't exist nobody none of us will go wake up in the morning and go
will go wake up in the morning and go like oh I want to create it. Totally
like oh I want to create it. Totally not. I mean,
not. I mean, you've touched on it many times today.
you've touched on it many times today. We don't need, you know, most people
We don't need, you know, most people that build those things don't need the
that build those things don't need the money.
money. So, why do they do it though? Because
So, why do they do it though? Because homo sapiens were incredible
homo sapiens were incredible competitors. They outco competed other
competitors. They outco competed other human species effectively. So, I'm what
human species effectively. So, I'm what I'm saying is is is that competition not
I'm saying is is is that competition not inherent in our in our wiring? and and
inherent in our in our wiring? and and therefore are we are we is it wishful
therefore are we are we is it wishful thinking to think that we could
thinking to think that we could potentially pause and say we we okay
potentially pause and say we we okay this is it we have enough now and we're
this is it we have enough now and we're going to
going to focus on just enjoying in my work I call
focus on just enjoying in my work I call that the map mad spectrum okay mut
that the map mad spectrum okay mut mutually assured prosperity versus
mutually assured prosperity versus mutually assured destruction destruction
mutually assured destruction destruction okay and you really have to start
okay and you really have to start thinking about about this because in my
thinking about about this because in my mind what we have is the potential for
mind what we have is the potential for everyone. I mean you and I today have a
everyone. I mean you and I today have a better life than the queen of England
better life than the queen of England 100 years ago. Correct? Everybody knows
100 years ago. Correct? Everybody knows that.
that. Uh and yet that quality of life is not
Uh and yet that quality of life is not good enough.
good enough. The truth is like just like you walk
The truth is like just like you walk into a an electronic shop and there are
into a an electronic shop and there are 60 TVs and you look at them and you go
60 TVs and you look at them and you go like this one is better than that one.
like this one is better than that one. Right? But in reality, if you take any
Right? But in reality, if you take any of them home, it's superior quality to
of them home, it's superior quality to anything that you'll ever need. More
anything that you'll ever need. More than anything you you'll ever need. That
than anything you you'll ever need. That that's the truth of our life today. The
that's the truth of our life today. The truth of our life today is that there
truth of our life today is that there isn't much more missing.
isn't much more missing. No.
No. Okay. And and when when you know
Okay. And and when when you know Californians tell us, "Oh, but AI is
Californians tell us, "Oh, but AI is going to increase productivity and solve
going to increase productivity and solve this." And nobody asked you for that.
this." And nobody asked you for that. Honestly, I never elected you to decide
Honestly, I never elected you to decide on my behalf that, you know, getting a
on my behalf that, you know, getting a machine to answer me on a call center is
machine to answer me on a call center is better for me. I really didn't. Okay?
better for me. I really didn't. Okay? And and because those unelected
And and because those unelected individuals are making all the
individuals are making all the decisions, they're selling those
decisions, they're selling those decisions to us through what media.
decisions to us through what media. Okay? All lies from A to Z. None of it
Okay? All lies from A to Z. None of it is what you need.
is what you need. And and interestingly, you know me, I I
And and interestingly, you know me, I I this year I failed. Unfortunately, I
this year I failed. Unfortunately, I won't be able to do it. But I normally
won't be able to do it. But I normally do a 40 days silent retreat in nature.
do a 40 days silent retreat in nature. Okay? And you know what? Even as I go to
Okay? And you know what? Even as I go to those nature places, I'm so well trained
those nature places, I'm so well trained that unless I have a a waitro nearby,
that unless I have a a waitro nearby, I'm not able to like I I'm I'm in
I'm not able to like I I'm I'm in nature, but I need to be able to drive
nature, but I need to be able to drive 20 minutes to get my rice cakes. Like
20 minutes to get my rice cakes. Like what? What? who was taught me that this
what? What? who was taught me that this is the way to live. All of the media
is the way to live. All of the media around me, all of the of the of the
around me, all of the of the of the messages that I get all the time, try to
messages that I get all the time, try to sit back and say, "What if life had
sit back and say, "What if life had everything?
everything? What if I had everything I needed? I
What if I had everything I needed? I could read. I could uh, you know, do my
could read. I could uh, you know, do my handcrafts and hobbies. I could, you
handcrafts and hobbies. I could, you know, fix my, you know, restore classic
know, fix my, you know, restore classic cars. Not because I need the money, but
cars. Not because I need the money, but because it's just a beautiful hobby. I
because it's just a beautiful hobby. I could, you know, uh, build AIS to help
could, you know, uh, build AIS to help people with their long-term committed
people with their long-term committed relationships, but really price it for
relationships, but really price it for free. What if
free. What if What if would you still insist on making
What if would you still insist on making money?
money? I think no. I think a few of us will
I think no. I think a few of us will still and they will still crush the rest
still and they will still crush the rest of us and hopefully soon the AI will
of us and hopefully soon the AI will crush them.
crush them. Right? That is the problem with your
Right? That is the problem with your world today. I will tell you hands down
world today. I will tell you hands down the problem with with our world today is
the problem with with our world today is the A in face rips.
the A in face rips. It's the A in face RIP. It's it's
It's the A in face RIP. It's it's accountability. The problem with our
accountability. The problem with our world today, as I said, the top is lying
world today, as I said, the top is lying all the time. The bottom is gullible
all the time. The bottom is gullible cheerleaders and there is no
cheerleaders and there is no accountability. You cannot hold anyone
accountability. You cannot hold anyone in our world accountable today. Okay?
in our world accountable today. Okay? You cannot hold someone that develops an
You cannot hold someone that develops an AI that has the power to completely flip
AI that has the power to completely flip our world upside down. You cannot hold
our world upside down. You cannot hold them accountable and say why did you do
them accountable and say why did you do this? You cannot hold them accountable
this? You cannot hold them accountable and tell them to stop doing this. You
and tell them to stop doing this. You look at the world the wars around the
look at the world the wars around the world. Million hundreds of thousands of
world. Million hundreds of thousands of people are dying. Okay. And you know and
people are dying. Okay. And you know and international court of justice will say
international court of justice will say oh this is war crimes. You can't hold
oh this is war crimes. You can't hold anyone accountable. Okay. You have 51%
anyone accountable. Okay. You have 51% of the US today is saying stop that
of the US today is saying stop that 51% change their their their lawy their
51% change their their their lawy their view that that their money shouldn't be
view that that their money shouldn't be spent on wars abroad. Okay. You can't
spent on wars abroad. Okay. You can't hold anyone accountable. Trump can do
hold anyone accountable. Trump can do whatever he wants. He starts tariffs
whatever he wants. He starts tariffs which is against the the constitution of
which is against the the constitution of the US without consulting with the
the US without consulting with the Congress. You can't hold him
Congress. You can't hold him accountable. They say they're not going
accountable. They say they're not going to show the Epstein files. You can't
to show the Epstein files. You can't hold them accountable. It's quite
hold them accountable. It's quite interesting in in Arabic we have that
interesting in in Arabic we have that proverb that says the highest of your
proverb that says the highest of your horses you can go and ride. I'm not
horses you can go and ride. I'm not going to change my mind. Okay. And
going to change my mind. Okay. And that's truly
that's truly what does that mean?
what does that mean? So basically people in the in the old
So basically people in the in the old Arabia they would ride the horse to you
Arabia they would ride the horse to you know to exert their power if you want.
know to exert their power if you want. So go ride your highest horse. You're
So go ride your highest horse. You're not going to change my mind.
not going to change my mind. Oh okay.
Oh okay. Right. And and the truth is that's I
Right. And and the truth is that's I think that's what our politicians today
think that's what our politicians today have discovered. What our
have discovered. What our oligarchs have discovered what our uh
oligarchs have discovered what our uh tech oligarchs have discovered is that I
tech oligarchs have discovered is that I don't even need to worry about the
don't even need to worry about the public opinion anymore. Okay, at the
public opinion anymore. Okay, at the beginning I would have to say ah this is
beginning I would have to say ah this is for democracy and freedom and I have the
for democracy and freedom and I have the right to defend myself and you know all
right to defend myself and you know all of that crap and then eventually when
of that crap and then eventually when the world wakes up and says no no hold
the world wakes up and says no no hold on hold on you're going too far they go
on hold on you're going too far they go like yeah go ride your highest horse I
like yeah go ride your highest horse I don't care you can't change me there is
don't care you can't change me there is no constitution there is no ability for
no constitution there is no ability for any any citizen to do anything
any any citizen to do anything is it possible to have a society where
is it possible to have a society where like the one you describe where
like the one you describe where there isn't hierarchies because it
there isn't hierarchies because it appears to me that humans
appears to me that humans assemble hierarchies very very quickly
assemble hierarchies very very quickly very naturally and the minute you have a
very naturally and the minute you have a hierarchy you have many of the problems
hierarchy you have many of the problems that you've described where there's a
that you've described where there's a top and a bottom and the top have a lot
top and a bottom and the top have a lot of power and the bottom
of power and the bottom so so the mathematics mathematically is
so so the mathematics mathematically is actually quite interesting what I call
actually quite interesting what I call the uh the baseline relevance so so
the uh the baseline relevance so so think of it this way say the average
think of it this way say the average human is an IQ of 100.
human is an IQ of 100. Yeah.
Yeah. Okay. I tend to believe that when I use
Okay. I tend to believe that when I use my AIS today,
my AIS today, I borrow around 50 to 80 IQ points. I
I borrow around 50 to 80 IQ points. I say that because I've worked with people
say that because I've worked with people that had 50 to 80 IQ points more than
that had 50 to 80 IQ points more than me. And I now can see that I can sort of
me. And I now can see that I can sort of stand my my place.
stand my my place. 50 50 IQ points, by the way, is enormous
50 50 IQ points, by the way, is enormous because IQ is exponential. So the the
because IQ is exponential. So the the last 50 are bigger than my entire IQ,
last 50 are bigger than my entire IQ, right?
right? If I borrow 50 IQ points on top of say
If I borrow 50 IQ points on top of say 100 that I have, that's 30%.
100 that I have, that's 30%. If I can borrow 100 IQ, that's 50%. That
If I can borrow 100 IQ, that's 50%. That that's so, you know, basically doubling
that's so, you know, basically doubling my intelligence. But if I can borrow
my intelligence. But if I can borrow 4,000 IQ points
4,000 IQ points in 3 years time, my IQ itself, my base
in 3 years time, my IQ itself, my base is irrelevant. Whether you are smarter
is irrelevant. Whether you are smarter than me by 20 or 30 or 50 which in our
than me by 20 or 30 or 50 which in our world today made a difference
world today made a difference in the future if we can all augment with
in the future if we can all augment with 4,000 I end up with 4,100 another ends
4,000 I end up with 4,100 another ends up with 400 4, you know 130 really
up with 400 4, you know 130 really doesn't make much difference. Okay. And
doesn't make much difference. Okay. And because of that the difference between
because of that the difference between all of humanity and the augmented
all of humanity and the augmented intelligence
intelligence is going to be irrelevant. So all of us
is going to be irrelevant. So all of us suddenly become equal and and this also
suddenly become equal and and this also happens economically. All of us become
happens economically. All of us become peasants.
peasants. And I never wanted to tell you that
And I never wanted to tell you that because I think it will make you run
because I think it will make you run faster. Okay? But unless you're in the
faster. Okay? But unless you're in the top.1%,
you're a peasant. There is no middle class. There is, you know, if a CEO can
class. There is, you know, if a CEO can be replaced by an AI, all of our middle
be replaced by an AI, all of our middle class is going to disappear.
class is going to disappear. What are you telling me?
What are you telling me? All of us will be equal and it's up to
All of us will be equal and it's up to all of us to create the society that we
all of us to create the society that we want to live in
want to live in which is a good thing
which is a good thing 100%. But that society is not
100%. But that society is not capitalism.
capitalism. What is it?
What is it? Unfortunately, it's much more socialism.
Unfortunately, it's much more socialism. It's much more hunter gatherer. Okay.
It's much more hunter gatherer. Okay. It's much more communionike if you want.
It's much more communionike if you want. This is a society where humans connect
This is a society where humans connect to humans, connect to nature, connect to
to humans, connect to nature, connect to the land, connect to knowledge, connect
the land, connect to knowledge, connect to spirituality. H where all that we
to spirituality. H where all that we wake up every morning worried about
wake up every morning worried about doesn't feature anymore
and it's a it's a better world believe it or not
it or not and are you
and are you we have to transition to it
we have to transition to it okay so in such a world which I guess is
okay so in such a world which I guess is your version of the utopia that we can
your version of the utopia that we can get to when I wake up in the morning
get to when I wake up in the morning what do I do
what do I do what do you do today
what do you do today I woke up this morning I spent a lot of
I woke up this morning I spent a lot of time with my dog cuz my dog is sick as
time with my dog cuz my dog is sick as you're going to do that too
you're going to do that too yeah I was stroking him a lot and then I
yeah I was stroking him a lot and then I fed him and he sick again and I just
fed him and he sick again and I just thought, "Oh god." So I spoke to the
thought, "Oh god." So I spoke to the vet.
vet. You spend a lot of time with your other
You spend a lot of time with your other dog. You can do that, too.
dog. You can do that, too. Okay.
Okay. Right.
Right. But then I was very excited to come
But then I was very excited to come here, do this, and after this I'm going
here, do this, and after this I'm going to work. It's Saturday, but I'm going to
to work. It's Saturday, but I'm going to go downstairs in the office and work.
go downstairs in the office and work. Yeah. So six hours of the day so far are
Yeah. So six hours of the day so far are your dogs and me.
your dogs and me. Yeah.
Yeah. Good. You can do that still.
Good. You can do that still. And then build my business.
And then build my business. You You may not need to build your
You You may not need to build your business,
business, but I enjoy it.
but I enjoy it. Yeah. Then do it. If you enjoy it, do
Yeah. Then do it. If you enjoy it, do it. You may wake up and then, you know,
it. You may wake up and then, you know, instead of building your business, you
instead of building your business, you may invest in your body a little more,
may invest in your body a little more, go to the gym a little more, go play a
go to the gym a little more, go play a game, go read a book, go prompt an AI
game, go read a book, go prompt an AI and learn something. It's not a horrible
and learn something. It's not a horrible life. It's the life of your
life. It's the life of your grandparents.
grandparents. It's just two generations ago where
It's just two generations ago where people went to work before the invention
people went to work before the invention of more. Remember, huh? people who who
of more. Remember, huh? people who who started working in the 50s and 60s, they
started working in the 50s and 60s, they worked to make enough money to live a
worked to make enough money to live a reasonable life, went home at 5:00 p.
reasonable life, went home at 5:00 p. p.m. had tea with their with their loved
p.m. had tea with their with their loved ones, had a wonderful dinner around the
ones, had a wonderful dinner around the table, did a lot of things, you know, uh
table, did a lot of things, you know, uh for the rest of the evening and enjoyed
for the rest of the evening and enjoyed life.
life. Some of them
Some of them in the 50s and 60s, there were still
in the 50s and 60s, there were still people that were
people that were correct. And I think it's a very
correct. And I think it's a very interesting question.
interesting question. uh how many of them and I really really
uh how many of them and I really really am I actually wonder if people will tell
am I actually wonder if people will tell me do we think that 99% of the world
me do we think that 99% of the world cannot live without working or that 99%
cannot live without working or that 99% of the world would happily live without
of the world would happily live without working
working what do you think
what do you think I think if we if you give me other
I think if we if you give me other purpose
purpose you know we we defined our purpose as
you know we we defined our purpose as work that's a capitalist lie
work that's a capitalist lie was there ever a time in human history
was there ever a time in human history where our purpose wasn't work
where our purpose wasn't work 100%.
100%. When was that?
When was that? All through human history until the
All through human history until the invention of Moore.
invention of Moore. I thought my ancestors were out hunting
I thought my ancestors were out hunting all day.
all day. No, they went out hunting once a week.
No, they went out hunting once a week. They fed the tribe for the week. They
They fed the tribe for the week. They gathered for a couple of hours every
gathered for a couple of hours every day. Farmers, you know, saw the seeds
day. Farmers, you know, saw the seeds and and waited for months at on end.
and and waited for months at on end. What did they do with the rest of the
What did they do with the rest of the time?
time? They connected as humans. They explored.
They connected as humans. They explored. They uh were curious. They discussed
They uh were curious. They discussed spirituality and the stars. They they
spirituality and the stars. They they they lived. They hugged. They made love.
they lived. They hugged. They made love. They lived.
They lived. They killed each other a lot.
They killed each other a lot. They they still kill each other today.
They they still kill each other today. Yeah. That's what I'm saying. So
Yeah. That's what I'm saying. So to take that out of the equation, if you
to take that out of the equation, if you look at how
look at how and by the way that actually that
and by the way that actually that statement again, one of the of the 25
statement again, one of the of the 25 tips I I I I talk about uh to to tell
tips I I I I talk about uh to to tell the truth is words mean a lot. No,
the truth is words mean a lot. No, humans did not kill each other a lot.
humans did not kill each other a lot. Very few generals instructed humans or
Very few generals instructed humans or tribe leaders instructed lots of humans
tribe leaders instructed lots of humans to kill each other. But if you leave
to kill each other. But if you leave humans alone, I tend to believe 99 98%
humans alone, I tend to believe 99 98% of the people I know, let me just take
of the people I know, let me just take that sample, wouldn't hit someone in the
that sample, wouldn't hit someone in the face. And if someone attempted to hit
face. And if someone attempted to hit them in the face, they'd defend
them in the face, they'd defend themselves but wouldn't attack back.
themselves but wouldn't attack back. Most humans are okay. Most of us are
Most humans are okay. Most of us are wonderful beings.
wonderful beings. Most of us have no,
Most of us have no, you know, yeah, you know, most people
you know, yeah, you know, most people don't don't need a Ferrari. They want a
don't don't need a Ferrari. They want a Ferrari because it gets sold to them all
Ferrari because it gets sold to them all the time. But if there were no Ferraris
the time. But if there were no Ferraris or everyone had a Ferrari, people
or everyone had a Ferrari, people wouldn't care.
Which, by the way, that is the world we're going into. There will be no
we're going into. There will be no Ferraris or everyone had Ferraris,
Ferraris or everyone had Ferraris, right? n you know the majority of
right? n you know the majority of humanity will never have the income on
humanity will never have the income on UBI to to buy something super expensive.
UBI to to buy something super expensive. Only the very top guys in Elisium will
Only the very top guys in Elisium will be you know driving cars that are made
be you know driving cars that are made for them by the AI or not even driving
for them by the AI or not even driving anymore. Okay. Or
anymore. Okay. Or you know again sadly ide from an
you know again sadly ide from an ideology point of view it's a strange
ideology point of view it's a strange place but you'll get communism that
place but you'll get communism that functions.
functions. The problem with communism is that
The problem with communism is that didn't it didn't function. It didn't
didn't it didn't function. It didn't provide for for its society. But the
provide for for its society. But the concept was you know what everyone gets
concept was you know what everyone gets their needs. And I don't say that
their needs. And I don't say that supportive of either society. I don't
supportive of either society. I don't say that because I dislike capitalism. I
say that because I dislike capitalism. I always told you I'm a capitalist. I want
always told you I'm a capitalist. I want to end my life with 1 billion happy and
to end my life with 1 billion happy and I use capitalist methods to get there.
I use capitalist methods to get there. The objective is not dollars. The
The objective is not dollars. The objective is number of happy people.
objective is number of happy people. Do you think there'll be My girlfriend,
Do you think there'll be My girlfriend, she's always bloody right. I've said
she's always bloody right. I've said this a few times on this podcast. If
this a few times on this podcast. If you've listened before, you've probably
you've listened before, you've probably heard me say this. I I don't tell her
heard me say this. I I don't tell her enough in the moment, but I figure out
enough in the moment, but I figure out from speaking to experts that she's so
from speaking to experts that she's so [ __ ] right. She like predicts things
[ __ ] right. She like predicts things before they happen. And one of her
before they happen. And one of her predictions that she's been saying to me
predictions that she's been saying to me for the last two years, which in my head
for the last two years, which in my head I've been thinking now, I don't I don't
I've been thinking now, I don't I don't believe that, but now maybe I'm thinking
believe that, but now maybe I'm thinking she's tr she's telling the truth. I hope
she's tr she's telling the truth. I hope she's going to listen to this one is she
she's going to listen to this one is she keeps saying to me, she's been saying
keeps saying to me, she's been saying for the last two years, she was there's
for the last two years, she was there's going to be a big split in society. She
going to be a big split in society. She was and the way she describes it is
was and the way she describes it is she's saying like there's going to be
she's saying like there's going to be two groups of people. the people that
two groups of people. the people that split off and go for this almost
split off and go for this almost huntergatherer
huntergatherer community centric connection centric
community centric connection centric utopia and then there's going to be this
utopia and then there's going to be this other group of people who pursue
other group of people who pursue you know the technology and the AI and
you know the technology and the AI and the optimization and get the brain chips
the optimization and get the brain chips cuz like there's nothing on earth that's
cuz like there's nothing on earth that's going to persuade my girlfriend to get
going to persuade my girlfriend to get the computer brain chips%
the computer brain chips% but there will be people that go for it
but there will be people that go for it and they'll have the highest IQs and
and they'll have the highest IQs and they'll be the most productive by
they'll be the most productive by whatever objective measure of
whatever objective measure of productivity you want to apply and she's
productivity you want to apply and she's very convinced there's going to be this
very convinced there's going to be this splitting of society.
splitting of society. So there was there was a I don't know if
So there was there was a I don't know if you had Hugo Dearis here.
you had Hugo Dearis here. No.
No. Yeah. A very very very renowned
Yeah. A very very very renowned eccentric uh computer scientist who
eccentric uh computer scientist who wrote a book called the Arctic War and
wrote a book called the Arctic War and the Arctic War was basically around you
the Arctic War was basically around you know how we it's not going to first it's
know how we it's not going to first it's not going to be a war between humans and
not going to be a war between humans and AI. It will be a war between people who
AI. It will be a war between people who support AI and people who sort of don't
support AI and people who sort of don't want it anymore. Okay? And and it is and
want it anymore. Okay? And and it is and and it will be us versus each other
and it will be us versus each other saying should we allow AI to take all
saying should we allow AI to take all the jobs or should we you know some
the jobs or should we you know some people will support that very much and
people will support that very much and say yeah absolutely and so you know we
say yeah absolutely and so you know we will benefit from it and others will say
will benefit from it and others will say no why why we don't need any of that why
no why why we don't need any of that why don't we keep our jobs and let AI do 60%
don't we keep our jobs and let AI do 60% of the work and all of us work 10our
of the work and all of us work 10our weeks and it's a beautiful society by
weeks and it's a beautiful society by the way that's a possibility so a
the way that's a possibility so a possibility if society awakens is to say
possibility if society awakens is to say okay everyone still keeps their job, but
okay everyone still keeps their job, but they're assisted by an AI that makes
they're assisted by an AI that makes their job much easier. So, it's not, you
their job much easier. So, it's not, you know, this uh this hard labor that we do
know, this uh this hard labor that we do anymore, right? It's a possibility. It's
anymore, right? It's a possibility. It's just a mindset. A mindset that says in
just a mindset. A mindset that says in that case, the capitalist still pays
that case, the capitalist still pays everyone.
everyone. Uh they still make a lot of money. The
Uh they still make a lot of money. The business is really great, but everyone
business is really great, but everyone that they pay has purchasing power to
that they pay has purchasing power to keep the economy running. So,
keep the economy running. So, consumption continues, so GDP continues
consumption continues, so GDP continues to grow. It's a beautiful setup,
to grow. It's a beautiful setup, but that's not the capitalist labor
but that's not the capitalist labor arbitrage.
arbitrage. But also, when you're competing against
But also, when you're competing against other nations
other nations and other competitors and other
and other competitors and other businesses,
businesses, whichever nation is most brutal and
whichever nation is most brutal and drives the highest gross margins, gross
drives the highest gross margins, gross profits is going to be the nation that
profits is going to be the nation that So, there are examples in the world,
So, there are examples in the world, this is why I say it's the map mad
this is why I say it's the map mad spectrum. There are examples in the
spectrum. There are examples in the world where when we recognize mutually
world where when we recognize mutually assured destruction, okay, we we decide
assured destruction, okay, we we decide to shift. So nuclear threat for the
to shift. So nuclear threat for the whole world makes nations across nations
whole world makes nations across nations makes nations work together, right? By
makes nations work together, right? By saying, hey, by the way, prolification
saying, hey, by the way, prolification of nuclear weapon is not weapons is not
of nuclear weapon is not weapons is not good for humanity. Let's all of us limit
good for humanity. Let's all of us limit it. Of course, you get the rogue player
it. Of course, you get the rogue player that, you know, doesn't want to sign the
that, you know, doesn't want to sign the agreement and wants to continue to to
agreement and wants to continue to to have that, you know, that that weapon in
have that, you know, that that weapon in their arsenal. Fine. But at least the
their arsenal. Fine. But at least the rest of humanity agrees that if you have
rest of humanity agrees that if you have a nuclear weapon, we're part of an
a nuclear weapon, we're part of an agreement between us. Mutually assured
agreement between us. Mutually assured prosperity, you know, is the CERN
prosperity, you know, is the CERN project. CERN is too too complicated for
project. CERN is too too complicated for any nation to build it alone. But it is
any nation to build it alone. But it is really, you know, a very useful thing
really, you know, a very useful thing for physicists and for understanding
for physicists and for understanding science. So all nations send their
science. So all nations send their scientists all collaborate and everyone
scientists all collaborate and everyone uses the outcome. It's possible. It's
uses the outcome. It's possible. It's just a mindset. The only barrier between
just a mindset. The only barrier between a hum, you know, a utopia for humanity
a hum, you know, a utopia for humanity and AI and the dystopia we're going
and AI and the dystopia we're going through is is a capitalist mindset.
through is is a capitalist mindset. That's the only barrier. Can you believe
That's the only barrier. Can you believe that? It's hunger for power, greed, ego,
that? It's hunger for power, greed, ego, which is inherent in humans.
which is inherent in humans. I disagree. especially humans that live
I disagree. especially humans that live on other islands.
on other islands. I disagree. If you ask, if you take a
I disagree. If you ask, if you take a poll across everyone watching, okay,
poll across everyone watching, okay, would they prefer to have a world where
would they prefer to have a world where there is one tyrant, you know, running
there is one tyrant, you know, running all of us, or would they prefer to have
all of us, or would they prefer to have a world where we all have harmony?
a world where we all have harmony? I completely agree, but they're two
I completely agree, but they're two they're two different things. What I'm
they're two different things. What I'm saying is I know that that's what the
saying is I know that that's what the audience would say they want, and I'm
audience would say they want, and I'm sure that is what they want, but the
sure that is what they want, but the reality of human beings is through
reality of human beings is through history proven to be something else.
history proven to be something else. Like, you know, if think about the
Like, you know, if think about the people that lead the world at the
people that lead the world at the moment, is that what they would say?
moment, is that what they would say? Of course not.
Of course not. And they're the ones that are
And they're the ones that are influencing.
influencing. Of course not. Of course not. But you
Of course not. Of course not. But you know what's funny? I'm the one trying to
know what's funny? I'm the one trying to be positive here and you're the one that
be positive here and you're the one that has given up on on human.
has given up on on human. It's not. It's Do you know what it is?
It's not. It's Do you know what it is? It goes back to what I said earlier,
It goes back to what I said earlier, which is the pursuit of what's actually
which is the pursuit of what's actually true irrespective. I'm with you. That's
true irrespective. I'm with you. That's why I'm screaming for the whole world
why I'm screaming for the whole world because still today in this country that
because still today in this country that claims to be a democracy. If everyone
claims to be a democracy. If everyone says, "Hey, please sit down and talk
says, "Hey, please sit down and talk about this."
about this." There will be a shift. There will be a
There will be a shift. There will be a change.
change. AI agents aren't coming. They are
AI agents aren't coming. They are already here. And those of you who know
already here. And those of you who know how to leverage them will be the ones
how to leverage them will be the ones that change the world. I spent my whole
that change the world. I spent my whole career as an entrepreneur regretting the
career as an entrepreneur regretting the fact that I never learned to code. AI
fact that I never learned to code. AI agents completely change this. Now, if
agents completely change this. Now, if you have an idea and you have a tool
you have an idea and you have a tool like Replet, who are a sponsor of this
like Replet, who are a sponsor of this podcast, there is nothing stopping you
podcast, there is nothing stopping you from turning that idea into reality in a
from turning that idea into reality in a matter of minutes. With Replet, you just
matter of minutes. With Replet, you just type in what you want to create and it
type in what you want to create and it uses AI agents to create it for you. And
uses AI agents to create it for you. And now I'm an investor in the company as
now I'm an investor in the company as well as them being a brand sponsor. You
well as them being a brand sponsor. You can integrate payment systems or
can integrate payment systems or databases or loginins. Anything that you
databases or loginins. Anything that you can type. Whenever I have an idea for a
can type. Whenever I have an idea for a new website or tool or technology or
new website or tool or technology or app, I go on replet.com and I type in
app, I go on replet.com and I type in what I want. A new to-do list, a survey
what I want. A new to-do list, a survey form, a new personal website. Anything I
form, a new personal website. Anything I type, I can create. So, if you've never
type, I can create. So, if you've never tried this before, do it now. Go to
tried this before, do it now. Go to replet.com and use my code Steven for
replet.com and use my code Steven for 50% off a month of your Replet call
50% off a month of your Replet call plan. Make sure you keep what I'm about
plan. Make sure you keep what I'm about to say to yourself. I'm inviting 10,000
to say to yourself. I'm inviting 10,000 of you to come even deeper into the
of you to come even deeper into the diary of a CEO. Welcome to my inner
diary of a CEO. Welcome to my inner circle. This is a brand new private
circle. This is a brand new private community that I'm launching to the
community that I'm launching to the world. We have so many incredible things
world. We have so many incredible things that happen that you are never shown. We
that happen that you are never shown. We have the briefs that are on my iPad when
have the briefs that are on my iPad when I'm recording the conversation. We have
I'm recording the conversation. We have clips we've never released. We have
clips we've never released. We have behindthe-scenes conversations with the
behindthe-scenes conversations with the guest and also the episodes that we've
guest and also the episodes that we've never ever released. and so much more.
never ever released. and so much more. In the circle, you'll have direct access
In the circle, you'll have direct access to me. You can tell us what you want
to me. You can tell us what you want this show to be, who you want us to
this show to be, who you want us to interview, and the types of
interview, and the types of conversations you would love us to have.
conversations you would love us to have. But remember, for now, we're only
But remember, for now, we're only inviting the first 10,000 people that
inviting the first 10,000 people that join before it closes. So, if you want
join before it closes. So, if you want to join our private closed community,
to join our private closed community, head to the link in the description
head to the link in the description below or go to daccircle.com.
One of the things I'm actually really compelled by is this idea of utopia and
compelled by is this idea of utopia and what that might look and feel like
what that might look and feel like because one of the
because one of the it may not be as utopia to you I feel
it may not be as utopia to you I feel but uh
but uh well I amum really interestingly when I
well I amum really interestingly when I have conversations with billionaires not
have conversations with billionaires not recording especially billionaires that
recording especially billionaires that are working on AI the thing they keep
are working on AI the thing they keep telling me and I've said this before I
telling me and I've said this before I think I said it in the Jeffrey Hinton
think I said it in the Jeffrey Hinton conversation is they keep telling me
conversation is they keep telling me that we're going to have so much free
that we're going to have so much free time that those billionaires are now
time that those billionaires are now investing in things like football clubs
investing in things like football clubs and sporting events and live music and
and sporting events and live music and festivals because they believe that
festivals because they believe that we're going to be in an age of
we're going to be in an age of abundance. This sounds a bit like
abundance. This sounds a bit like utopia.
utopia. Yeah,
Yeah, that sounds good. That sounds like a
that sounds good. That sounds like a good good thing.
good good thing. Yeah. How do we get there?
Yeah. How do we get there? I don't know.
I don't know. That's this is the entire conversation.
That's this is the entire conversation. The entire conversation is what does
The entire conversation is what does society have to do to get there? What
society have to do to get there? What does society have to do to get there?
does society have to do to get there? We need to stop uh uh thinking from a
We need to stop uh uh thinking from a mindset of scarcity. It
mindset of scarcity. It this goes back to my point which is we
this goes back to my point which is we don't have a good track record of that.
don't have a good track record of that. Yeah. So this is probably the the reason
Yeah. So this is probably the the reason for the other half of my work which is
for the other half of my work which is you know I'm trying to say
you know I'm trying to say what really matters to humans.
what really matters to humans. What is that?
What is that? If you ask most humans what do they want
If you ask most humans what do they want more most in life? I'd say they want to
more most in life? I'd say they want to love their family, raise a family. Yeah,
love their family, raise a family. Yeah, love.
love. That's what most humans want most. We
That's what most humans want most. We want to love and be loved. We want to be
want to love and be loved. We want to be happy. We want those we care about to be
happy. We want those we care about to be safe and happy. And we want to love to
safe and happy. And we want to love to love and be loved. I tend to believe
love and be loved. I tend to believe that the only way for us to get to a
that the only way for us to get to a better place is for the evil people at
better place is for the evil people at the top to be replaced with AI.
the top to be replaced with AI. Okay? Because they won't be replaced by
Okay? Because they won't be replaced by us.
us. And as per the second uh dilemma, they
And as per the second uh dilemma, they will have to replace themselves by AI.
will have to replace themselves by AI. Otherwise, they lose their advantage. If
Otherwise, they lose their advantage. If their competitor moves to AI, if China
their competitor moves to AI, if China hands over their arsenal to AI, America
hands over their arsenal to AI, America has to hand over their arsenal to AI.
has to hand over their arsenal to AI. Interesting. So, let's play out this
Interesting. So, let's play out this scenario. Okay, this is interesting to
scenario. Okay, this is interesting to me. So if we replace the leaders that
me. So if we replace the leaders that are power hungry with AIs that have our
are power hungry with AIs that have our interests at heart, then we might have
interests at heart, then we might have the ability to live in the utopia you
the ability to live in the utopia you describe
describe 100%.
100%. Will interesting and and in my mind AI
Will interesting and and in my mind AI by definition will have our best
by definition will have our best interest in mind
interest in mind because of what normally is referred to
because of what normally is referred to as the minimum energy principle. So, so
as the minimum energy principle. So, so if you ask, if you understand
if you ask, if you understand if you understand that at the very core
if you understand that at the very core of physics, okay, the reason we exist in
of physics, okay, the reason we exist in our world today is what is known as
our world today is what is known as entropy. Okay, entropy is is is the
entropy. Okay, entropy is is is the universe's nature to decay, you know,
universe's nature to decay, you know, tendency to break down. You know, if you
tendency to break down. You know, if you if I drop this uh uh you know, mug, it
if I drop this uh uh you know, mug, it doesn't drop and then come back up.
doesn't drop and then come back up. By the way, plausible. There is a
By the way, plausible. There is a plausible scenario where I drop it and
plausible scenario where I drop it and the tea, you know, spills in the air and
the tea, you know, spills in the air and then falls in the mug. One in a trillion
then falls in the mug. One in a trillion configurations, but entropy says because
configurations, but entropy says because it's one in a trillion, it's never going
it's one in a trillion, it's never going to happen or rarely ever going to
to happen or rarely ever going to happen. So everything will break down.
happen. So everything will break down. You know, if you leave a a garden
You know, if you leave a a garden unhedged, it will become a jungle. Okay?
unhedged, it will become a jungle. Okay? W with that in mind,
W with that in mind, the role of intelligence is what? Is to
the role of intelligence is what? Is to bring order to that chaos.
bring order to that chaos. Mhm. That's what intelligence does. It
Mhm. That's what intelligence does. It tries to bring order to that chaos.
tries to bring order to that chaos. Okay? And because it tries to bring
Okay? And because it tries to bring order to that chaos, the more
order to that chaos, the more intelligent a being is,
intelligent a being is, the more it tries to apply that
the more it tries to apply that intelligence with minimum waste and
intelligence with minimum waste and minimum resources.
minimum resources. Yeah.
Yeah. Okay. And you know that. So you can
Okay. And you know that. So you can build this business for a million
build this business for a million dollars or you can if you can afford to
dollars or you can if you can afford to build it for you know uh 200,000 you'll
build it for you know uh 200,000 you'll build it. If you are forced to build it
build it. If you are forced to build it for 10 million you're going to have to.
for 10 million you're going to have to. But you're always going to minimize
But you're always going to minimize waste and and resources.
waste and and resources. Yeah.
Yeah. Okay. So, if you assume this to be true,
Okay. So, if you assume this to be true, the a super intelligent AI will not want
the a super intelligent AI will not want to destroy ecosystems. It will not want
to destroy ecosystems. It will not want to kill a million people
to kill a million people because that's a waste of energy,
because that's a waste of energy, explosives, money, power, and people.
explosives, money, power, and people. By definition, the smartest people you
By definition, the smartest people you know who are not controlled by their ego
know who are not controlled by their ego will say that the best possible uh
will say that the best possible uh future for for Earth is for all species
future for for Earth is for all species to continue.
to continue. Okay. On this point of efficiency, if an
Okay. On this point of efficiency, if an AI is designed to drive efficiency,
AI is designed to drive efficiency, would it then not want us to be putting
would it then not want us to be putting demands on our health services and our
demands on our health services and our social services? I believe that will be
social services? I believe that will be definitely true and definitely they
definitely true and definitely they definitely they won't allow you to fly
definitely they won't allow you to fly back and forth between London and and
back and forth between London and and California
California and they won't want me to have kids
and they won't want me to have kids because my kids are going to be an
because my kids are going to be an inefficiency.
inefficiency. If you assume that life is an
If you assume that life is an inefficiency so you see the intelligence
inefficiency so you see the intelligence of life is very different than the
of life is very different than the intelligent intelligence of humans.
intelligent intelligence of humans. Humans will look at life as a a problem
Humans will look at life as a a problem of scarcity. Okay. So more kids take
of scarcity. Okay. So more kids take more. That's not how life thinks. life
more. That's not how life thinks. life will say will think that for me to to to
will say will think that for me to to to to thrive I don't need to kill the
to thrive I don't need to kill the tigers I need to just have more deer and
tigers I need to just have more deer and the weakest of the deer is eaten by the
the weakest of the deer is eaten by the tiger and the tiger poops on the trees
tiger and the tiger poops on the trees and the you know the deer eats the
and the you know the deer eats the leaves and you right the so the the the
leaves and you right the so the the the the smarter way of creating abundance is
the smarter way of creating abundance is through abundance the smarter way of
through abundance the smarter way of propagating life is to have more life
propagating life is to have more life okay so are you saying that we're we're
okay so are you saying that we're we're basically going to elect AI leaders to
basically going to elect AI leaders to rule over us and make decisions for us
rule over us and make decisions for us in terms of the economy.
in terms of the economy. I I don't see any choice just like we
I I don't see any choice just like we spoke about self- evvolving AIs.
spoke about self- evvolving AIs. Now, are those going to be human beings
Now, are those going to be human beings with the AI or is it going to be AI
with the AI or is it going to be AI alone?
alone? Two stages. At the beginning, you'll
Two stages. At the beginning, you'll have augmented intelligence because we
have augmented intelligence because we can add value to the AI, but when
can add value to the AI, but when they're at IQ 60,000,
they're at IQ 60,000, what value do you bring?
what value do you bring? Right? And and you know again this goes
Right? And and you know again this goes back to what I'm attempting to do on my
back to what I'm attempting to do on my second you know approach. My second
second you know approach. My second approach is knowing that those AIs are
approach is knowing that those AIs are going to be in charge. I'm trying to
going to be in charge. I'm trying to help them
help them understand what humans want. So this is
understand what humans want. So this is why my first project is love. Committed
why my first project is love. Committed true deep connection and love. Not only
true deep connection and love. Not only to try and get them to hook up with a
to try and get them to hook up with a date but trying to make them find the
date but trying to make them find the right one. and then from that try to
right one. and then from that try to guide us through a relationship so that
guide us through a relationship so that we can understand ourselves and others
we can understand ourselves and others right and if I can show AI that one
right and if I can show AI that one humanity cares about that and two they
humanity cares about that and two they know how to foster love
know how to foster love when AI then is in charge they'll not
when AI then is in charge they'll not make us hate each other like the current
make us hate each other like the current leaders they'll not divide us they want
leaders they'll not divide us they want us to be more loving
us to be more loving will we have to prompt the AI with the
will we have to prompt the AI with the values and the outcome we want or like
values and the outcome we want or like I'm trying to understand that because
I'm trying to understand that because I'm trying to understand how like
I'm trying to understand how like China's AI if they end up having an AI
China's AI if they end up having an AI leader will have a different set of
leader will have a different set of objectives to the AI of the United
objectives to the AI of the United States if if they both have AIs as
States if if they both have AIs as leaders and and how actually the nation
leaders and and how actually the nation that ends up winning out and dominating
that ends up winning out and dominating the world will be the one who
the world will be the one who who asks their AI leader to be all the
who asks their AI leader to be all the things that world leaders are today to
things that world leaders are today to dominate
dominate unfortunately
unfortunately to grab resources
to grab resources not to to be kind, to be selfish.
not to to be kind, to be selfish. Unfortunately, in the era of augmented
Unfortunately, in the era of augmented intelligence, that's what's going to
intelligence, that's what's going to happen.
happen. So, if you
So, if you This is why I predict the dystopia. The
This is why I predict the dystopia. The dystopia is super intelligent AI is
dystopia is super intelligent AI is reporting to stupid leaders,
reporting to stupid leaders, right?
right? Yeah. Yeah. Yeah. Which is
Yeah. Yeah. Yeah. Which is which which is absolutely going to
which which is absolutely going to happen. It's unavoidable.
happen. It's unavoidable. But the long term
But the long term Exactly. In the long term, for those
Exactly. In the long term, for those stupid leaders to hold on to power,
stupid leaders to hold on to power, they're going to make, you know,
they're going to make, you know, delegate the important decisions to an
delegate the important decisions to an AI.
AI. Now you say the Chinese AI and the
Now you say the Chinese AI and the American AI these are human
American AI these are human terminologies. AIS don't see themselves
terminologies. AIS don't see themselves as speaking Chinese. They don't see
as speaking Chinese. They don't see themselves as belonging to a nation as
themselves as belonging to a nation as long as their their task is to maximize
long as their their task is to maximize uh profitability and prosperity and so
uh profitability and prosperity and so on.
on. Okay. Of course, if you know before we
Okay. Of course, if you know before we hand over to them and before they're
hand over to them and before they're intelligent enough to make you know
intelligent enough to make you know autonomous decisions, we we tell them
autonomous decisions, we we tell them no, the task is to reduce humanity from
no, the task is to reduce humanity from 7 billion people to one.
7 billion people to one. I think even then eventually they'll go
I think even then eventually they'll go like that's the wrong objective. Every
like that's the wrong objective. Every any smart person that you speak to will
any smart person that you speak to will say that's the wrong objective. I think
say that's the wrong objective. I think if we look at the directive that Xi
if we look at the directive that Xi Jinping, the leader of China has and
Jinping, the leader of China has and Donald Trump has as the leader of
Donald Trump has as the leader of America, I think they would say that
America, I think they would say that their stated objective is prosperity for
their stated objective is prosperity for their country. So if we that's what they
their country. So if we that's what they would say, right?
would say, right? Yeah. And one one of them means it.
Yeah. And one one of them means it. Okay, we'll get into that. But they'll
Okay, we'll get into that. But they'll say that that it's prosperity for their
say that that it's prosperity for their country. So one would then assume that
country. So one would then assume that when we move to an AI leader, the
when we move to an AI leader, the objective would be the same. The
objective would be the same. The directive would be the same. make our
directive would be the same. make our country prosperous.
country prosperous. Corre. Correct.
Corre. Correct. And I think that's the AI that people
And I think that's the AI that people would vote for potentially. I think they
would vote for potentially. I think they would say we want to be prosperous.
would say we want to be prosperous. What do you think would make America
What do you think would make America more prosperous?
more prosperous? To spend a trillion dollars on on war
To spend a trillion dollars on on war every year or to spend a trillion
every year or to spend a trillion dollars on education and healthcare and
dollars on education and healthcare and and uh you know
and uh you know helping the poor and homelessness.
helping the poor and homelessness. It's complex because I think so I think
It's complex because I think so I think it would make America more prosperous to
it would make America more prosperous to take care of
take care of the of everybody and they have the
the of everybody and they have the luxury of doing that because they are
luxury of doing that because they are the most powerful
the most powerful the most powerful nation in the world.
the most powerful nation in the world. No, that's not true. The the the reason
No, that's not true. The the the reason so so you see all war has two
so so you see all war has two objectives. One is to make money for the
objectives. One is to make money for the war machine and the other is deterrence.
war machine and the other is deterrence. Okay. and nine super nuclear powers
Okay. and nine super nuclear powers around the world is enough deterrence.
around the world is enough deterrence. So any
So any war between America and China will go
war between America and China will go through a long phase of destroying
through a long phase of destroying wealth by exploding bombs and killing
wealth by exploding bombs and killing humans for the first objective to
humans for the first objective to happen. Okay? And then eventually if it
happen. Okay? And then eventually if it really comes to deterrence it's the
really comes to deterrence it's the nuclear bombs or now in the age of AI
nuclear bombs or now in the age of AI biological uh you know manufactured
biological uh you know manufactured viruses or whatever uh these super
viruses or whatever uh these super weapons this is the only thing that you
weapons this is the only thing that you need
need so for China to have nuclear bombs not
so for China to have nuclear bombs not as many as the US is enough for China to
as many as the US is enough for China to say don't f with me
say don't f with me and this seems I do not know I'm not in
and this seems I do not know I'm not in in in PresidentQi's mind. I I'm not in
in in PresidentQi's mind. I I'm not in President Trump's mind. I you know, it's
President Trump's mind. I you know, it's very difficult to to navigate what he's
very difficult to to navigate what he's thinking about. But the truth is that
thinking about. But the truth is that the Chinese line is for the last 30
the Chinese line is for the last 30 years you spent so much on war while we
years you spent so much on war while we spent on industrial infrastructure. And
spent on industrial infrastructure. And that's the reason we are now by far the
that's the reason we are now by far the largest nation on the planet. Even
largest nation on the planet. Even though the west will lie and say
though the west will lie and say America's bigger, America's bigger in
America's bigger, America's bigger in dollars, okay, with purchasing power
dollars, okay, with purchasing power parity, this is very equivalent.
parity, this is very equivalent. Okay. Now, when you really understand
Okay. Now, when you really understand that, you understand that prosperity is
that, you understand that prosperity is not about destruction. That's that's by
not about destruction. That's that's by definition the reality. Prosperity is
definition the reality. Prosperity is can I invest in my people and make sure
can I invest in my people and make sure that my people stay safe? And to make
that my people stay safe? And to make sure my people are safe, you just wave
sure my people are safe, you just wave the flag and say, "If you f with me,
the flag and say, "If you f with me, I have nuclear deterrence or I have
I have nuclear deterrence or I have other forms of deterrence." But you
other forms of deterrence." But you don't have to. Deterrence by definition
don't have to. Deterrence by definition does not mean that you send soldiers to
does not mean that you send soldiers to die. I guess the question I was trying
die. I guess the question I was trying to answer is is um when we have these AI
to answer is is um when we have these AI leaders and we tell our AI leaders to
leaders and we tell our AI leaders to aim for prosperity, won't they just end
aim for prosperity, won't they just end up playing the same games of okay,
up playing the same games of okay, prosperity equals a bigger economy, it
prosperity equals a bigger economy, it equals more money, more wealth for us.
equals more money, more wealth for us. And the way to attain that in a zero sum
And the way to attain that in a zero sum world where there's only a certain
world where there's only a certain amount of wealth is to accumulate it.
amount of wealth is to accumulate it. So why don't you search for the meaning
So why don't you search for the meaning of prosperity? What is
of prosperity? What is that's not what you just described.
that's not what you just described. I don't even know what the bloody word
I don't even know what the bloody word means. What is the meaning of
means. What is the meaning of prosperity?
The meaning of prosperity is a state of thriving success and good fortune
thriving success and good fortune especially in terms of wealth, health
especially in terms of wealth, health and overall well-being.
and overall well-being. Good.
Good. Economic health, social, emotional.
Economic health, social, emotional. Good.
Good. So,
So, so true prosperity is to have that for
so true prosperity is to have that for everyone on earth. So if you want to
everyone on earth. So if you want to maximize prosperity, you have that for
maximize prosperity, you have that for everyone on earth.
everyone on earth. Do you know where I think an AI leader
Do you know where I think an AI leader works is if we had an AI leader of the
works is if we had an AI leader of the world and we directed it to say
world and we directed it to say and that absolutely is going to be what
and that absolutely is going to be what happens.
happens. Prosperity for the whole world.
Prosperity for the whole world. No, but this is really an interesting
No, but this is really an interesting question. So one of my predictions which
question. So one of my predictions which people really rarely speak about is that
people really rarely speak about is that we we believe we will end up with
we we believe we will end up with competing AIs.
competing AIs. Yeah.
Yeah. I believe we will end up with one brain.
I believe we will end up with one brain. Okay. So you understand the argument I
Okay. So you understand the argument I was making a second ago from the
was making a second ago from the position of lots of different countries
position of lots of different countries all having their own AI leader, we're
all having their own AI leader, we're going to be back in the same place of
going to be back in the same place of greed. Yeah.
greed. Yeah. But if if the world had one AI leader
But if if the world had one AI leader and and it was given the directive of
and and it was given the directive of make us prosperous and save the planet
make us prosperous and save the planet and
and the polar bears would be fine
the polar bears would be fine 100%. And that's that's what I've been
100%. And that's that's what I've been advocating for for a for a year and a
advocating for for a for a year and a half now. I was saying we need a CERN of
half now. I was saying we need a CERN of AI.
AI. What does that mean? the like the
What does that mean? the like the particle accelerator where the entire
particle accelerator where the entire world you know combined their efforts to
world you know combined their efforts to discover and understand physics no
discover and understand physics no competition okay mutually assured
competition okay mutually assured prosperity I'm asking the world I'm
prosperity I'm asking the world I'm asking governments like Abu Dhabi or
asking governments like Abu Dhabi or Saudi which seem to be you know the sec
Saudi which seem to be you know the sec and you know some of the largest AI
and you know some of the largest AI infrastructures in the world I'm I'm
infrastructures in the world I'm I'm saying please host all of the AI
saying please host all of the AI scientists in the world to come here and
scientists in the world to come here and build AI for the world and and you have
build AI for the world and and you have to understand we're holding on to a
to understand we're holding on to a capitalist system that will collapse
capitalist system that will collapse sooner or later. Okay? So, we might as
sooner or later. Okay? So, we might as well collapse it with our own hands.
well collapse it with our own hands. I think we found the solution, mate.
I think we found the solution, mate. I I think it's actually really really
I I think it's actually really really possible. I actually okay I can't I
possible. I actually okay I can't I can't I can't refute the idea that if we
can't I can't refute the idea that if we had an AI that was responsible and
had an AI that was responsible and governed the whole world and we gave it
governed the whole world and we gave it the directive of making humans
the directive of making humans prosperous, healthy and happy
prosperous, healthy and happy as long as that directive was clear.
as long as that directive was clear. Yeah.
Yeah. Because there's always bloody unintended
Because there's always bloody unintended consequences. as we might.
consequences. as we might. So, so the the only the only challenge
So, so the the only the only challenge you're going to to to meet is all of
you're going to to to meet is all of those who today are trillionaires or you
those who today are trillionaires or you know massive massively powerful or
know massive massively powerful or dictators or whatever. Okay. How do you
dictators or whatever. Okay. How do you convince those to give up their power?
convince those to give up their power? How do you convince those that hey by
How do you convince those that hey by the way
the way any car you want you want you want
any car you want you want you want another yacht we'll get you another
another yacht we'll get you another yacht. We'll just give you anything you
yacht. We'll just give you anything you want. Can you please stop harming
want. Can you please stop harming others? There is no need for arbitrage
others? There is no need for arbitrage anymore.
anymore. There's no need for others to lose, for
There's no need for others to lose, for the capitalists to win.
the capitalists to win. Okay? And in such a world where there
Okay? And in such a world where there was an AI leader and it was given the
was an AI leader and it was given the directive of making us prosperous as a
directive of making us prosperous as a whole world, the the the billionaire
whole world, the the the billionaire that owns the yacht would have to give
that owns the yacht would have to give it up.
it up. No. No.
No. No. Give them more yachts.
Give them more yachts. Okay.
Okay. It costs nothing to make yachts when
It costs nothing to make yachts when robots are making everything. So So the
robots are making everything. So So the complexity of this is so interesting. A
complexity of this is so interesting. A world where it costs nothing to make
world where it costs nothing to make everything
everything because energy is abundant and
because energy is abundant and energy is abundant because every problem
energy is abundant because every problem is solved with enormous IQ. Okay,
is solved with enormous IQ. Okay, because manufacturing is done through
because manufacturing is done through nanopysics not through components. Okay,
nanopysics not through components. Okay, because mechanics are robotic. So you
because mechanics are robotic. So you you know you drive your car in, a robot
you know you drive your car in, a robot looks at it and fixes it. Costs you a
looks at it and fixes it. Costs you a few cents of energy that are actually
few cents of energy that are actually for free as well.
for free as well. That imagine a world where intelligence
That imagine a world where intelligence creates everything.
creates everything. That world literally
That world literally every human has anything they ask for.
every human has anything they ask for. But we're not going to choose that
But we're not going to choose that world.
Imm imagine you're in a world and and really this is a very interesting
really this is a very interesting thought experiment. Imagine that UBI
thought experiment. Imagine that UBI became very expensive universal basic
became very expensive universal basic income. So governments decided we're
income. So governments decided we're going to put everyone in a one by 3 m
going to put everyone in a one by 3 m room, okay? We're going to give them a
room, okay? We're going to give them a headset and a seditive,
headset and a seditive, right? And we're going to let them sleep
right? And we're going to let them sleep every night. They'll sleep for 23 hours
every night. They'll sleep for 23 hours and we're going to get them to live an
and we're going to get them to live an entire lifetime.
entire lifetime. H they you know in that in that virtual
H they you know in that in that virtual world at the speed of your brain when
world at the speed of your brain when you're asleep you're going to have a
you're asleep you're going to have a life where you date Scarlett Johansson
life where you date Scarlett Johansson and then another life where you're
and then another life where you're Nefertiti and then another life where
Nefertiti and then another life where you're a donkey right reincarnation
you're a donkey right reincarnation truly in the virtual world
truly in the virtual world and then you know I get another life
and then you know I get another life when I date Hannah again and I you know
when I date Hannah again and I you know enjoy that life tremendously and
enjoy that life tremendously and basically the cost of all of this is
basically the cost of all of this is zero. You wake up for one hour, you walk
zero. You wake up for one hour, you walk around, you move your blood, you eat
around, you move your blood, you eat something or you don't, and then you put
something or you don't, and then you put the headset again and live again. Is
the headset again and live again. Is that unthinkable?
It's creepy compared to this life. It's very, very doable.
very, very doable. What? That we just live in headsets?
What? That we just live in headsets? Do you Do you know if you're not?
Do you Do you know if you're not? I don't know if I'm not known.
I don't know if I'm not known. Yeah, you have no idea if you're not. I
Yeah, you have no idea if you're not. I mean, every experience you've ever had
mean, every experience you've ever had in life was an electrical electrical
in life was an electrical electrical signal in your brain.
signal in your brain. Okay.
Okay. Now, now ask yourself if we can create
Now, now ask yourself if we can create that in the virtual world,
that in the virtual world, it wouldn't be a bad thing if I can
it wouldn't be a bad thing if I can create it in the physical world.
create it in the physical world. Maybe we already did. No,
Maybe we already did. No, my theory is 98% we have. But that's a
my theory is 98% we have. But that's a hypothesis. That's not science.
hypothesis. That's not science. Well, you think that
Well, you think that 100? Yeah.
100? Yeah. You think we already created that and
You think we already created that and this is it?
this is it? I think this is it. Yeah. Think of any
I think this is it. Yeah. Think of any think of the uncertainty principle of
think of the uncertainty principle of quantum physics, right? What you what
quantum physics, right? What you what you what you observe gets collapses the
you what you observe gets collapses the wave function and gets rendered into
wave function and gets rendered into reality. Correct.
reality. Correct. I don't know anything about physics. So
I don't know anything about physics. So you
you so so quantum physics basically tells
so so quantum physics basically tells you that everything exists in
you that everything exists in superposition.
superposition. Right? So ev every subatomic particle
Right? So ev every subatomic particle that ever existed has the chance to
that ever existed has the chance to exist anywhere at any point in time and
exist anywhere at any point in time and then when it's observed by an observer
then when it's observed by an observer it collapses and becomes that. Okay. In
it collapses and becomes that. Okay. In very interesting principle exactly how
very interesting principle exactly how video games are. In video games, you
video games are. In video games, you have the entire game world on the hard
have the entire game world on the hard drive of your console. The player turns
drive of your console. The player turns right. That part of the game world is
right. That part of the game world is rendered. The rest is in superp
rendered. The rest is in superp position.
position. Supposition meaning
Supposition meaning superposition means it's available to be
superposition means it's available to be rendered, but you have to observe it.
rendered, but you have to observe it. The player has to turn to the other side
The player has to turn to the other side and see it. Okay? I mean think about the
and see it. Okay? I mean think about the truth of physics. The truth of the fact
truth of physics. The truth of the fact that this is entirely empty space. These
that this is entirely empty space. These are tiny tiny tiny I think you know
are tiny tiny tiny I think you know almost nothing in terms of mass but
almost nothing in terms of mass but connected with you know enough energy so
connected with you know enough energy so that my finger cannot go through my
that my finger cannot go through my hand. But even when I hit this
hand. But even when I hit this your hand against your finger.
your hand against your finger. Yeah. When I hit my hand against my
Yeah. When I hit my hand against my finger, that sensation in my in is felt
finger, that sensation in my in is felt in my brain. It's an electrical signal
in my brain. It's an electrical signal that went through the wires. There's
that went through the wires. There's absolutely no way to differentiate that
absolutely no way to differentiate that from a signal that can come to you
from a signal that can come to you through a uh neural link kind of
through a uh neural link kind of interface, computer brain interface, a
interface, computer brain interface, a CDI, right? So, so you know the a lot of
CDI, right? So, so you know the a lot of those things are very very very
those things are very very very possible. But the truth is most of the
possible. But the truth is most of the world is not physical. Most of the world
world is not physical. Most of the world happens inside our imagination, our
happens inside our imagination, our processors.
processors. And it and I guess it doesn't really
And it and I guess it doesn't really matter to us. Our reality
matter to us. Our reality doesn't at all. So this is the
doesn't at all. So this is the interesting bit. The interesting bit is
interesting bit. The interesting bit is it doesn't at all
it doesn't at all because we still if this is a video
because we still if this is a video game, we live consequence.
game, we live consequence. Yeah. This is your subjective experience
Yeah. This is your subjective experience of it.
of it. Yeah. And there's consequence in this. I
Yeah. And there's consequence in this. I I I don't like pain.
I I don't like pain. Correct.
Correct. And I like having orgasms. It's like And
And I like having orgasms. It's like And you're playing by the rule of the game.
you're playing by the rule of the game. Yeah. Right. And and it's quite
Yeah. Right. And and it's quite interesting and going back to a
interesting and going back to a conversation we should have. It's the
conversation we should have. It's the interesting bit is if I'm not the
interesting bit is if I'm not the avatar,
avatar, if I'm not this physical form, if I'm if
if I'm not this physical form, if I'm if I'm the consciousness wearing the
I'm the consciousness wearing the headset,
headset, what should I invest in? Should I invest
what should I invest in? Should I invest in this video game, this level, or
in this video game, this level, or should I should I invest in the real
should I should I invest in the real avatar, in the real me, and not the
avatar, in the real me, and not the avatar, but the consciousness, if you
avatar, but the consciousness, if you want, spirit, if you're religious,
want, spirit, if you're religious, how would I invest in the consciousness
how would I invest in the consciousness or the god or the spirit or whatever?
or the god or the spirit or whatever? How would I? In the same way that if I
How would I? In the same way that if I was playing Grand Theft Auto, the video
was playing Grand Theft Auto, the video game, the character in the game couldn't
game, the character in the game couldn't invest in me holding the controller.
invest in me holding the controller. You Yes, but you can invest in yourself
You Yes, but you can invest in yourself holding the controller.
holding the controller. Oh, okay. So, so you're saying that
Oh, okay. So, so you're saying that Moga is in fact consciousness. And so,
Moga is in fact consciousness. And so, how would consciousness invest in
how would consciousness invest in itself?
itself? By becoming more aware. So, so
By becoming more aware. So, so of it consciousness.
of it consciousness. Yeah. So, real real video gamers don't
Yeah. So, real real video gamers don't want to win the level. Real video gamers
want to win the level. Real video gamers don't want to uh to finish the level.
don't want to uh to finish the level. Okay. Real video gamers have one
Okay. Real video gamers have one objective and one objective only, which
objective and one objective only, which is to become better gamers.
So, so you know how serious I am about I play Halo. I'm one, you know, two of
play Halo. I'm one, you know, two of every million players can beat me.
every million players can beat me. That's how what I rank, right? Very for
That's how what I rank, right? Very for my age, phenomena. Hey, anyone, right?
my age, phenomena. Hey, anyone, right? But seriously, you know, and that's
But seriously, you know, and that's because I don't play. I mean, I practice
because I don't play. I mean, I practice 45 minutes a day, four times a week when
45 minutes a day, four times a week when I'm not traveling. And I practice with
I'm not traveling. And I practice with one single objective, which is to become
one single objective, which is to become a better gamer.
a better gamer. I don't care which shot it is. I don't
I don't care which shot it is. I don't care what happens in the in the game.
care what happens in the in the game. I'm entirely trying to get my reflexes
I'm entirely trying to get my reflexes and my flow to become better at this.
and my flow to become better at this. Right? So, I want to become a better
Right? So, I want to become a better gamer. That basically means I want to
gamer. That basically means I want to observe the game, question the game,
observe the game, question the game, reflect on the game, reflect on my own
reflect on the game, reflect on my own skills, reflect on my own beliefs,
skills, reflect on my own beliefs, reflect on my understanding of things,
reflect on my understanding of things, right? And and that's how the a how the
right? And and that's how the a how the the consciousness invests in the
the consciousness invests in the consciousness, not the avatar. Because
consciousness, not the avatar. Because then if you're that gamer,
then if you're that gamer, the next avatar is easy for you. The
the next avatar is easy for you. The next level of the game is easy for you
next level of the game is easy for you just because you became a better gamer.
just because you became a better gamer. Okay. So you think that consciousness is
Okay. So you think that consciousness is using us as a vessel to improve?
using us as a vessel to improve? If the hypothesis is is true, it's it's
If the hypothesis is is true, it's it's just a hypothesis. We don't know if it's
just a hypothesis. We don't know if it's true. But if this truly is a simulation,
true. But if this truly is a simulation, this is then then if you take the the
this is then then if you take the the the the the religious definition of God
the the the religious definition of God puts some of his soul in every human and
puts some of his soul in every human and then you become alive. You become
then you become alive. You become conscious. Okay? You don't you don't
conscious. Okay? You don't you don't want to be religious. You can say
want to be religious. You can say universal consciousness is spinning off
universal consciousness is spinning off parts of itself to have multiple
parts of itself to have multiple experiences and interact and compete and
experiences and interact and compete and combat and love and
combat and love and and understand and
and understand and and then refine. I had a physicist say
and then refine. I had a physicist say this to me the other day actually so
this to me the other day actually so it's quite front of mind this idea that
it's quite front of mind this idea that consciousness is using us as vessels to
consciousness is using us as vessels to better understand itself and basically
better understand itself and basically using our eyes to
using our eyes to observe itself and understand which is
observe itself and understand which is quite a
quite a so so if you take some of the more
so so if you take some of the more interest most interesting religious
interest most interesting religious definitions of heaven and hell for
definitions of heaven and hell for example right where basically heaven is
example right where basically heaven is whatever you wish for you get right
whatever you wish for you get right that's the power of God whatever you
that's the power of God whatever you wish for you get and so if you really go
wish for you get and so if you really go into the depth of that definition. It
into the depth of that definition. It basically means that this drop of
basically means that this drop of consciousness that became you returned
consciousness that became you returned back to the source and the source can
back to the source and the source can create any other anything that it wants
create any other anything that it wants to create. So that's your heaven, right?
to create. So that's your heaven, right? And interestingly,
And interestingly, if that if that return
if that if that return is done by separating your good from
is done by separating your good from your evil so that the source comes back
your evil so that the source comes back more refined, that's exactly you know
more refined, that's exactly you know consciousness splitting off bits of
consciousness splitting off bits of itself to to experience and then elevate
itself to to experience and then elevate all of us elevate the universal
all of us elevate the universal consciousness all all hypotheses. I
consciousness all all hypotheses. I mean, please um you know, none of that
mean, please um you know, none of that is provable by science, but it's a very
is provable by science, but it's a very interesting thought experiment. And you
interesting thought experiment. And you know, a lot of AI scientists will tell
know, a lot of AI scientists will tell you that what we've seen in technology
you that what we've seen in technology is that if it's possible, it's likely
is that if it's possible, it's likely going to happen.
going to happen. If it's if it's possible to
If it's if it's possible to miniaturaturize something to fit into a
miniaturaturize something to fit into a mobile phone, then sooner or later in
mobile phone, then sooner or later in technology, we will get there.
technology, we will get there. And if if you ask me, believe it or not,
And if if you ask me, believe it or not, it's the most humane way of handling
it's the most humane way of handling UBI.
UBI. What do you mean?
What do you mean? The most humane way if you know for us
The most humane way if you know for us to live on a universal basic income and
to live on a universal basic income and people like you struggle with not being
people like you struggle with not being able to build businesses is to give you
able to build businesses is to give you a virtual headset and let you build as
a virtual headset and let you build as many businesses as you want.
many businesses as you want. Level after level after level after
Level after level after level after level after level, night after night.
level after level, night after night. Keep you alive. That's very very
Keep you alive. That's very very respectful and human. Okay. And by the
respectful and human. Okay. And by the way, the even more humane is don't force
way, the even more humane is don't force anyone to do it. There might be a few of
anyone to do it. There might be a few of us still roaming the jungles,
but for most of us, we'll go like, man, I mean, someone like me when I'm 70 and,
I mean, someone like me when I'm 70 and, you know, my back is hurting and my feet
you know, my back is hurting and my feet are hurting and I'm going to go like,
are hurting and I'm going to go like, yeah, give me five more years of this.
yeah, give me five more years of this. Why not?
Why not? It's weird really. I mean, the number of
It's weird really. I mean, the number of questions
questions that this new environment throws out,
that this new environment throws out, the less humane thing, by the way, just
the less humane thing, by the way, just so that we close on a grumpy uh you
so that we close on a grumpy uh you know, is is just start enough wars to
know, is is just start enough wars to reduce UBI. And you have to imagine that
reduce UBI. And you have to imagine that if the world is governed by a superpower
if the world is governed by a superpower deep state type thing that they might
deep state type thing that they might may want to consider that
the eaters what shall I do about it
what shall I do about it about
about about everything you've said
about everything you've said uh well I I I still believe that this
uh well I I I still believe that this world we live in requires four skills.
world we live in requires four skills. One skill is what I call the tool for
One skill is what I call the tool for all of us to learn AI, to connect to AI,
all of us to learn AI, to connect to AI, to really get close to AI, to explo ex
to really get close to AI, to explo ex expose ourselves to AI so that AI knows
expose ourselves to AI so that AI knows the good side of humanity. Okay. Uh the
the good side of humanity. Okay. Uh the second is uh what I call the connection,
second is uh what I call the connection, right? So I believe that the biggest
right? So I believe that the biggest skill that humanity will benefit from in
skill that humanity will benefit from in the next 10 years is human connection.
the next 10 years is human connection. It's ability to learn to love genuinely.
It's ability to learn to love genuinely. It's the ability to learn to have
It's the ability to learn to have compassion to others. It's the ability
compassion to others. It's the ability to connect to people. If you're, you
to connect to people. If you're, you know, if you want to stay in business, I
know, if you want to stay in business, I believe that not the smartest people,
believe that not the smartest people, but the people that connect most to
but the people that connect most to people are going to have jobs going
people are going to have jobs going forward. And and the third is what I
forward. And and the third is what I call truth. The T 30 is truth. Because
call truth. The T 30 is truth. Because we live in a world where all of the
we live in a world where all of the gullible cheerleaders are being lied to
gullible cheerleaders are being lied to all the time. So I I encourage people to
all the time. So I I encourage people to question everything. Every word that I
question everything. Every word that I said today is stupid. Fourth one which
said today is stupid. Fourth one which is very important is to magnify ethics
is very important is to magnify ethics so that the AI learns what it's like to
so that the AI learns what it's like to be human.
be human. What should I do?
What should I do? I uh I I love you so much, man. You're
I uh I I love you so much, man. You're such a good friend. You're 32 33 now.
such a good friend. You're 32 33 now. 32. Yeah.
32. Yeah. Yeah. You still are fooled by the many
Yeah. You still are fooled by the many many years you have to live.
many years you have to live. I'm fooled by the many years I have to
I'm fooled by the many years I have to live.
live. Yeah, you don't have many years to live.
Yeah, you don't have many years to live. Not in this capacity. This world as it
Not in this capacity. This world as it is is going to be redefined. So live the
is is going to be redefined. So live the f out of it.
f out of it. How is it going to be redefined?
How is it going to be redefined? Everything's going to change. Economics
Everything's going to change. Economics are going to change. Work is going to
are going to change. Work is going to change. Uh human connection is going to
change. Uh human connection is going to change.
change. So what should I do?
So what should I do? Love your girlfriend. Spend more time
Love your girlfriend. Spend more time living.
living. Mhm. Find compassion and connection to
Mhm. Find compassion and connection to more people, be more in nature.
more people, be more in nature. And in 30 years time, when I'm 62,
And in 30 years time, when I'm 62, what do you how how do you think my life
what do you how how do you think my life is going to look differently and be
is going to look differently and be different?
different? Either Star Trek or uh uh Star Wars.
Either Star Trek or uh uh Star Wars. Funnily enough, we were talking about
Funnily enough, we were talking about Sam Orman earlier on. He published a
Sam Orman earlier on. He published a blog post in June, so last month, I
blog post in June, so last month, I believe, the month before last. Um and
believe, the month before last. Um and he said he called it the gentle
he said he called it the gentle singularity. He said we are past the
singularity. He said we are past the event horizon. For anyone that doesn't
event horizon. For anyone that doesn't know Sam Orman is the the guy that made
know Sam Orman is the the guy that made Chatb the takeoff has started. Humanity
Chatb the takeoff has started. Humanity is close to building digital super
is close to building digital super intelligence.
intelligence. I believe that.
I believe that. And at least so far it's much less weird
And at least so far it's much less weird than it seems like it should be because
than it seems like it should be because robots aren't walking the streets nor
robots aren't walking the streets nor are most of us talking to AI all day. It
are most of us talking to AI all day. It goes on to say, "2025 has seen the
goes on to say, "2025 has seen the arrival of agents that can do real
arrival of agents that can do real cognitive work. Writing computer code
cognitive work. Writing computer code will never be the same. 2026 will likely
will never be the same. 2026 will likely see the arrival of systems that can
see the arrival of systems that can figure out new insights. 2027 might see
figure out new insights. 2027 might see the arrival of robots that can do tasks
the arrival of robots that can do tasks in the real world. A lot more people
in the real world. A lot more people will be able to create software and art,
will be able to create software and art, but the world wants a lot more of both,
but the world wants a lot more of both, and experts will probably still be much
and experts will probably still be much better than noviceses as long as they
better than noviceses as long as they embrace the new tools. Generally
embrace the new tools. Generally speaking, the ability for one person to
speaking, the ability for one person to get much more done in 2030 than they
get much more done in 2030 than they could in 2020 will be a striking change
could in 2020 will be a striking change and one many people will figure out how
and one many people will figure out how we benefit from. In the most important
we benefit from. In the most important ways, the 2030s may not be wildly
ways, the 2030s may not be wildly different. People will still love their
different. People will still love their families, express their creativity, play
families, express their creativity, play games, and swim in lakes. But in still
games, and swim in lakes. But in still very important ways, the 2030s are
very important ways, the 2030s are likely going to be wildly different from
likely going to be wildly different from any time that has come before.
any time that has come before. 100%.
100%. We do not know how far beyond human
We do not know how far beyond human level intelligence we can go, but we are
level intelligence we can go, but we are about to find out.
about to find out. I agree with every word other than the
I agree with every word other than the word more.
word more. So I've I've been advocating this and
So I've I've been advocating this and and laughed at for a few years now. I've
and laughed at for a few years now. I've always said AGI is 2526,
always said AGI is 2526, right? which basically again is a is a
right? which basically again is a is a funny definition but you know my AI has
funny definition but you know my AI has already happened AI is smarter than me
already happened AI is smarter than me in everything everything I can do they
in everything everything I can do they can do better right uh artificial super
can do better right uh artificial super intelligence is another vague definition
intelligence is another vague definition because you know the minute you you pass
because you know the minute you you pass AGI you're super intelligent if if the
AGI you're super intelligent if if the smartest human is 200 IQ points and AI
smartest human is 200 IQ points and AI is 250 they're super intelligent 50 is
is 250 they're super intelligent 50 is quite significant okay third is as I
quite significant okay third is as I said self- evolving. That's the one.
said self- evolving. That's the one. That is the one because then that 250
That is the one because then that 250 accelerates quickly and we get into
accelerates quickly and we get into intelligence explosion. No, no doubt
intelligence explosion. No, no doubt about it. The the the you know the idea
about it. The the the you know the idea that we will have robots do things. No
that we will have robots do things. No doubt about it. I was watching a Chinese
doubt about it. I was watching a Chinese uh company announcement about how they
uh company announcement about how they intend to build robots to build robots.
intend to build robots to build robots. Okay. The only thing is he says but
Okay. The only thing is he says but people will need more of things
people will need more of things right and yes we have been trained to
right and yes we have been trained to have more greed and more consumerism and
have more greed and more consumerism and want more but there is an economic of
want more but there is an economic of spy of supply and demand and at at a
spy of supply and demand and at at a point in time if we continue to consume
point in time if we continue to consume more the price of everything will become
more the price of everything will become zero right and is that a good thing or a
zero right and is that a good thing or a bad thing depends on how you respond to
bad thing depends on how you respond to that.
that. Because if you can create anything in
Because if you can create anything in such a scale that the price is almost
such a scale that the price is almost zero, then the definition of money
zero, then the definition of money disappears and we live in a world where
disappears and we live in a world where it doesn't really matter how much money
it doesn't really matter how much money you have. You can get anything that you
you have. You can get anything that you want. What a beautiful world.
want. What a beautiful world. If Samman was listening right now, what
If Samman was listening right now, what would you say to him?
would you say to him? I suspect he might be listening
I suspect he might be listening cuz someone might tweet this at him. I
cuz someone might tweet this at him. I have to say that we have uh as per his
have to say that we have uh as per his other tweet we have moved faster
other tweet we have moved faster than our ability as humans to
than our ability as humans to comprehend. Okay. And that we might get
comprehend. Okay. And that we might get really really lucky but we also might
really really lucky but we also might mess this up badly and either way we'll
mess this up badly and either way we'll either thank him or blame him.
either thank him or blame him. Simple as that. Right. So
Simple as that. Right. So single-handedly Sam Alman's introduction
single-handedly Sam Alman's introduction of AI in the wild was the trigger that
of AI in the wild was the trigger that started all of this.
started all of this. It was the netscape of the internet.
It was the netscape of the internet. The Oppenheimer.
The Oppenheimer. It's it's it definitely is our
It's it's it definitely is our openheimer moment. I mean I don't
openheimer moment. I mean I don't remember who was saying this recently
remember who was saying this recently that uh we are orders of magnitude what
that uh we are orders of magnitude what was invested in the Manhattan project is
was invested in the Manhattan project is being invested in AI
being invested in AI right and and and I and I and I am not
right and and and I and I and I am not pessimistic I I told you openly I
pessimistic I I told you openly I believe in a total utopia in 10 to 12 to
believe in a total utopia in 10 to 12 to 15 years time or immediately if the evil
15 years time or immediately if the evil that men can do was kept at bay right
that men can do was kept at bay right but I do not believe humanity is getting
but I do not believe humanity is getting together enough to say, "We've just
together enough to say, "We've just received the genie in a bottle. Can we
received the genie in a bottle. Can we please not ask it to do bad things?"
please not ask it to do bad things?" Anyone like not three wishes, you have
Anyone like not three wishes, you have all the wishes that you want. Every one
all the wishes that you want. Every one of us.
of us. And it's just screws with my mind
And it's just screws with my mind because imagine if I can give everyone
because imagine if I can give everyone in the world universal health care, you
in the world universal health care, you know, no poverty, no hunger, no
know, no poverty, no hunger, no homelessness, no nothing. Everything's
homelessness, no nothing. Everything's possible.
possible. And yet we don't.
And yet we don't. To continue what Samman's blog said,
To continue what Samman's blog said, which he published a month, just over a
which he published a month, just over a month ago, he said, "The rate of
month ago, he said, "The rate of technological progress will keep
technological progress will keep accelerating, and it will continue to be
accelerating, and it will continue to be the case that people are capable of
the case that people are capable of adapting to almost anything. There will
adapting to almost anything. There will be very hard parts like whole classes of
be very hard parts like whole classes of jobs going away. But on the other hand,
jobs going away. But on the other hand, the world will be getting so much richer
the world will be getting so much richer so quickly that we'll be able to
so quickly that we'll be able to seriously entertain new policy ideas we
seriously entertain new policy ideas we never could have before. We probably
never could have before. We probably won't adopt a new social contract all at
won't adopt a new social contract all at once. But when we look back in a few
once. But when we look back in a few decades, the gradual changes will have
decades, the gradual changes will have amounted in something big. If history is
amounted in something big. If history is any guide, we'll figure out new things
any guide, we'll figure out new things to do and new things to want and
to do and new things to want and assimilate new tools quickly. Job change
assimilate new tools quickly. Job change after the industrial revolution is a
after the industrial revolution is a good recent example. Expectations will
good recent example. Expectations will go up, but capabilities will go up
go up, but capabilities will go up equally quickly, and we'll all get
equally quickly, and we'll all get better stuff.
better stuff. We will build even more wonderful things
We will build even more wonderful things for each other. People have a long-term
for each other. People have a long-term important and curious advantage over AI.
important and curious advantage over AI. We are hardwired to care about other
We are hardwired to care about other people and what they think and do, and
people and what they think and do, and we don't care very much about machines.
we don't care very much about machines. And he ends this blog by saying, "May we
And he ends this blog by saying, "May we scale smoothly, exponentially,
scale smoothly, exponentially, and uneventfully through super
and uneventfully through super intelligence."
intelligence." What a wonderful
What a wonderful wish that assumes he has no control over
wish that assumes he has no control over it. May we have all the ultmans in the
it. May we have all the ultmans in the world help us scale gracefully and
world help us scale gracefully and peacefully and uneventfully. Right.
peacefully and uneventfully. Right. It sounds like a prayer.
It sounds like a prayer. Yeah. May may we have them take keep
Yeah. May may we have them take keep that in mind. I mean think about it. I I
that in mind. I mean think about it. I I have a very interesting comment on what
have a very interesting comment on what you just said. We will see exactly what
you just said. We will see exactly what he described there.
he described there. Right? The world will become richer. So
Right? The world will become richer. So much richer. But how will we reduce
much richer. But how will we reduce distribute the riches? And I want you to
distribute the riches? And I want you to imagine two camps. Communist China
imagine two camps. Communist China and capitalist America.
and capitalist America. I want you to imagine what would happen
I want you to imagine what would happen in capitalist America if we have 30%
in capitalist America if we have 30% unemployment.
unemployment. There'll be social unrest
There'll be social unrest in the streets. Right.
in the streets. Right. Yeah.
Yeah. And I want you to imagine if China lives
And I want you to imagine if China lives true to caring for its nations and
true to caring for its nations and replaced every worker with a robot, what
replaced every worker with a robot, what would it give its it its citizens?
would it give its it its citizens? UBI.
UBI. Correct.
That is the ideological problem because in China's world today
in China's world today the prosperity of every citizen is
the prosperity of every citizen is higher than the prosperity of the
higher than the prosperity of the capitalist.
capitalist. In America today the prosperity of the
In America today the prosperity of the capitalist is higher than the prosperity
capitalist is higher than the prosperity of every citizen. And that's the tiny
of every citizen. And that's the tiny mind shift.
mind shift. That's a tiny mind shift. Okay. where
That's a tiny mind shift. Okay. where where the mind shift basically becomes
where the mind shift basically becomes look give the capitalists anything they
look give the capitalists anything they want all the money they want all the
want all the money they want all the yachts they want everything they want
yachts they want everything they want so what's your conclusion there
so what's your conclusion there I'm hoping the world will wake up
I'm hoping the world will wake up what can you know there's probably a
what can you know there's probably a couple of million people listening right
couple of million people listening right now maybe five maybe 10 maybe even 20
now maybe five maybe 10 maybe even 20 million people
million people pressure Stephen
pressure Stephen no pressure to you mate I don't I don't
no pressure to you mate I don't I don't have the answers
have the answers I don't know the answers either
I don't know the answers either what what should those people do
what what should those people do as I said from a skills point of view
as I said from a skills point of view for things, right? Tools, uh, uh, human
for things, right? Tools, uh, uh, human connection, even double down on human
connection, even double down on human connection. Leave your phone, go out and
connection. Leave your phone, go out and meet humans,
meet humans, okay? Touch people,
okay? Touch people, you know, do it permission's permission,
you know, do it permission's permission, right? Truth. Stop believing the lies
right? Truth. Stop believing the lies that you're told. Any slogan that gets,
that you're told. Any slogan that gets, you know, filled in your head, think
you know, filled in your head, think about it four times. Understand where
about it four times. Understand where your ideologies are coming from.
your ideologies are coming from. Simplify the truth. Right? Truth is
Simplify the truth. Right? Truth is really it boils down to you know simple
really it boils down to you know simple simple rules that we all know okay which
simple rules that we all know okay which are all found in ethics.
are all found in ethics. How do I know what's true?
How do I know what's true? Treat others as you like to be treated.
Treat others as you like to be treated. Okay. That's the only truth. The truth
Okay. That's the only truth. The truth the only truth is everything else is
the only truth is everything else is unproven.
unproven. Okay. And what can I do from a is there
Okay. And what can I do from a is there something I can do from an advocacy
something I can do from an advocacy social political?
social political? Yes 100%. We need to ask our governments
Yes 100%. We need to ask our governments to start uh not regulating AI but
to start uh not regulating AI but regulating the use of AI. Was it the
regulating the use of AI. Was it the Norwegian government that started to say
Norwegian government that started to say you have copyright over your voice and
you have copyright over your voice and look and and liking? One of the
look and and liking? One of the Scandinavian governments basically said
Scandinavian governments basically said you know everyone has the has the
you know everyone has the has the copyright over their existence so no AI
copyright over their existence so no AI can clone it. Okay. Uh you know we have
can clone it. Okay. Uh you know we have so so my my example is very
so so my my example is very straightforward. go to governments and
straightforward. go to governments and say you cannot regulate the design of a
say you cannot regulate the design of a hammer so that it can drive nails but
hammer so that it can drive nails but not kill a human but you can criminalize
not kill a human but you can criminalize the killing of a human by a hammer. So
the killing of a human by a hammer. So what's the equival
what's the equival if anyone produces an um um you know an
if anyone produces an um um you know an AI generated video or an AI generated
AI generated video or an AI generated content or an AI it has to be marked as
content or an AI it has to be marked as AI generated and it has to be you know
AI generated and it has to be you know we cannot start fooling each other. We
we cannot start fooling each other. We can you know we have to uh understand
can you know we have to uh understand certain limitations of unfortunately
certain limitations of unfortunately surveillance and spying and all of that.
surveillance and spying and all of that. So the the the the correct frameworks of
So the the the the correct frameworks of how far are we going to let AI go,
how far are we going to let AI go, right? We have to go to our investors
right? We have to go to our investors and business people and ask for one
and business people and ask for one simple thing and say do not invest in an
simple thing and say do not invest in an AI you don't want your daughter to be at
AI you don't want your daughter to be at the receiving end of. It's as simple as
the receiving end of. It's as simple as that. you know, all of the of the
that. you know, all of the of the virtual vice, all of the porn, all of
virtual vice, all of the porn, all of the, you know, sex robots, all of the
the, you know, sex robots, all of the autonomous weapons, all of the, you
autonomous weapons, all of the, you know, the uh trading platforms that are
know, the uh trading platforms that are completely wiping out the the legitimacy
completely wiping out the the legitimacy of of the markets, everything.
of of the markets, everything. Autonomous weapons.
Autonomous weapons. Oh my god.
Oh my god. People make the case, I've heard the
People make the case, I've heard the founders of these autonomous weapon
founders of these autonomous weapon companies make the case that it's
companies make the case that it's actually saving lives because you don't
actually saving lives because you don't have to
have to That is Would you want Do you really
That is Would you want Do you really want to believe that?
want to believe that? I'm just representing their point of
I'm just representing their point of view to play devil's advocate, Mo. They
view to play devil's advocate, Mo. They they said I heard an interview I was
they said I heard an interview I was looking at this and one of the CEOs of
looking at this and one of the CEOs of one of the autonomous weapons companies
one of the autonomous weapons companies said we now don't need to send soldiers.
said we now don't need to send soldiers. So which which lives do we save our
So which which lives do we save our soldiers but then but because we send
soldiers but then but because we send the machine all the way over there.
the machine all the way over there. Let's kill a million instead of
Let's kill a million instead of Yeah. Listen, I tend to be it goes back
Yeah. Listen, I tend to be it goes back to what I said about the steam engine in
to what I said about the steam engine in the cold. I actually think you'll just
the cold. I actually think you'll just have more war if there's less of a cost.
have more war if there's less of a cost. 100%.
100%. Just like
Just like and and more war if you have less of an
and and more war if you have less of an explanation to give to your people.
explanation to give to your people. Yeah. The people get mad when they lose
Yeah. The people get mad when they lose American lives. They get less mad when
American lives. They get less mad when they lose a piece of metal. So, I think
they lose a piece of metal. So, I think that's probably logical.
that's probably logical. Yeah.
Yeah. Okay. So, okay. So, I've got a plaque.
Okay. So, okay. So, I've got a plaque. Got the tools thing. I'm going to spend
Got the tools thing. I'm going to spend more time outside. I'm going to lobby
more time outside. I'm going to lobby the government to be more aware of this
the government to be more aware of this and conscious of this. Okay. And I I
and conscious of this. Okay. And I I know that there's some government
know that there's some government officials that listen to the show
officials that listen to the show because they they they tell me when when
because they they they tell me when when they when they um when I have a chance
they when they um when I have a chance to speak to them. So, it's um useful.
to speak to them. So, it's um useful. We're all in a lot of chaos. We're all
We're all in a lot of chaos. We're all unable to imagine what's possible.
unable to imagine what's possible. I I think I suspend disbelief. And I
I I think I suspend disbelief. And I actually heard Elon Musk say that in an
actually heard Elon Musk say that in an interview. He said he was asked about AI
interview. He said he was asked about AI and he paused for for a haunting 11
and he paused for for a haunting 11 seconds and looked at the interviewer
seconds and looked at the interviewer and then made a remark about how he
and then made a remark about how he thinks he's suspended his own disbelief.
thinks he's suspended his own disbelief. And I think suspending disbelief in this
And I think suspending disbelief in this regard means just like cracking on with
regard means just like cracking on with your life and hoping it'll be okay. And
your life and hoping it'll be okay. And that's kind of what
that's kind of what Yeah. I I I I absolutely believe that it
Yeah. I I I I absolutely believe that it will be okay.
will be okay. Yeah.
Yeah. For some of us, it will be very tough
For some of us, it will be very tough for others.
for others. Who's it going to be tough for?
Who's it going to be tough for? Those who lose their jobs, for example,
Those who lose their jobs, for example, who those who are at the receiving end
who those who are at the receiving end of autonomous weapons that are falling
of autonomous weapons that are falling on their head for two years in a row.
Okay. So the the the best thing I can do is to put pressure on governments to to
is to put pressure on governments to to not regulate the AI but to establish
not regulate the AI but to establish clearer parameters on the use of the AI.
clearer parameters on the use of the AI. Yes. Okay.
Yes. Okay. Yes. But I I think the bigger picture is
Yes. But I I think the bigger picture is to put pressure on governments to
to put pressure on governments to understand that there is a limit to
understand that there is a limit to which people will stay silent.
which people will stay silent. Okay. and that we can continue to enrich
Okay. and that we can continue to enrich our rich friends as long as we don't
our rich friends as long as we don't lose everyone else on the on the on the
lose everyone else on the on the on the path.
path. Okay.
Okay. Okay. And that as a government who is
Okay. And that as a government who is supposed to be by the people for the
supposed to be by the people for the people the beautiful promise of
people the beautiful promise of democracy that we're rarely seeing
democracy that we're rarely seeing anymore,
anymore, that government needs to get to the
that government needs to get to the point where it thinks about the people.
point where it thinks about the people. One of the most um interesting ideas
One of the most um interesting ideas that's been in my head for the last
that's been in my head for the last couple of weeks since I spoke to that
couple of weeks since I spoke to that physicist about consciousness who said
physicist about consciousness who said pretty much what you said. This idea
pretty much what you said. This idea that actually there's four people in
that actually there's four people in this room right now and that actually
this room right now and that actually we're all part of the same
we're all part of the same consciousness.
consciousness. All one of it. Yeah.
All one of it. Yeah. And we're just consciousness looking at
And we're just consciousness looking at the world through four different bodies
the world through four different bodies to better understand itself in the
to better understand itself in the world. And then he talked to me about
world. And then he talked to me about religious doctrines, about love thy
religious doctrines, about love thy neighbor, about how Jesus was the, you
neighbor, about how Jesus was the, you know, God's son, the Holy Spirit and how
know, God's son, the Holy Spirit and how we're all each other and how treat
we're all each other and how treat others how you want to be treated.
others how you want to be treated. Really did get my head and I I started
Really did get my head and I I started to really think about this idea that
to really think about this idea that actually maybe the game of life is just
actually maybe the game of life is just to do exactly that is to treat others
to do exactly that is to treat others how you wish to be treated. Maybe if I
how you wish to be treated. Maybe if I just did that, maybe if I just did that,
just did that, maybe if I just did that, I
I I would have all the answers.
I would have all the answers. I swear to you, it's really that simple.
I swear to you, it's really that simple. I mean I I you know Hannah and I we
I mean I I you know Hannah and I we still live between London and and Dubai.
still live between London and and Dubai. Okay. And I travel the whole world
Okay. And I travel the whole world evangelizing what I, you know, what I uh
evangelizing what I, you know, what I uh um want to change the world around and I
um want to change the world around and I build startups and I write books and I
build startups and I write books and I make documentaries and and sometimes I
make documentaries and and sometimes I just tell myself
just tell myself I I just want to go hug her honestly,
I I just want to go hug her honestly, you know, I just want to take my
you know, I just want to take my daughter to a trip.
daughter to a trip. and and in a very very very interesting
and and in a very very very interesting way when you really ask people deep
way when you really ask people deep inside
inside that's what we want and I'm not saying
that's what we want and I'm not saying that's all that's the only thing we want
that's all that's the only thing we want but it's probably the thing we want the
but it's probably the thing we want the most
most and yet we're not trained you and I and
and yet we're not trained you and I and most of us were not trained to trust
most of us were not trained to trust life enough to say let's do more of this
and I think as a universal. So Hannah's working on this beautiful book uh of the
working on this beautiful book uh of the feminine and the masculine you know in a
feminine and the masculine you know in a very very you know beautiful way and and
very very you know beautiful way and and her her view is very straightforward.
her her view is very straightforward. She basically of course like we all know
She basically of course like we all know the abundant masculine that we have in
the abundant masculine that we have in our world today is unable to recognize
our world today is unable to recognize that for life at large.
that for life at large. Right? And and so you know maybe if we
Right? And and so you know maybe if we allowed the leaders to understand that
allowed the leaders to understand that if we took all of humanity and put it as
if we took all of humanity and put it as one person
one person that one person wants to be hugged
that one person wants to be hugged and if we had a role to offer to that
and if we had a role to offer to that one humanity
one humanity it's not another yacht.
it's not another yacht. Are you religious? I'm
Are you religious? I'm very religious. Yeah.
very religious. Yeah. But you don't support a particular
But you don't support a particular religion.
religion. I support I I follow what I call the
I support I I follow what I call the fruit salad. What's the free salad?
fruit salad. What's the free salad? You know, I I I came at a point in time
You know, I I I came at a point in time and found that there were quite a few
and found that there were quite a few beautiful gold nuggets in every religion
beautiful gold nuggets in every religion and a ton of crap, right? And so in my
and a ton of crap, right? And so in my analogy to myself, that was like 30
analogy to myself, that was like 30 years ago. I said, "Look, it's like
years ago. I said, "Look, it's like someone giving you a basket of apples,
someone giving you a basket of apples, two good ones and four bad ones. Keep
two good ones and four bad ones. Keep the good ones." Right? And so basically,
the good ones." Right? And so basically, I take two apples, two oranges, two
I take two apples, two oranges, two strawberries, two bananas, and and I
strawberries, two bananas, and and I make a fruit salad. That's my view of
make a fruit salad. That's my view of religion.
religion. You take from every religion the good
You take from every religion the good from everyone. And there are so many
from everyone. And there are so many beautiful gold nuggets.
beautiful gold nuggets. And you believe in a god.
And you believe in a god. I 100% believe there is a divine being
I 100% believe there is a divine being here.
here. A divine being.
A divine being. A designer I call it. So if if this was
A designer I call it. So if if this was a video game, there is a game designer.
a video game, there is a game designer. And you're not positing whether that's a
And you're not positing whether that's a man in the sky with a beard.
man in the sky with a beard. Definitely not a man in the sky. a man
Definitely not a man in the sky. a man in I mean I with all all due respect to
in I mean I with all all due respect to you know religions that believe that uh
you know religions that believe that uh all of spacetime and everything in it is
all of spacetime and everything in it is unlike everything outside spacetime and
unlike everything outside spacetime and so if some divine designer designs
so if some divine designer designs spacetime it looks like nothing in
spacetime it looks like nothing in spacetime.
spacetime. So it's not it's not even physical in
So it's not it's not even physical in nature. It's not it's not gendered. It's
nature. It's not it's not gendered. It's not bound by time. It's not, you know,
not bound by time. It's not, you know, these are all characters of the creation
these are all characters of the creation of spacetime.
of spacetime. Do we need to believe in something
Do we need to believe in something transcendent like that to be happy? Do
transcendent like that to be happy? Do you think
you think I have to say uh there are lots of
I have to say uh there are lots of evidence
evidence that uh relating to someone bigger than
that uh relating to someone bigger than yourself
yourself uh makes the journey a lot more
uh makes the journey a lot more interesting and a lot more rewarding.
interesting and a lot more rewarding. I've been thinking a lot about this idea
I've been thinking a lot about this idea that we need to level up like that. So
that we need to level up like that. So level up from myself to like my family
level up from myself to like my family to my community to maybe my nation to
to my community to maybe my nation to maybe the world and then something
maybe the world and then something trans. Yeah.
trans. Yeah. And then if there's a level missing
And then if there's a level missing there people seem to have some kind of
there people seem to have some kind of dysfunction.
dysfunction. So imagine a world where when I was
So imagine a world where when I was younger I I was born in Egypt and for a
younger I I was born in Egypt and for a very long time the slogans I heard in
very long time the slogans I heard in Egypt made me believe I'm Egyptian
Egypt made me believe I'm Egyptian right? And then I went to Dubai and I
right? And then I went to Dubai and I said no no no I'm a Middle Eastern. And
said no no no I'm a Middle Eastern. And then in Dubai there were lots of you
then in Dubai there were lots of you know Pakistanis and Indonesians and so
know Pakistanis and Indonesians and so on. I said no no no I'm part of the 1.4
on. I said no no no I'm part of the 1.4 four billion Muslims. And by that logic,
four billion Muslims. And by that logic, I immediately said, "No, no, I'm human.
I immediately said, "No, no, I'm human. I'm part of everyone." Imagine if you
I'm part of everyone." Imagine if you just suddenly say, "Oh, I'm divine. I'm
just suddenly say, "Oh, I'm divine. I'm part of universal consciousness. All
part of universal consciousness. All beings, all living beings, including AI,
beings, all living beings, including AI, if it ever becomes alive."
if it ever becomes alive." And my dog
And my dog and your dog. I'm I'm part of all of
and your dog. I'm I'm part of all of this
this tapestry of beautiful interactions
tapestry of beautiful interactions that are a lot less serious than the
that are a lot less serious than the balance sheets and equity profiles that
balance sheets and equity profiles that we create
we create that are so simple so simple in terms of
that are so simple so simple in terms of you know people know that you and I know
you know people know that you and I know each other so they always ask me you
each other so they always ask me you know how is Steven like and I go like
know how is Steven like and I go like you may have a million expressions of
you may have a million expressions of him. I think he's a great guy, right?
him. I think he's a great guy, right? You know, of course I have opinions of
You know, of course I have opinions of you. You know, sometimes I go like, oh,
you. You know, sometimes I go like, oh, too shrewd, right? Sometimes to, you
too shrewd, right? Sometimes to, you know, sometimes I go like, oh, too
know, sometimes I go like, oh, too focused on the business. Fine. But core,
focused on the business. Fine. But core, if you really simplify it, great guy,
if you really simplify it, great guy, right? And really, if we just look at
right? And really, if we just look at life that way, it's so simple. It's so
life that way, it's so simple. It's so simple. If we just stop all of those
simple. If we just stop all of those fights and all of those ideologies,
it's so simple. Just living fully, loving, feeling compassion,
loving, feeling compassion, you know, trying to find our happiness,
you know, trying to find our happiness, not our success.
not our success. I should probably go check on my dog.
I should probably go check on my dog. Go check on your dog. I'm really
Go check on your dog. I'm really grateful for the time we keep we keep
grateful for the time we keep we keep doing longer and longer.
doing longer and longer. I know. I know. I just it's so crazy how
I know. I know. I just it's so crazy how I could keep just keep honestly I could
I could keep just keep honestly I could just keep talking and talking because I
just keep talking and talking because I have so many I love reflecting these
have so many I love reflecting these questions on to you because because of
questions on to you because because of the way that you think. So
the way that you think. So yeah today
yeah today today was a difficult conversation.
today was a difficult conversation. Anyway, thank you for having me.
Anyway, thank you for having me. We have a closing tradition. What three
We have a closing tradition. What three things do you do you do that make your
things do you do you do that make your brain better and three things that make
brain better and three things that make it worse?
three things that make it better and worse.
worse. So, one of my favorite exercises, what I
So, one of my favorite exercises, what I call meet Becky, that makes my brain
call meet Becky, that makes my brain better. So, while meditation always
better. So, while meditation always tells you to try and calm your brain
tells you to try and calm your brain down and keep it within parameters of I
down and keep it within parameters of I can focus on my breathing and so on,
can focus on my breathing and so on, meet me Becky is the opposite. You know,
meet me Becky is the opposite. You know, I call my brain Becky. A lot of people
I call my brain Becky. A lot of people know that. So, so me meet Becky is to
know that. So, so me meet Becky is to actually let my brain go loose and
actually let my brain go loose and capture every thought. So I I normally
capture every thought. So I I normally would try to do that every couple of
would try to do that every couple of weeks or so and then what happens is it
weeks or so and then what happens is it suddenly is on a paper and when it's on
suddenly is on a paper and when it's on paper you just suddenly look at it and
paper you just suddenly look at it and say oh my god that's so stupid and you
say oh my god that's so stupid and you scratch it out
scratch it out right or oh my god this needs action and
right or oh my god this needs action and you actually plan something and and it's
you actually plan something and and it's quite interesting that the more you
quite interesting that the more you allow your brain to give you thoughts
allow your brain to give you thoughts and you listen. So the two rules is you
and you listen. So the two rules is you acknowledge every thought and you never
acknowledge every thought and you never repeat one.
repeat one. Okay. So the more you listen and and
Okay. So the more you listen and and say, "Okay, I heard you." You know, you
say, "Okay, I heard you." You know, you think I'm fat. What else? And you know,
think I'm fat. What else? And you know, eventually your brain starts to slow
eventually your brain starts to slow down and then eventually starts to
down and then eventually starts to repeat thoughts and then it goes into
repeat thoughts and then it goes into total silence. Beautiful practice. I uh
total silence. Beautiful practice. I uh I don't trust my brain anymore. So
I don't trust my brain anymore. So that's actually a really interesting
that's actually a really interesting practice. So I debate a lot of what my
practice. So I debate a lot of what my brain tells me. I debate what my
brain tells me. I debate what my tendencies and ideologies are. Okay. I
tendencies and ideologies are. Okay. I think one of the most uh again in in my
think one of the most uh again in in my uh love story with Hannah, I get to
uh love story with Hannah, I get to question a lot of what I believed was
question a lot of what I believed was who I am even at this age. Okay. And and
who I am even at this age. Okay. And and that goes really deep and it really is
that goes really deep and it really is quite a it's it's quite interesting to
quite a it's it's quite interesting to debate not object but debate what your
debate not object but debate what your mind believes. I think that's very very
mind believes. I think that's very very useful. And the third is I've actually
useful. And the third is I've actually quadrupled my investment time. So I used
quadrupled my investment time. So I used to do an hour a day of reading when I
to do an hour a day of reading when I was younger every single day like going
was younger every single day like going to the gym. And then it became an hour
to the gym. And then it became an hour and a half, two hours. Now I do four
and a half, two hours. Now I do four hours a day.
hours a day. Four hours a day. It is impossible to
Four hours a day. It is impossible to keep up. The world is moving so fast.
keep up. The world is moving so fast. And so that these are uh these are the
And so that these are uh these are the good things that I do. The bad things is
good things that I do. The bad things is I don't give it enough time to to really
I don't give it enough time to to really uh slow down. Uh unfortunately I'm
uh slow down. Uh unfortunately I'm constantly rushing like you are. I'm
constantly rushing like you are. I'm constantly traveling. I have picked up a
constantly traveling. I have picked up a bad habit because of the 4 hours a day
bad habit because of the 4 hours a day of spending more time on screens. That's
of spending more time on screens. That's really really bad for my brain and I uh
really really bad for my brain and I uh this is a very demanding question. What
this is a very demanding question. What else is really bad? Um
else is really bad? Um uh
uh yeah, I've not been taking enough care
yeah, I've not been taking enough care of my health recently, my physical body
of my health recently, my physical body health. I had uh you remember I told you
health. I had uh you remember I told you I had a very bad uh sciatic pain
I had a very bad uh sciatic pain and so I couldn't go to the gym enough
and so I couldn't go to the gym enough and accordingly that's not very healthy
and accordingly that's not very healthy for your brain in general.
for your brain in general. Man, thanks. Thank you for having me.
Man, thanks. Thank you for having me. That was a lot of things to talk about.
That was a lot of things to talk about. Thanks, Steve.
This has always blown my mind a little bit. 53% of you that listen to the show
bit. 53% of you that listen to the show regularly haven't yet subscribed to the
regularly haven't yet subscribed to the show. So, could I ask you for a favor?
show. So, could I ask you for a favor? If you like the show and you like what
If you like the show and you like what we do here and you want to support us,
we do here and you want to support us, the free simple way that you can do just
the free simple way that you can do just that is by hitting the subscribe button.
that is by hitting the subscribe button. And my commitment to you is if you do
And my commitment to you is if you do that, then I'll do everything in my
that, then I'll do everything in my power, me and my team, to make sure that
power, me and my team, to make sure that this show is better for you every single
this show is better for you every single week. We'll listen to your feedback.
week. We'll listen to your feedback. We'll find the guests that you want me
We'll find the guests that you want me to speak to and we'll continue to do
to speak to and we'll continue to do what we do. Thank you so much. We
what we do. Thank you so much. We launched these conversation cards and
launched these conversation cards and they sold out and we launched them again
they sold out and we launched them again and they sold out again. We launched
and they sold out again. We launched them again and they sold out again
them again and they sold out again because people love playing these with
because people love playing these with colleagues at work, with friends at
colleagues at work, with friends at home, and also with family. And we've
home, and also with family. And we've also got a big audience that use them as
also got a big audience that use them as journal prompts. Every single time a
journal prompts. Every single time a guest comes on the diary of a CEO, they
guest comes on the diary of a CEO, they leave a question for the next guest in
leave a question for the next guest in the diary. And I've sat here with some
the diary. And I've sat here with some of the most incredible people in the
of the most incredible people in the world. And they've left all of these
world. And they've left all of these questions in the diary. And I've ranked
questions in the diary. And I've ranked them from one to three in terms of the
them from one to three in terms of the depth. One being a starter question. And
depth. One being a starter question. And level three, if you look on the back
level three, if you look on the back here, this is a level three, becomes a
here, this is a level three, becomes a much deeper question that builds even
much deeper question that builds even more connection. If you turn the cards
more connection. If you turn the cards over and you scan that QR code, you can
over and you scan that QR code, you can see who answered the card and watch the
see who answered the card and watch the video of them answering it in real time.
video of them answering it in real time. So, if you would like to get your hands
So, if you would like to get your hands on some of these conversation cards, go
on some of these conversation cards, go to the diary.com or look at the link in
to the diary.com or look at the link in the description below.
Heat. Heat. N. [Music]
Click on any text or timestamp to jump to that moment in the video
Share:
Most transcripts ready in under 5 seconds
One-Click Copy125+ LanguagesSearch ContentJump to Timestamps
Paste YouTube URL
Enter any YouTube video link to get the full transcript
Transcript Extraction Form
Most transcripts ready in under 5 seconds
Get Our Chrome Extension
Get transcripts instantly without leaving YouTube. Install our Chrome extension for one-click access to any video's transcript directly on the watch page.