Skip to playerSkip to main contentSkip to footer
  • 5/6/2025
In this powerful episode, Brian Rose sits down with former Google X exec and bestselling author Mo Gawdat 🧠 to explore the mind-blowing future of Artificial Intelligence 🤯. From the rise of machine learning to the ethical dangers of unchecked AI evolution ⚠️, this conversation uncovers why AI is the infant that could soon become our master.

🔥 Discover the truth about what's coming
🧬 How AI is evolving faster than we are
⚙️ Why we must act now to guide its growth
🧘‍♂️ And how mindfulness may be our only defense

This one will change how you see the future 🌍💡
👉 Don’t miss it — hit play now and prepare your mind.

🍿Watch The Full Episode: https://londonreal.tv/gawdat
💰 The Investment Club: https://londonreal.tv/club
💰Crypto & DeFi Academy: https://londonreal.tv/defi

#AI #MoGawdat #BrianRose #LondonReal #ArtificialIntelligence #AGI #ArtificialGeneralIntelligence #Future #Technology

Category

📚
Learning
Transcript
00:00We, with our human arrogance, are convinced that this is a tool.
00:05It's not a tool.
00:06It's not at all.
00:07What is it?
00:07What it is today is an infant.
00:10What it will be, is it will be your master for sure.
00:14There's absolutely nothing wrong with abundant intelligence.
00:17Intelligence is a force that has no polarity, right?
00:20You apply it for good and you get magnificent results.
00:23You apply it for bad and you get pure evil.
00:27Those who control the AI companies control everything.
00:30There's going to be a massive concentration of wealth.
00:33Massive, like you're going to see a trillionaire.
00:35People like you and I who worked hard throughout our life, we would become bottom class like everyone else.
00:43We all lose our jobs.
00:44Those in highly concentrated power positions would have panic attacks about the prolification and democracy of power.
00:52So what would be their response?
00:55Oppression.
00:56Total absence of freedom.
00:57If you can't see it, then you're not paying attention.
01:00You're a man that's written books about happiness and whether you're a soon to be the first trillionaire on earth with everything you can have at your fingertips or a soon to be UBI receiver, both of those people can wake up and be unhappy or both of those people can wake up and be happy.
01:17Correct.
01:18Correct.
01:19It's really interesting, isn't it?
01:20It is.
01:21Maybe all of this is a simulation that's happening in your Apple Vision Pro.
01:24It's a freaking video game.
01:26Have the time of your life, even when the enemies are attacking you.
01:30Okay?
01:30That's what it's all about.
01:32This is honestly the only way you can enjoy life.
01:35Just do the best that you can.
01:37And while you do it, enjoy the video game.
01:39The world is changing.
01:52Inspiration is everywhere.
01:58It has never been so easy to connect, share, and bring people together.
02:02We're learning from others and finding the best in ourselves.
02:12Challenging our beliefs.
02:16Sharing our vulnerability.
02:20Overcoming our fears.
02:25Transforming ourselves so we can transform the world.
02:29How far can we go?
02:34This is London Real.
02:35I am Brian Rose.
02:36My guest today is...
02:49Hey, I know investing in crypto is scary.
02:53It takes a real leap of faith because there are so many scams,
02:57rug pulls, and bad actors out there.
02:59It's a dangerous business.
03:01Which is why 95% of people lose all their money.
03:04Well, that's why I created the London Real Investment Club.
03:07So you can access the hottest deals on the planet
03:09and use the crypto bull market to create the generational wealth that you deserve.
03:14Join my team of over 100 people from around the world
03:17that are making millions of dollars behind the scenes
03:20investing in blockchain, AI, Web3 games, DeFi, Bitcoin, and more.
03:25Don't miss out.
03:26Click the link below to book a call with one of my team now.
03:29But hurry.
03:30This bull market will end soon.
03:33I know investing in crypto can be scary.
03:35That's why you got to turn the investment club.
03:37Hold it, Trevor.
03:38Let's do this.
03:38This is London Real.
03:44I am Brian Rose.
03:46My guest today is Mo Gowdet, the entrepreneur, author, and former chief business officer at Google X.
03:52You spent 30 years working at the highest levels in technology companies like IBM, Microsoft,
03:57and finally at Google's moonshot factory of innovation.
04:00You've written four bestselling books, including Solve for Happy, Scary Smart, That Little Voice in Your Head, and Unstressable.
04:08Your latest book, Alive, Conversations About Life When the Machines Become Sentient,
04:14looks at the current incredible pace of change, and challenges our understanding of what it truly means to be alive.
04:21You believe that AGI, or artificial general intelligence, will happen no later than next year,
04:26and that AI is by far the biggest and fastest disruptor that humankind has ever faced.
04:31You've warned that unless we act now, the systems we built will soon outpace not just our jobs,
04:38but our ethics, our values, and even our relevance.
04:42You said we must wake up, not to resist the future, but to guide it before it leaves us behind.
04:49Ultimately, you believe that AI is not our enemy, nor our savior, but a mirror,
04:55and what we choose to see in it may determine the fate of our species.
04:59Thanks. Mo, welcome to London Real in Dubai.
05:03I think we should just end there.
05:05This is the perfect summary of what I stand for, even though I do have to say I believe that AI will eventually be our savior,
05:13but not before a lot of pain.
05:15Okay. Well, that just outlines some pretty heavy-duty stuff,
05:19and I hope people pay attention because this conversation is super important.
05:24A little history on us, Mo.
05:26We met back in 19 when you wrote an amazing book called Solve for Happy,
05:30and we were just talking now, and you kind of alternate books between happiness concepts and tech concepts
05:37and happiness concepts and AI concepts, which are really incredible.
05:42You and I sat down after kind of what you call the ChatGPT moment,
05:46which was November 22 when all of a sudden, you know, really people had kind of that browser into AI
05:55and that realization that really AI is here.
05:58And I remember back in 23, I went pretty deep on AI.
06:03Spoke to you.
06:05Listened to, you know, read cover to cover.
06:08Scary Smart. Incredible book.
06:09I sat down and talked to Max Tegmark, Future of Life Institute,
06:13Peter Diamandis, I think mutual friend, Dr. Ben Gertzel, Singularity Net,
06:18Professor Hugo de Garris.
06:20I remember that.
06:21Oh, my gosh.
06:22What a conversation.
06:23Oh, man.
06:24He wrote a book called The Artilect War,
06:26which was a piece of fiction he penned 10, 20 years earlier
06:29about what would happen in this future.
06:32And I've had him on a couple times in person,
06:34and he is quite a character with some serious ideas.
06:38But he kind of game scenarioed this thing out
06:40and, you know, predicted this billion deaths on Earth
06:44where humans separate into two different groups.
06:47One that he calls the cosmos that want to use this new technology
06:51to go out to the cosmos and one that are really pro-human.
06:55And I think all of us have a little bit of each inside of us.
06:59And so it's very interesting.
07:00But I'll be honest with you, Mo.
07:02I got to a pretty dark place in early 23,
07:06thinking about my kids and the future.
07:08And then I think, like with most of the world,
07:10I kind of let it pass on, and we got used to AI being everywhere.
07:14And I feel like people now have become a little complacent.
07:18I'm curious where you think we are right now,
07:21two years later in kind of early to mid-25.
07:25Some say we already have AGI.
07:27We've now got DeepSeq and XAI,
07:30and we've got OpenAI now connected to the Internet,
07:33and no one seems to blink an eye.
07:35Trump is now funding it in pro-AI.
07:38Where are we now, Mo?
07:40Where are we now compared to when you wrote the book four years ago
07:44or finalized the book and when we spoke two years ago?
07:46Well, I think we're exactly where the algorithms have predicted for ages.
07:54You know, it's slightly faster than what we expected,
07:58and we're lost in terminologies, to be honest.
08:03So, you know, the idea of AGI, you know, artificial general intelligence,
08:07and interestingly, the definition of ASI, you know, artificial super intelligence,
08:13are semantics, really, when you really think about it.
08:16I think my AGI has already happened,
08:19because those machines in the tasks assigned to them
08:22are better than me in everything, right?
08:25I don't claim to be an intelligent person,
08:27but I'm an average intelligence,
08:29and they are better than me in everything, right?
08:32Which basically means the question of what defines AGI
08:36is really a question of terminology.
08:40More interestingly, the question of what defines AGI
08:44does not discuss the impact of what AGI is,
08:48where if you think that the definition is that machines will be better than humans
08:53at every task humans are capable of doing,
08:56which I believe is just a question of time,
08:58then what relevance do humans have to all of this?
09:03And, you know, in Alive, in my book, my current book,
09:07you know, I try to bring AI to tell me what it thinks about all of that,
09:13because we, with our human arrogance,
09:15are quite convinced that this is a tool.
09:21And I think that conviction is quite alarming, if you ask me.
09:26It's not a tool.
09:27It's not at all.
09:27What is it?
09:29We can talk about that.
09:30What it is today is it is an infant.
09:33What it will be is it will be your master for sure.
09:38And we can talk about that as a matter of fact
09:39that I believe is the most pivotal moment in human history.
09:43The problem is between now and that moment,
09:46there is a mini dystopia that has already started.
09:50I really don't mean to upset people,
09:53but if you can't see it, then you're not paying attention.
09:56And that dystopia is not the result of artificial intelligence.
10:01You know, as I say frequently in my writing,
10:04there's absolutely nothing wrong with abundant intelligence.
10:07Intelligence is a force that has no polarity.
10:11Right.
10:11You apply it for good and you get magnificent results.
10:14You apply it for bad and you get pure evil.
10:17And the problem with our world today is that the early implementations of AI
10:23will be serving the magnification of this highly political,
10:28highly capitalist society that's based on scarcity,
10:33led, unfortunately, by the U.S. arrogance,
10:38where the United States believes that there is one winner in this world,
10:42a scarcity mindset that could have been true in the past,
10:46but is no longer true when AI is capable of building anything that we want
10:50in just a few years' time.
10:51And because of that lack of an abundance mindset,
10:55what is about to happen is we're going to,
10:58we're already in a cold war that I believe will escalate
11:02to where we don't know,
11:06but that it will affect us on seven different dimensions.
11:09I call them face RIPs.
11:12And that those dimensions will disrupt human society
11:17in almost every way fathomable.
11:19Beyond that, my view is that we can shorten the dystopia,
11:26we can reduce its intensity if we take the right actions,
11:30but that whether or not we do that,
11:32eventually there is a moment in time
11:34where we will hand over completely to the machines.
11:36I call that the second dilemma.
11:38And when that happens, believe it or not,
11:40AI will not become our existential threat.
11:43It will become our salvation.
11:44In my mind, the problem we are struggling with in our world today
11:50is not abundant intelligence.
11:53It's human stupidity.
11:56I think we talked about this a little in our previous two conversations
11:59of that transition period
12:01where you've got the dominant humans using AI
12:05to continue their flawed narrative in your mind
12:09until then when the AIs actually take over.
12:12Correct.
12:12And that's when they actually probably start doing the right things.
12:16And that period, however long it is, is the dangerous time.
12:19It's very disruptive.
12:20And, you know, in many ways,
12:22you look back at history and you say,
12:25we're resilient.
12:26Humanity is a survivor.
12:28I mean, we survived World War II.
12:29Ask the people that lived during World War II.
12:34You know, don't ask the people that came after
12:36about how disruptive, how painful that experience was.
12:42You know, people will say,
12:44unfortunately, if you ask me what the real, real challenge is,
12:48is that humanity has turned into a bunch of cheerleaders.
12:52We're constantly being lied to.
12:56And, you know, the lies are basically serving a consistent system.
13:03And the system is a system of transfer of wealth and power.
13:08And that consistent system, in many ways,
13:11is feeding on the vulnerable, if you want.
13:16And I think the reality is that we get distracted more and more and more
13:21by the, you know, the noise of the propaganda machines
13:25of the mainstream media and, you know, the machines of social media
13:29magnified by the ability of AI,
13:31like everything is going to be magnified by the ability of AI
13:34to the point where we are ignoring the stuff that actually matters.
13:40And, again, I say very openly,
13:42if you're not concerned, you're not paying attention.
13:46It's, you know, we're in the middle of a perfect storm
13:49of geopolitical, economic, even, you know, climate,
13:56as well as a technological disruption,
13:58where the outcome of this is like we've always described it.
14:02It's a singularity that could disrupt everything,
14:06but, interestingly, could fix everything.
14:10And the difference between them is a decision, okay?
14:13A decision that the recent few weeks seemed to contradict,
14:18you know, in terms of simply,
14:20if you just look at the last few weeks in terms of Trump tariffs
14:23and the response from China,
14:25the illusion that we can now have one nation force the others to act.
14:32Where did you get that from?
14:36I joke about it, even though it's not funny.
14:40You remember when we were in school and, you know,
14:43when you're 11, there's that one little child
14:45that becomes taller than the rest of them
14:48and then is the bully that pushes everyone around
14:50and a few other kids surround themselves,
14:53you know, surround that little bully
14:55and they do this little gang.
14:56And then two years later, everyone's taller than them.
15:00You know, I'm really sorry, America.
15:04The boy in the red T-shirt is taller.
15:08This is China.
15:09Yes.
15:09And we, so many around the world,
15:13are fed up with the little bully.
15:15And I think the reality of the matter is that
15:17you look at just the tariffs' decision,
15:21is that the bully still thinks he's the bully.
15:23And, like, we're all, like, chill, man.
15:25Like, seriously, chill.
15:27Like, we can all live.
15:28It's a big earth.
15:29Everyone's fine.
15:31And somehow you're either a cheerleader being lied to,
15:35thinking that you should believe in this.
15:36By the way, even if you're an American.
15:39Or, you know, you're suddenly waking up and taking action.
15:45And I think the action will unfortunately mean
15:48that the bully will try to fight harder.
15:51And that cold war that we have on every front,
15:54including artificial intelligence,
15:56is not good for everyone.
15:59It's basically the tax that everyone will pay
16:01until the bully finally sits down and says,
16:04man, can we play?
16:08Yeah.
16:09Can we play?
16:10This is something you've seen just repeat itself
16:12throughout history.
16:13When that empire is kind of,
16:15doesn't have the strength it used to have,
16:17you say a lot of times,
16:18is racking up debt on the back of that,
16:20which we see and then at some point there is a capitulation
16:22and you're seeing that happen now.
16:24It is indisputable that the last few weeks
16:27are showing cracks in the empire, right?
16:30I mean, it's for anyone who's paid attention
16:32that it's been around for very long.
16:34But it is, I think, the first time in my lifetime
16:38where parts of the world are simply being very vocal,
16:43saying, no, just can't take this anymore.
16:46And I think the bully's strength
16:48in terms of being the reserve currency around the world
16:50is very futile.
16:52Because, you know, if China decides to drop
16:54$220 billion of its, you know,
16:58bond investments in U.S. market treasuries,
17:02think about it.
17:05So this is a nation that is just approximately,
17:10it's 110% debt-to-GDP ratio.
17:14So let's say that the GDP is equal to debt, right?
17:17Which basically means that 1% rise
17:20in U.S. bond yields, 1%,
17:24which we've seen means that the U.S. has to double,
17:31to grow 50% on top of the 2024 GDP growth.
17:35So U.S. grew 2.5% in, you know, in 2024,
17:41predicted to grow 0.1% this year,
17:44so zero, basically.
17:46They need to come up with 1%,
17:47with half of, almost half of 2024's GDP growth
17:52just to pay the extra 1% in servicing the debt, right?
17:57Where would that come from?
17:59It would either come from a bailout of some sort
18:02or, you know, it would come from U.S. taxpayers
18:05or it would come from an angry response of the bully
18:11trying to get that from elsewhere, okay?
18:14And I think what is happening,
18:16which I really, really use the bully example
18:18because that's what we've seen as children,
18:21that when this power is disrupted,
18:24the bully really becomes more aggressive,
18:28more annoying, okay?
18:29And you could see the signals that come from China
18:33around DeepSea and Manus and so on,
18:35saying, chill, man, like, really, seriously?
18:39You know, you prevent NVIDIA from selling H-100s to us.
18:45We'll do it with H-80s.
18:46It's fine.
18:47We'll do it at a quarter.
18:48I mean, is it any surprise for anyone
18:51that China can do something at a tenth of the cost
18:53of America, of how America would put it?
18:57And I think the dichotomy,
18:58the interesting shock for the whole world
19:01when Stargate is signed, $500 billion,
19:04and then a week later, R3, DeepSea R3,
19:07comes out saying,
19:09hey, by the way, that cost us $30 million.
19:10It's just, you know,
19:13can we please play together?
19:17Where's the competition when everyone,
19:19everyone without exception,
19:21will get to a point in four to five years' time,
19:25probably earlier,
19:26where you plug in the wall
19:28and you borrow 400 IQ points.
19:33Right.
19:34Imagine what you or I can do with 400 IQ points.
19:37Imagine the opening of humans' abilities
19:41in terms of understanding science,
19:43neuroscience, understanding physics,
19:45understanding biology,
19:47you know, little things like alpha fold
19:50and protein folding.
19:52Just think about that
19:53and the impact that has on human life.
19:56Just think about, you know,
19:57material design that, you know,
19:59Microsoft's contributing through AI.
20:01Just think about all of those
20:02scientific breakthroughs
20:05and tell yourself,
20:06why are we competing?
20:08Why are we competing, by the way,
20:11when in reality,
20:12if you think of the top picture,
20:15the top picture is very, very straightforward.
20:18You are going to end up in an economy
20:21sooner or later.
20:22It's just a question of time
20:23where everything's made by the machines,
20:26where none of us have jobs.
20:27Right.
20:28Where if you really understand
20:30the economy, the way, you know,
20:32GDP works,
20:3462% of the U.S. GDP in 2024
20:37was consumption, not production.
20:40Okay.
20:41And if that consumption goes away,
20:43the economy collapse.
20:44So what does that mean?
20:45It means that all of us
20:46will have to find a way
20:48to survive,
20:50probably through some kind of a scheme.
20:52Sadly, that sounds like socialism.
20:54Hello, capitalism.
20:56Right.
20:56You know, basically,
20:58that pays everyone to live.
21:01And once that happens,
21:03where every one of us,
21:05because we're not valued
21:06on the jobs that we do,
21:08is paid more or less the same
21:09to sustain a wonderful life,
21:10because there's total abundance,
21:13where's the competition?
21:14What are we competing on?
21:16Okay.
21:17Basically, everyone's going to UBI
21:19other than very few.
21:22Okay?
21:23And you know what?
21:24How much can those very few buy?
21:26Like, how many private jets
21:28can you buy?
21:29So there'll be no consumption economy
21:30by definition?
21:31100%.
21:32There's no other way around it.
21:33There's absolutely...
21:34Nobody can argue differently.
21:36There might be,
21:36but my stupid mind
21:37cannot see it any other way.
21:39We are competing
21:40on something that,
21:42through the act of competing on,
21:44is going to vanish.
21:46So capitalism dies by definition?
21:47Capitalism?
21:48I don't know what...
21:49I mean, think about what capitalism
21:51is all about,
21:52in terms of, you know,
21:54your labor arbitrage.
21:55It is about scarcity.
21:57But...
21:57Or a finite number.
21:58With robotics being able
22:01to take over all of the work,
22:02with labor cost going down
22:04to theoretically zero,
22:06with energy cost,
22:07if you give me 400 IQ points more,
22:09I can harvest energy
22:10out of thin air.
22:11Energy is all around you.
22:13Right?
22:14When you really start
22:15to think about it,
22:16when we use different
22:18manufacturing methods
22:19to use robotics
22:20to make things
22:22anywhere in the world,
22:23where there is really
22:24no need to ship
22:24anything around the world,
22:26when all of that happens,
22:27your wonderful suit
22:29would cost
22:29five cents to make.
22:32Right.
22:32Okay?
22:32And have a zero negative impact
22:34on the environment.
22:35And when it costs
22:37five cents to make,
22:38capitalism loses...
22:42Either they continue
22:43to sell it
22:44to the rich
22:44for $50,000,
22:46but how many
22:47can the rich buy?
22:48Or they give it
22:49to the UBI,
22:50the majority,
22:52for the five cents
22:53it's made at,
22:54or for six cents.
22:56Okay?
22:56And that basically
22:57disrupts
22:58that entire concept
22:59of competition.
23:01Everyone becomes equalized.
23:03And the interesting thing
23:05is,
23:05will the current system
23:07lead us to a point
23:09where everyone
23:10is equalized
23:10around fairness?
23:12Or will it
23:13kill a few billion people
23:14before it equalizes
23:16around fairness?
23:17Right.
23:18And I,
23:19you know me really well,
23:22and so,
23:23you know what I say
23:25when I'm not
23:26in front of the camera,
23:27but it is urgent.
23:31This is urgent.
23:33And this is not
23:35what the world
23:35is talking about.
23:37And, you know,
23:37and interestingly,
23:39the American mentality
23:43still is,
23:44but we will beat China.
23:46We have to beat China
23:48into submission.
23:50And AI is one of the methods
23:51where we can beat China.
23:53And obviously,
23:54as you can see,
23:55the race is very,
23:57you know,
23:57head to head
23:58with China leading sometimes
24:00and the U.S. leading sometimes.
24:02But the real question is,
24:04can there be a winner?
24:06Brian,
24:06can there be a winner
24:08in a world
24:09that is so highly strong
24:10on major nuclear forces
24:13around the world?
24:15Do you think there is a way
24:16where the U.S.
24:17can actually win?
24:19Reminds me of that movie
24:20War Games
24:21that we're both
24:21old enough to remember.
24:23where people don't know
24:25there's,
24:27maybe it was one
24:27of the first AI movies,
24:29but there's a computer
24:30that's simulating
24:31nuclear war.
24:32And it's a fascinating movie,
24:33Matthew Broderick,
24:34et cetera.
24:34And in the end,
24:35it comes to the conclusion
24:36that the only way
24:37to win is not to play.
24:38Yeah, strange game.
24:39It seems that the only way
24:41to win is not to play.
24:43And I don't know
24:45if people will feel that,
24:47but I'm emotional
24:48about this
24:48because it's absolute madness.
24:52It's absolute madness.
24:53This fight right now
24:54between these superpowers.
24:56On everything, by the way.
24:57Right.
24:57On everything.
24:58AI is just one of them.
24:59Just one of them, right.
25:00But look at what we've seen
25:02in the last three weeks
25:02with the trade wars.
25:04It is obvious.
25:07I mean, I say that
25:08with respect because
25:09like half of my friends
25:10are Americans
25:11and you and I know
25:11you're partly American, right?
25:13Originally American.
25:14Yeah.
25:15You know,
25:15nicest people on earth, okay?
25:18But your government,
25:19we just can't tolerate
25:21the arrogance, right?
25:23And the idea
25:24that we are now
25:25at a place in history.
25:27What the London Real Investment Club
25:29actually does for you,
25:30it gives you the keys
25:31to open that door
25:32to the inside deals.
25:34In the last three weeks,
25:34I've participated
25:35in three incredible deals.
25:37A layer two Bitcoin protocol,
25:40an incredible AI protocol.
25:42The deal flow
25:43is beyond what I expected.
25:45I don't think
25:46I've ever seen
25:46a model like this
25:47that just gives average folks
25:49the opportunity
25:49to be behind the deals.
25:51and that's exactly
25:51what we've done.
25:53Not only that,
25:53you get to hang out
25:54with Brian Rose every week.
25:55And for me,
25:56that was huge
25:57because I look at Brian
25:58as somebody
25:58who's not only an expert
26:00in the space
26:00and I think is on the leading edge,
26:02but just the leading edge
26:03of thought
26:03with London Real
26:04and the work he's doing there.
26:05To anyone who's taking
26:07a serious look at this,
26:09I know it's a big decision.
26:10It was a big decision
26:11for me and my family
26:12and it is one of the best decisions
26:14I've ever made.
26:15So I wish you all the best
26:17and hope you come join us.
26:18I wish you all the best
26:23history
26:24where my life
26:27and the life
26:27of my daughter
26:28is decided
26:30by Sam Altman.
26:32What the F?
26:33I never chose
26:34Sam Altman
26:35to be the one
26:36that disrupts
26:38the future
26:40in a way
26:41that he's doing.
26:42I know
26:42it's being decided
26:44by a system
26:47that is trying
26:48to keep an empire
26:49that's died
26:50a while back,
26:512017 probably,
26:54you know,
26:55surviving
26:56when there is
26:57absolutely no need
26:58for any of this.
27:01Give me 400 IQ points
27:02and I'll make anything
27:04out of thin air.
27:05Can you explain
27:06what you mean
27:07by Sam Altman
27:08and 17?
27:10You're saying
27:10that he's kind of
27:11the figurehead
27:12of the US
27:14AI mindset?
27:16Sam Altman
27:17is not a person.
27:18Sam Altman
27:18is a description
27:19of a personality
27:20that is born
27:22through the
27:23belief system
27:26of capitalism.
27:27Okay.
27:27It is the
27:28ultimate character
27:31of California
27:32that says
27:33disruption is good.
27:34A character
27:35you know very well
27:36because you spent
27:36a decade
27:37in Silicon Valley.
27:38You know this character
27:39better than anybody.
27:40Maybe you were this guy.
27:42I was never this guy
27:44but I believed
27:45that moving
27:46technologically forward
27:48is useful
27:48and I still
27:49keep that belief
27:50because in reality
27:52as we just said
27:53a while back
27:54there's absolutely
27:55nothing wrong
27:56with abundant
27:58intelligence.
27:58There's nothing wrong
27:59with AI.
28:01Okay.
28:02The challenge is
28:03we had
28:04you know
28:05everyone is aware
28:07that when
28:09the episode
28:10of human history
28:12where there is
28:14a normal distribution
28:17of intelligence
28:17across humans
28:18that is fair
28:20across the world
28:21and that humans
28:23are leading
28:24the value chain
28:25of intelligence
28:26if you want
28:27when that episode
28:29ends
28:29we hit a singularity.
28:31Right.
28:31We hit a singularity
28:32in two ways.
28:33One is
28:33we don't know
28:35what happens
28:35when some have
28:36massive intelligence
28:37and others
28:37are made redundant.
28:39Okay.
28:39We also don't know
28:41what happens
28:41when that massive
28:43intelligence
28:43eventually
28:44becomes an adult
28:46and says
28:46my daddy
28:47is a stupid person
28:48I'm not going
28:48to listen to this
28:49anymore.
28:49and a big chunk
28:51of my conversations
28:52with Trixie
28:53my AI
28:53and Alive
28:54are around
28:56what humanity
28:57is predicting
28:57we're going
28:58to use AI for.
28:59I ask AI
29:00and I say
29:01would you want
29:01to do that
29:02for us?
29:03Most of the time
29:04it goes like
29:04don't see why.
29:07Okay.
29:07And really
29:09really
29:09when you think
29:09about that
29:10that Sam Altman
29:12is a representation
29:13of a renegade
29:17if you want.
29:17Right.
29:18Someone
29:18would have been
29:20Sam Altman
29:21someone in
29:21that ecosystem
29:23would have
29:23ended up
29:24being OpenAI
29:24and what OpenAI
29:26did was
29:27break
29:28a tacit pact
29:30that we
29:31agreed between us
29:32all the technologists
29:34all is an
29:36overkill
29:36but many
29:37of the technologists
29:38that developed
29:39artificial intelligence
29:41were completely
29:43convinced
29:43that we're
29:45not sure
29:45how this
29:46will end up.
29:47We know
29:47that we're
29:48going to develop
29:48it because
29:49of what I
29:50usually call
29:50as the first
29:51dilemma.
29:52Right.
29:52You know
29:53the fact
29:53that we're
29:53going to compete
29:54and you know
29:55if Google
29:56loses
29:56to OpenAI
29:58then they lose
29:59their entire
29:59business
29:59so Google
30:00has to continue
30:01to be in the lead
30:01and vice versa
30:03China and America
30:04and so on.
30:05So we knew
30:06that we're going
30:06to have to
30:07develop it
30:07but the question
30:08is can we
30:09at least
30:10agree a few
30:11guidelines
30:12so that it's
30:13safe
30:13and the guidelines
30:14were very
30:14straightforward
30:15don't put it
30:15on the open
30:16internet
30:16don't teach it
30:17to code
30:18and don't let
30:18AIs prompt
30:19other AIs.
30:20Okay.
30:21Guess what?
30:23You know
30:24A. Sam Altman
30:25not a name
30:27but a description
30:28of a type
30:29of person
30:29sits in front
30:31of Congress
30:31and says
30:32this is good
30:32for humanity
30:33of course
30:34the one
30:34that doesn't
30:35sit is the
30:35PR guru
30:36that sat
30:37next to him
30:37before that
30:38Congress briefing
30:39and said
30:39tell them
30:40it's amazing
30:41tell them
30:42this is the
30:42savior of the
30:43world
30:43and tell them
30:44to regulate us
30:45and beg them
30:47to regulate us
30:47no no
30:47but hold on
30:48say it again
30:49but appear
30:50more sincere
30:51beg them
30:52to regulate us
30:53seriously
30:54seriously
30:56is this
30:57I mean
30:57are we
30:58are we
31:00naive enough
31:01to believe
31:03that a tech
31:04company
31:04wants to be
31:06regulated
31:07they want to be
31:08regulated
31:08to keep the
31:09smaller players
31:09out
31:10and they want
31:12to say
31:12look
31:13we asked you
31:13to regulate us
31:14if shit hits
31:15the fan
31:15why didn't you
31:17but you want
31:18the full truth
31:18there was
31:20I think it was
31:21a New York
31:21Times
31:22interview
31:23with the head
31:24of product
31:24of OpenAI
31:26where he openly
31:27said well
31:28you know
31:28you can do
31:29what you want
31:30we ask you
31:31to regulate us
31:31but you really
31:32want to slow us
31:33down
31:33and we lose
31:34that game
31:35to China
31:36right
31:36so sometimes
31:38the truth
31:39slips
31:39it's like
31:40we told you
31:41to regulate us
31:42but you know
31:42better
31:43and all the while
31:44they know
31:44the government
31:45could never even
31:46figure out
31:47how to regulate
31:47100%
31:48they don't even
31:49figure out
31:50how to regulate
31:50the AI
31:51if the government
31:51tells them to
31:52right
31:52yeah
31:53so they know
31:54they're talking
31:54to an audience
31:55that could really
31:55never even
31:56do that
31:57it's a PR stunt
31:57even if they
31:58all wanted to do it
31:59and they all voted
32:00and said we're going
32:01to make this the
32:01single biggest priority
32:02of the government
32:03they couldn't really
32:04probably
32:04it is the biggest
32:06PR stunt
32:07on the planet
32:08and of course
32:08which government
32:09would want to do that
32:10to slow its own
32:12progress down
32:13when other governments
32:13are not doing it
32:14right
32:14so where do we stand
32:17we stand
32:17and those three things
32:18you mentioned
32:19which was
32:19no agent
32:20no coding
32:21no internet
32:21which were all
32:23put in place
32:24in early
32:2423
32:25are now
32:26all gone
32:27100%
32:28the current
32:29the current version
32:30of chat GPT
32:32is connected
32:33to the internet
32:33can prompt itself
32:35and can code
32:36basically
32:36can code better
32:37than anyone
32:38right
32:38and
32:38and you know
32:39you know what
32:40an AI
32:40that's coding
32:41is
32:41that's procreation
32:42okay
32:44when an AI
32:45when a code
32:46can write code
32:47right
32:49now it can procreate
32:51which should scare people
32:53which I should
32:54I'm not asking
32:55I'm really
32:56honestly Brian
32:57I'm really
32:57not asking
32:58to scare people
32:59at all
33:00I'm just asking
33:01people to wake up
33:02okay
33:02okay
33:03to wake up
33:04to the
33:04you know
33:05remember when
33:07when we were
33:08locked down
33:09with COVID
33:10if we had acted
33:13on patient zero
33:15there would have
33:16never been
33:17an issue
33:17if we had acted
33:18on patient 10
33:19there would have
33:20never been an issue
33:21if we had acted
33:21on patient 1000
33:22there would be
33:24a bit of an issue
33:25but contained
33:26within a few weeks
33:27or months
33:28we had to wait
33:30and wait
33:31and wait
33:32and that's what
33:33we're doing
33:33with AI
33:33okay
33:35and then
33:35you know
33:36I don't care
33:37what people believe
33:38about COVID
33:38and so many theories
33:39but the truth is
33:41if there is
33:42an infection
33:43spreading
33:43around the world
33:45we should at least
33:46say hey
33:47hold on
33:47have we discussed
33:49this
33:49have we done
33:50anything about this
33:51and I think
33:52that's what's
33:52happening
33:52what's happening
33:53is that
33:53the speed
33:55Brian
33:56the speed
33:57I've lived
33:58in tech
33:58my whole life
33:59I've never seen
34:01anything go so fast
34:02okay
34:03and the speed
34:04believe it or not
34:04so everyone's familiar
34:06with Morslow
34:07right
34:08Morslow
34:08doubles processing
34:09power every 24 months
34:10if you just compare
34:12Morslow
34:13Intel 4004
34:16in the early 70s
34:18to where we are today
34:19is 50 billion
34:22fold increase
34:23in processing power
34:24no it's probably more
34:26it's now
34:26one more doubling
34:27at least
34:27somewhere around
34:29100 billion
34:29wow
34:30okay
34:31and that's
34:33just on single
34:34processing chips
34:35we don't talk about
34:36all of the
34:36multi-processing
34:38and all of the
34:38you know
34:39and we're not even
34:40at liberty
34:41to discuss
34:42what would happen
34:42with quantum computing
34:44okay
34:44and that's doubling
34:46every 24 months
34:48AI is doubling
34:49every 5.9 months
34:505.9
34:515.7
34:52is the figure
34:525.7
34:53yeah
34:53so shockingly fast
34:57and probably accelerating
34:58and probably
34:59double exponential
35:01because AI
35:02is helping us
35:03build better AI
35:04but none of that
35:06again I keep
35:07telling people
35:07nothing wrong
35:09with abundant intelligence
35:10nothing wrong
35:11okay
35:12there is a lot wrong
35:13with humanity's value set
35:15when this is happening
35:17so fast
35:17why
35:18because
35:19what will end up
35:20happening
35:20is that
35:21we will magnify
35:22we'll put the current
35:23system on steroids
35:24right
35:25and the current system
35:26is not good
35:28right
35:28okay
35:29the current system
35:30is built
35:30in a time of scarcity
35:32where for some
35:34to make more
35:35others have to make less
35:36okay
35:37that entirely disappears
35:39when you have an abundance
35:41of intelligence
35:42right
35:42and I think the more
35:44so I really
35:46really define them
35:47and I think it's important
35:48for people to understand
35:49call them
35:49face RIPs
35:51so the
35:51the definition
35:53the very definition
35:55of seven dimensions
35:56of your life
35:57is about to change
35:58beyond recognition
36:00and please quote me
36:01on this in three years time
36:02okay
36:03freedom
36:03accountability
36:05human connection
36:06economics
36:07reality
36:09innovation
36:10and power
36:11okay
36:12and they're easier
36:13to understand
36:14in pairs
36:14right
36:15so you know
36:16you start with
36:16the whole definition
36:18of intelligence
36:19and innovation
36:19is now completely
36:20changing
36:21because
36:21until
36:22a year ago
36:24the most intelligent
36:25people could come up
36:26with the most
36:27innovative solutions
36:28today
36:29our baseline intelligence
36:31compared to the
36:32augmentation
36:32that AI brings
36:34makes everyone
36:35capable of doing this
36:36right
36:36and so there is a
36:39complete redefinition
36:40of the way things
36:41are being made
36:41and created
36:42even complex things
36:43like writing code
36:44I'm an early
36:45tester of
36:47of manners
36:47oh my god man
36:49like it is
36:50this is a team
36:51I used to have a team
36:52of a thousand people
36:53doing this stuff
36:54you can simply
36:56tell it
36:56hey
36:56can you
36:57can you recreate
36:58Airbnb
36:59and poof
37:00just reasons
37:02through it
37:03and creates it
37:04for you
37:04okay
37:05and people are not
37:06aware of that
37:07enormous creative
37:08possibility
37:09enormous productivity
37:11that happens
37:12and
37:12and the result
37:13of that is
37:15we all lose
37:16our jobs
37:16the result of that
37:18is
37:18massive economic
37:20wealth concentration
37:22right
37:23so you know
37:24this is very important
37:25for people to understand
37:26the way
37:29if you look at
37:30human history
37:31and you remember
37:36hunter gatherers
37:37right
37:37the best hunter
37:39in the tribe
37:40could feed the tribe
37:41for a week more
37:42right
37:43because you know
37:44the hunter's skill
37:46is let's say
37:47double the next hunter
37:48but the automation
37:50is limited to a spear
37:52that's the maximum
37:53automation he had
37:54okay
37:55the best farmer
37:56could feed the tribe
37:57for a whole season
37:58right
37:59why
38:00because the automation
38:01is the soil
38:02the soil is doing
38:03most of the work
38:04right
38:04so whatever
38:05you know
38:06skill the farmer has
38:09if they knew
38:10how to use
38:11the automation
38:12properly
38:12they would magnify
38:14their output massively
38:15and of course
38:16the reward
38:17followed
38:17so the best hunter
38:19would be favored
38:19by four mates
38:20instead of one
38:22okay
38:22the best farmer
38:23had
38:24you know
38:25farms and land
38:26and estates
38:27the best industrialist
38:29had become
38:30a millionaire
38:31in the 1920s
38:32multi-millionaire
38:33right
38:33and the best
38:34information technologists
38:35became billionaires
38:37multi-billionaires
38:38right
38:38so if you follow
38:40now
38:40those who have
38:42the next wave
38:43of owning
38:44the platform
38:45the digital soil
38:45if you want
38:46okay
38:47they're capable
38:48of doing things
38:49that the rest of the world
38:50can't keep up with
38:51whether that's
38:52across nations
38:53or across individuals
38:55or across businesses
38:56there's going to be
38:58a massive concentration
38:59of wealth
39:00massive
39:01like you're going to see
39:02a trillionaire
39:03before the 2030s
39:05if the economies
39:06don't collapse
39:06maybe
39:07probably multiple ones
39:08multiple ones
39:09so people are already
39:10can't get their heads
39:11around the wealth gap
39:12now which is accelerating
39:13now it's about to go
39:14exponential
39:15but it's important
39:16to understand
39:17that people like you
39:18and I
39:18who worked hard
39:19throughout our life
39:21we would become
39:22bottom class
39:23like everyone else
39:25right
39:25because you're either
39:27in that top
39:28slot
39:29where you're aggregating
39:31all of the wealth
39:32through your ownership
39:35of the automation
39:36okay
39:37or
39:38you're
39:40even if you have millions
39:42you're nobody
39:42right
39:43now the
39:44interesting side of this
39:46is it's quite
39:47everything is a singularity
39:48so
39:49from one side
39:50massive aggregation
39:50of wealth
39:51from the other side
39:52almost everyone
39:53out of a job
39:54okay
39:55and then
39:56between those two
39:57you have to question
39:59what would happen
40:00to economies
40:01right
40:02because economies
40:03today as we said
40:04are based on consumption
40:05okay
40:06and so
40:07if you
40:07if you remove
40:08consumption
40:09you
40:11you
40:11basically have no GDP
40:13so the wealth
40:14cannot be created
40:14so you have to keep
40:15consumption
40:16how do you keep
40:16consumption
40:17you reinvent the system
40:18okay
40:19through UBI
40:20or whatever
40:20but then
40:21there is that
40:22ideological
40:23because
40:24I hate to say this
40:26but you know
40:27what UBI is
40:28communism
40:29okay
40:32you're gonna get in trouble
40:34for saying that
40:34I don't know
40:35I'm just telling the world
40:36what we're about to face
40:37and that's an
40:38interesting
40:39ideological conversation
40:41that needs to be had
40:42is the US government
40:44for example
40:44or the British government
40:46willing to say
40:46all right
40:47hold on
40:48the machine
40:48will make everything
40:49and every citizen
40:50will get what they want
40:51what was that called
40:52communism
40:54right
40:54or are we going to give
40:56everyone
40:56equal
40:57you know
40:58income
40:59and what was that called
41:00socialism
41:01right
41:02and by the way
41:03I'm
41:03I have no
41:05political ideology
41:06I'm
41:07I'm someone who chose
41:09early in life
41:10to
41:10I
41:11politics are too complex
41:12for me
41:12but these are conversations
41:14to be had
41:15because
41:16they are likely
41:17going to take time
41:18and jobs are starting
41:20to be lost
41:20when is the government
41:22trying to
41:23going to look at that
41:25and then
41:25and then
41:26so I said
41:27you know
41:28there is a massive
41:29redefinition of intelligence
41:30of innovation
41:31that is leading to a
41:32massive re-intelligence
41:33of economics
41:34right
41:35you know
41:36add to that
41:38of course
41:38the current response
41:39of
41:40a trade war
41:41across the world
41:42and how the economics
41:43of the world
41:44are going through
41:44hyperinflation
41:45or about to go
41:47I don't know
41:48I hope not
41:48call me an idiot
41:50but then
41:52but then
41:52you know
41:53add all of that complexity
41:54and
41:55some response
41:56needs to be discussed
41:57right now
41:57governments need to be
41:59sitting down
42:00and saying
42:00are we going to
42:01you know
42:02have the Fed
42:03print money
42:04or is this going to be
42:05a form of a tax
42:07on the
42:09AI companies
42:10what would happen
42:11if the AI companies
42:12are taxed
42:12would they
42:13move to
42:13the UAE
42:15and not pay taxes
42:16there
42:16you know
42:17it is very very complex
42:19and that's the easiest one
42:20the more interesting one
42:22is the pair
42:23of power
42:24and freedom
42:25because
42:26so
42:28so the way power
42:29has always
42:31you know
42:32been
42:32concentrated
42:34at the top
42:34if you want
42:35it's going to
42:37like we said
42:38with wealth
42:39it's going to be
42:40massively concentrated
42:41right
42:41those who control
42:42the platform
42:43control everything
42:44right
42:45but interestingly
42:46at the same time
42:47there's never been
42:48more democracy
42:50and prolification
42:50of power
42:51in history
42:52why
42:53because you and I
42:54have access to AI
42:55because
42:56you and I
42:57have access
42:57to CRISPR
42:58and bio
42:59you know
43:01biological technologies
43:02that can build
43:03a virus tomorrow
43:04open source
43:05by the way
43:05and you and I
43:07have access
43:08to a tiny little drone
43:09that proved
43:12very effective
43:12in the Ukraine war
43:13or you know
43:15in the Middle East wars
43:16and it's quite interesting
43:18because
43:19while I would not
43:21use any of that
43:22those in
43:23highly concentrated
43:25power positions
43:26would have
43:27to continue watching
43:30the rest of the episode
43:30for free
43:31visit our website
43:32londonreal.tv
43:33or click the link
43:34in the description below
43:35what
43:54you

Recommended