Skip to playerSkip to main contentSkip to footer
  • 6/1/2025
So please sport,follow & Subscribe to my channel for interesting videos
Thanks! for watching

Category

😹
Fun
Transcript
00:00:00All right. Thanks for taking the time.
00:00:27How are you doing, brother?
00:00:30You keep him busy?
00:00:33Yeah. I mean...
00:00:37It's rarely a slow week.
00:00:40I mean, in the world as well.
00:00:43Yeah.
00:00:44I mean, any given week, it just seems like the thing's getting out of here.
00:00:47It's definitely a simulation.
00:00:48We've agreed on this at this point.
00:00:51I mean, look, if we are in some Alien Netflix series, I think the ratings are high.
00:00:56How are the freedom of speech wars going?
00:01:04This is a...
00:01:06You've been at war for two years now.
00:01:10Yes.
00:01:11The price of freedom of speech is not cheap, is it?
00:01:14I think it's like 44 billion, something like that.
00:01:16Just...
00:01:17Run numbers.
00:01:18Give or take a billion.
00:01:20Yeah, run numbers.
00:01:23Yeah.
00:01:24It's pretty nutty.
00:01:25There is like this weird movement to quell free speech kind of around the world.
00:01:32And that's something we should be very concerned about.
00:01:36You know, you have to ask, like, why was the First Amendment, like, a high priority?
00:01:42It was like number one.
00:01:43Number one.
00:01:44It's because people came from countries where if you spoke freely, you would be imprisoned
00:01:51or killed.
00:01:52And they were like, well, we'd like to not have that here.
00:01:55Because that was terrible.
00:01:56Yeah.
00:01:57And actually, you know, there's a lot of places in the world right now if you're critical
00:02:04of the government, you get imprisoned or killed.
00:02:08Right.
00:02:09Yeah.
00:02:10We'd like to not have that.
00:02:11Are you concerned?
00:02:12Can I add to that?
00:02:13Yeah.
00:02:14I mean, I suspect this is a receptive audience to that message.
00:02:18You know, I think we always thought that the West was the exception to that, that we knew
00:02:29there were authoritarian places around the world, but we thought that in the West, we'd
00:02:33have freedom of speech.
00:02:34And we've seen, like you said, it seems like a global movement.
00:02:37In Britain, you've got teenagers being put in prison for memes, opposing...
00:02:42Yes.
00:02:43It's like you liked a Facebook post, throw them in the prison.
00:02:47Yeah.
00:02:48It's like people have got an actual, you know, prison for like, obscure comments on
00:02:55social media.
00:02:56Not even shitposting yet.
00:02:57Like not even...
00:02:58Yeah.
00:02:59It's crazy.
00:03:00Like Pavel got thrown in prison.
00:03:01Right.
00:03:02You've got Pavel...
00:03:03I was like, what is the massive crime that...
00:03:07Right.
00:03:08Pavel in France, and then of course we've got Brazil with Judge Voldemort.
00:03:13That one seems like the one that impacts you the most.
00:03:16Can you...
00:03:17I guess we are trying to figure out, is there some reasonable solution in Brazil?
00:03:30The...
00:03:31You know, the concern...
00:03:32I mean, I want to just make sure that this is framed correctly.
00:03:38And, you know, funny memes aside, the nature of the concern was that, at least at X Corp, we
00:03:47had the perception that we were being asked to do things that violated Brazilian law.
00:03:53So, obviously we cannot, as an American company, impose American laws and values on other countries.
00:04:02That, you know, we wouldn't get very far if we did that.
00:04:06But we do, you know, think that if a country's laws are a particular way, and we're being asked to...
00:04:16We think we're being asked to break them, then...
00:04:19And be silenced about it, then obviously that is no good.
00:04:22Mm-hmm.
00:04:23So, I just want to be clear.
00:04:24This sometimes comes across as...
00:04:28Elon's trying to just be a crazy, whatever, billionaire, and demand outrageous things from other countries.
00:04:36And, you know, while that is true...
00:04:45In addition, there are other things that I think are, you know, valid, which is like, we obviously can't...
00:04:57You know, I think any given thing that we do at X Corp, we've got to be able to explain in the light of day,
00:05:03and not feel that it was dishonorable, or, you know, we did the wrong thing, you know?
00:05:10So, we don't...
00:05:13That was the...
00:05:15That's the nature of the concern.
00:05:16So, we actually are in sort of discussions with the, you know, judicial authorities in Brazil to try to, you know, run this to ground.
00:05:28Like, what's actually going on?
00:05:32Like, if we're being asked to break the law, Brazilian law, then that's...
00:05:37That obviously should not be...
00:05:38Should not sit well with the Brazilian judiciary.
00:05:42And if we're not, and we're mistaken, we'd like to understand how we're mistaken.
00:05:46I think that's a pretty reasonable position.
00:05:49I'm a bit concerned, as your friend, that you're going to go to one of these countries, and I'm going to wake up one day,
00:05:57and you're going to get arrested, and like, I'm going to have to go bail you out or something.
00:06:00Like, this is...
00:06:01Feels very acute.
00:06:02Like...
00:06:03Yes.
00:06:04I mean, it's not a joke now.
00:06:05Like, they're literally saying, like...
00:06:07Yeah.
00:06:08You know, it's not just Biden saying, like, we have to look into that guy.
00:06:10Now it's become quite literal, like, this...
00:06:12I don't know, who was the guy who just wrote the...
00:06:14Was it the Guardian piece about, like...
00:06:17Oh, yeah, yeah.
00:06:18There have been three articles, and I think in the past three weeks...
00:06:22Robert Reich.
00:06:23But it wasn't just him.
00:06:24It was three different articles.
00:06:26Three different articles.
00:06:27That doesn't...
00:06:28That's a trend.
00:06:29Calling for me to be imprisoned.
00:06:32Right.
00:06:33In the Guardian.
00:06:34You know.
00:06:35Guardian of what?
00:06:36What are they protecting, exactly?
00:06:38Yeah.
00:06:39Guardian of...
00:06:40I don't know.
00:06:41Of...
00:06:42Authoritarianism?
00:06:43Yeah.
00:06:44Guardian of...
00:06:45Censorship?
00:06:46Censorship.
00:06:47But the premise here is that you bought this thing, this online forum, this communication
00:06:54platform, and you're allowing people to use it to express themselves.
00:06:58Therefore, you have to be jailed.
00:07:00I don't understand the logic here.
00:07:02Right.
00:07:03There's...
00:07:04What do you think they're actually afraid of at this point?
00:07:07What's the motivation here?
00:07:08Well...
00:07:09I mean, I think...
00:07:10The...
00:07:11The...
00:07:12The...
00:07:13If somebody's afraid...
00:07:14If somebody's sort of trying to push a false premise on the world, then...
00:07:16And...
00:07:17Then that...
00:07:18And that premise can be undermined with public dialogue, then they will be opposed to public
00:07:23dialogue on that premise, because they wish their false premise to prevail.
00:07:26Right.
00:07:27So...
00:07:28That's...
00:07:29I think...
00:07:30You know...
00:07:31The...
00:07:32The issue there is...
00:07:33If they don't like the truth, the...
00:07:34You know...
00:07:35Then...
00:07:36We want to suppress it.
00:07:37So...
00:07:38Now...
00:07:39The...
00:07:40The...
00:07:41The...
00:07:42The...
00:07:43The...
00:07:44The...
00:07:45The issue is...
00:07:46I distinguish that from my son, who's also called X.
00:07:49Yes.
00:07:50Right.
00:07:51You have...
00:07:52You have parental goals, and then you have goals for the company.
00:07:54Everything's just called...
00:07:55And then you have goals for the company.
00:07:56Everything is just called...
00:07:57Everything's just called X, basically.
00:07:58Yes.
00:07:59Very difficult disambiguation.
00:08:00The car, the sun...
00:08:01Yeah.
00:08:02It's X everything.
00:08:03So...
00:08:04What we're trying to do is simply adhere to the...
00:08:07You know, the laws in a...
00:08:09In a country.
00:08:10So...
00:08:11So...
00:08:12something is illegal in the United States or if it's illegal in Europe or Brazil or wherever it
00:08:18might be, then we will take it down or we'll suspend the account because we're not there to
00:08:26make the laws. But if speech is not illegal, then what are we doing? Okay, now we're injecting
00:08:33ourselves in as a censor and where does it stop and who decides? And where does that path lead?
00:08:44I think it leads to a bad place. So if the people in a country want the laws to be different,
00:08:52they should make the laws different. But otherwise, we're going to obey the law in each jurisdiction.
00:08:58Right. That's it. It's not more complicated than that. We're not trying to flout the law. I'm going
00:09:02to be clear about that. We're trying to adhere to the law. If laws change, we will change. And if
00:09:07the laws don't change, we won't. We're just literally trying to adhere to the law. It's pretty
00:09:12straightforward. Yes, it's very straightforward. And if somebody thinks we're not adhering to the
00:09:17law, well, they can file a lawsuit. Bingo. Also very straightforward. Yes. I mean,
00:09:22there are European countries that don't want people to promote Nazi propaganda. Yes. They have
00:09:26some sensitivity to it. Well, it is illegal. It is illegal in those countries. And in those
00:09:30countries, if somebody puts that up, you take it down. Yes. But they typically file something
00:09:35and say, take this down. Yes. No, in some cases, it is just obviously illegal. Like,
00:09:40you don't need to file a lawsuit for, you know, if something is just, you know, unequivocally
00:09:46illegal, we can literally read the law. This violates the law. You know, anyone can see that.
00:09:51Like, you know, you don't need, like, if somebody is stealing, you don't need, let me check the law
00:09:56on that. Okay. Oh, no, they're stealing stuff. Let's talk about it. So we had JD Vance here this
00:10:03morning. He did a great job. And, you know, one of the things is there's this image on X of like,
00:10:09basically like you, Bobby, Trump, and JD are like the Avengers, I guess. And then there's another meme
00:10:18where you're in front of a desk where it says D-O-G-E. Yeah. The Department of Governmental
00:10:23Efficiency. Yes. Yes. I posted that one. Tell us about it. I made it using Grok, the Grok image
00:10:29generator. And I posted it. Tell us about the seat for efficiency. How do you do it? Well, I mean,
00:10:41I think with great difficulty, but, you know, look, it's been a long time since there was a serious
00:10:53effort to reduce the size of government and to remove absurd regulations. Yeah. And, you know,
00:11:01last time there was a really concerted effort on that front was Reagan in the early 80s. We're 40
00:11:06years away from a serious effort to remove, you know, regulations that don't serve the greater good
00:11:16and reduce the size of government. And I think it's just, if we don't do that, then what's happening
00:11:23is that we get regulations and laws accumulating every year until eventually everything's illegal.
00:11:29And that's why we can't get major infrastructure projects done in the United States. Like if you
00:11:34look at the absurdity of the California high-speed rail, I think they've spent $7 billion and have
00:11:39a 1,600-foot segment that doesn't actually have rail in it. I mean, your tax dollars at work?
00:11:45I mean, that's an expensive 1,600 feet of concrete, you know? And I mean, I think it's like,
00:11:53you know, I realize sometimes I'm perhaps a little optimistic with schedules, but,
00:11:59you know, I mean, I wouldn't be doing the things I'm doing if I was, you know, not an optimist. So,
00:12:12but at the current trend, you know, California high-speed rail might finish sometime next century.
00:12:19Maybe. But probably not. We'll have teleportation by that time.
00:12:22Yeah, exactly. AI do everything at that point. So I think you really think of, you know, the United
00:12:33States and many countries, it's arguably worse than the EU, as being like Gulliver tied down by a
00:12:39million little strings. And like any one given regulation is not that bad, but you've got a
00:12:45million of them. Or millions, actually. And then eventually you just can't get anything done.
00:12:52And this is a massive tax on the consumer, on the people. It's just they don't realize that
00:12:59there's this massive tax in the form of irrational regulations. I'm going to give you a recent
00:13:06example that, you know, is just insane, is that like SpaceX was fined by the EPA $140,000
00:13:14for, they claimed dumping potable water on the ground, drinking water. So, and we're like,
00:13:22this is at Starbase. And we're like, we're in a tropical thunderstorm region. That stuff comes
00:13:29from the sky all the time. And there was no actual harm done. You know, it was just water to cool the
00:13:36launch pad during liftoff. And there's zero harm done. And they're like, they agree, yes, there's zero
00:13:42harm done. And we're like, okay, so there's no harm done. And, um, but you want us to pay a $140,000
00:13:48fine. It's like, yes, because you didn't have a permit. Okay. We didn't know there was a permit
00:13:53needed for zero harm, fresh water being on the ground in a place that where fresh water falls
00:13:59from the sky all the time. Got it. Next to the ocean. Next to the ocean. Because there's a little bit
00:14:05water there too. Yeah. I mean, sometimes it rains so much, the roads are flooded. So we're like,
00:14:10you know, how does this make any sense? Yeah. And, and then like, then they were like, well,
00:14:16we're not going to process any more of your, any more of your applications for launch, for Starship
00:14:20launch, unless you pay this $140,000. So they just ransomed us. And we're like, okay, so we've paid
00:14:26$140,000. But it was a, it's like, this is no good. I mean, at this rate, we're never going to get to Mars.
00:14:31I mean, that's the, that's the confounding part here. Yeah. Is we're acting against our own
00:14:38self-interest. You know, when you look at, we do have to make, putting aside fresh water, but hey,
00:14:45you know, there, the rocket makes a lot of noise. So I'm, I'm certain there's some complaints about
00:14:50noise once in a while, but sometimes you want to have a party where you want to make progress and
00:14:54there's a little bit of noise. Therefore, you know, we, we trade off a little bit of noise for
00:14:58massive progress or even fun. So like, when did we stop being able to make those trade-offs, but talk
00:15:04about the difference between California and Texas, where you and I now reside. Texas, you were able to
00:15:12build the Gigafactory. I remember when you got the plot of land and then it seemed like it was less
00:15:19than two years when you had the party to open it. Yeah. From, from start of construction, um,
00:15:26to completion, uh, was 14 months, 14, 14 months. Is there anywhere on the planet that would go
00:15:32faster? Is like China faster than that? Uh, China was 11 months. Got it. So Texas, China, 11 and 14
00:15:39months, California, how many months? And just to give you a sense of size, the, our Tesla Gigafactory in
00:15:47China is three times the size of the Pentagon, which was the biggest building in America. Uh,
00:15:51no, there are bigger buildings, but the Pentagon is a pretty big one. Yeah. Or it was the biggest.
00:15:54In units, in units of Pentagon, it's like three. Okay. Three pentagons and counting. Yeah. Got it.
00:16:01In 14 months. Um, the, just the, just the regulatory approvals in California would have taken two years.
00:16:10Yeah. So that's, that's the issue. Where, where, where do you think the regulation helps? Like for the
00:16:15people that will say, we need some checks and balances. We can't have some, because for every
00:16:19good actor like you, there'll be a bad actor. So where is that line then? Yeah. I mean, I haven't,
00:16:24sort of, uh, you know, uh, in, in, in sort of doing sensible deregulation and, um, reduction in the
00:16:33size of government is, is just like be very public about it and say like, which of these rules do you,
00:16:38if, if the public is really excited about a rule and wants to keep it, we'll just keep it.
00:16:42Yeah. And, and here, the thing about the rules, if like, if the rule is, um, you know, turns out to
00:16:48be a bad, we'll just put it right back. Okay. And then, you know, problem solved. It's like,
00:16:53it's easy to add rules, but we don't actually have a process for getting rid of them. That's
00:16:57the issue. There's no garbage collection for rules. Um, when we were, um, watching you work,
00:17:05David and I and Antonio, um, in that first month at Twitter, which was all hands on deck and you
00:17:11were doing zero based budgeting, you really quickly got the costs under control. And then
00:17:16miraculously, everybody said this site will go down and you added 50 more features. So maybe
00:17:23explain, cause this is the first time they're like, it was so many articles like the, that this is,
00:17:29Twitter is dead forever. There's no way it could possibly even continue at all. It was almost like
00:17:34the press was rooting for you to fail. It's like, all right, let's write the obituary.
00:17:37Like, here's the obituary. Uh, they were all saying their goodbyes on Twitter. Remember that?
00:17:40Yeah. Yeah. Yeah. They were all leaving and saying their goodbyes cause the site was going to melt
00:17:44down and totally failing. And, uh, all the journalists left. Yeah. And which is, if you ever want to like
00:17:51hang out with a bunch of hall monitors, oh my God, threads is amazing. Every time I go over there
00:17:56and post, they're like, they're really triggered. But yeah, I mean, if you like being condemned
00:18:01repeatedly, then yes, you know, for reasons that make no sense, then threads is the way to go.
00:18:06Yeah. It's really, it's, it's the most miserable place on earth. If Disney's the happiest,
00:18:11this is the anti Disney. But if we were to go into government, you went into the department of
00:18:16education or pick the department. You've worked with a lot of them actually. You can't go in there
00:18:21and zero based budget. Okay. We get it. But if you could just pair two, three, four, five percent of
00:18:28those organizations, what kind of impact could that have?
00:18:33Yeah. I mean, I think we'd need to do more than that. I think ideally, yeah, compounding every
00:18:38year, two or three percent a year. I mean, it would be better than what's happening now.
00:18:44Yeah. I look, I think we've, um, you know, uh, if Trump wins and I don't know, obviously I
00:18:53suspect there are people with mixed feelings about whether that should happen, but, uh, but if,
00:18:59but we do have an opportunity, uh, to do kind of a once in a lifetime deregulation and reduction in
00:19:05the size of government. Um, because, because the other thing, besides the regulations, um, America
00:19:09is also going bankrupt extremely quickly. Um, and, and nobody seems to, everyone seems to be sort
00:19:15of whistling past the graveyard on this one. Um, but they're all, they're all grabbing the silverware.
00:19:21Or everyone's stuffed in their pockets of the silverware before this Titanic sinks.
00:19:25Well, you know, the, the defense department budget is a very big budget. Okay. It's a trillion
00:19:31dollars a year. DOD Intel, it's a trillion dollars. Um, and interest payments on the national debt
00:19:40just exceeded the defense department budget, but they're over a trillion dollars a year,
00:19:45just in interest and rising. We're, we're adding a trillion dollars to the net, to our debt,
00:19:53which our, you know, kids and grandkids are going to have to pay somehow, um, you know, every,
00:20:00every three months. And then soon it's going to be every two months and then every month. And then
00:20:05the only thing we'll be able to pay is interest. And, and if, if this is, it's just, you know,
00:20:11it's just like a person at scale that has racked up too much credit card debt. Um, and, uh,
00:20:19this, this, this is not, this does not have a good ending. Um, and so, so we, we have to reduce
00:20:26the spending. Let me ask one question. Cause I've brought this up a lot and the counter argument I
00:20:30hear, which I disagree with. Um, but the counter argument I hear from a lot of politicians is
00:20:35if we reduce spending, cause right now, if you add up federal state and local government spending,
00:20:40it's between 40 and 50% of GDP. So nearly half of our economy is supported by government spending
00:20:49and nearly half of people in the United States are dependent directly or indirectly on government
00:20:54checks and, uh, either through contractors, uh, that, that the government pays or they're employed
00:20:58by government, um, entity. So if you go in and you take too hard an ax too fast, you will have
00:21:05significant contraction, job loss and recession. What's the balancing act, Elon? Just thinking
00:21:11realistically, cause I'm a hundred percent on board with you. The steps, the next set of steps,
00:21:16however, assume Trump wins and you become the, the, the, the chief, uh, DOGE, um, uh, DOG, uh,
00:21:26DO, like double G and how, how, how, how, how, yeah. And I think the challenge is how quickly can we,
00:21:32yeah, how quickly can we go in, how quickly can things change and without, without, without, without,
00:21:36without, without, without, without, without, without, without, without, without all the, without all the
00:21:44contraction and job loss. Yeah. So, so I guess, how do you really address it when so much of the
00:21:49economy and so many people's jobs and livelihoods are dependent on government spending?
00:21:52Well, I mean, I, I, I do think it's, it's, it's, it's sort of, um, um, you know, it's, it's false
00:22:00dichotomy. It's not like no government spending is going to happen. Um, you really have to say like,
00:22:05it's at the right level. Um, and, and just remember that, that, you know, any, any given person,
00:22:10if they are doing things in a less efficient organization versus a more efficient organization,
00:22:16their contribution to the economy, their net output of goods and services will reduce. Um,
00:22:22I mean, you've got a couple of clear examples between, uh, East Germany and West Germany,
00:22:26North Korea and South Korea. Um, I mean, North Korea, they're starving, uh, South Korea. It's
00:22:31like amazing. It's the compounding effect of productivity gains. Yeah. Yeah. It's night and
00:22:36day. Um, and so in the North, North Korea, you've got a hundred percent government. Um, and
00:22:41in South Korea, you've got probably, I don't know, 40% government. It's not zero. Um, and yet you've
00:22:46got a standard of living that is probably 10 times higher in South Korea, at least exactly. Um,
00:22:51uh, and then East and West Germany, um, in West Germany, uh, you, you had just thinking in terms
00:22:57of cars, I mean, you had BMW, Porsche, Audi, Mercedes, um, and, and, and East Germany, which
00:23:04is a random line on a map. Um, you, you, the, the, the car, the only car you could get was
00:23:09a, a Trabant, which is basically a lawnmower with a shell on it. Um, and, uh, it was extremely
00:23:15unsafe. Uh, you, you, there was a 20 year wait. So you like, you know, put your kid
00:23:21on the list as soon as they're conceived. Um, and, and even then only, I think, um, you
00:23:28know, a quarter of people maybe got, got this lousy car. And it's just the same. It's so,
00:23:34so that's just an interesting example of like basically the same people, different operating
00:23:37system. And, and it's not like what, uh, West Germany was some, you know, you know, a capitalist,
00:23:44uh, heaven. It was, it's quite socialist actually. So, uh, so when you look, you know, probably
00:23:52it was half, half government in West Germany and a hundred percent government in East Germany.
00:23:57And again, it's sort of a five to, I'd like to call it a, call it at least a five to 10 X
00:24:02standard of living difference and even qualitatively vastly better. And, and it's obviously
00:24:07you know, sometimes people have these amazingly in this modern era, this debate as to which
00:24:11system is better. Well, I'll tell you which system is better. Um, the one that doesn't
00:24:15need to build the world, keep people in. Okay. That's, that's how you can tell. Okay.
00:24:21Yeah. It's a dead giveaway. Spoiler alert. Dead giveaway.
00:24:27They clogged the wall to get out or come in. You have to keep, build a barrier to keep people
00:24:32in that is the bad system. Um, it wasn't West, West Berlin that built the wall. Okay. They
00:24:39were like, you know, anyone who wants to flee West Berlin, go ahead. Um, speaking of walls.
00:24:43So it was, you know, and, and, and if you look at sort of the flux of boats from Cuba, there's
00:24:49a large number of boats from Cuba and there's a bunch of free boats that you, anyone can take
00:24:55to, to go back to Cuba. Plenty of seats. There's like, Hey, wow, an abandoned boat. I could use
00:25:01this boat to go to Cuba where they have communism. Awesome. Um, and, and, and yet nobody, nobody
00:25:07picks up those boats and does it amazing. Um, so yeah. Wait, so your point is jobs will be
00:25:15created. If we cut government spending in half jobs will be created fast enough to make up
00:25:19for, right. Just to count. Yeah. Obviously, you know, I'm not suggesting that, that people,
00:25:24you know, um, had like immediately turped, you know, tossed out with, with no severance and,
00:25:29and, you know, can't, could not, can't pay their mortgage. Then you see some reasonable
00:25:32off ramp, uh, where, yeah. Yeah. Um, so reasonable off ramp where, you know, they're still, um,
00:25:39you know, earning, they're still receiving money, but have like, I don't know, a year or two
00:25:42to, to find, to find jobs in the private sector, which they will find. And then there will
00:25:47be in a different operating system. Um, again, you can see the difference. East
00:25:51Germany was incorporated into West Germany, living standards in East Germany, uh, rose
00:25:55dramatically. Um, so. Well, in, in four years, if you could shrink the size, the size of the
00:26:01government with Trump, what would be a good target just in terms of like ballpark? I mean,
00:26:06are you trying to get me assassinated before this even happens? No, no. Pick a loan number. I mean,
00:26:11you know, there's that old phrase, go postal. I mean, it's like they might. Yeah. So we'll keep
00:26:16the post office. I mean, I'm going to need a lot of security details guys. Yes. I mean,
00:26:22the, uh, the serial number of disgruntled workers for former government employees is, you know,
00:26:27quite a scary number. I mean, I might not make it, you know, I was saying low cut, low digits every
00:26:33year for four years would be palatable. Yeah. And I like your idea of an offer. But the thing is
00:26:36that if it's not done, uh, like if you have a once, once in a lifetime or once in a generation
00:26:42opportunity and you don't take serious action and, and then you have four years to get it done
00:26:48and then, uh, and if it doesn't get done, then how serious is Trump about this? Like you've talked
00:26:53to him about it. Yeah. Yeah. I think he is very serious about it. Um, and no, I, I think actually
00:27:00the reality is that if we get rid of nonsense regulations and shift people from the government
00:27:05sector to the private sector, we will have immense prosperity. Um, and, and I think we'll have a
00:27:11golden age in this country and it'll be fantastic. Um, you have a bunch of critical milestones
00:27:23coming up. Um, yeah, in fact, uh, there's an important, a very exciting launch, um, that,
00:27:29uh, is maybe happening tonight. So if that, if the weather is, is holding up, then I'm going
00:27:34to leave here, head to Cape Canaveral, uh, for the, um, the, the Polaris Dawn mission, which
00:27:39is a private mission, so funded by Jared, um, Jared Isaacman, and he's, um, awesome guy.
00:27:45And, and there, this will be the first time, uh, the first private, the first, first commercial
00:27:51for spacewalk, um, and, and it'll be at, at the highest altitude, uh, since Apollo. So
00:27:57it's the furthest from earth that anyone's gone. Um, yeah.
00:28:02And, you know, what comes after that? Let's assume that's successful.
00:28:08I sure hope so, man. Um, no pressure. Um, yeah, we're, you know, absolutely, you know,
00:28:20astronaut prior, astronaut safety is, man, if I had, like, all, all the, all the wishes
00:28:26I could save up, that would be the one to, to put it on. So, you know, space is dangerous.
00:28:31Um, so, the, the, yeah, I mean, the, the next milestone after that would be the next flight
00:28:41of Starship, um, which, um, you know, Starship is, the next flight of Starship is ready to fly.
00:28:48We are waiting for regulatory approval. You know, it, it, it, it, it really should not
00:28:57be possible to build a giant rocket faster than the paper can move from one desk to another.
00:29:10That stamp is really hard. Approved.
00:29:14Yeah.
00:29:15You ever see that movie, Zootopia, you ever see that movie, Zootopia, there's, like, a sloth
00:29:19Yeah, he's coming in for the approval.
00:29:23Yeah, yeah, and then he, I accidentally tell a joke, and then it's like, oh, no, this is
00:29:27No, here we go.
00:29:28This is gonna take a long time.
00:29:29Sorry, sorry.
00:29:29Um, but, yeah, Zootopia, you know, it's, you know, the funny thing is, like, so, I went
00:29:35to the DMV, uh, about, I don't know, a year later, after Zootopia, and, to get my,
00:29:42whatever, license renewal, and the guy, in, in an exercise of incredible self-awareness,
00:29:47had the sloth from Zootopia in his, um, in his cube, and, in, in his cube, and, and he
00:29:53was actually swift.
00:29:54Yeah.
00:29:56With that, with the mandate.
00:29:57Beat the sloth.
00:29:58Yeah, yeah, no.
00:29:59Beat personal agency.
00:30:00Personal agency.
00:30:01No, I mean, sometimes people, like, think the, you know, the government is, um, more competent
00:30:08than it, than it is.
00:30:09I'm not saying that there aren't competent people in the government, they're just in an
00:30:12operating system that is inefficient.
00:30:14Um, once you move them to a more efficient operating system, they, their output is dramatically
00:30:19greater, as we've seen, you know, when East Germany was reintegrated to, into, with West
00:30:24Germany, and, and, and the same people, um, were vastly more prosperous, uh, with a basically
00:30:31half-capitalist, uh, operating system.
00:30:33So, um, but, I mean, for a lot of people, like, their, like, their, maybe most direct
00:30:41experience with, with the government is the DMV, um, and, and, and then the important thing
00:30:46to remember is that the government is the DMV at scale.
00:30:52Right.
00:30:52That's the government.
00:30:54Got the mental picture.
00:30:55How much do you want to scale it?
00:30:56Yeah.
00:30:57Yeah.
00:30:58Yeah.
00:30:59Yeah.
00:31:00Sorry, can you go back to Chamath's, um, uh, question on Starship?
00:31:04So, you, you announced just the other day, Starship going to Mars in two years, and...
00:31:08Yeah, by the way.
00:31:09Huh?
00:31:10Yeah, yeah, yeah.
00:31:11Yeah, yeah.
00:31:12And then, uh, four years for a crude, uh, aspirational launch, uh, next window.
00:31:16Yeah, I mean...
00:31:17And how much is the government involved?
00:31:18I mean, I'm not saying, like, say you watch by these, not, you know.
00:31:21Uh, but these, uh, but it, based on our current progress, where, with Starship, we were able
00:31:28to successfully reach oval velocity twice, uh, we were able to achieve soft landings of
00:31:34the booster and the ship in water, uh, and that's despite the ship having, you know, half
00:31:40its flaps cooked off, um, you can see the video on the X platform, it's quite exciting.
00:31:45Um, so, you know, we, we, we think we'll be able to have, to launch reliably, and repeatedly,
00:31:53and quite quickly, um, and the, the, the fundamental holy grail breakthrough for rocketry for, for,
00:32:01to, what, the fundamental breakthrough that is needed for life to become multi-planetary,
00:32:05is a, a rapidly reusable, reliable rocket.
00:32:09Ar, ar, ar, ar.
00:32:11Ar, ar, ar, ar.
00:32:12Ar.
00:32:13Ar.
00:32:14Ar.
00:32:15Ar.
00:32:16Ar.
00:32:17Ar.
00:32:18Ar.
00:32:19Ar.
00:32:20Ar.
00:32:21Ar.
00:32:22Ar.
00:32:23Ar.
00:32:24Ar.
00:32:25Ar.
00:32:26Ar.
00:32:27Ar.
00:32:28Ar.
00:32:29Ar.
00:32:30Ar.
00:32:31Ar.
00:32:32Ar.
00:32:33Ar.
00:32:34Ar.
00:32:35Ar.
00:32:36Ar.
00:32:37Ar.
00:32:38Ar.
00:32:39Ar.
00:32:40Ar.
00:32:41is success in the set of possible outcomes.
00:32:44That sounds pretty obvious,
00:32:47but there are often projects where that success
00:32:51is not in the set of possible outcomes.
00:32:54And so Starship not only is full reusability
00:32:59in the set of possible outcomes,
00:33:00it is being proven with each launch.
00:33:03And I'm confident it will succeed.
00:33:05It's simply a matter of time.
00:33:06And if we can get some improvement
00:33:12in the speed of regulation,
00:33:14we could actually move a lot faster.
00:33:18So that would be very helpful.
00:33:21And in fact, if something isn't done
00:33:24about reducing regulation
00:33:28and sort of speeding up approvals,
00:33:30and to be clear, I'm not talking about anything unsafe.
00:33:32It's simply the processing of the safe thing
00:33:36can be done as fast as the rocket is built,
00:33:40not slower,
00:33:41then we could become a space-baring civilization
00:33:45and a multi-planet species,
00:33:47and be out there among the stars in the future.
00:33:49And there's, you know, it's just very,
00:33:56like, it's incredibly important
00:33:57that we have things that we find inspiring,
00:34:01that you look to the future
00:34:04and say the future's going to be better than the past,
00:34:06things to look forward to.
00:34:08And, like, kids are a good way to assess this.
00:34:13Like, what are kids fired up about?
00:34:14And if you could say, you know,
00:34:18you could be an astronaut on Mars.
00:34:21You could maybe one day go beyond the solar system.
00:34:25We could make Star Trek, Starfleet Academy real.
00:34:30That is an exciting future.
00:34:32That is inspiring.
00:34:36You know, I mean, you need things that move your heart.
00:34:39Right.
00:34:41Yeah.
00:34:43Fuck yeah.
00:34:44Fuck yeah.
00:34:46Let's do it.
00:34:47Fuck.
00:34:47I mean, it...
00:34:49Like, life can't just be about
00:34:50solving one miserable problem after another.
00:34:53There's got to be things that you look forward to as well.
00:34:55Yeah.
00:34:56And do you think you might have to move it
00:34:59to a different jurisdiction to move faster?
00:35:01I've always wondered if, like...
00:35:02Rocket technology is considered
00:35:05an advanced weapons technology,
00:35:06so we can't just go do it, you know...
00:35:08In another country.
00:35:08Yes.
00:35:09Yeah, interesting.
00:35:10And if we don't do it,
00:35:11other countries could do it.
00:35:12I mean, they're so far behind us,
00:35:15but theoretically,
00:35:16there is a national security,
00:35:19you know, justification here,
00:35:21if somebody can put their thinking caps on,
00:35:23like, do we want to have this technology
00:35:25that you're building,
00:35:26the team's working so hard on,
00:35:27stolen by other countries,
00:35:28and then, you know,
00:35:29maybe they don't have as much red tape.
00:35:31I wish people were trying to steal it.
00:35:35So, no one's trying to steal it.
00:35:38It's just too...
00:35:39It's just too crazy, basically.
00:35:44And that's for you.
00:35:46Yeah, it's way too crazy.
00:35:47Elon, what do you think is going on
00:35:49that led to Boeing building the Star Line
00:35:54the way that they did?
00:35:56They were able to get it up.
00:36:00But not complete.
00:36:01But can't complete.
00:36:02They can't finish.
00:36:03Can't finish.
00:36:04I don't understand.
00:36:05And now you're going to have to go up and finish.
00:36:11Well, I mean,
00:36:13I think Boeing is a company that is...
00:36:16They actually do so much business with the government,
00:36:18they have sort of impedance match to the government.
00:36:20So, they're like basically one notch
00:36:23away from the government, maybe two...
00:36:25They're not far from the government
00:36:27from an efficiency standpoint
00:36:28because they derive so much of the revenue
00:36:30from the government.
00:36:32And a lot of people think,
00:36:33well, SpaceX is super dependent on the government,
00:36:35and actually, no,
00:36:36most of our revenue is commercial.
00:36:42And there's...
00:36:45I think, at least up until perhaps recently,
00:36:49because they have a new CEO
00:36:51who actually shows up in the factory.
00:36:53And the CEO before that,
00:36:55I think, had a degree in accounting
00:36:57and never went to the factory
00:36:58and didn't know how airplanes flew.
00:37:02So, I think if you are in charge of a company
00:37:05that makes airplanes fly
00:37:07and a spacecraft go to orbit,
00:37:12then it can't be a total mystery
00:37:14as to how they work.
00:37:15So, you know,
00:37:21I'm like, sure,
00:37:23if somebody's like running Coke or Pepsi
00:37:24and they're like great at marketing or whatever,
00:37:27that's fine
00:37:29because it's not a sort of technology-dependent business.
00:37:34You know,
00:37:35or if they're running a, you know,
00:37:37financial consulting
00:37:38and they're degrees in accounting,
00:37:39that makes sense.
00:37:41But I think, you know,
00:37:43if you're the cavalry captain,
00:37:45you should know how to ride a horse.
00:37:47Pretty basic.
00:37:47Yeah.
00:37:48Yeah.
00:37:49Great deal.
00:37:51It's like,
00:37:51it's disconcerting
00:37:52if the cavalry captain just falls off the horse.
00:37:56He's not going to inspire the team.
00:37:59I'm sorry, I'm scared of horses.
00:38:00He gets on backwards.
00:38:01I'm like, oops.
00:38:04Shifting gears to AI,
00:38:06Peter was here earlier
00:38:07and he was talking about how so far
00:38:09the only company to really make money off AI
00:38:11is NVIDIA with the chips.
00:38:13Do you have a sense yet
00:38:14of where you think the big applications
00:38:17will be from AI?
00:38:19Is it going to be an enabling self-driving?
00:38:21Is it going to be enabling robots?
00:38:23Is it transforming industries?
00:38:25I mean,
00:38:26it's still, I think,
00:38:27early in terms of
00:38:28where the big business impact is going to be.
00:38:30Do you have a sense yet?
00:38:31Yeah.
00:38:37I think the spending on AI
00:38:47probably runs ahead of,
00:38:49I mean, it does run ahead
00:38:50of the revenue right now.
00:38:51There's no question about that.
00:38:54But the rate of improvement of AI
00:38:57is faster than any technology
00:38:58I've ever seen by far.
00:38:59And it's,
00:39:06I mean,
00:39:07for example,
00:39:08a Turing test used to be a thing.
00:39:11Now,
00:39:12you know,
00:39:12your basic open source
00:39:14random LLM
00:39:15writing on a frigging Raspberry Pi
00:39:17probably could,
00:39:18you know,
00:39:20be the Turing test.
00:39:23So there's,
00:39:25I think actually,
00:39:27the good future of AI
00:39:31is one of immense prosperity
00:39:34where
00:39:35there is
00:39:37an age of abundance,
00:39:40no shortage
00:39:41of goods and services.
00:39:43Everyone can have
00:39:44whatever they want
00:39:45unless,
00:39:46except for things
00:39:47we artificially define
00:39:48to be scarce
00:39:49like some special artwork.
00:39:52But anything that is
00:39:53a manufactured good
00:39:54or provided service
00:39:55will,
00:39:57I think,
00:39:57with the admin of AI
00:39:58plus robotics
00:39:59that the cost
00:40:01of goods and services
00:40:03will be,
00:40:04will
00:40:04trend to zero.
00:40:08Like,
00:40:08I'm not saying
00:40:09it'll be actually zero,
00:40:10but it'll be,
00:40:13everyone will be able
00:40:14to have anything they want.
00:40:16That's the good future.
00:40:17Of course,
00:40:19you know,
00:40:19in my view,
00:40:20that's probably 80% likely.
00:40:21So look on the bright side.
00:40:24Only 20%,
00:40:2520% probability of annihilation.
00:40:27It's nothing.
00:40:29Is the 20% like,
00:40:31what does that look like?
00:40:33I don't know, man.
00:40:34I mean,
00:40:35frankly,
00:40:35I do have to go
00:40:36engage in some degree
00:40:37of deliberate suspension
00:40:38of disbelief
00:40:39with respect to AI
00:40:40in order to sleep well.
00:40:41and even then.
00:40:45Because I think
00:40:46the actual issue,
00:40:47the most likely issue
00:40:49is like,
00:40:49well,
00:40:49how do we find meaning
00:40:50in a world
00:40:51where AI can do
00:40:51everything we can do
00:40:52a bit better?
00:40:53That is perhaps
00:40:55the bigger challenge.
00:40:58Although,
00:40:59you know,
00:41:00at this point,
00:41:00I know more and more people
00:41:01who are retired
00:41:01and they seem to
00:41:02enjoy that life.
00:41:03but I think
00:41:07that may be,
00:41:08maybe there'll be
00:41:08some crisis of meaning
00:41:09because the computer
00:41:11can do everything
00:41:12you can do
00:41:13but better,
00:41:14so maybe that'll be
00:41:15a challenge.
00:41:16but really,
00:41:20you know,
00:41:21you need the sort
00:41:22of end effectors.
00:41:23You need the
00:41:23autonomous cars
00:41:26and you need
00:41:28the sort of
00:41:28humanoid robots
00:41:29or general purpose robots.
00:41:32But once you have
00:41:33general purpose
00:41:34humanoid robots
00:41:35and autonomous vehicles,
00:41:40really,
00:41:41you can build anything.
00:41:45And I think
00:41:46that there's no
00:41:46actual limit
00:41:47to the size
00:41:49of the economy.
00:41:50I mean,
00:41:50there's obviously,
00:41:51you know,
00:41:51the mass of Earth,
00:41:52you know,
00:41:52like that will be
00:41:53one limit.
00:41:55But,
00:41:56you know,
00:41:57the economy
00:41:58is really just
00:41:58the average
00:41:59productivity per person
00:42:00times number of people.
00:42:02That's the economy.
00:42:04And if you've got
00:42:06humanoid robots
00:42:07that can do,
00:42:09you know,
00:42:10where there's no real
00:42:11limit on the number
00:42:11of humanoid robots
00:42:12and they can operate
00:42:15very intelligently,
00:42:17then there's no
00:42:18actual limit
00:42:18to the economy.
00:42:20There's no meaningful
00:42:20limit to the economy.
00:42:21You guys just turned
00:42:23on Colossus,
00:42:24which is like
00:42:25the largest
00:42:25private compute cluster,
00:42:28I guess,
00:42:28of GPUs anywhere.
00:42:29Is that right?
00:42:30It's the most powerful
00:42:32supercomputer of any kind.
00:42:33which sort of speaks
00:42:36to what David said
00:42:37and kind of what
00:42:37Peter said,
00:42:38which is a lot
00:42:39of the kind
00:42:40of economic value
00:42:41so far of AI
00:42:42has entirely
00:42:44gone to NVIDIA.
00:42:45But there are people
00:42:46with alternatives
00:42:47and you're actually
00:42:47one with an alternative.
00:42:48Now,
00:42:49you have a very
00:42:49specific case
00:42:50because Dojo
00:42:50is really about
00:42:51images and large images,
00:42:53huge video.
00:42:55Yeah,
00:42:56I mean,
00:42:56the Tesla problem
00:42:57is different
00:42:58from the,
00:42:59you know,
00:43:01the sort of LLM problem.
00:43:03The nature
00:43:04of the intelligence
00:43:05actually is actually
00:43:05and what matters
00:43:09in the AI
00:43:10is different
00:43:11to the point
00:43:13you just made,
00:43:13which is that
00:43:14in Tesla's case,
00:43:15the context
00:43:16length is very long.
00:43:18So you've got
00:43:18gigabytes of context.
00:43:19Gigabyte of context
00:43:20windows, yeah.
00:43:20Yeah, you've got,
00:43:21you know,
00:43:22sort of.
00:43:23I was just bringing it up.
00:43:25Kind of billions
00:43:25of tokens of context,
00:43:26nutty amount of context
00:43:28because you've got
00:43:29seven cameras
00:43:31and if you've got
00:43:32several,
00:43:32you know,
00:43:33let's say you've got
00:43:33a minute of several
00:43:35high def cameras,
00:43:37then that's gigabytes.
00:43:39So you need to compress,
00:43:40so the Tesla problem
00:43:41is you've got to compress
00:43:42a gigantic context
00:43:44into the pixels
00:43:46that are,
00:43:47that actually matter
00:43:48and,
00:43:49you know,
00:43:51and condense that
00:43:53over a time,
00:43:54so you've got to,
00:43:55in both,
00:43:56the time dimension
00:43:57and the space dimension,
00:43:58you've got to compress
00:43:58the pixels
00:43:59in space
00:44:01and the pixels
00:44:02over in time
00:44:02and then have
00:44:06that inference done
00:44:08on a tiny computer,
00:44:09relatively speaking,
00:44:10a small,
00:44:11you know,
00:44:12a few hundred watts.
00:44:14It's a Tesla-designed
00:44:15AI inference computer,
00:44:17which is, by the way,
00:44:18still the best,
00:44:19there isn't a better thing
00:44:20we could buy
00:44:20from suppliers.
00:44:22So the Tesla-designed
00:44:23AI inference computer
00:44:24that's in the cars
00:44:25is better than anything
00:44:26we could buy
00:44:27from any supplier.
00:44:28Just, by the way,
00:44:29that's kind of a,
00:44:30the Tesla AI chip team
00:44:33is extremely good.
00:44:33You guys,
00:44:34in the design,
00:44:35there was a technical paper
00:44:36and there was a deck
00:44:36that somebody on your team
00:44:37from Tesla published
00:44:39and it was stunning to me.
00:44:41You designed your own
00:44:42transport control,
00:44:43like, layer over Ethernet.
00:44:44You were like,
00:44:45ah, Ethernet's not good enough
00:44:46for us.
00:44:46You have this TTCOE
00:44:49or something
00:44:49and you're like,
00:44:49oh, we're just going
00:44:50to reinvent Ethernet
00:44:51and, like,
00:44:51string these chips.
00:44:52It's pretty incredible stuff
00:44:53that's happening over there.
00:44:54Yeah.
00:44:55No, the team,
00:44:57the Tesla chip design team
00:44:59is extremely,
00:45:00extremely good.
00:45:02So.
00:45:03But is there a world
00:45:04where, for example,
00:45:05other people over time
00:45:06that need, you know,
00:45:07some sort of, like,
00:45:08video use case
00:45:09or image use case
00:45:10could theoretically,
00:45:11you know,
00:45:11you'd say,
00:45:12oh, why not?
00:45:13You know,
00:45:13I have some extra cycles
00:45:14over here,
00:45:14so.
00:45:15We should kind of
00:45:16make you a competitor
00:45:16of NVIDIA.
00:45:17It's not intentionally,
00:45:18per se, but.
00:45:20Yeah, I mean,
00:45:21the,
00:45:22you know,
00:45:24there's this training
00:45:25and inference,
00:45:26and we do have
00:45:27those two projects
00:45:28at Tesla.
00:45:29We've got Dojo,
00:45:29which is the training
00:45:30computer,
00:45:31and then,
00:45:33you know,
00:45:34our inference chip,
00:45:37which is in every car,
00:45:38inference computer.
00:45:39So,
00:45:42and Dojo,
00:45:43we've only had Dojo 1.
00:45:44Dojo 2 is,
00:45:47you know,
00:45:47should be,
00:45:47we should have Dojo 2
00:45:48in volume towards
00:45:49the end of next year.
00:45:51And that will be,
00:45:53we think,
00:45:53sort of comparable
00:45:54to
00:45:56sort of a B200
00:45:59type system,
00:46:00a training system.
00:46:03And,
00:46:03you know,
00:46:05so there's,
00:46:06I guess there's some
00:46:06potential for that
00:46:07to be used
00:46:09as a service.
00:46:12But,
00:46:12but,
00:46:12like,
00:46:13you know,
00:46:14Dojo is,
00:46:15is just,
00:46:17kind of like,
00:46:17I mean,
00:46:18we're,
00:46:19I guess,
00:46:19I guess I have,
00:46:20like,
00:46:20some improved confidence
00:46:22in Dojo,
00:46:24but I think
00:46:26we won't really know
00:46:27how good Dojo is
00:46:29until probably
00:46:30version 3.
00:46:31Like,
00:46:31it usually takes
00:46:32three major iterations
00:46:33on a technology
00:46:34for it to be,
00:46:36to be excellent.
00:46:36and we'll only
00:46:38have the second
00:46:39major iteration
00:46:39next year.
00:46:42The third iteration,
00:46:44I don't know,
00:46:44maybe late,
00:46:45you know,
00:46:4626 or something
00:46:47like that.
00:46:48How's the,
00:46:48how's the Optimus
00:46:49project going?
00:46:50I remember
00:46:50when we talked last,
00:46:51and you said
00:46:52this publicly,
00:46:53that it's in
00:46:54doing some light
00:46:55testing inside
00:46:56the factory.
00:46:57Yeah.
00:46:57So it's actually
00:46:58being useful.
00:46:59What's the build
00:47:00of materials
00:47:00and when,
00:47:02you know,
00:47:02for something like
00:47:03that at scale,
00:47:04so when you start
00:47:04making it like
00:47:05you're making
00:47:05the Model 3 now
00:47:06and there's a
00:47:06million of them
00:47:07coming off
00:47:07the factory line,
00:47:09what would they
00:47:09cost?
00:47:1020, 30,
00:47:1040,000 dollars
00:47:11you think?
00:47:12Yeah.
00:47:13I mean,
00:47:13I've discovered
00:47:14really that,
00:47:15you know,
00:47:17anything made
00:47:18in sufficient volume
00:47:19will asymptotically
00:47:20approach the cost
00:47:21of its materials.
00:47:25So,
00:47:25now there's,
00:47:26I should say,
00:47:27there's,
00:47:28some things are
00:47:29constrained by
00:47:29the cost of
00:47:30intellectual property
00:47:31and like paying
00:47:32for patents and
00:47:32stuff.
00:47:33So a lot of,
00:47:34you know,
00:47:34what's in a chip
00:47:36is like paying
00:47:37royalties
00:47:38and depreciation
00:47:40of the chip fab.
00:47:41So,
00:47:42but the actual
00:47:42marginal cost
00:47:43of the chips
00:47:43is very low.
00:47:46So,
00:47:46so Optimus,
00:47:47it obviously
00:47:48is a humanoid robot,
00:47:49it is,
00:47:50it weighs much less
00:47:51and it's much smaller
00:47:52than a car.
00:47:54So the,
00:47:55you could expect
00:47:55that in high volume
00:47:57and I'd say
00:48:00that you also
00:48:01probably need
00:48:01three production
00:48:02versions of Optimus.
00:48:03So you need
00:48:04to refine the design
00:48:05three,
00:48:06at least
00:48:07three major times
00:48:08and then you need
00:48:10to scale production
00:48:10to sort of
00:48:13the million unit
00:48:13plus per year level.
00:48:15And
00:48:16I think
00:48:17at that point
00:48:18the cost,
00:48:20the,
00:48:20you know,
00:48:22the labor and materials
00:48:23on Optimus
00:48:24is probably not
00:48:25much more than
00:48:26$10,000.
00:48:27Yeah,
00:48:28and that's a decade
00:48:28long journey maybe?
00:48:29Basically think of it
00:48:30like Optimus
00:48:31will cost less
00:48:32than
00:48:33a
00:48:34small car.
00:48:36Right.
00:48:37So
00:48:37at scale volume
00:48:40with
00:48:40three major iterations
00:48:41of technology
00:48:42and so
00:48:43if a small car
00:48:44costs $25,000
00:48:46it's probably
00:48:48like $20,000
00:48:49for an Optimus
00:48:51for a humanoid robot
00:48:52that can be
00:48:53your buddy
00:48:54like a combination
00:48:55of R2-D2
00:48:56and C-3PO
00:48:57but better.
00:48:59Yeah,
00:48:59I mean...
00:49:00Honestly,
00:49:01I think people
00:49:01are going to get
00:49:02really attached
00:49:02to their humanoid robot
00:49:04because I mean
00:49:04like you look at
00:49:05sort of
00:49:06you watch Star Wars
00:49:06and it's like
00:49:07R2-D2
00:49:07and C-3PO
00:49:08I love those guys.
00:49:10Yeah.
00:49:11You know,
00:49:11they're awesome.
00:49:11And their personality
00:49:14and I mean
00:49:15all R2 could do
00:49:17is just beef at you.
00:49:18Right.
00:49:19I couldn't speak English.
00:49:22And you see C-3PO
00:49:23to translate the beeps.
00:49:25So you're in year two
00:49:26of that
00:49:26if you did
00:49:27two or three years
00:49:28per iteration
00:49:28or something
00:49:29it's a decade long journey
00:49:30for this to hit
00:49:31some sort of scale.
00:49:33I would say
00:49:34major iterations
00:49:35are less than
00:49:36two years
00:49:36so
00:49:36it's probably
00:49:39on the order
00:49:40of five years.
00:49:41Yeah.
00:49:44Maybe six
00:49:44to get to
00:49:45a million units a year.
00:49:47And at that price point
00:49:48everybody can afford one.
00:49:49Yes.
00:49:50On planet Earth.
00:49:50I mean it's going to be
00:49:52that one to one
00:49:52two to one
00:49:53what do you think
00:49:54ultimately
00:49:54if we're sitting here
00:49:55in 30 years
00:49:56the number of robots
00:49:57on the planet
00:49:58versus humans.
00:50:00Yeah.
00:50:00I think the number of robots
00:50:01will vastly exceed
00:50:01the number of humans.
00:50:03Vastly.
00:50:03Vastly exceed.
00:50:04I mean you have to say
00:50:04who would not want
00:50:06their robot buddy.
00:50:09Everyone wants
00:50:09a robot buddy.
00:50:10All right.
00:50:11You know
00:50:13this is like
00:50:14especially if it can
00:50:16you know
00:50:16it can take care
00:50:19of your
00:50:19take your dog
00:50:20for a walk
00:50:21it could
00:50:21you know
00:50:22mow the lawn
00:50:23it could
00:50:24watch your kids
00:50:25it could
00:50:26you know
00:50:27it could
00:50:28teach your kids
00:50:29it could
00:50:30we could
00:50:30we could also
00:50:31send it to Mars
00:50:31yeah
00:50:32we could send a lot
00:50:33of robots to Mars
00:50:34to do the work needed
00:50:35to make it
00:50:36a colonized planet
00:50:38for humans
00:50:38and Mars is already
00:50:39the robot planet
00:50:39there's like a whole
00:50:40bunch of you know
00:50:41robots like rovers
00:50:42and helicopter
00:50:43yes only robots
00:50:44so yeah
00:50:46so yeah
00:50:46no I think
00:50:49the sort of
00:50:49useful
00:50:51humanoid robot
00:50:52opportunity
00:50:53is the single
00:50:54biggest
00:50:54opportunity
00:50:55ever
00:50:57because
00:51:02if you assume
00:51:03like
00:51:03I mean
00:51:04the ratio
00:51:05of humanoid robots
00:51:06to humans
00:51:07is going to be
00:51:07at least
00:51:07two to one
00:51:08maybe three to one
00:51:09because everybody
00:51:10will want one
00:51:11and then there'll
00:51:12be a bunch of
00:51:12robots that
00:51:13you don't see
00:51:14that are making
00:51:14goods and services
00:51:15and you think
00:51:15it's a general
00:51:16one generalized
00:51:17robot that then
00:51:18learns how to do
00:51:18different tasks
00:51:19or
00:51:19yeah
00:51:20hey
00:51:21I mean
00:51:21we are a
00:51:22generalized
00:51:23yeah we're
00:51:24a general
00:51:25we're just
00:51:25made of meat
00:51:26yeah exactly
00:51:27we're a meat bot
00:51:29a generalized meat bot
00:51:30yeah I mean
00:51:31I'm operating
00:51:31my meat puppet
00:51:32you know
00:51:32so yeah
00:51:35we are actually
00:51:36and by the way
00:51:38it turns out
00:51:38like as we're
00:51:40designing Optimus
00:51:40we sort of
00:51:41learn more
00:51:41and more
00:51:42about why
00:51:43humans are
00:51:44shaped the way
00:51:44they're shaped
00:51:45and you know
00:51:46and why we have
00:51:48five fingers
00:51:48and why your
00:51:49little finger
00:51:49is smaller
00:51:50than you know
00:51:51your index finger
00:51:51you know
00:51:53obviously why
00:51:55you have
00:51:55opposable thumbs
00:51:56but also
00:51:57why for example
00:51:58your
00:51:58the muscles
00:51:59the major muscles
00:52:00that operate
00:52:01your hand
00:52:02are actually
00:52:03in your forearm
00:52:03and your fingers
00:52:06are primarily
00:52:06operated
00:52:07like
00:52:07the muscles
00:52:11that actuate
00:52:11your fingers
00:52:12are located
00:52:14the vast majority
00:52:15of your finger strength
00:52:16is actually
00:52:17coming from
00:52:17your forearm
00:52:18and your fingers
00:52:20are being operated
00:52:20by tendons
00:52:22little strings
00:52:22that
00:52:24that's
00:52:25and so
00:52:26the current version
00:52:27of the Optimus
00:52:28hand
00:52:28has the actuators
00:52:30in the hand
00:52:31and has only
00:52:3211 degrees
00:52:33of freedom
00:52:33so it can't
00:52:34it's not as
00:52:34it doesn't have
00:52:35all the degrees
00:52:36of freedom
00:52:36of human hand
00:52:37which has
00:52:38depending on how
00:52:38you count it
00:52:39roughly 25 degrees
00:52:40of freedom
00:52:40and
00:52:43and
00:52:45and
00:52:45and it's also
00:52:46like
00:52:46not strong enough
00:52:48in certain ways
00:52:49because the actuators
00:52:50have to fit in the hand
00:52:51so the next generation
00:52:53Optimus hand
00:52:54which we have
00:52:55in prototype form
00:52:56the actuators
00:52:58have moved to the forearm
00:52:59just like a human
00:53:00and they operate
00:53:01the fingers
00:53:02through cables
00:53:03just like a human hand
00:53:04and
00:53:06and
00:53:07and the next generation
00:53:08hand has
00:53:0822 degrees of freedom
00:53:09which we think
00:53:11is
00:53:12enough to do
00:53:14almost anything
00:53:14that a human can do
00:53:16and
00:53:19presumably
00:53:19I think it was written
00:53:21that
00:53:21X and Tesla
00:53:23may work together
00:53:24and
00:53:24you know
00:53:25provide services
00:53:26but my immediate thought
00:53:27went to
00:53:28oh if you just
00:53:28provide a grok
00:53:29to the robot
00:53:29then the robot
00:53:30has a personality
00:53:31and can process
00:53:32voice
00:53:33and video
00:53:34and images
00:53:34and all of that stuff
00:53:35as we wrap here
00:53:37I think
00:53:39you know
00:53:40everybody talks
00:53:41about
00:53:41all the projects
00:53:42you're working on
00:53:43but
00:53:43people don't know
00:53:44you have a great
00:53:45sense of humor
00:53:45that's not true
00:53:47oh you do
00:53:47you do
00:53:48people don't see it
00:53:49but
00:53:50I would say
00:53:51I know for me
00:53:52the funniest week
00:53:53of my life
00:53:53or one of the funniest
00:53:54was when you did SNL
00:53:55and we got
00:53:56and you
00:53:56I got to tag along
00:53:58maybe you saw it
00:53:59maybe
00:54:01behind the scenes
00:54:03like
00:54:03some of your funniest
00:54:04recollections
00:54:05of
00:54:06that chaotic
00:54:07insane
00:54:07week
00:54:09when we laughed
00:54:10for 12 hours a day
00:54:11it was a little terrorizing
00:54:12on the first couple of days
00:54:13but
00:54:13yeah
00:54:14I was a bit worried
00:54:16at the beginning there
00:54:17because
00:54:17frankly
00:54:18nothing was funny
00:54:19day one was rough
00:54:22rough
00:54:23yeah
00:54:25so
00:54:26I mean
00:54:26it's like a rule
00:54:28but can't you guys
00:54:28just say it
00:54:29just say the
00:54:29stuff that got on
00:54:31the cutting
00:54:31the funniest skits
00:54:34were the ones
00:54:34they didn't let you do
00:54:35that's what I'm saying
00:54:35can you just say it
00:54:36there were a couple
00:54:36of funny ones
00:54:37yeah
00:54:37that they didn't let you do
00:54:38you can say it
00:54:38so that he doesn't get
00:54:39I mean
00:54:40how much time
00:54:41do we have here
00:54:41well we should just
00:54:42give him one or two
00:54:43because it was
00:54:44in your mind
00:54:46which one do we regret
00:54:47most
00:54:47not getting on air
00:54:48you really want
00:54:52to hear that
00:54:52I mean
00:54:53I mean it was
00:54:55a little spicy
00:54:56it was a little funny
00:54:58okay
00:55:00here we go
00:55:03all right
00:55:03here we go guys
00:55:04all right
00:55:07so one of the things
00:55:10that I think
00:55:11everyone's been
00:55:12sort of wondering
00:55:13this whole time
00:55:13is
00:55:14is Saturday Night Live
00:55:16actually live
00:55:17like
00:55:19live
00:55:19live live
00:55:20live
00:55:20or do they have
00:55:21like a delay
00:55:22or like
00:55:23just in case
00:55:24you know
00:55:25there's a war probe
00:55:25malfunction
00:55:26or something like that
00:55:27is it like a
00:55:29you know
00:55:29truly live
00:55:30five second delay
00:55:31what's really going on
00:55:32but there's a way
00:55:34to test this
00:55:34right
00:55:35we can't walk away
00:55:37there's a way to test this
00:55:38which is
00:55:40we don't tell them
00:55:41what's going on
00:55:42I walk on
00:55:43and say
00:55:43this is the script
00:55:44I'll throw it on the ground
00:55:46we're going to find out
00:55:47tonight
00:55:48right now
00:55:49if Saturday Night Live
00:55:50if Saturday Night Live
00:55:51is actually live
00:55:53and the way
00:55:55that we're going to do this
00:55:56is I'm going to
00:55:57take my cock out
00:55:58this is the greatest pitch ever
00:56:07and if
00:56:08if you see my cock
00:56:10you know
00:56:13it's true
00:56:15and if you don't
00:56:17it's been a lie
00:56:18a lie
00:56:19all these years
00:56:20all these years
00:56:21we're going to bust them
00:56:22right now
00:56:23and this
00:56:24we're pitching this
00:56:25yeah yeah
00:56:26so we're pitching this
00:56:26on Zoom
00:56:27yeah we're pitching this
00:56:28on Zoom
00:56:28on like a Monday
00:56:29because it's COVID
00:56:30yeah we're like
00:56:31kind of hungover
00:56:32from the weekend
00:56:32and we're like
00:56:33pitching this
00:56:33we're in Miami
00:56:33yeah
00:56:34and it's
00:56:35you know
00:56:36Jason's on
00:56:37and
00:56:39Mike and you
00:56:40yeah
00:56:40and Mike
00:56:41you know
00:56:42it's essentially got like
00:56:42you know
00:56:43my friends who I think
00:56:44are sort of
00:56:45you know
00:56:46quite funny
00:56:46you know
00:56:48Jason's quite funny
00:56:49I think like
00:56:50Jason's the closest thing
00:56:52to Cartman
00:56:52that exists in the real
00:56:53in the life
00:56:54we have a joke going
00:56:57that he's Butters
00:56:58and I'm Cartman
00:56:59yeah
00:56:59so
00:57:01and then
00:57:03I heard Mike's
00:57:04pretty funny too
00:57:05so
00:57:06so we come in
00:57:08like just like
00:57:09guns blazing
00:57:09guns blazing
00:57:10with like ideas
00:57:11and we didn't realize
00:57:12like actually
00:57:13you know
00:57:14that's not how it works
00:57:15and
00:57:15that's
00:57:17that's normally like
00:57:18actors
00:57:18and they just get told
00:57:19what to do
00:57:20and like oh
00:57:20what you mean
00:57:21we can't just like
00:57:22do funny things
00:57:23that we thought of
00:57:24what
00:57:25they're
00:57:25they're watching this
00:57:26and on the zoom
00:57:27they're aghast
00:57:28at like Elon's pitch
00:57:30yeah
00:57:30and it's silence
00:57:31like so
00:57:32and I'm like
00:57:33and I was like
00:57:33is this thing working
00:57:34is this
00:57:35are we muted
00:57:36is our mic on
00:57:37and they're like
00:57:38we hear you
00:57:39and then
00:57:39and then
00:57:39after a long silence
00:57:41like Mike
00:57:42Mike just says
00:57:43the word crickets
00:57:43and they're not laughing
00:57:45so
00:57:46not even a chuckle
00:57:48I'm like
00:57:48what's going on
00:57:49and then Elon
00:57:49explains the punchline
00:57:50which is
00:57:51exactly
00:57:52so
00:57:52there's more to it
00:57:54okay
00:57:55yes
00:57:55that's just the beginning
00:57:59so Elon says
00:58:01so
00:58:01so then I'm
00:58:02so then I'm like
00:58:03so
00:58:04so
00:58:04so
00:58:04so I say
00:58:05look
00:58:05I'm gonna
00:58:06I'm gonna reach down
00:58:08into my pants
00:58:12into my pants
00:58:13and I stick my hand
00:58:14in my pants
00:58:14and I'm gonna
00:58:15and I'm
00:58:15and I'm gonna pull my cock
00:58:16and I tell this to the audience
00:58:17and the audience
00:58:18is gonna be like
00:58:18what
00:58:19and
00:58:22and
00:58:22and
00:58:23and
00:58:23and
00:58:23and
00:58:24and
00:58:24and
00:58:24and
00:58:24and
00:58:25and then
00:58:25and then I pull out
00:58:26a baby rooster
00:58:28you know
00:58:31yes
00:58:31and it's like
00:58:32okay this is
00:58:33kind of PG
00:58:34you know
00:58:34it's like
00:58:35not that bad
00:58:35it's like
00:58:36this is my tiny cock
00:58:38and
00:58:39and
00:58:40and
00:58:40and
00:58:41and it's like
00:58:43what do you think
00:58:45and so then
00:58:47and do you think
00:58:48it's a nice cock
00:58:48I mean
00:58:49I like it
00:58:50and I pitch
00:58:50I'm like
00:58:51and then Kate McKinnon
00:58:52walks out
00:58:53yeah exactly
00:58:53and I'm like
00:58:54oh no
00:58:54but you haven't heard
00:58:55half of it
00:58:55so Kate McKinnon
00:58:56comes out
00:58:56yeah
00:58:56and she says
00:58:57Elon
00:58:58I expected
00:58:59you would have
00:59:00a bigger cock
00:59:01yeah
00:59:02I was like
00:59:03I don't mean
00:59:04to disappoint you Kate
00:59:05but
00:59:05yeah
00:59:06but I
00:59:07I hope you like it anyway
00:59:08and then
00:59:09but Kate's
00:59:11gotta come out
00:59:11with
00:59:12with her cat
00:59:12okay
00:59:13right
00:59:13so
00:59:14and Kate says
00:59:16you can see where this is going
00:59:18and I say
00:59:19nice
00:59:20wow
00:59:20that's
00:59:20that's a
00:59:21that's a
00:59:22that's a nice pussy
00:59:22you've got there Kate
00:59:23wow
00:59:25that's amazing
00:59:27um
00:59:27it looks a little wet
00:59:29was it raining outside
00:59:30and then
00:59:33Kate says
00:59:35do you mind if I stroke your pussy
00:59:36is that cool
00:59:37it's like
00:59:38oh no
00:59:39Elon
00:59:39actually
00:59:40can I hold your cock
00:59:41of course
00:59:42of course Kate
00:59:43you will
00:59:43and then they trade
00:59:44hold my cock
00:59:45um
00:59:46and then
00:59:46you know
00:59:47we exchange
00:59:48and I think
00:59:48just the audio version
00:59:49of this is pretty good
00:59:50right
00:59:50um
00:59:51and
00:59:53and um
00:59:53you know
00:59:54it's just like
00:59:54wow
00:59:55I really like
00:59:56um
00:59:57stroking your cock
00:59:57and
00:59:58I was like
00:59:59I
00:59:59and then
01:00:02Elon says
01:00:03I'm really enjoying
01:00:04stroking your pussy
01:00:06yes
01:00:06of course
01:00:07and
01:00:08um
01:00:08yeah
01:00:09so
01:00:09you know
01:00:10they're looking at us
01:00:11like
01:00:12oh my god
01:00:13what have we done
01:00:14inviting these lunatics
01:00:15on the program
01:00:16yeah
01:00:17and then
01:00:17they said
01:00:18they said
01:00:18like
01:00:19well
01:00:19um
01:00:20it is
01:00:20uh
01:00:21it is
01:00:21Mother's Day
01:00:22it's Mother's Day
01:00:25we might not want to go
01:00:27with this one
01:00:27while the mom's in the audience
01:00:28and I'm like
01:00:29well that's a good point
01:00:30fair
01:00:31fair
01:00:31it might be a bit uncomfortable
01:00:33for all the moms in the audience
01:00:34maybe
01:00:34I don't know
01:00:35I don't know
01:00:35maybe they'll dig it
01:00:36maybe they'll like it
01:00:37so
01:00:38yeah
01:00:39yeah
01:00:39that was
01:00:40that's the
01:00:42um
01:00:42that's the
01:00:43that's the
01:00:44that's the
01:00:45um
01:00:46cold open
01:00:47that didn't make it
01:00:48we didn't get that
01:00:48on the air
01:00:49um
01:00:50but uh
01:00:51we did fight for Doge
01:00:53yes
01:00:54and we got Doge
01:00:55on the air
01:00:55I mean there's a bunch of things
01:00:56that I said
01:00:57that were just not on the script
01:00:58like they have these like
01:00:59cue cards for
01:00:59what you're supposed to say
01:01:00and I just didn't say it
01:01:01I just went off
01:01:02off the rails
01:01:02yeah
01:01:03they didn't see that coming
01:01:05yeah
01:01:06it's live
01:01:06well
01:01:07it's live
01:01:09and uh
01:01:10so
01:01:11the
01:01:12Elon wanted to do Doge
01:01:14this is the other one
01:01:15and he wanted to do Doge
01:01:16on late night
01:01:17and he says
01:01:18um
01:01:18hey Jake Al
01:01:18can you um
01:01:19make sure
01:01:19oh yeah
01:01:20I wanted to do the Doge Father
01:01:22like you sort of redo the
01:01:23you know
01:01:23that scene from
01:01:24uh
01:01:24the the Godfather
01:01:26I mean you kind of need the music
01:01:27to cue things up
01:01:28you bring me
01:01:33on my daughter's wedding
01:01:35this and you ask for Doge
01:01:37yeah you got
01:01:38Marlon Brando
01:01:38and I give you Bitcoin
01:01:39but you want Doge
01:01:40exactly
01:01:41you really got to set the mood
01:01:43you got to have a tuxedo
01:01:44and this sort of job office
01:01:45and the Doge Father
01:01:46and you got to have like
01:01:47talking like Marlon Brando
01:01:49and I said
01:01:52you come to me
01:01:53on this day
01:01:54of my Doge's wedding
01:01:56and you ask me
01:01:58for your private keys
01:01:59are you even a friend
01:02:04you call me
01:02:06the Doge Father
01:02:07so
01:02:09that's potential
01:02:14they had great potential
01:02:15so they come to me
01:02:16and I'm
01:02:16I'm talking to Colin
01:02:17um
01:02:18and Joe
01:02:19who's got a great sense of humor
01:02:21and he's amazing
01:02:21he loves Elon
01:02:22and he's like
01:02:23we can't do it
01:02:24because of the law
01:02:25and stuff like that
01:02:25the law
01:02:26and a pliability
01:02:28so I said
01:02:29it's okay
01:02:29Elon called Comcast
01:02:31and
01:02:32he put in an offer
01:02:34and they just accepted it
01:02:35he just bought NBC
01:02:37so it's fine
01:02:39yeah
01:02:39and Colin Joe's looks at me
01:02:42and I just sold it
01:02:43so good
01:02:43and he's like
01:02:44you're serious
01:02:46I'm like
01:02:46yep
01:02:47we own NBC now
01:02:49yeah
01:02:50and he's like
01:02:51okay well that kind of
01:02:52changes things doesn't it
01:02:53I'm like absolutely
01:02:54we're a go on Doge
01:02:56yeah
01:02:57and then he's like
01:02:58you're fucking with me
01:02:58and I'm like
01:02:59I'm fucking with you
01:03:00or are we
01:03:02or are we
01:03:03it was the greatest
01:03:05week of
01:03:07and that like
01:03:08is like
01:03:08two of ten stories
01:03:10yeah
01:03:10we'll save the other eight
01:03:12yeah
01:03:12but it was
01:03:13and I was just so happy
01:03:15for you
01:03:15to see you
01:03:17have a great week
01:03:18of just joy
01:03:19and fun
01:03:19and letting go
01:03:20because you were
01:03:21launching rockets
01:03:22you're dealing with
01:03:22so much bullshit
01:03:23in your life
01:03:23to have those moments
01:03:25yeah
01:03:25to share them
01:03:26and just laugh
01:03:27it was just so great
01:03:29yeah
01:03:29more of those moments
01:03:30I think we gotta
01:03:31we gotta get you back
01:03:32on SNL
01:03:33who wants to back
01:03:34on SNL
01:03:34one more time
01:03:35all right ladies
01:03:36and gentlemen
01:03:37our bestie
01:03:38Elon Musk
01:03:38thank you for the love of the Lord
01:03:49and thank you for the love of the Lord
01:03:50and thank you for the love of the Lord
01:03:51and thank you for the love of the Lord
01:03:52and thank you for the love of the Lord
01:03:53and thank you for the love of the Lord
01:03:55and thank you for the love of the Lord
01:03:56and thank you for the love of the Lord
01:03:57and thank you for the love of the Lord
01:03:58and thank you for the love of the Lord
01:03:59and thank you and for the Lord
01:04:00and thank you for the love of the Lord
01:04:01and thank you for the love of the Lord
01:04:02and thank you for the love of the Lord
01:04:03and thank you for the love of the Lord
01:04:04and thank you for the Lord
01:04:05and thank you for the love of the Lord

Recommended