- 5/31/2025
Friday Night Live 30 May 2025
In this episode, I discuss the profound implications of artificial intelligence on society, prompted by Mike Cernovich's warnings about rapid advancements. I challenge the narrative that technology boosts productivity, citing how it often leads to inefficiency and bureaucratic overload.
While I acknowledge AI's potential—drawing from my own experiences with technology in research and the promise of self-driving vehicles—I candidly address the likely job losses it may bring. I urge listeners to adapt to change and reflect on their own journeys, advocating for mutual care in relationships while maintaining boundaries in our emotional investments.
GET MY NEW BOOK 'PEACEFUL PARENTING', THE INTERACTIVE PEACEFUL PARENTING AI, AND THE FULL AUDIOBOOK!
https://peacefulparenting.com/
Join the PREMIUM philosophy community on the web for free!
Subscribers get 12 HOURS on the "Truth About the French Revolution," multiple interactive multi-lingual philosophy AIs trained on thousands of hours of my material - as well as AIs for Real-Time Relationships, Bitcoin, Peaceful Parenting, and Call-In Shows!
You also receive private livestreams, HUNDREDS of exclusive premium shows, early release podcasts, the 22 Part History of Philosophers series and much more!
See you soon!
https://freedomain.locals.com/support/promo/UPB2025
In this episode, I discuss the profound implications of artificial intelligence on society, prompted by Mike Cernovich's warnings about rapid advancements. I challenge the narrative that technology boosts productivity, citing how it often leads to inefficiency and bureaucratic overload.
While I acknowledge AI's potential—drawing from my own experiences with technology in research and the promise of self-driving vehicles—I candidly address the likely job losses it may bring. I urge listeners to adapt to change and reflect on their own journeys, advocating for mutual care in relationships while maintaining boundaries in our emotional investments.
GET MY NEW BOOK 'PEACEFUL PARENTING', THE INTERACTIVE PEACEFUL PARENTING AI, AND THE FULL AUDIOBOOK!
https://peacefulparenting.com/
Join the PREMIUM philosophy community on the web for free!
Subscribers get 12 HOURS on the "Truth About the French Revolution," multiple interactive multi-lingual philosophy AIs trained on thousands of hours of my material - as well as AIs for Real-Time Relationships, Bitcoin, Peaceful Parenting, and Call-In Shows!
You also receive private livestreams, HUNDREDS of exclusive premium shows, early release podcasts, the 22 Part History of Philosophers series and much more!
See you soon!
https://freedomain.locals.com/support/promo/UPB2025
Category
📚
LearningTranscript
00:00:00Good evening, third time Lucky. Welcome, welcome to Friday Night Live, 30th of May, 2025.
00:00:05FreeDebate.com slash donate. Let's keep things spicy.
00:00:09If you've got great questions, let me know if you'd like me to start with Ina Kleiner just been ranting.
00:00:19Ranting? Would you like a rant?
00:00:20Would you like a rant? Because Mike Cernovich, Monsieur Michael von Cernovich, had a bit of a banger tweet that went out, I think, yesterday or the day before, and somebody's just sort of pointing it out here, where he was saying, what was he saying here?
00:00:41He said, let me just get this here. He said, I'm not a doomer and have always been pro-tech acceleration.
00:00:48Everyone I'm talking to who works in AI, man, we're about to get hit by a freight train.
00:00:55What do you think, Steph? Will AI really be a freight train?
00:00:59Well, what are your thoughts? I'm happy to hear your thoughts.
00:01:03I can certainly share my thoughts about all of this, and you can tell me what you think of it.
00:01:10It is going to be a rant. I will be straight up about that.
00:01:13And it is going to be a rant that may be a little too spicy for you.
00:01:19It's going to be a rant that is going to be frank.
00:01:26So I will have my rant, and then you can tell me what you think.
00:01:31Now, here's my rant.
00:01:34I don't care. I don't care.
00:01:37I look within, right? I look within, and I try to be honest with you.
00:01:43I aim to be honest with you guys.
00:01:44I don't care.
00:01:48So some of the studies are not great with regards to AI and productivity.
00:01:53There was one study about managers that AI had saved them about 3% of their productivity.
00:02:02Computers had a sort of similar thing.
00:02:07Everyone said, oh, my God, computers are going to make us so much more efficient.
00:02:10It's not really the case.
00:02:12Human beings adapt to technology and have an amazing genius at finding ways to destroy productivity gains from technology.
00:02:23It's really quite remarkable how we're able to do that.
00:02:26So, oh, email is going to be so much more efficient.
00:02:28And no, because sometimes phone calls are much better, right?
00:02:33I mean, a lot of people who have social anxiety just end up emailing all the time.
00:02:37They're not very good at communicating.
00:02:38You can't read between the lines, whereas you can read people's emotions with regards to especially face-to-face calls.
00:02:45You can read people's emotions on a phone call much better than you can an email.
00:02:49And, of course, the email flood, the, oh, it's so much easier.
00:02:53Now we have Zoom or now we have other things.
00:02:56We can just do these meetings in so much more of an efficient manner.
00:02:59My God, it's beautiful.
00:03:00It's like, no, now you just have a bunch of made-up HR jobs where people have endless meetings with no particular purpose or point.
00:03:07So, we have an amazing ability to completely screw up productivity gains.
00:03:13Now, in a free society, that wouldn't really be the case, of course, but it certainly is the case in the society that we have now.
00:03:20I have to watch this tendency in myself to not keep tinkering and fussing with things to the point where I wreck the productivity gains of having all of this great technology.
00:03:30So, for instance, I've done some article reviews recently, and I just hold my tablet, and I'm not, oh, let's do the separate, slightly better audio with the, you know, all of this.
00:03:42It's like, no, just boot it up and talk into the microphone.
00:03:45It's fine.
00:03:48It's fine.
00:03:48And to not fuss with massive productivity gains, as you can see, you know, the studio here is not very, it is not very sophisticated.
00:04:01It is me inside an aging ping-pong ball of gray testicle doominess.
00:04:07And I really want you guys to focus on the ideas and the arguments and not be distracted by some sort of background nonsense.
00:04:14So, there is that.
00:04:16Now, on the other hand, on the other hand, I think that the technology is a lot further ahead than people think.
00:04:23I've worked in tech R&D, and the stuff that's in the pipeline is way better than the stuff that's out here in the world.
00:04:31And so, I think with regards to physical robots, everyone's like, yeah, but they can't clean toilets.
00:04:36Yeah, they can.
00:04:37Yeah, but they can't, you know, assemble this out of the other.
00:04:40It's like, yes, they can.
00:04:42They really can do absolutely wild and amazing stuff.
00:04:45Now, they can't do philosophy yet, but I'll give you sort of an example of an AI productivity that I used.
00:04:53I'm working on this new novel, which is, oh, my God, it's so good.
00:04:57Oh, my God, it's so good.
00:05:01I, you know, sometimes I amaze even myself.
00:05:04It's an old line from Star Wars.
00:05:05But, no, it's really, I've had a real breakthrough in writing and all of that.
00:05:10So, I needed to, you know, one of the characters is a lawyer, and so I needed a legal situational scenario.
00:05:16Now, in the past, I'd have buried myself in books and done research for a week or two.
00:05:21But with AI, you can just say, give me this scenario.
00:05:24Tell me, give me a Canadian context and, you know, whatever, right?
00:05:29And it can map it all out for you and get all the research done together for you.
00:05:32It's really amazing as far as that goes.
00:05:34So, for me, it is really helpful and good for these kinds of things because when you're a novelist, you have to become an expert in just about everything.
00:05:45And AI is just a remarkable research assistant as far as all of that goes.
00:05:51Now, we've got self-driving cars.
00:05:56I told you guys a couple of months ago, I went for a test drive in a Cybertruck.
00:06:03Not that I'm buying a Cybertruck, obviously.
00:06:05My car is eight years old, something like that, still younger than my phone.
00:06:11Oh, no, my phone's about six years old.
00:06:15But I was just curious about the technology.
00:06:18It will blow your gourd, man.
00:06:20It will blow your gourd.
00:06:21It is unbelievably great.
00:06:24It is unbelievably great.
00:06:25So, it drives you to your destination, no hassle, no problem.
00:06:29And also, when you get into a parking space, you get into a parking lot, it'll show you the parking spots around you.
00:06:37You touch the parking spot, and it figures out how to get your car, your Cybertruck, into that parking lot.
00:06:44It's mind-blowing.
00:06:45And it feels like it's doom.
00:06:46Your car feels possessed, and it is possessed by Elon Musk excellence, musculence, or whatever you want to call it.
00:06:55It's, like, staggeringly good.
00:06:57And that is the future.
00:06:59I mean, that is the future.
00:07:02People, you know, you have, like, from what I read recently, you have, like, a 10% chance of getting into an accident over the next five years.
00:07:08But it cut way in half and more if you are, if you have a self-driving car.
00:07:13And if you let it handle it all, it goes down even further.
00:07:16So, I mean, car crashes kill, like, what, 30,000, 35,000 people a year in the States.
00:07:21So, that's the future.
00:07:23And I remember having this.
00:07:25I did a show a couple of years ago about Tucker Carlson's book about we can't let self-driving trucks out there on the road.
00:07:38Sure we can.
00:07:40Right now, I was reading about a guy on X.
00:07:43He runs eight trucks.
00:07:45He runs eight trucks.
00:07:46His insurance, which is fairly minimal insurance, is well over $120,000 a year.
00:07:53That's crazy.
00:07:55So, yeah, self-driving trucks are coming along.
00:07:59Manual dexterity robots are coming along.
00:08:02And AI is, I mean, AI is already reviewing resumes.
00:08:09They're starting to have AI call people up and do interviews to see if they're answering questions correctly and so on.
00:08:16AI is creating teaching plans, learning plans, studying plans, which means tutors are out of the way.
00:08:25So, yeah, AI is coming to eat everyone's lunch.
00:08:30I mean, not mine.
00:08:32AI is not original in that way.
00:08:34So, what I do is original.
00:08:37It's why I keep doing it.
00:08:38It's why I'm not doing the same thing time after time, week after week.
00:08:41I couldn't stand it.
00:08:43It would be appalling to me.
00:08:44So, I do new things.
00:08:46And those new things are fun for me.
00:08:48Hopefully, they're fun and enjoyable for you.
00:08:50But honestly, I don't care.
00:08:52I do not care what happens to people with regards to AI.
00:09:01And I'll tell you what.
00:09:02Yeah, AI is going to be great for therapy.
00:09:05AI is, you know, I've already got, not that I'm a therapist, but, you know, if you're a subscriber to freedomain.locals.com or subscribestrata.com slash freedomain, you've got a whole call-in show.
00:09:16We've programmed up an AI with my call-in shows and given it some good instructions, and people are finding it very, very helpful.
00:09:25Now, again, it's not going to be original in the way that I am, but it's going to be pretty amazing, and it's certainly going to be helpful, and, you know, cost-benefit, right?
00:09:34It's great.
00:09:35So, I don't care.
00:09:36I don't care.
00:09:37In particular, I don't care what happens to the, I don't know, how do we put this?
00:09:47How do we put this?
00:09:48How do we put this?
00:09:52Those who are more easily replaced by AI, I don't particularly care.
00:09:59And everybody who says that they do care is just kind of lying to you, right?
00:10:04I mean, think of, like, at the beginning of the 20th, at the beginning of the 19th century, right, like, 90% or more of Americans were involved in farming.
00:10:14Now, it's down to about 2%, right?
00:10:16So, 88% of people got kicked out of the country.
00:10:20They lost all of their farm skills.
00:10:22They lost all of their manual labor skills because they had to transfer them to some other environment, usually some sort of proletariat situation.
00:10:29Did anybody really care?
00:10:30Was there a massive dislocation?
00:10:32There certainly was, right?
00:10:34I remember at the end of the Second World War, sorry, it sounds like I remember directly, but I remember hearing about the end of the Second World War, you had, you know, millions of soldiers coming back from the war, right?
00:10:46And these were men who'd been traumatized and had been doing terrible things, killed people, I guess.
00:10:51I mean, no, the majority of them didn't even fire their weapons in war, but some of them certainly did.
00:10:56And they were coming, flooding back into the American economy, and all of the economists and the socialists and the leftists and the blah, blah, blahs were like, oh, we've got to have a big program to help to settle these people and get them going from war to some sort of peacetime activity.
00:11:11And we had that, right?
00:11:12And by the time they'd even got the plan sorted out about how they might do it, everyone had come back and self-organized, had self-organized.
00:11:20I've had to change and reframe what it is that I do based upon wild economic dislocations that I ranted about on Wednesday that you can't plan for a goddamn thing, because the economy is like a drunken Uber driver possessed by Mario Andretti.
00:11:37And the way that I drive my car on the highway, which is exactly the speed limit, you know, creating a vast train of people behind me who just like to follow me.
00:11:45I don't have that many followers left on social media, but on the highway, I have a lovely train of people honking and gesticulating their absolute enjoyment of me going the speed limit.
00:11:56It's beautiful.
00:11:57It's a beautiful thing.
00:11:57It warms my heart, and I know that some of them are so excited about my speed that they actually want to bump into me from behind to show their enthusiasm and maybe bump me one kilometer over the speed limit.
00:12:09But I don't care.
00:12:12I don't care because none of that.
00:12:17I mean, so when I was first started out in the business world, everybody needed a secretary, and I had a secretary.
00:12:25At one point, I had two, because, you know, to organize, to plan, to keep this calendar going, to book your business travel, blah, blah, blah.
00:12:33And then what happened was, you know, like Expedia came along, and other calendar things came along, Outlook came along, and you could just schedule your own stuff.
00:12:43And, you know, millions of secretaries and receptionists were gone.
00:12:49And was there racking, oh, my God, it's so terrible.
00:12:52It's just horrible.
00:12:53It's so appalling, blah, blah, blah, right?
00:12:55So the alternative media, which, of course, I was, you know, king of alt media for quite some years in terms of views and downloads, we ate the mainstream media's lunch, right?
00:13:08Because we could actually deal with facts, and we didn't have an HR department.
00:13:13So I wasn't crying that I was putting people out of work in the mainstream media.
00:13:19I mean, I remember going to show my daughter, and she's like, Daddy, how big is your show?
00:13:23And I said, well, okay, here's my tweet.
00:13:25Here's, you know, just an average tweet of mine.
00:13:26Here's how many views we get.
00:13:27Let's go to CNN.
00:13:28Let's go to New York Times.
00:13:29Let's go to NBC and see how they're doing.
00:13:31And I was like, many, many times bigger.
00:13:34Was I crying about that?
00:13:35No.
00:13:35I was like, yeah, you win, and other people don't win.
00:13:39Again, I look forward to Hollywood being completely wrecked by AI video generation.
00:13:48I mean, at some point, and I mentioned this a year or two ago, at some point, you'll just, I'll be able to feed one of my novels into an AI, and it will create a movie for me based upon my characters and the dialogue.
00:13:57And obviously, I'll be able to tweak it and all of that, but that's going to save, like, 20 million, 30, 50 million to feed my novel almost into, like, to actually make that, it would be, like, at least a 12 to 14 hour miniseries and cost two to $300 million.
00:14:15And at some point, you'll just be, so the quality of the story will count, because the more capital you need for something, the more political everything becomes, because the more you've got a choke point, you go, right?
00:14:25So, I don't care.
00:14:27I don't care about all of the people who are going to lose their jobs through AI.
00:14:31Why?
00:14:31A couple of reasons.
00:14:32One, nobody cared that I lost my job to censorship.
00:14:35Nobody cared that I lost my career to a large degree due to deplatforming.
00:14:40And I have, I have, I have, as a rigid foundational moral rule, 100%, 100%, it has never failed me, and it never will.
00:14:52Well, treat people the best you can, the first time you meet them after that, treat them as they treat you.
00:14:58Why would I care about people losing their jobs in the world, and losing their careers, and having to retrain, and change, and this, nobody gave a shit, and that happened to me?
00:15:07Really?
00:15:08I mean, I know you guys did, and I care about you, and all of that, and we'll continue to have these conversations if I can help you transition in that way.
00:15:15I certainly will.
00:15:16But no, I don't care.
00:15:18I don't care.
00:15:18That's number one.
00:15:19Number two, the people who have seasonal jobs will often go on welfare during the off-season, or go on unemployment insurance, and milk it to the very last, fennec, fennec, fennec, milk it to the very last shekel.
00:15:35And I don't care.
00:15:36I care about as much of people who don't work as the people who don't work cared about my wallet when it came to paying their benefits, right?
00:15:50So, like, the poor, the underemployed, the kind of lazy, the people who just vote for more and more stuff taken out of the wallets of people like us.
00:16:00Well, how much compassion and care did they show for us?
00:16:03Did they say, well, you know, we kind of want the unearned.
00:16:06We kind of want free everything.
00:16:07We want free health care, free education, free dental care.
00:16:10We want all this free stuff.
00:16:11And I want unemployment insurance.
00:16:14I want old age pensions.
00:16:15I want welfare.
00:16:17I want rent controls.
00:16:18I want food stamps.
00:16:19I want all of this free stuff.
00:16:22How much compassion did those people show to, say, you or me who had to foot the bill for all of this stuff?
00:16:29Well, the answer is not only did they not show any compassion, they were directly predatory upon our jugular.
00:16:37They took food out of our children's mouths.
00:16:39They took resources out of our homes.
00:16:42They made our homes smaller.
00:16:43They made our cars older.
00:16:45We had to get by on less because you had the 10,000 predatory mosquitoes draining every drop of blood from your system.
00:16:51So you had to stagger forth like your entire ass had been encased in Lyme disease carrying kicks.
00:16:58So I don't care.
00:17:03I don't care.
00:17:05I don't care about people who don't care about me.
00:17:08And I sure as shit, I'm not going to care about people who've preyed upon me my entire adult life.
00:17:13I mean, I got my first job when I was 10 years old.
00:17:18I've never taken a dime of welfare.
00:17:20I've never taken unemployment insurance.
00:17:22I've never, I don't expect to get much of a pension.
00:17:25I mean, it's, and my daughter is homeschooled.
00:17:28So it's all just predatory.
00:17:34I mean, would you give your last kidney to somebody who trafficked your last kidney by offering you a blowjob in a Mexican toilet?
00:17:42And you woke up with your kidney missing and badly sewn up.
00:17:45And then, oh, I need your kidney.
00:17:46It's like, you already took one.
00:17:48I don't care to give another.
00:17:49This is the whole UBI question.
00:17:51Universal basic income.
00:17:53Nope.
00:17:54I don't care.
00:17:55I mean, they'll probably try and go for it.
00:17:56And they'll probably, maybe they'll be successful.
00:17:58But I don't care.
00:18:00I do not care.
00:18:01You know, one of the prices that society pays by preying upon the productive is the productive don't give a shit about the people preying on them.
00:18:11At least I would suggest that.
00:18:12I think that's a healthy moral attitude.
00:18:13I don't care.
00:18:19They have preyed upon my wallet in time and like half my life is sacrificed to endlessly pay for, at the point of a knife, people's terrible decisions.
00:18:31I don't care.
00:18:34Oh, you're going to lose your job to AI?
00:18:35I don't care.
00:18:36I mean, I trained in software and then the software stuff was all outsourced and displaced.
00:18:44I trained in academics and then, you know, white guys were not exactly going to be hired as a whole, right?
00:18:51I trained in managing a very large social media empire.
00:18:55And then that was rug pulled for reasons of hysteria and witch burning and political correctness.
00:19:01And nobody cares.
00:19:02I mean, again, you guys, fantastic.
00:19:04I'm very much loyal to you guys, but I've had to rebrand myself so many times in my life that why would I care that other people have to rebrand themselves?
00:19:13I mean, nobody cared about this.
00:19:15I don't know.
00:19:16People just lie about all of this caring.
00:19:18They just lie about it.
00:19:19They just say stuff.
00:19:21You know, it's like all the people who are like, oh, yes, we should take everyone in from the world.
00:19:24And it's like, okay, I've got some people here.
00:19:25Can they move into your house?
00:19:26Oh, my God, no.
00:19:27Right?
00:19:27Like, St. Martha's Vineyards had some migrants dropped off and they got the literal army out there to yank them away within 48 hours.
00:19:37People just say stuff.
00:19:38And so, yes, there'll be a dislocation, I assume, to a large degree because of AI.
00:19:43And then there'll be lots of people who are like, oh, I just care so much.
00:19:47It's like, no, you don't.
00:19:48No, you don't.
00:19:50We know.
00:19:50Because if people cared enough, there wouldn't be a dislocation.
00:19:55Guaranteed.
00:19:56So, for instance, when cars came along, the people who made and sold the horse and buggies, the people who shoveled all the horse manure off the streets of New York and eventually put it in a casket and had it resurrected as Gavin Newsom,
00:20:12like all of those people were people like, ooh, you know, ooh, you know, my cousin Vinny is a horse and buggy manufacturer, so let's not build, let's not buy a car.
00:20:25Don't buy a car, right?
00:20:27People will.
00:20:29They'll buy the fucking cars.
00:20:31And so they don't care.
00:20:33People will take advantage of AI.
00:20:34People will use AI and they won't care.
00:20:37They won't say, oh, you know, a therapist is $200 an hour, but I'm going to go pay that because I don't want to use a therapy AI or whatever it is, right?
00:20:50Eliza was a program way back in the day that was used as a sort of pseudo-therapist, right?
00:20:56Oh, tutors, you know, AI can really create a customized learning plan for me, but I don't want to do that because, you know, these tutors need that.
00:21:03They don't care.
00:21:04If they did care, it wouldn't happen.
00:21:06So let's say that people prefer self-driving cars, self-driving trucks, right?
00:21:14Because it's cheaper and the insurance is lower and they're less dangerous and they don't have to sleep and blah, blah, blah, right?
00:21:21And they don't get tired.
00:21:22They don't get distracted.
00:21:23They don't fiddle with the radio.
00:21:24They don't have, you know, CB, Breaker, Breaker, 10-4, Good Buddy, Kurt Russell stuff going on, right?
00:21:29They don't need that, right?
00:21:30I need to take this approach and apply it better.
00:21:36Thanks, Steph.
00:21:36Oh, I'm begging you.
00:21:38I'm begging you.
00:21:39People, I'm on my knees begging you.
00:21:43Do not care more about people than they care about you.
00:21:52Do not care about people more than they care about you.
00:21:55You will be wrecked, exploited, and most people trick you into pathological altruism and then suck the fucking bone marrow out of your balls.
00:22:06Somebody says, I spent $200 an hour on therapists who just said, mm-hmm, as a response throughout the session.
00:22:15Yes, but how do you feel about it?
00:22:17Well, freedomain.com slash donate.
00:22:19Don't forget to help out the show.
00:22:21It ain't free.
00:22:22Nothing is free.
00:22:23Nothing is free.
00:22:25Nothing is free.
00:22:27So, you know, I mean, I'm sure, I mean, okay, hit me with a number here.
00:22:33Hit me with a number, right?
00:22:35Hit me with a number.
00:22:37How many times have you had to learn substantial new skills?
00:22:46How many?
00:22:48I think for me, it's probably, you know, I mean, I was into acting and playwriting, and then people found out that I was a free market capitalist and hated my guts and didn't want to have anything to do with me and wouldn't hire me.
00:22:59And, you know, I got half booted out of theater school after they found out about my politics.
00:23:04So, how many times have you had to, how do you qualify that?
00:23:09How do you quantify that?
00:23:11Think for yourself, man.
00:23:12Come on, this is a philosophy show.
00:23:15I got to think for me, and I'm not trying to, I think for me, it's been at least six.
00:23:19Like, I'm thinking, well, I did the gold panning thing, prospecting thing.
00:23:24Then the tax laws changed, and I couldn't do that anymore.
00:23:27And then I went to, I did an English degree, and then I did theater school.
00:23:33Then I did a history degree.
00:23:34And then I did my graduate school, but I had to leave academia because I could tell the woke stuff was coming, and white males were an endangered species.
00:23:40And then I went into the business world.
00:23:41I did tech, and then I did marketing.
00:23:43And then I did novel writing.
00:23:45I did playwriting.
00:23:46I did alternate media stuff, the podcasting thing and all of that.
00:23:54So, it's, you have to redo a lot, right?
00:23:59So, what have we got here?
00:24:02Twice, five or more.
00:24:04I've got six.
00:24:05Three to four times.
00:24:07Oh, so, that's interesting.
00:24:08Two, pretty young, though.
00:24:09Every two years, five to six.
00:24:11I feel like it's a million things.
00:24:12It's a jack-of-all-trades housewife.
00:24:13It's like a lot.
00:24:14As far as job stuff, though, probably like five to six times.
00:24:17About five times, oil and gas, food industry, 18-wheeler design, structural design, helicopters, four to five, six, four.
00:24:24Almost all coaches I've had calls with, with the exception of maybe one or two, were scams, were scams who didn't provide anything of value, and it was a sunk cost.
00:24:34Yeah, eight or more, five.
00:24:35Yeah.
00:24:36I mean, so, we have to revamp all the time.
00:24:37Now, let me ask you this.
00:24:39Let me ask you this.
00:24:40How many people have given you deep empathy and sympathy?
00:24:44I mean, maybe not personally, but you as a class.
00:24:46How many people have given you, you know, deep, empathetic sympathy and all of that for all of the retraining that you've had to do?
00:24:54Because this is new in human life.
00:24:56Like, you were raised as a stonemason in the Middle Ages.
00:24:59You did what your father did.
00:25:00You were raised as a stonemason.
00:25:01What were you doing the rest of your life?
00:25:03You were a stonemason.
00:25:04If you were a priest, you were a priest.
00:25:05If you were a baker, you were a baker.
00:25:06If you were a blacksmith, you were a blacksmith.
00:25:09So, this whole having to reinvent yourself all the time and learning about a particular industry and then having some stupid rule or maybe technology, some government rule, regulation, whatever, just change the whole damn thing.
00:25:26You know, I was hoping to get all the way through university by doing all this gold panning and prospecting stuff because, you know, I learned my skills.
00:25:32I learned how to do it.
00:25:33And then the tax laws changed and the oil and gas exploration was no longer subsidized.
00:25:38That whole industry just dried up and evaporated.
00:25:40Right.
00:25:45You know, I mean, I knew a woman who was a purser.
00:25:51She was a purser on an airline, which is like the head stewardess.
00:25:55And then 9-11 happened and she lost her job.
00:26:00Just shit that happens, right?
00:26:02Just shit that happens, right?
00:26:04So, let's see.
00:26:06Somebody says, I started in acting, then fashion, then marketing media, and now in fashion.
00:26:10Therapy.
00:26:11She means as a therapist, not in therapy.
00:26:13So, we got zero, zero, one, zero.
00:26:15These are people who've given you sympathy.
00:26:17Zero, zero, ha, one, zero.
00:26:19At least three big shifts in tech approach and soft skills development two to three times.
00:26:23That's James.
00:26:24People giving you sympathy.
00:26:25Zero.
00:26:26One person.
00:26:27Oh, no, empathy.
00:26:28You change because you have to.
00:26:30LOL, that creek bed's dry.
00:26:32Zero.
00:26:33Not the only, not only the time spent to relearn, then all the money spent on courses, books, etc.
00:26:39Yeah.
00:26:39Zero, although I did love the changes.
00:26:41Every time I changed skill sets, it was to do something better.
00:26:45Yeah.
00:26:45Anyone remember learn to code?
00:26:47Forget sympathy.
00:26:48There was contempt.
00:26:49Okay.
00:26:49We also know that there has been a real focus on, in a wide variety of areas in society, a real focus on hiring people who are not white and male, right?
00:27:02I mean, the DEI stuff and so on, right?
00:27:04This is currently going on.
00:27:07This confrontation between, in America, you're not supposed to base any laws on race.
00:27:11And we, of course, know that a lot of American universities give preferences to people who are not white and Asian and so on.
00:27:17So, you made particular assumptions based upon that when you were growing up.
00:27:25And then there was a rug pull, right?
00:27:26So, how much sympathy, if you're a white male, have you gotten from the media, from people around you because you are, your hiring prospects are diminished?
00:27:38How much sympathy has the media given you?
00:27:44Well, you know, it's tough.
00:27:45We understand.
00:27:46It's difficult.
00:27:47You know, we've got to find some way to balance these things.
00:27:50Right?
00:27:54Wow.
00:27:54Thank you, Steph.
00:27:55Love the love.
00:27:55What a gift.
00:27:56FreedomMain.com.
00:27:57But yeah, how much sympathy for men, regardless of race, how much sympathy do you get based upon the fact that, well, Dr. Warren Farrell did a good job.
00:28:08There's a great book on this called The Boy Crisis.
00:28:10So, the bias and bigotry and sexism against boys in the educational system.
00:28:21I mean, there are very few males teaching at the younger levels.
00:28:27And we know for a simple fact that there's massive bias against boys.
00:28:31Because when you take essays and tests and all of that, and you give them to the teacher to grade, and you take off, whether it's a boy or a girl, the boys' marks go up considerably.
00:28:43Yeah, got to have those women-owned businesses and all of that.
00:28:46Right?
00:28:47So, how many times have you received sympathy from the world, from people as a whole, for being a male or for being white and all this kind of stuff, if you have challenges in the hiring process?
00:29:01And, you know, you can see some of these challenges, and I'm sure you've experienced that.
00:29:05So, why would you give sympathy to people who don't have sympathy for you?
00:29:13Right?
00:29:13I mean, if AI replaces a bunch of HR, human resources stuff, well, particularly if you're a male and the focus has been on hiring women,
00:29:23why would you give sympathy to the people who might be being replaced by HR, by AI, right, in HR?
00:29:32I don't care.
00:29:34This is a foundational contract you must have in society.
00:29:38You have to have this contract in society that you do not give people more compassion and empathy than they give you.
00:29:47You don't do it.
00:29:48You do not be a spiritual whore, and I use that term with great emphasis, and it's an insult to whores, because at least whores get paid.
00:29:58Do not be a spiritual whore.
00:30:00Do not be a slave.
00:30:01Do not have this commandment of be nice, be nice, be nice.
00:30:04It is a relationship.
00:30:08Consideration, compassion, empathy, concern, care, love, sympathy is a relationship.
00:30:13It is not a fucking commandment that you must do like gravity.
00:30:18It is not that at all.
00:30:20It is not that at all.
00:30:23Not that at all.
00:30:27You will lose everything.
00:30:29Is that your, if that's your role, you will lose everything.
00:30:33Everything.
00:30:34I mean, your history, your culture, everything.
00:30:35I mean, this is the argument to some degree with deportations, right?
00:30:42I mean, again, I'm not a statist, and I would like nothing more than a truly free society, and I've written about this in my novel, The Future.
00:30:50I talk about immigration there, how it works in a free society, so that's what I want.
00:30:54But in terms of the general things, like, well, we've got to have compassion for the people who came into America illegally.
00:30:59It's like, but where's the reciprocity?
00:31:05How much compassion do the people who come in illegally have for the preferences and laws of the host country?
00:31:13So.
00:31:16An AIHR, says someone, sounds like it would be massively dystopic.
00:31:20Dystopian?
00:31:21But the funny thing is, it would probably be more objective than the HR women we have currently.
00:31:26Well.
00:31:28No.
00:31:30So prejudice in people is undocumented in general, right?
00:31:36Prejudice is hidden.
00:31:37It's not like there are all of these things written out, right?
00:31:39So prejudice is undocumented.
00:31:41However, AI code is documented, right?
00:31:46So if there's bias in HR, you can't look in people's heads to see their bias.
00:31:52But you can sure as shit pass out the AI code to see if it's programmed to be biased.
00:31:57So people are reverse engineering the bias that's in AI all the time.
00:32:02You can see it in Google, right?
00:32:03That's pretty obvious.
00:32:04But it would be much more objective.
00:32:10So for instance, if you have AI as part of your hiring process, well, in America, it's against the Civil Rights Act to have any preferences or obstacles based on race.
00:32:24So you would run your AI through that and make sure that there wasn't any biases or prejudices, whereas you can't run people's heads through that.
00:32:36You can't run through the internal thoughts, right?
00:32:39There's no breaking down the source code of the brain.
00:32:42But you sure as shit can break down the source code of your AI and see if it's biased against race or sex.
00:32:47So, I disagree.
00:32:55I think AI.
00:32:57And now, an AI.
00:33:02Somebody says, I bet when Steph lost his job in oil and gas.
00:33:05No, it was a gold panning.
00:33:07Society didn't set up a job fair to help everyone that lost their oil and gas jobs.
00:33:11Yeah, of course.
00:33:12Of course.
00:33:13Of course.
00:33:14So, AI, if it has source code and so on.
00:33:19See, AI, you can test the biases because AI will simply do what it's told, right?
00:33:24Whatever the program tells.
00:33:25However, you can't test the biases in HR because people will just lie to you.
00:33:30No, no, no.
00:33:31We don't do any of this and that, blah, blah, blah, right?
00:33:34So, AI has the possibility of being much more objective and rational and non-discriminatory.
00:33:43Isn't that what we want?
00:33:45Non-discriminatory.
00:33:48Non-discriminatory.
00:33:53All right.
00:33:56Somebody says, I think AI has the potential to drastically change the economic landscape
00:34:01to the point where the world 10 years from now will be impossible to predict.
00:34:05It does worry me a bit.
00:34:06AI could eventually do every job.
00:34:08For me, the question is, what opportunities will this make us?
00:34:13For us, no one could have predicted jobs that exist now while new teachers replaced.
00:34:21Okay, I don't know.
00:34:22That last sentence can broke apart like a meteor in the upper atmosphere.
00:34:26So, please, please double check your words.
00:34:29Please double check your questions.
00:34:30Don't make me read stuff that's bad.
00:34:33Otherwise, you know, if you don't check your questions, you're just going to get replaced
00:34:36by AI.
00:34:38Seems to me that if you feel the most judgment, the ability for AI to replace you is less likely
00:34:42to be on the hiring.
00:34:43No.
00:34:44No.
00:34:44AI is good at judging things.
00:34:46AI is just not good at creating things.
00:34:48It's a word guesser.
00:34:49And I've got whole presentations on AI, which I did a year or two ago.
00:34:52You can find them at fdrpodcasts.com.
00:34:55Somebody says, I was looking into paying a virtual assistant before AI.
00:35:01Once chat GPT came along, solved all my problems for 20 bucks a month.
00:35:05The best investment helps me immensely for my work.
00:35:07The thought of it getting better excites me.
00:35:09Yeah, fantastic.
00:35:11Somebody says, oh, James says, I take back my comment on AI being crippled.
00:35:15Grok is basically, it's incredibly useful at finding information in research.
00:35:18Yeah.
00:35:19Honestly, it's amazing.
00:35:20And if you're going to pay for any of them, my humble opinion, I have got no stake in the
00:35:25game.
00:35:25I've got no dog in the fight.
00:35:27Grok is by far the best.
00:35:28It's the one I trust the most.
00:35:29And even it's a little woke, but not too bad.
00:35:33Somebody says, James says, I came up with a question and ask it would probably have taken
00:35:37a week to find the answer.
00:35:38If not more, Grok does it all in about 60 seconds.
00:35:40Yeah.
00:35:41What do you think will happen to those people who get displaced because of AI?
00:35:44Don't care.
00:35:45Don't care.
00:35:47I don't care.
00:35:50Nobody cared when I got displaced as a white.
00:35:52Nobody cared when I got displaced as a male.
00:35:53Nobody cared when I got displaced as a gold pan or prospector.
00:35:57Nobody cared when I got displaced out of academia.
00:35:59Nobody cared when the boom-bust cycle of tech came along.
00:36:04Nobody cared when I got displaced because, you know, there are a lot of people are getting
00:36:08displaced because of, you know, mass sort of human movement and so on.
00:36:12I don't care.
00:36:13No, who cared?
00:36:14Who cared?
00:36:15Like, were there big articles when I got deplatformed?
00:36:17Were there big articles saying, yeah, but, you know, what's Steph going to do now?
00:36:21I mean, his audience base has cut enormously.
00:36:23We're just, we're so worried about what's Steph, how is he going to pay the bills?
00:36:27How is he going to survive?
00:36:29I mean, he spent, you know, 15 years building up this immense audience of millions of people
00:36:36and, you know, who racked themselves with, oh my God, what's going to happen to Steph
00:36:42now that he's been deplatformed and 95 or more percent of his audience has, poof, right?
00:36:47Nobody cared.
00:36:48That's fine.
00:36:50That's, I got to tell you, I love it.
00:36:53Love it.
00:36:53I love that no one cared.
00:36:58Absolutely loved that no one cared because now I don't have to.
00:37:02I don't have to shred one fucking neuron in my brain.
00:37:06I don't have to have one micro drop of cortisol or any kind of stress hormone in my system
00:37:11about anybody else.
00:37:13No, again, you guys, absolutely beautiful, wonderful.
00:37:16Let's keep that going.
00:37:17Let's keep the conversation going.
00:37:18But, um, freedom for others is freedom for you.
00:37:24You absolutely must grind this into your bone marrow.
00:37:27Freedom for others is freedom for you.
00:37:32They don't have to care about you.
00:37:35All I want to know is that they don't really care about us.
00:37:39I'm not going to sing the whole thing, of course, but it's beautiful.
00:37:43It's absolutely beautiful.
00:37:49Because when people don't care about you, they're handing you an immense, glorious helium
00:37:59balloon, take it to the stars gift of absolute, not give a fuck them about them.
00:38:03Honestly, it's glorious.
00:38:07Now, I love the people who care about me.
00:38:08I love being bound to those bonds of reciprocal obligation and love and beautiful.
00:38:15Love it.
00:38:15And once you really do have people who care about you in your life, you'll never be the
00:38:22same.
00:38:23Love absolutely reshapes you from the ground up.
00:38:26It is a reboot, a blank slate.
00:38:28It rebuilds you in an entirely different shape from the ground up.
00:38:31You become an absolutely different animal when you are deeply and truly loved.
00:38:35And I wish that for everyone.
00:38:37I wish for you to feel the glories of love.
00:38:40I wish for you to receive the benefits of love.
00:38:42I went for a lovely two-hour lunch with my wife today and just talked about everything
00:38:48under sun and moon after I went to work this morning at a cafe with my daughter.
00:38:55And then we went for a lovely long walk and talked about everything under the sun.
00:38:59So I would say that when you have gone through the experience of genuinely being loved and loving
00:39:09someone, many, more than one person, I hope, then you get this absolutely glorious gift of
00:39:18recognizing when people don't care about you, right?
00:39:22Right?
00:39:23You have this absolutely glorious gift.
00:39:27And the way that I view it is like an employer.
00:39:30If when I had jobs, you know, and you guys pay me, right?
00:39:33So you're my managers, you're my customers, you're my employers.
00:39:36But when I had a job, if the company or the manager paid me, I owed them the work.
00:39:45Sure.
00:39:46I owed them the work.
00:39:47Fantastic.
00:39:52If they didn't pay me, I didn't owe them shit.
00:39:57And it's beautiful, right?
00:39:58So if people pay you with love, respect, adoration, and sympathy, empathy, care, concern,
00:40:04and whatever, right?
00:40:05If people pay you, then you pay them back.
00:40:06It's reciprocal.
00:40:07It's beautiful.
00:40:08If people don't pay you, you don't owe them anything.
00:40:10Would you continue to go to work for a company that didn't pay you?
00:40:15No.
00:40:16So, oh, but the company needs you.
00:40:17It's like, but they're not paying me.
00:40:19So because they're not paying me, I don't owe them anything.
00:40:24Would you go, would you go to work for a company that stole from you?
00:40:28Did not only did they not pay you, they stole from you.
00:40:30Well, that's the people who are constantly voting for more and more of your paycheck.
00:40:36Anyway.
00:40:41Somebody says, I was asking Grok some questions earlier today, doing a bit of research and
00:40:45asking hypothetical questions.
00:40:46It was crazy to me how much information it was giving me.
00:40:50It practically wrote me a small book on the subject.
00:40:53Yeah.
00:40:53Well, they asked for more handouts.
00:40:59What if they don't get that?
00:41:02So I'll tell you what change is.
00:41:08And this is a rant and a bit.
00:41:12So change is all the little conversations that you have.
00:41:14So if you sort of spread this idea that you don't owe people compassion who don't have
00:41:19compassion for you, and you sure as hell do not owe people compassion who prey upon
00:41:23you through the power of the state to take things out of your wallet every second of
00:41:30every day, of every month, of every year, of every decade, of every century.
00:41:34Well, maybe not a century yet, but hopefully it won't last that long.
00:41:38It's all about these little.
00:41:39So when people say, oh my gosh, what about all the people whose jobs are going to get
00:41:42displaced by AI?
00:41:44It's like, do those people care that other people's jobs have been displaced?
00:41:50No.
00:41:52So it's these little conversations that you have where you just open people's minds.
00:41:56Just open people's minds.
00:41:58People are like, well, you know, the conquistadors came and the Aztecs and the Mayans.
00:42:02It's like, well, the Aztecs in particular were brutal, violent, evil, rapey, cannibalistic,
00:42:09you know, rip open the hearts of hundreds of children every weekend to satisfy their
00:42:13God that drank on the adrenochrome-fueled tears of tortured children.
00:42:18It's like, good riddance.
00:42:20Good riddance.
00:42:23One of the reasons why the Aztecs were so easy to conquer is every other tribe that they
00:42:28dominated, exploited, raped, and abused, and tortured and murdered arose up against them
00:42:32the moment that the conquistadors came along.
00:42:34Good riddance.
00:42:34It's all these little things, all these little things, little conversations you have, you
00:42:40never know, little conversations, you must have them, you must tell the truth, you must
00:42:45tell the truth.
00:42:48All these little conversations, they will spread, they will change people's minds, and so on,
00:42:51right?
00:42:52So right now, everyone's turning themselves, oh my God, but all these people, they're going
00:42:55to lose their jobs.
00:42:56It's like, and?
00:42:58They'll figure it out.
00:43:00They'll figure it out.
00:43:01Human beings survived ice ages, right?
00:43:05Sudden warming periods.
00:43:06They survived droughts.
00:43:07They survived plagues, locusts.
00:43:10They survived CNN.
00:43:13Okay, like, maybe not.
00:43:14I don't think people survive CNN, but that's a different story.
00:43:17So, they'll figure it out.
00:43:20Yeah, they'll figure it out.
00:43:20It's not my job.
00:43:21Not my job to live other people's lives for them.
00:43:24And here's the other thing, too.
00:43:25Like, everybody knows the economy changes, right?
00:43:32I mean, how many people are, oh my God, it's so sad.
00:43:34All the people who, they used to make these rotary dial phones, and now nobody wants rotary
00:43:38dial phones.
00:43:39Oh, it's so sad.
00:43:40All the people who used to make physical touchpads for cell phones and flip phones, they're,
00:43:45like, nobody cares.
00:43:45Everybody's like, oh, upgrade, man, fantastic.
00:43:47Fantastic, fantastic, let's, you know, like, the moment that, what was it, 2008 or whatever,
00:43:542009, when Apple finally produced a phone with a selfie camera, because they didn't
00:44:00realize just how narcissistic the population was.
00:44:03Got a camera facing the other way.
00:44:04No, no, selfie camera.
00:44:06Nobody cared.
00:44:07Nobody cared that Kodak went out of business.
00:44:10Kodak originally, I think, did not pursue the digital camera, even though it was invented
00:44:15by one of their guys.
00:44:17But nobody cared.
00:44:19Nobody cared.
00:44:20Oh, but, I mean, I could buy a digital camera, or I could buy a phone with a digital camera,
00:44:25but what about the 20,000 people who work for Kodak?
00:44:28What's going to happen to them, man?
00:44:29They don't care.
00:44:30They don't care.
00:44:31Everybody's like, charge the fuck forward.
00:44:35I don't care how many boom, boom, boom, boom, boom.
00:44:38I don't care how many people we drive over into the race to the better future.
00:44:41Ba-dung.
00:44:42Blood-soaked people, less right, careers, shredded, everything, torn apart.
00:44:47Their eyeballs are stuck in the fucking caterpillar tracks of your progress tank.
00:44:51Nobody cares.
00:44:52Nobody cares.
00:44:54And everybody who pretends to care is just yanking your chain and sucking on your bone marrow.
00:44:59So, if people say, well, what's going to happen to all the people who are displaced?
00:45:03Okay, how much have you cared about everyone else who's been displaced?
00:45:07It's not going to be almost 90% of the population.
00:45:09But that was farming.
00:45:12Oh, but that was slower.
00:45:13Yeah, well.
00:45:16People can't say, I want progress and I want continuity.
00:45:23It's retarded.
00:45:23I really want to go on vacation and stay at home at the same time.
00:45:28If I can just go forward and backward, north and south at the same time, I'd be happy.
00:45:33I want progress.
00:45:34I want new things.
00:45:35I want better things.
00:45:36And I want all the people employed in the previous industries to still continue their work without any change.
00:45:41I want a digital camera and I also want all the people at Kodak and all the people who would process the print cameras, the film cameras.
00:45:49I also want them.
00:45:50La, la, la, la.
00:45:51Oh, shut up.
00:45:52Shut up.
00:45:54I mean, nobody sits there and says, well, I really like computer-generated imagery in movies.
00:46:03Like, it's really gripping and absorbing and all of that.
00:46:06But at the same time, what about all these people who used to make these models and do it all by hand?
00:46:12What about all of them?
00:46:13Okay.
00:46:14So, if you don't want those people to lose their jobs, don't go see any movies with CGI.
00:46:19But people just like, Marvel shit.
00:46:21Great.
00:46:22That's fecal matter being reverse cocaine snorted up into my frontal lobes.
00:46:26I'll go take it.
00:46:28I'll get a cool little cup with Thor's ass cheek on it.
00:46:34Ah.
00:46:36So, yeah, everybody's just like, yeah, I want the new shit.
00:46:38I want the new shit.
00:46:39And that new shit means people in the old shit don't have a job anymore.
00:46:45And think of how much has changed over the last 30, 40, 50 years, right?
00:46:49Think how much has changed.
00:46:52People didn't say, well, there are these new fixed-wing airplanes that can get you from A to Z way quicker than a train or a chug-chug-chug car, right?
00:47:04Nobody cared.
00:47:04Nobody cared.
00:47:06Fuck, I'll take a plane, man.
00:47:07That's way, way cooler.
00:47:08Does anybody sit there and say, well, but I'll take a plane.
00:47:11The highway people, the car people, the train people, the bus people, they just don't do as well, man.
00:47:16Oh, I don't care.
00:47:18I mean, you care about progress.
00:47:20You care about making things more efficient.
00:47:22You care about getting things better and faster.
00:47:25Think about all the people who used to make 8088 chips or what was it, the Z80, ZX80, the Sinclair, the old 1K RAM.
00:47:35Think of all the people who used to make CRT monitors.
00:47:38I think of all the people who used to make 286 chips and so on.
00:47:41Nobody cares.
00:47:42Those skills are all gone.
00:47:44All of those factories are torn apart and it's all revamped.
00:47:46Nobody cares.
00:47:48It's beautiful.
00:47:49They're going to change.
00:47:50Okay.
00:47:50If you care so much about change and you care so much about people, then don't progress.
00:47:57Don't progress.
00:47:57Go buy yourself a rotary dial phone and take a fucking horse and buggy everywhere.
00:48:01Oh, I don't want to do that.
00:48:02Well, then there's going to be change.
00:48:03Yeah, I just don't have any patience for this stuff.
00:48:10It's just noise.
00:48:12But we've got to have concern.
00:48:13This is a big, big change.
00:48:15Yeah.
00:48:19Yeah.
00:48:19They don't care.
00:48:21They don't care.
00:48:24All right.
00:48:26Let's get to your questions and comments.
00:48:29And it's beautiful, man.
00:48:31I'll just tell you this as I get there.
00:48:32It's beautiful to not give a shit is right.
00:48:36Your circle of care is the people who care about you.
00:48:40That's your circle of care.
00:48:41Everybody else can go pound sand.
00:48:44Go fill your boots.
00:48:45As an ex-army friend of mine used to say, go fill your boots.
00:48:52All right.
00:48:53Somebody says, I give more energy to people than they deserve, but it's work related.
00:48:57At least I got the respect of the people.
00:48:59One told me I'm the most underrated person there.
00:49:02No, that's not respect at all.
00:49:06That's calling you an exploited loser, frankly.
00:49:11Yeah, man.
00:49:12You deserve so much more.
00:49:13Now give me more.
00:49:15Come on, man.
00:49:17Demand your value.
00:49:18Google goes crap now.
00:49:26I can't, honestly, I cannot remember the last time I did a Google search.
00:49:30I can't, I can't imagine.
00:49:32I can't imagine going to Google for anything other than propaganda.
00:49:36Anyone who would have held job fairs for the displaced was celebrating you being de-platformed.
00:49:44Yeah.
00:49:44Beautiful.
00:49:45Beautiful.
00:49:47Beautiful.
00:49:48Don't care about me.
00:49:50Fantastic.
00:49:51Don't care about me.
00:49:53I love it.
00:49:54I love it.
00:49:55I eat it up.
00:49:56I eat it up like a Greenwald sausage for breakfast.
00:50:00Beautiful.
00:50:01I don't care because I don't have to care now.
00:50:03I am liberated.
00:50:04Like I'm a kind of, I'm the kind of person because I have a fair amount of skills and brainpower.
00:50:09I feel a lot of responsibility towards humanity.
00:50:11I really do.
00:50:12Like I could take all that stuff on.
00:50:14And, uh, beautiful.
00:50:17It, it breaks the chains.
00:50:20It, you know, I feel a little bit like Gulliver, you know, in Gulliver's travels, he goes to,
00:50:23what's it, Lilliputia?
00:50:24And then there's all these tiny people that actually became a word, Lilliputian, right?
00:50:28And there's all these tiny people and, and they'd bind them down on the beach and stuff like that.
00:50:32And I've got all these obligations.
00:50:35I've got to take care of people.
00:50:36I've got a responsibility of brain and philosophy and communications.
00:50:39And I got to do good for the world.
00:50:41And it's like, ah, it was, it was, it was kind of exciting torture was the way that I
00:50:46would phrase all of that.
00:50:48And, uh, I felt that obligation so keenly.
00:50:52I can't even tell you.
00:50:53It was brutal.
00:50:54It was brutal.
00:50:56I mean, take a silly example, right?
00:50:58Imagine you had the power to cure infections by touching them.
00:51:02I mean, how would you rest?
00:51:03How would you get to bed?
00:51:04You'd realize every time you fell asleep, there'd be like 10,000 people dying of infections
00:51:07that you could have helped.
00:51:08But it's, uh, it's horrible to have this level of ability and obligation.
00:51:14And the two for me, noblesse oblige, whatever you want to call it.
00:51:17I did not earn my brain power.
00:51:19I did not earn my linguistic abilities.
00:51:21I've developed and honed them or whatever, right?
00:51:23But you know, people take singing lessons because they're already good singers, right?
00:51:26So I felt this just crushing weight of obligation to make the world a better place to, especially
00:51:34when I saw that I was able to talk about things that other people weren't willing to touch.
00:51:39And rightly so, I guess, if you want a big sustainable career or whatever, but, uh, I
00:51:43just, I felt just a massive weight of obligation.
00:51:46Like the whole world was my child and I have to get up if the child's crying and take care
00:51:50of it.
00:51:50I really did, honestly.
00:51:53And the de-platforming was not, it was not a massive shock, right?
00:51:58I mean, I've said a million times who got me de-platformed me, I got me de-platformed,
00:52:03but the, um, you know, when you spend your life helping people and then you fall down a
00:52:09well and everybody wanders off, even though they can hear you crying out for help.
00:52:13When you, when you get out of that fucking well, you are getting out of more than the
00:52:17well and you were getting out of all reciprocal obligations.
00:52:21And it's a beautiful thing.
00:52:23It is a way to wake up from delusion where there is no reciprocity.
00:52:27There is no obligation.
00:52:29Fuck that.
00:52:30There is no obligation.
00:52:31And it is incredibly liberating.
00:52:33And, uh, I truly, truly invite you into this absolute freedom that if people don't care
00:52:42about you, do not care about them.
00:52:45Do not care about them.
00:52:47I mean, imagine, imagine if England had not had the white man's burden.
00:52:51Imagine if England had not said, well, we've got to go and civilize the whole world and
00:52:55turn the whole world into England, right?
00:52:57Because it wasn't like the Indians really cared about the Raj.
00:52:59They hated it.
00:53:00They still hate it.
00:53:02And, um, uh, in Africa, they weren't thanking everyone.
00:53:05I mean, imagine if you just had lack of reciprocity.
00:53:08I mean, oh, anyway, it's amazing.
00:53:11It's amazing.
00:53:13Uh, honest question.
00:53:14I'm with you on don't care.
00:53:16Totally get it and believe it.
00:53:17However, do you believe in giving to charity, giving tithes, paying it forward to people who
00:53:21have no idea that you exist for them to care about you?
00:53:24I'm curious on your approach to the idea of giving unto others.
00:53:27That's, that's a great question.
00:53:29I do give some money to charity.
00:53:30For sure.
00:53:30Yeah.
00:53:31I do give some money to charity.
00:53:32Uh, I do spend time helping people as a whole and so on.
00:53:37So, uh, I, I think, but that's because it gives me pleasure and I feel good about it,
00:53:42honestly.
00:53:43But it's not a sense.
00:53:46It's not obligation.
00:53:47It's not obligation.
00:53:50NPCs without a purpose, says someone, are weaponized.
00:53:52I don't care for them, but I do care what happens with them.
00:53:55Look at how replaced people have been weaponized and de-industrialized cities.
00:53:59No, but that's just because of the, um, uh, that's just because of the, uh, the, the welfare
00:54:04state, right?
00:54:05Um, Aztecs ripping out hearts.
00:54:09It sounds like you're reading current news in Mexico.
00:54:11Well, you know, Mexico is just reverting to its original state.
00:54:14Yeah.
00:54:18People survived socialism.
00:54:20Yeah.
00:54:20Yeah.
00:54:21Yeah.
00:54:21I mean, where is, where is the sympathy for the Russians of whom 70 million was slaughtered
00:54:29under communism?
00:54:30Where's the sympathy for them?
00:54:31Where's the sympathy for the millions of Germans who were brutalized, raped and destroyed,
00:54:36uh, after the second world war?
00:54:38Of course, a lot of them didn't have anything to do with the war and so on.
00:54:41All right, so, and, and how many, you know, how many, and this is, this is a boomer thing
00:54:50too, but not just a boomer thing.
00:54:52How many people, uh, how many people care about the young and the national debt?
00:55:06How many people say, yeah, we really have to cut this, uh, we really have to cut this.
00:55:09Uh, we really have to cut the spending.
00:55:12We have to reform the unfunded liabilities because holy shit, are we ever, we're, you
00:55:16know, every kid born in America is born into a million dollars or more of debt and unfunded
00:55:21liabilities.
00:55:21It's absolute slavery.
00:55:22Most people make about a million dollars over the course of their life.
00:55:25It's absolute fucking bankstra slavery.
00:55:29We've got to do something about it.
00:55:31They don't care.
00:55:32They don't care.
00:55:33Do you care about the unborn?
00:55:35Or what about the million babies in the womb killed in America every year?
00:55:41Nobody cares.
00:55:42I mean, I, sorry, I shouldn't say nobody cares, but in general, the media, main people, they
00:55:45don't care.
00:55:47Just people who come along and try and provoke this sympathy from you.
00:55:50Uh, they're just, uh, flexing their fingers to squid finger out and spider finger your wallet.
00:55:56That's all.
00:55:56That's all.
00:55:57All those manual operators of telephone lines, telephone lines, lost their jobs when digital
00:56:04switching came along.
00:56:04Absolutely.
00:56:06I have to see all these white people, these women often, they were doing all of these
00:56:09cords into, eh, it's all gone.
00:56:12Yeah.
00:56:15Wish we could still travel by horse on a day to day.
00:56:18I ditched my car.
00:56:21While a horse almost rendered me infertile in Africa.
00:56:24That was rough, man.
00:56:25Um, okay, since you insist, I'll tell the story.
00:56:28So, uh, I was, um, visiting a friend of my father's, uh, who was a missionary.
00:56:34And this is when I went hunting.
00:56:36Uh, and what happened was, um, I also took a horse and went out into the bush and, uh,
00:56:43toured around the property of the big property, turned around, turned around all the property.
00:56:46Now this horse had recently had a foal.
00:56:49Uh, it was a female horse.
00:56:50She'd really had a foal.
00:56:51So getting her to leave, and I was foolishly in, I'm a boxer kind of guy.
00:56:58So I was in boxers and track pants, right?
00:57:00So, uh, balls were hanging and clanging, right?
00:57:03And so I couldn't get the horse.
00:57:07I had to keep urging the horse and try to get the horse away.
00:57:10And, oh, yeah, well, she's got a foal.
00:57:11She doesn't want to leave, right?
00:57:12So we ended up getting quite far away.
00:57:14It takes a couple of hours.
00:57:15And then we turn and head for home.
00:57:17And she just, man, she just takes off like Wile E. Coyote.
00:57:21And I'm not an expert horse person at all.
00:57:23And I'm just hanging along.
00:57:24And it was like, I'm just getting natted and natted and natted on that horse leather.
00:57:29Oh, man, that was an ugly afternoon.
00:57:31Not as bad as the one time I went on a camel, though.
00:57:33That also was like taste my own testicles in my forehead.
00:57:37Ouch.
00:57:38So, yeah, I can live without the horses.
00:57:42Somebody says, Paula says, I needed this.
00:57:44I don't care, rant.
00:57:45Oh, I needed this.
00:57:46I don't care, rant.
00:57:47I've been dealing with my stepsister to move our very senior parents out of their house
00:57:51to a senior living center.
00:57:52When she gets frustrated with them, she says, I wish they would die.
00:57:55I had to tell her how disgusting I found this.
00:57:57I would probably lost her as a family member.
00:58:00I probably lost her as a family member.
00:58:02I don't care.
00:58:03I wish they would die.
00:58:08Listen, I mean, I get that's a terrible sentiment, but why don't you just listen to her?
00:58:12Let her rant.
00:58:13Let her get it off her chest, right?
00:58:14All right.
00:58:20I was laid off early this year, probably in part due to AI.
00:58:23I work in AI.
00:58:24I mean, here's the thing.
00:58:26I mean, imagine all of the terrible job losses it's going to happen to all of the horrible
00:58:32banksters when crypto becomes the standard currency of the planet, right?
00:58:36Think of all of the people who work in 6'5 finance, blue eyes, right?
00:58:42Think of all the people.
00:58:43I mean, all of the people who are, you know, 33 to 1, hyperextended on their bullshit intergalactic
00:58:50loans, right?
00:58:51I mean, think of all of the people who print money and steal from the fixed income poor.
00:58:56Think of all the people who work for Visa and take their 3% of every interaction known
00:59:02to man and God and all by Satan.
00:59:05I just think of all of the Federal Reserve employees and all the banksters and the predators
00:59:09and the lenders and the investors and all of that.
00:59:12Imagine if your money grows in value just for existing, then you don't need to invest
00:59:16in shit.
00:59:17You don't need to be in the fucking stock market, which nobody wants to be in except
00:59:19four guys and a dog.
00:59:21Nobody wants to be in there, but you heard it in there because otherwise the stealth
00:59:25fingers of the banksters take your fucking life and your balls and your money just by
00:59:29sleeping, breathing and being alive.
00:59:31And after you die, they take even more.
00:59:33Are you going to sit there and say, ah, yes, but the poor central bankers and the poor
00:59:39massive instrument-creating predatory people who are constantly making money because everyone's
00:59:45hurting into the stock market because their shit gets stolen while they sleep through
00:59:49inflation.
00:59:49What's going to happen to all these people?
00:59:51Oh, no.
00:59:51I mean, you know, when they got rid of slavery, there were a lot of people heavily invested
00:59:58in buying, owning, transporting and selling slaves.
01:00:02What happened?
01:00:04Do you care?
01:00:05I don't.
01:00:06I'm good.
01:00:06Good.
01:00:06I'm glad that slavery is legalized, right?
01:00:08Oh, man, my heart absolutely breaks for the Bernie Madoffs of the world.
01:00:24Yeah, Sam Bankman-Fried and his semi-goblin girlfriend, they've got some really good-looking
01:00:30people to play them.
01:00:31See, sometimes they can up people's looks.
01:00:33Oh, my gosh.
01:00:35Must run early, early morning with a long weekend.
01:00:37Later, good people.
01:00:38Thank you for dropping by.
01:00:40Thank you for dropping by.
01:00:42FreeDomain.com slash donate to help out the show.
01:00:45FreeDomain.com slash donate to help out the show.
01:00:50I really would appreciate that.
01:00:52Thank you, of course, for your support.
01:00:54It means the world and a half to me.
01:00:57Let me sound off.
01:00:58Thank you, Matt, Joseph, Anthony, Lloyd, David, you Anglo-Saxon people.
01:01:07I appreciate that.
01:01:09I appreciate that.
01:01:11What happens if an AI is allowed to short-term trade?
01:01:13Infinite money glitch.
01:01:15Don't know.
01:01:16Don't know.
01:01:16Don't care.
01:01:18Yeah, I don't.
01:01:19I don't.
01:01:19I don't care.
01:01:21Like, what happens if?
01:01:22What happens if?
01:01:24I don't care.
01:01:25I don't want to care.
01:01:26It doesn't matter to me.
01:01:27And at this point, like, it's been, gosh, almost five years since I was, almost a half
01:01:32decade since I was deplatformed, right?
01:01:35Almost, right?
01:01:36And lost most of my audience, massive chunks of income and reach, and, you know, I had to
01:01:41adjust.
01:01:43And what I did was I said, okay, I've done enough politics.
01:01:45Now I want to do beauty, which is why I wrote, I'm on my third novel.
01:01:50I read my novels as audiobooks, and love it.
01:01:55I love it.
01:01:56I love it.
01:01:57And I am broken from obligation.
01:02:01I don't say to hell with the world, because that would be to have anger and hostility.
01:02:05I've just, it's this glorious intergalactic indifference.
01:02:08Because I have more investment in a movie from 1940 than I do in People's Fate in the current
01:02:16world.
01:02:17So it's been almost a half decade.
01:02:22I barely even, every now and then, every now and then, it'll be like, oh, this thing
01:02:28happened in the world, and I have a strong opinion about it.
01:02:30And it's like, hmm, you know, that, that Dagny Taggart thing where she leaps at the phone
01:02:37towards the end of Atlas Shrugged, and she has to will herself to not care.
01:02:42And for the most part, I'm just chugging along.
01:02:44It's great, beautiful, don't care.
01:02:47And for the most part, it's an automatic process.
01:02:54Every now and then, there's like, you know, the old muscle memory.
01:02:58I played, yesterday I booted up.
01:03:00We haven't played in, I don't know how long, years.
01:03:03But my daughter and I used to play Rocket League, like side-by-side screens.
01:03:07And we booted it up, and it took like, I don't know, 20 minutes for the muscle memory to kick
01:03:12in.
01:03:12She's much better out in the field, and I'm better in goal, because she's better at the
01:03:16game as a whole than I am.
01:03:17Why?
01:03:18Because she's here to replace me, friends.
01:03:20So, yeah, it just takes a while for the muscle memory to kick in.
01:03:24And it's the same thing with the indifference thing.
01:03:25Like, oh, what's going to happen to these people?
01:03:28I don't care.
01:03:29I don't care.
01:03:30I don't care any more than they cared about me.
01:03:34I will give a kidney to the people I care about.
01:03:38I really will.
01:03:39I mean, my care, because it's kind of like either or to some degree, right?
01:03:44Because the more you care about the world, right?
01:03:46This is the out-group versus in-group preference, right?
01:03:48The more you care about the world, the less available you are to care about the people close
01:03:52to you.
01:03:52The more you care about the people close to you, the less available you are to care about
01:03:55the world.
01:03:56And I am massively invested in friends and family.
01:04:00And you, as an audience, honestly, thank you again so much.
01:04:03I'm massively invested in friends and family.
01:04:05And I am not invested in the world as a whole.
01:04:13What about this?
01:04:14What about that?
01:04:15What about the other?
01:04:17I don't wish it ill.
01:04:18I don't wish it well.
01:04:19I just...
01:04:21They pass right through you like a ghost.
01:04:25Baby, you're nowhere.
01:04:31Yeah.
01:04:32So, it is liberating and it is philosophical.
01:04:37It is liberating and it is philosophical.
01:04:41And I have no regrets.
01:04:45I am absolutely thrilled that I did what I did.
01:04:49You know, the sort of fix the world stuff.
01:04:53I am absolutely thrilled that I did what I did.
01:04:56I have no complaints about any of it.
01:04:59I would have...
01:05:00I would have massively regretted it if...
01:05:08I had not done it.
01:05:15I would have massively regretted it.
01:05:18Because I wouldn't have known for sure, right?
01:05:27All right.
01:05:29Somebody says,
01:05:31In science fiction,
01:05:32there is often a technology that is somewhat far-fetched.
01:05:36In your novel, The Future,
01:05:38I thought the angels were exactly that.
01:05:40The overly convenient plot mechanism
01:05:42to justify the world you described.
01:05:43Now we see that AI
01:05:44is already a potential foundation of the angels.
01:05:48Yeah.
01:05:49Yeah.
01:05:50Yeah.
01:05:51Yeah.
01:05:52I mean, I never programmed AI
01:05:54because I was a programmer like 30 years ago.
01:05:57But I definitely did goal-seeking
01:05:59and optimization stuff and all of that.
01:06:01So, sniffing your way
01:06:02to getting best business practices
01:06:06based upon the data and so on.
01:06:09So, all right.
01:06:10We've had one tip on locals.
01:06:13If you're...
01:06:13Or on Rumble.
01:06:16If you are finding what I'm saying helpful
01:06:18and I'm telling you,
01:06:19one of the best pieces of advice you can get
01:06:21is to not care about people
01:06:23who don't care about you.
01:06:24And the fact that I can put it
01:06:25entertainingly and engagingly
01:06:26and the fact that I live it myself as well
01:06:29is great.
01:06:31Great advice.
01:06:32Great advice.
01:06:33And where else do you get it?
01:06:34Where else do you get this kind of stuff?
01:06:36FreeDomain.com.
01:06:38Nowhere that's the case.
01:06:40So, of course,
01:06:40you can go to FreeDomain.com
01:06:42slash donate
01:06:42and help out there.
01:06:45Or you can help right here
01:06:46on the apps on Rumble and Locals.
01:06:48Now,
01:06:48don't forget,
01:06:50you get this wonderful free book.
01:06:52It's a free book.
01:06:52This book called The Future
01:06:54will absolutely inspire you
01:06:56about everything that
01:06:57we are working towards
01:06:58or fighting for.
01:06:59So,
01:07:01I hope that you will
01:07:04check it out.
01:07:07Thank you, Soda.
01:07:07I appreciate that.
01:07:09I appreciate that.
01:07:10What would that buy?
01:07:13Six minutes of your time?
01:07:15I'm just kidding.
01:07:16I'm just waiting.
01:07:17You'll be listening for an hour.
01:07:18But no,
01:07:19I appreciate that.
01:07:19And so,
01:07:21let me see if there's other things
01:07:23that I have stored for y'all
01:07:25to get a hold of.
01:07:29What do I have stored for you?
01:07:33Only 49% of web traffic
01:07:34comes from humans.
01:07:35The other 51% are bots.
01:07:3837% are malicious,
01:07:41powered by AI
01:07:41and designed to manipulate
01:07:42what you see,
01:07:43think,
01:07:43and believe.
01:07:45Yeah,
01:07:45they flood comments.
01:07:46So,
01:07:47one of the things that happens
01:07:48to public figures
01:07:50is
01:07:52you get these bots
01:07:54that goad you
01:07:55into talking about things
01:07:57that will get you deplatformed.
01:07:59Unfortunately,
01:08:00I didn't need a lot of goading.
01:08:01But yes,
01:08:01there's definitely this
01:08:02goading thing.
01:08:03So,
01:08:04if they target someone,
01:08:05what they'll do is
01:08:05they'll flood them with bots
01:08:06saying,
01:08:07well,
01:08:07why don't you talk about this?
01:08:08You chicken,
01:08:09you coward.
01:08:10And here's the other thing
01:08:10that you should talk about
01:08:11and blah,
01:08:11blah,
01:08:11blah,
01:08:12right?
01:08:12and
01:08:15be careful of that.
01:08:22Be careful of that.
01:08:26The Zed blog wrote
01:08:27yesterday,
01:08:30I've written about
01:08:31modern sophists
01:08:32and sophistry
01:08:33quite a bit
01:08:33using people like
01:08:35Stefan Molyneux
01:08:36as a good example
01:08:37of the type.
01:08:38Jordan Peterson
01:08:38is another useful example
01:08:40of how they operate.
01:08:41Notice that he relies
01:08:42heavily on weird
01:08:43facial expressions
01:08:44and body movement.
01:08:45It is a tactic
01:08:46to suggest opinions
01:08:47about the other party
01:08:48in the minds of those
01:08:48watching the show.
01:08:50It fails here
01:08:51because Peterson
01:08:51cannot control the cameras
01:08:52and the staging
01:08:53that leaves him
01:08:54with just his words
01:08:55and that is not
01:08:55a good ground
01:08:56for him
01:08:57at this point
01:08:58in his career.
01:08:59Yeah,
01:09:00so,
01:09:00I mean,
01:09:00this is interesting.
01:09:04This is interesting
01:09:05because
01:09:06to be animated
01:09:12because,
01:09:13you know,
01:09:13I mean,
01:09:14I'm fairly animated.
01:09:16I mean,
01:09:16halfway towards,
01:09:17I'm halfway towards
01:09:18a Jim Varney cartoon,
01:09:20right?
01:09:20I'm very animated.
01:09:22That's not fake.
01:09:23I am generally animated
01:09:24as a whole.
01:09:25When I tell stories
01:09:26or when I talk,
01:09:27I'm animated.
01:09:28My voice is animated.
01:09:29My gestures are animated.
01:09:32I wear my heart
01:09:33on my sleeve
01:09:33and all this kind of stuff,
01:09:34right?
01:09:35So,
01:09:35what happens is,
01:09:36and the Zed blog,
01:09:37if anybody has,
01:09:40if anyone can find me
01:09:41and you can send it to me,
01:09:42host at freedomain.com
01:09:43where he's written
01:09:44about sophistry,
01:09:46right?
01:09:46I mean,
01:09:47I can be accused
01:09:48of a lot of things.
01:09:49Sophistry is just
01:09:50not one of them
01:09:50because
01:09:51sophistry is making
01:09:53the worst argument
01:09:53appear the better
01:09:54and I've built
01:09:56my arguments up
01:09:57from a blank slate
01:09:58first principles,
01:09:59right?
01:10:00I've got a whole,
01:10:01was it,
01:10:0219-part introduction
01:10:03to philosophy series
01:10:04where I build things up
01:10:05assuming we know nothing,
01:10:06like Cartesian style,
01:10:07the Descartes style,
01:10:08meditations,
01:10:09sorry,
01:10:11the sort of
01:10:12Descartes style,
01:10:13right?
01:10:14And so,
01:10:16given that I have
01:10:19built all my arguments
01:10:20up from first principles,
01:10:21you can't call that
01:10:23sophistry.
01:10:24I mean,
01:10:24you could disagree
01:10:25with my arguments,
01:10:26of course,
01:10:26and,
01:10:26you know,
01:10:27I'm sure that some
01:10:27could be improved,
01:10:28but with regards to ethics,
01:10:31you know,
01:10:31I start with
01:10:32we know nothing,
01:10:32we have no virtue,
01:10:33there's no such thing
01:10:34as virtue,
01:10:34how do we build up
01:10:35a system of ethics
01:10:36from nothing?
01:10:37That's the opposite
01:10:37of sophistry.
01:10:40So,
01:10:41building things up
01:10:42from first principles
01:10:43is unassailable
01:10:45if the first principles
01:10:47are valid
01:10:47and the arguments
01:10:48are valid.
01:10:48It's absolutely,
01:10:49you can't do better
01:10:50than that.
01:10:50You cannot do better
01:10:51than blank slate
01:10:52building things up
01:10:53from first principles.
01:10:58Somebody says,
01:10:59I can't claim
01:11:00to be looking at you
01:11:01all that much,
01:11:01honestly.
01:11:02I mostly pay attention
01:11:03to audio and podcasts
01:11:04as a whole.
01:11:13I'm talking to a lot
01:11:14of people.
01:11:15Yes,
01:11:15but I,
01:11:16but me,
01:11:17but I,
01:11:18but me,
01:11:19see,
01:11:19there's an example
01:11:20of the animation,
01:11:21right?
01:11:23I mean,
01:11:23this is,
01:11:24a lot of women,
01:11:26a lot of women
01:11:27talk about
01:11:28their personal experience
01:11:28rather than
01:11:29generalized understanding
01:11:31and a woman says,
01:11:32well,
01:11:32I'm a woman
01:11:33and I don't do that,
01:11:33right?
01:11:34Wow,
01:11:35that was nasty.
01:11:35Oh,
01:11:47Dave,
01:11:48Dave,
01:11:48Dave,
01:11:49please spend some time
01:11:50around men.
01:11:52I'm sorry that you were
01:11:53raised without a father.
01:11:54I'm sorry that you don't
01:11:55have much male influence.
01:11:58But if somebody's
01:12:00just talking about
01:12:01I'm making a sort
01:12:02of general case
01:12:03about being animated
01:12:04and somebody says,
01:12:05well,
01:12:05I don't really look at you.
01:12:06I only listen.
01:12:07But I said,
01:12:07my voice is animated,
01:12:08right?
01:12:09So when I say
01:12:11my face is animated
01:12:12and my voice is animated
01:12:13for somebody saying,
01:12:14well,
01:12:15I,
01:12:15Mimi,
01:12:15I don't look at you.
01:12:17I only listen to you.
01:12:17But it still doesn't,
01:12:18but it's somebody,
01:12:19I'm making a case
01:12:19in general.
01:12:21And this is somebody
01:12:23talking about
01:12:23their own particular
01:12:24personal experience.
01:12:26Assume that's common.
01:12:27Perhaps that's a bit
01:12:28narcissistic of me.
01:12:29Well,
01:12:30I'm not saying
01:12:30it's narcissistic.
01:12:31I'm just saying that
01:12:32try to listen
01:12:33to a general argument
01:12:34without inserting
01:12:35your own particular
01:12:36experience,
01:12:37right?
01:12:38This really is,
01:12:39if you're a tall woman
01:12:42and men say,
01:12:44or a man or a woman,
01:12:45anyone says,
01:12:46men are on average
01:12:47taller than women,
01:12:48you say,
01:12:48well,
01:12:48I'm a tall woman,
01:12:50right?
01:12:50So,
01:12:51so,
01:12:54the Z block,
01:12:56it's interesting.
01:12:57I don't know
01:12:57anything about them,
01:12:59but,
01:13:00why would you
01:13:02analyze facial
01:13:04expressions
01:13:04rather than
01:13:06the argument itself?
01:13:07Because,
01:13:07it's interesting,
01:13:08because what's being
01:13:09said here,
01:13:10and I,
01:13:11I'm not saying this is,
01:13:12I'm not going to read
01:13:12this guy's mind
01:13:13or whatever,
01:13:13but I think that
01:13:14the effects of that
01:13:15is saying,
01:13:16well,
01:13:16if you have,
01:13:17if you're animated,
01:13:18if you're
01:13:19entertaining,
01:13:20if you're engaging,
01:13:21if you make some jokes
01:13:22or you roll your eyes
01:13:22or whatever it is,
01:13:23right?
01:13:23If you're,
01:13:24if you're animated,
01:13:25then you're a sophist,
01:13:26right?
01:13:27Well,
01:13:27that's disarming
01:13:28effective communications,
01:13:30because then,
01:13:31you know,
01:13:31you really can't be
01:13:32animated,
01:13:33you have to
01:13:34pull a full
01:13:35Novocaine on the brain,
01:13:37Sam Harris
01:13:38kind of approach.
01:13:40sorry,
01:13:41sorry,
01:13:41I moved my head
01:13:41a little,
01:13:41let me get it
01:13:42fixed back
01:13:43a little bit
01:13:43there,
01:13:44and you have
01:13:44to be a
01:13:45droning monotone
01:13:47ostrich egg
01:13:48of barely perceivable
01:13:50rational arguments,
01:13:51and maybe you'll be
01:13:52right,
01:13:52and maybe,
01:13:53um,
01:13:55you'll have
01:13:56something effective
01:13:58to say,
01:13:59but the way in which
01:14:01you don't want
01:14:02to be a sophist
01:14:02is you just don't
01:14:03want to be animated,
01:14:04you don't want
01:14:05to be engaging,
01:14:07you don't want
01:14:08to make any jokes,
01:14:09you just want
01:14:10to drone,
01:14:11like a slowly
01:14:13buzzing tinnitus
01:14:14bee of
01:14:16steady metronome
01:14:17argumentation.
01:14:19Sorry,
01:14:20even that language
01:14:20is probably a little
01:14:21bit too florent,
01:14:22so you just have
01:14:23to be calm
01:14:25and reasonable
01:14:26and focused
01:14:27and monotone,
01:14:28because Lord knows
01:14:28if you have any
01:14:29facial expressions
01:14:30or any animation
01:14:31in your voice
01:14:31or any spontaneity
01:14:32in your communication,
01:14:33you're a sophist.
01:14:35Well,
01:14:36that's trying
01:14:41to make people
01:14:41self-conscious
01:14:42about weird
01:14:44facial expressions.
01:14:46Who the fuck
01:14:46are you to say
01:14:47what is a weird
01:14:47facial expression?
01:14:48If I have
01:14:49facial expression,
01:14:50that's my face.
01:14:54It's weird about it.
01:14:55So it's a way,
01:14:56and again,
01:14:56maybe he's done
01:14:57a lot of more analysis,
01:14:58but I just thought
01:14:59that was interesting.
01:15:01It wasn't a particularly
01:15:01big tweet,
01:15:02but if anyone
01:15:03can find
01:15:04where he's talked
01:15:05about me,
01:15:06I'd be curious.
01:15:07I'd be curious
01:15:07to see that.
01:15:10Somebody,
01:15:11this is Overton Windmill,
01:15:14said,
01:15:14I'm not a libertarian,
01:15:16but Grok's attempt
01:15:18to prove
01:15:19Stefan Molyneux's
01:15:20UPB
01:15:20in 100 words
01:15:22for a 100 IQ audience.
01:15:24So this is
01:15:25Grok's attempt
01:15:26to prove
01:15:27universally preferable
01:15:28behavior,
01:15:29my theory of ethics,
01:15:30my fact of ethics.
01:15:32So prove
01:15:33Stefan Molyneux's
01:15:34UPB
01:15:34in 100 words
01:15:36for a 100 IQ audience.
01:15:39And this is
01:15:40what Grok said.
01:15:41If you steal,
01:15:42you want to keep
01:15:43your stuff,
01:15:44but stealing
01:15:45says others
01:15:46can take it too.
01:15:48That's a contradiction.
01:15:49You can't want both.
01:15:50Same with lying
01:15:51or murder.
01:15:51They need a victim
01:15:52who disagrees,
01:15:53so they can't be universal.
01:15:55Truth-telling
01:15:55or respecting property
01:15:56works for all,
01:15:57no contradictions.
01:15:58UPB holds
01:15:59because actions
01:16:00must pass this logic test
01:16:01to be moral.
01:16:02If they fail
01:16:02like theft,
01:16:03they're wrong
01:16:04for everyone,
01:16:05proving UPB's
01:16:06universal standard.
01:16:07That's pretty good.
01:16:09Lord knows
01:16:10as I cruise
01:16:10into my hour 20,
01:16:11concision is not
01:16:12necessarily my
01:16:13strong suit,
01:16:14but that was
01:16:14pretty good, man.
01:16:16That's pretty good.
01:16:23I wonder if that fellow
01:16:24has ever criticized
01:16:25anyone for being boring.
01:16:27Yes, so the left
01:16:28is very entertaining
01:16:29and let's give credit
01:16:31where credit is due.
01:16:32They can't mean
01:16:32to save their life,
01:16:33but they're pretty fantastic
01:16:35at snarky,
01:16:36bitchy comedy.
01:16:38Right?
01:16:39I mean,
01:16:39they're really good
01:16:40at that.
01:16:41I mean,
01:16:41if you ever want to like,
01:16:43oh, what's his name?
01:16:44The guy who says
01:16:45he looks like a librarian
01:16:48or a parrot.
01:16:48What was his name?
01:16:49Oh, it's on the tip
01:16:55of my tongue.
01:16:56He's got a whole TV show,
01:16:57you know,
01:16:57the owl-eyed
01:16:57British.
01:17:00Is he a British guy?
01:17:01Somebody will help me out.
01:17:04Does Grok agree
01:17:04with UPB?
01:17:05No.
01:17:06No.
01:17:06Grok does not agree
01:17:07with UPB
01:17:08because AI can't agree
01:17:10with anything.
01:17:10It's just a word guesser.
01:17:12John Oliver.
01:17:15John Oliver.
01:17:16You'd think I remember that
01:17:17because I was in the musical
01:17:18Oliver when I was a kid.
01:17:19But, yeah,
01:17:20John Oliver.
01:17:21Somebody has written
01:17:22a whole thing
01:17:24about how John Oliver's...
01:17:26I'm going to see
01:17:27if I can find it.
01:17:28John Oliver's
01:17:29how his shtick works.
01:17:31John Oliver
01:17:32show pattern.
01:17:35I think I'll find it.
01:17:36If not,
01:17:37I will waste
01:17:39half hour
01:17:39of everybody's life
01:17:40doing that.
01:17:44Yeah,
01:17:45I don't...
01:17:46But, yeah,
01:17:46I mean,
01:17:47it's just snarky,
01:17:48right?
01:17:48It's just snarky stuff.
01:17:50Yeah,
01:17:50I'm not going to find it.
01:17:52But,
01:17:53somebody did a whole thing.
01:17:54I think it was on Reddit
01:17:55breaking down
01:17:56John Oliver's shtick
01:17:57and all of that stuff.
01:17:58But they're very entertaining.
01:17:59You look at Seth Meyers
01:18:00and John Oliver
01:18:02and
01:18:02John Stewart
01:18:05and so on.
01:18:06They're very entertaining
01:18:07or
01:18:08they're just
01:18:11the sort of
01:18:12late night hosts.
01:18:13They're very entertaining
01:18:14and they're funny
01:18:14and they're animated,
01:18:15right?
01:18:16And they're passionate
01:18:16and they're engaging,
01:18:18right?
01:18:18So,
01:18:20heaven forbid,
01:18:21you know,
01:18:21somebody try
01:18:22to do something.
01:18:24Thank you,
01:18:25Dorbans.
01:18:25Heaven forbid somebody
01:18:26try and do something
01:18:27engaging or entertaining
01:18:28who's not on the left,
01:18:29right?
01:18:29And again,
01:18:30I wouldn't say
01:18:30I'm particularly
01:18:31on the right,
01:18:32but all of that,
01:18:33right?
01:18:33So,
01:18:38yeah,
01:18:38it's really sad.
01:18:42And it's just a way
01:18:43of crippling people.
01:18:44It's just a way
01:18:45of crippling people,
01:18:46that's all.
01:18:48Oh,
01:18:48you can't be too animated
01:18:50because that's having
01:18:50weird facial expressions
01:18:52and if you have weird
01:18:52facial expressions,
01:18:53you're a sophist,
01:18:54so you've got to go back
01:18:54to being a monotone.
01:18:56Nobody's going to listen
01:18:56to you,
01:18:57right?
01:18:58Nobody's going to listen
01:18:58to you.
01:19:00It's just a way
01:19:00of crippling your enemies,
01:19:02so to speak,
01:19:03right?
01:19:03St. Louis Post-Dispatch
01:19:09wrote,
01:19:10a bald eagle
01:19:11thought to be hurt
01:19:12was really just
01:19:13too fat to fly,
01:19:14Missouri officials say.
01:19:16And somebody wrote,
01:19:18homie took being
01:19:18America's mascot
01:19:19to the next level.
01:19:26There's a woman,
01:19:27I don't particularly
01:19:29enjoy making fun
01:19:31of people's attributes,
01:19:32but there's a woman,
01:19:33I think she's on TikTok
01:19:34or something like that
01:19:35and she has this
01:19:35absolutely ginormous mouth.
01:19:37Like,
01:19:38it's a big mouth.
01:19:39I have a big forehead,
01:19:41she has a big mouth
01:19:42and the comments
01:19:44are actually very funny.
01:19:45She smiles in six
01:19:46different time zones.
01:19:47Her smile starts on Monday
01:19:49and ends on Sunday.
01:19:51She speaks in IMAX.
01:19:52She can whisper
01:19:53in her own ear.
01:19:54She doesn't talk,
01:19:55she broadcasts.
01:19:56She speaks for all of us.
01:19:57She can sing two songs
01:19:58at the same time.
01:20:00She'd give you,
01:20:01she'd give Pac-Man
01:20:02a run for his money.
01:20:03Her lipstick bill
01:20:03is higher than her rent.
01:20:05She ain't giving
01:20:06French kisses,
01:20:07she gives Europe kisses.
01:20:08She gives blow careers,
01:20:10not jobs.
01:20:11Favorite tea,
01:20:12Lipton.
01:20:12It's very funny.
01:20:13And you know how,
01:20:15oh,
01:20:15the audio cramped out here.
01:20:17It's an article.
01:20:18A teen finds out
01:20:19that the anonymous
01:20:20internet bully
01:20:21who harassed her
01:20:21for a year
01:20:22is actually her mom.
01:20:24That's wild.
01:20:24Some of those stories
01:20:25are just amazing.
01:20:27If a Minecraft world
01:20:27is infinite,
01:20:28how does the sun
01:20:29go around it?
01:20:30Very important questions
01:20:31from the world
01:20:32of engineering.
01:20:35I'm still,
01:20:36on social media,
01:20:38I will still follow
01:20:38a couple of Dungeons
01:20:39and Dragons.
01:20:41I will still follow
01:20:42a couple of Dungeons
01:20:43and Dragons accounts.
01:20:44I still just have
01:20:46such a soft spot
01:20:47for that game.
01:20:48And I have talked
01:20:49about it as a whole
01:20:51every now and then.
01:20:52Hit me with a why
01:20:53if it would be
01:20:54interesting for people
01:20:55to,
01:20:57hit me with a why
01:20:58if it would be
01:20:58interesting for people
01:20:59to do a D&D night
01:21:00if I host
01:21:01as a Dungeon Master.
01:21:03Would you be interested
01:21:04in doing a D&D night?
01:21:10Because I still have
01:21:11a great deep
01:21:12and soft spot
01:21:13for the game
01:21:14as a whole.
01:21:15and they're usually,
01:21:17one of the things
01:21:18I love about Dungeons
01:21:19and Dragons
01:21:19is it can be
01:21:20so absolutely,
01:21:21completely
01:21:21and jaw-droppingly
01:21:23cry till the milk
01:21:25comes out of your nose
01:21:26with laughter
01:21:26funny.
01:21:28And I don't know
01:21:28why it is exactly
01:21:29but there's something
01:21:30about,
01:21:31James is fairly true,
01:21:32right?
01:21:32Because we played
01:21:32a little bit,
01:21:33right?
01:21:34It's fairly true
01:21:35that it can be
01:21:35quite hilarious
01:21:36and all of that.
01:21:38And so
01:21:38I,
01:21:42maybe I'll,
01:21:43maybe I'll do that.
01:21:43You've never played D&D?
01:21:50Yeah,
01:21:50that's fine.
01:21:50That's fine.
01:21:51It would be something
01:21:52we would sort of record
01:21:53and maybe put out
01:21:55to donors
01:21:55because it can be
01:21:56just hilarious.
01:21:58Did you read this?
01:21:59A star Harvard
01:22:00business professor
01:22:01stripped of tenure
01:22:02fired for manipulating
01:22:03data in studies
01:22:04on dishonesty.
01:22:05Oh,
01:22:08that's wild.
01:22:12That's really wild.
01:22:18All right.
01:22:19Let me just,
01:22:20if there's any
01:22:21sort of last questions.
01:22:24Is this true?
01:22:25Billy Joel
01:22:26was diagnosed
01:22:26with a sudden
01:22:27onset brain disorder.
01:22:28Is that true?
01:22:29Billy Joel,
01:22:34a fully vaccinated
01:22:35musician,
01:22:36this is Grog,
01:22:37was diagnosed
01:22:37with normal pressure
01:22:39hydrocephalus.
01:22:41They say
01:22:42I got a lot of water
01:22:43in my brain.
01:22:45A rare brain disorder
01:22:46involving fluid buildup,
01:22:48which this post
01:22:49links to
01:22:49mRNA COVID vaccine,
01:22:50noting Joel's public
01:22:51support for the vaccine
01:22:52rollout during the pandemic.
01:22:53The post references
01:22:55a known side effect
01:22:56of mRNA vaccines
01:22:57supported by a 2021
01:22:58study in clinical
01:23:00immunology
01:23:00on 704,003
01:23:03Pfizer-BioNTech
01:23:04recipients in Mexico,
01:23:05which reported
01:23:06rare but severe
01:23:06neurological events
01:23:07like seizures
01:23:08and Guillain-Barre
01:23:09syndrome,
01:23:10though NPH
01:23:12specifically isn't
01:23:13mentioned,
01:23:13so I don't think
01:23:14it's a known side effect.
01:23:16Joel's rapid decline
01:23:17and tour cancellations
01:23:18through July 2026
01:23:20raised questions
01:23:21about vaccine-related
01:23:22neurological risks,
01:23:23blah, blah, blah.
01:23:25I like Billy Joel.
01:23:26Myself,
01:23:28for some reason,
01:23:29this is probably
01:23:30particularly,
01:23:31not particularly
01:23:31important for you,
01:23:32but I remember
01:23:34learning how to skate
01:23:35in Canada
01:23:36and
01:23:37I don't care
01:23:39what you say anymore,
01:23:40this is my life.
01:23:43Go ahead
01:23:44with your own life,
01:23:45leave me alone.
01:23:47I just remember
01:23:47listening to that
01:23:48and I also remember
01:23:49being,
01:23:51oh gosh,
01:23:52in my teens
01:23:53and listening
01:23:55in headphones
01:23:56to the song
01:23:56Vienna.
01:23:58Slow down,
01:23:59you crazy child,
01:24:00you're so ambitious
01:24:02for a juvenile,
01:24:03but then if you're
01:24:04so strong,
01:24:05tell me,
01:24:05why are you
01:24:06still so afraid?
01:24:08It's a beautiful song,
01:24:09beautiful song,
01:24:10but it's funny
01:24:12because he just
01:24:12kind of stopped
01:24:13recording any new songs.
01:24:15Big Man on Mulberry Street
01:24:16and that's it,
01:24:17man,
01:24:17that's it.
01:24:18and you've got
01:24:20to listen to
01:24:20he did a duet
01:24:21with Ray Charles,
01:24:23my baby,
01:24:23that baby grand
01:24:24and when he's
01:24:27in concert
01:24:28and he imitates
01:24:28Ray Charles,
01:24:29it's incredible.
01:24:30I mean,
01:24:31a really talented guy.
01:24:34A really talented guy.
01:24:36But yeah,
01:24:36he just stopped,
01:24:37but
01:24:37I don't care
01:24:41what you say anymore,
01:24:42this is my life.
01:24:43Absolutely great
01:24:43for me personally,
01:24:44he says.
01:24:45James,
01:24:46yeah.
01:24:47Don't get me wrong,
01:24:49I still belong
01:24:50and you can speak
01:24:51your mind,
01:24:52but not on my time.
01:24:58Oh yeah,
01:24:58it's a great,
01:24:59it's a great song.
01:24:59He's a great singer.
01:25:00My God,
01:25:01listen to him
01:25:01do Innocent Man.
01:25:03What an amazing vocalist.
01:25:06Tom Cruise
01:25:07is the same age
01:25:08as Ian McKellen
01:25:08was when he played Gandalf.
01:25:10I don't know what,
01:25:11we talked about this before,
01:25:12but it's Moving Out,
01:25:14it's a great song.
01:25:16Songs from,
01:25:17themes from an Italian restaurant,
01:25:19also a great song.
01:25:20And yeah,
01:25:21I just loved
01:25:22Billy Joel's stuff,
01:25:23it's great.
01:25:26Or even he said
01:25:27some of his songs
01:25:31are kind of like
01:25:32dental drills.
01:25:34The one about history,
01:25:35it's kind of like
01:25:36dental drills.
01:25:37all right.
01:25:46Any other last thoughts,
01:25:47questions,
01:25:48issues,
01:25:48comments?
01:25:52Yes,
01:25:52Moving Out
01:25:52is a great song.
01:25:54Some of Billy Joel's song
01:25:55is survivable
01:25:55from a mediocre
01:25:56baritone approach
01:25:59such as I have.
01:26:00I can do a few things.
01:26:03Still rock and roll to me,
01:26:04that's vaguely handily,
01:26:06you know,
01:26:06in the same way
01:26:07that most people
01:26:07can sing
01:26:08We Will Rock You
01:26:08by Queen,
01:26:09but God help you
01:26:09if you ever try
01:26:10to take on something
01:26:11like Somebody to Love
01:26:12or anything else.
01:26:14But yeah,
01:26:15it's a great,
01:26:15Billy Joel has some
01:26:17fantastic songs
01:26:18and just fantastic songs.
01:26:21All right,
01:26:22well,
01:26:22I will stop here.
01:26:23I really do appreciate
01:26:24everyone's time tonight.
01:26:25If you could help out
01:26:26the show,
01:26:27I would deeply
01:26:28and humbly
01:26:28and gratefully
01:26:29appreciate it
01:26:29so that I can care
01:26:31about you.
01:26:32freedomain.com
01:26:32slash donate
01:26:33if you want to set up
01:26:34a call-in show,
01:26:35private or public,
01:26:36happy to do it.
01:26:38We've got a real gift
01:26:39and I really do
01:26:40thank for the listener.
01:26:41He did a private
01:26:42call-in show.
01:26:43It's very powerful stuff
01:26:44because again,
01:26:44I'm pretty blunt
01:26:45to direct the private
01:26:46call-in shows
01:26:47and he agreed
01:26:48or he offered
01:26:50to make it public
01:26:51and that's very kind.
01:26:52It's very generous.
01:26:53I hugely appreciate it.
01:26:54That's going to go out
01:26:55to donors over the weekend
01:26:56and then at some point
01:26:57in the future
01:26:57it will go out
01:26:58to the general stream,
01:26:59I think.
01:27:00But I guess
01:27:02that's a chance
01:27:03for you to listen
01:27:03into what a private
01:27:04call-in show is like
01:27:05and that's very kind
01:27:06of him
01:27:06and I appreciate that
01:27:07and so you can go
01:27:08to freedomain.com
01:27:09slash call
01:27:09to help out the show
01:27:11as far as that goes
01:27:12and have yourself
01:27:13an absolutely wonderful weekend.
01:27:14We will speak to you Sunday.
01:27:16Lots of love.
01:27:17Take care, my friends.
01:27:18Bye-bye.
Recommended
1:43:02
|
Up next
2:02:38
1:15:36
1:49:00
1:13:35
1:34:27
1:27:53
28:00
1:32:38
2:06:05
1:04:57
1:40:26
2:16:11
1:09:13
1:42:08
29:57
1:33:54
1:12:33
1:27:26
1:39:32
56:27
2:34:38
1:16:06