Skip to playerSkip to main contentSkip to footer
  • 6/29/2025
Today on The Cameron Journal Podcast we are talking with Joseph Lenard, he is the host of Christitutionalist Politics (that's just one of his shows) and he is a former IT guy turned political commentator. We've met before on a shock-jock show in the past that was a lot of fun so we meet up again to have an even better conversation and break down the latest changes in AI.

You can learn more at: https://podsite.fm/christitutionalist-tm-politics
Transcript
00:00Thank you very much.
00:30Today on the Cameron Journal podcast, we are joined by Joseph Lenard.
00:35He is the author of Terror Strikes, Coming Soon to a City Near You, and How to Write a Book and Get It Published.
00:42He's a former information technology professional, and he now does political conservative commentary and blogs and writes and all this type of thing from the great state of Michigan.
00:51So welcome, Joseph, to the Cameron Journal podcast.
00:55Sorry to do this, but an immediate correction.
00:59It looks French.
01:01It's not.
01:02It's not Lenard.
01:04It's Leonard without an O.
01:08It's Joseph M. Leonard because that's my author.
01:14Well, that's my legal name and my author name that I use because there is a Joseph Lenard out of South Carolina.
01:23So I, you know, need to make that distinction.
01:27Yeah.
01:27No, I'm glad we got it cleared up, but that's a perfect transition right into why don't you tell us a little about yourself and your work and what you're talking about these days.
01:36Yeah.
01:36Yeah.
01:38Born and raised in Southeast Michigan area.
01:41And I choose to stay here, even though I'm not fond of Michigan winters.
01:46You know, I, I'd love to live in like Vegas, but I stay in Michigan and, uh, yeah, uh, I'm on disability.
01:57I was an it guy.
01:58I got a massive burnout, you know, constant calls all through the evening.
02:05And the body didn't get proper sleep.
02:08And at some point my body said to me, okay, you're not going to slow down.
02:13We're going to slow you down.
02:15Like it or not.
02:17Bam.
02:17Shut down.
02:18Immune system.
02:21Yes.
02:22So, so I'm on disability, which is, I wrote and recorded.
02:27I come from a musical family, Ted Leonard Jr.
02:30And the polka Kings.
02:32You can still find their albums today, but I didn't want to go into polka music, but I did write and record music for a while.
02:40Never got a record deal.
02:42So I'm continuing kind of the family writing legacy through books now and do online articles.
02:51I host my own constitutionalist politics podcast show.
02:55I'm also AKA jokester, Joe and raging Joe on savaged unfiltered podcast.
03:04So that's as far as the jokester, Joe, my specialty is lame, bad, stupid, loose wordplay.
03:12Like your show, the Cameron journal.
03:15It's like, you're now the, for me, you're going to be the camera on journal.
03:21That's very clever.
03:27No, it's pretty lame.
03:28Actually, it might be a little clever, but it's definitely lame.
03:33That will not, sadly, not have been the worst thing that's been said about me or the show.
03:38So I'm sure.
03:39I'm sure.
03:40Yeah.
03:41Yes.
03:42Yes.
03:42Well, why don't, why don't you tell us a little bit about your, your book to start?
03:46We'll start with that.
03:47Why don't you tell us a little about your book?
03:48Yeah, I've been writing my whole life that I said, wrote and recorded music, wrote stories,
03:55short stories, novels, poems, whatnot, my whole life, you know, since my young ages.
04:02But I only now available have five international books.
04:08The first internationally available book is Terror Strikes, Coming Soon to a City Near You.
04:15And that subtitle is, of course, very important because there are a million different, I'm,
04:26you could see me groping back here for something, and it'll make sense why in a minute, because
04:31I didn't memorize this.
04:33But Terror Strikes, Coming Soon to a City Near You is my first internationally available books.
04:39And shout out to people in Australia, for some reason, my book does really well through
04:44Booktopia in Australia.
04:47But it's not a book about death.
04:49Yes, about terrorism, but not book about death, but life and living.
04:54And yeah, I'm reading this because I can never memorize it.
04:57No, it's okay.
04:58Life over death, hope over fear, faith over despair, love over hate, good over evil, individualism
05:06over collectivism, freedom over tyranny, family and friends.
05:11So yeah, terrorism, it's really about Martin, who is a journalist who's going to write a book
05:18about terrorism.
05:20Hence, that's the thread that Hyde holds all those sub-themes together.
05:28Really more about life than death and destruction.
05:32And then my second book, I was on lots of shows, you know, like yours for Terror Strikes.
05:38And they say, well, just come back and talk about writing and publishing.
05:44Sure.
05:45So there's a million books on how to get your book on Amazon.
05:50Yeah.
05:51Well, you could read the Amazon help files for that.
05:54You don't need to pay anybody for that.
05:55So I figured there's a market for how to write a book and get it published, hints, tips, and
06:02techniques that take you from concept to writing and publishing.
06:09What are your realistic options?
06:11After you've published, you're not done.
06:14Then comes promotion and marketing and all that stuff.
06:18So there was a book for that.
06:20And indeed, several people are using my book now for their book that will be out later
06:27this year or next year.
06:29And a few people already in the past have done that.
06:32My third international was Constitutionalist Politics based on my podcast.
06:38Then I wrote Podcasting Quick Start Guide because a lot of people would like to do a podcast.
06:44And I tell you, hey, you could do it from your living room like I am right here.
06:50See my ugly couch back there behind my green screen, just doing it from my living room.
06:58And then Constitutionalist Politics 2 came out again further in my podcast.
07:06At any rate, that's probably TMI for most people.
07:09They didn't want to know all that.
07:10No, no, it gives us a great context of who you are and your politics and your view and all this type of thing.
07:18The real reason I'm here, of course, is as a former IT guy to talk AI.
07:25Yes, well, and we should turn our attention to that.
07:28So why don't we get it?
07:30So let me set the scene for everybody because there's been – we're recording this on the 30th of May, 2025.
07:36This will not come out for a while yet, but this morning on Twitter, the shit has hit the proverbial fan when it comes to job losses from AI, and everyone's kind of got their hair on fire a little bit today.
07:53So it's kind of apropos that we're recording this now.
07:56So people are starting to realize software companies have slowed down or quit hiring, especially for junior roles.
08:04They're predicting 50% of entry-level jobs are going to disappear, which is someone who went through the great financial crisis.
08:09I know how devastating that can be for young people trying to get a career off the ground.
08:13You have senior devs that are trying to turn – that are trying to make AI a version of themselves and clone themselves.
08:21There's a lot happening right now.
08:23There's a lot of predictions saying by the end of this decade, 50% of white-collar work will no longer exist.
08:29So there's a lot of dialogue happening right now, and then, you know, everyone's saying, okay, well, I guess we'll all, you know, be in the trades and everything.
08:41And then someone says, not with Elon's new humanoid robots that are already working in Amazon factories and all this type of thing.
08:47You know, that may get a five-year extension because working in the physical world is very difficult, but it won't be too much longer before a lot of those jobs go away as well.
08:56So, what have we learned from all of this?
09:00This is no longer theoretical.
09:02It's here.
09:04And so –
09:04And we briefly discussed this.
09:07I co-host of Savaged Unfiltered, as I mentioned.
09:11You were a guest on Savaged Unfiltered, and we touched on a little of this there.
09:16So this is almost a follow-up episode.
09:20Yes, yes.
09:21I actually kind of forgot that, and I thought I recognized you, and I didn't know from where.
09:24So thank you.
09:26And so, yes, yes, so that's very, very good.
09:30So then we should then kind of turn our attention to you're talking about AI, copyright, law, all this type of thing, which is all getting very muddy.
09:41And some people say, well, we should just get rid of copyright in general.
09:44And I'm kind of like, well, that's a nice idea if you don't produce anything intellectual, you know, sort of thing.
09:50But for, you know, for the rest of us that do that and that's our business, this is an existential threat sort of thing.
09:56So understanding that things with AI are getting very real right now.
10:01And in my, in January, in my 2025 trends report, I said, this is the year AI rubber hits the road.
10:09If they can figure out how to solve business problems and create business cases, then AI will move forward.
10:13Otherwise, it probably won't.
10:15It looks like they're solving the problem and AI is moving forward.
10:17So that's the scene.
10:19That's the milieu and zeitgeist we're in.
10:22Where, where, where are you with all of this?
10:25What's, what's the latest problem with AI besides taking all of our jobs that we must worry about?
10:30Yeah.
10:31From the writing and research and thesis papers, our kids in school aspect of it.
10:41Indeed, I wrote at beforeitsnews.com and theliberty.beacon.com, a piece AI and the law and the basis of why I'm here is indeed that Congress, Democrats and Republicans alike, this is not a partisan thing, are ignoring, as they too often do, a situation until it turns critical.
11:11They're not updating the copyright laws, the fair use law, if you're familiar with that, trademark even, infringement potentials, because kids are just, and teachers know this, they're just going to AI whatever bot and saying, I need a paper on such and such and such, and it's spewing it out.
11:39And they're basically plagiarizing what AI gave them and calling it their own.
11:46That's plagiarism.
11:47That's a violation.
11:49Now, the terms of service of your AI service might say, yes, you can use this.
11:56But just the moral dishonesty of trying to claim it as your own, let alone the legality that hasn't been cleared up of that plagiarism, is an issue.
12:11Now, I use AI at times.
12:14They're not, every word of my books are me.
12:19I wrote them.
12:21None of it, none of it is AI.
12:23If you, I use AI at times online, online articles, and if I do that, I disclaim that I've used Galaxy AI.
12:38Galaxy AI created this picture.
12:41Galaxy AI created that summary of a video I did, right?
12:47I am giving proper accreditation, and that's the key word here.
12:53Because AI is what?
12:57Really a fancy search engine.
12:59It's just an aggregator of information that it is pulling and pulling.
13:05Now, if AI lifts person X saying Y about Z, and you're researching subject Z and writing a paper, and the AI gives you the Y quote, but not in quotes, and not accrediting person X, and you use it in a paper, guess what?
13:29But your name's on it, and you're the one guilty of copyright violation, and plagiarism, unless it falls under fair use.
13:39So if you don't see YA, and at least say this article in whole or in part, partially produced through Galaxy AI, or Brock, or Google, whatever, you're on the hook.
14:01Just like the Chrisley's were just pardoned for tax evasion.
14:07Well, your name is the one on the bottom of the tax return.
14:12You hire them to help you, but you, you are the one on the hook.
14:18I think the difficult, I mean, I think there's two difficulties, is one, because AI is a black box, most people don't know that, because it's a black box, and we cannot ever retrace its steps to find out where it got what from to prove where it came from.
14:38Proving copyright is very difficult.
14:39It's why there's been no legal cases.
14:40It would fall apart in court.
14:41Well, I don't know.
14:44You know, I hope, I hope they have logs.
14:49And again, this is the law issue.
14:52Congress needs to pass a law.
14:54Any question typed into AI and the response given should be, by law, required to be recorded in a log somewhere, so it can be legally not FOIA'd.
15:11Because private companies aren't subject to FOIA laws, unless if they update FOIA laws, but a court, a lawyer could subpoena the logs that way.
15:25We don't know if that is or isn't happening.
15:29I don't even think most of these models even have that capability, because they take in all this training data, and then they put words together in the order that matches what it thinks you're doing.
15:39I don't even think that, and here's the problem, I don't even think that sort of data even exists.
15:45The one thing I do like that some of the newest models have been doing, and I know because I've been getting clicks from them, is they are now starting to, like, kind of cite their sources of, we saw this information here, and here's kind of where we got it from.
16:04And I discovered this because I was getting clicks from perplexity, so I started to go look at how that was working, and found out that in some cases, AI models were citing my work.
16:16Because I have this huge library at CameronShield.com.
16:18Exactly.
16:18Check it out.
16:18Read it.
16:19They're ripping you off.
16:21Yes.
16:21Well, I mean, I joke, Sam Altman, where should I send the bill for you stealing my entire library of work?
16:27Right.
16:27And when can we expect payment?
16:29You know.
16:32I know they've lifted me, too.
16:34So, yeah.
16:35That's the issue.
16:36And when can you expect payment?
16:38Right.
16:39That's why I use the word accreditation and attribution.
16:44As you said, now, and that's part of what Congress needs to put into law.
16:50If Brock uses josephmlaunner.us as a source, or the Cameron Journal as a source, we need to be accredited and attributed for it, if not direct things put in quotes.
17:09Yes.
17:10No, I mean, and I think this, and I think the, um, I think the sad part, and I'm sure this is, you know, enraging to you as well, um, and this has always bothered me, and I've had a parade of AI people on the show to talk about this.
17:26The cavalier nature with which they treat this stuff, they don't care.
17:31No, they don't.
17:31For me, that's always been the most terrifying part, especially as someone who has three degrees, for whom not citing a source is a suspendable, you lose everything, get kicked out of the program offense sort of thing, and who takes this stuff very seriously.
17:44Um, uh, you know, there's, um, uh, the casual nature with which they treat that and don't understand how important it is to be able to go read an original source if you want to, um, being able to, you know, credit people for their work that's not your own, which is very important.
18:04And also how it can damage other people's careers, because the reality is, if some college student is writing a paper, and they're quoting my work on domestic terrorism, you know, or whatever have you, um, you know, that's, I mean, that's the basis on which academia is built, is showing that it's not just you saying it, but there's a corpus of people behind you saying it.
18:24And I'm always frustrated by the AI, by the AI people, because none of them care about any of that stuff.
18:29For them, it's too messy.
18:32It's too hard to understand.
18:34They don't think it's important.
18:36They just look at it as data, ones and zeros.
18:39See my note there, fair use, when you started saying that.
18:43I wrote that out.
18:45That's the issue here.
18:46There is fair use law.
18:49And I myself have quoted under fair use myself.
18:54But normally, I see YA, and I attribute and accredit where it came from, and link the source.
19:04For the rest of person X's article, where quote Y came from on topic Z, to use that example I did before, go to this link and see their full piece that I quoted under fair use here.
19:24No, I mean, and that's, like I said, I've always been very frustrated with how much AI, and Silicon Valley in general, because I've had a lot of those guys on this show too.
19:34They just don't care about any of this stuff.
19:37No, they want to treat it all as fair use availability.
19:41That's not the law.
19:43You are guilty of Seth.
19:46Plagiarism.
19:47Copyright violation.
19:49Yeah, and that's, unfortunately, I mean, I, and I don't know if you feel so lonely about this, but I feel like looking at the marketplace and looking at how fast this is moving, and all that's happening, I feel like they're probably going to get their wish.
20:04Like, I feel like, very strongly, we're probably going to lose the whole system of copyright that we have relied upon since the founding of this country.
20:14I mean, it's probably, I mean, will we have copyright by 2040?
20:20I doubt it.
20:21I really do.
20:23I doubt it.
20:23I think we'll be able to collect it, because it, now, constitutionalist, my podcast, I invented that word, and I indeed trademarked it, so it's protected, and I can cease and desist if someone's bastardizing the word of how I intended it to be, right?
20:48But without that trademark, it's just, you can't copyright a word or a phrase.
20:55You can trademark a word or a phrase, but, and you can't copyright like a paragraph.
21:04That's fair use, but again, it should be accredited where you got it from.
21:09If you want to quote a paragraph out of terror strikes coming soon to a city near you, you should accredit me and the book from whence you got it from, but under fair use, that would be allowed.
21:26They want to turn everything into fair use, and I don't think they'll get to go that far.
21:33Our books will stay copyrightable.
21:36You can't lift my book and claim it as your own without violating, but I agree with you.
21:43It's going to loosen and probably too much and then go too far, because why?
21:51They've got money.
21:54They've got lobbyists.
21:56You and I don't have a lobbyist in D.C.
22:00No, no, definitely not.
22:02I think also the other problem is, especially with, you know, like, you know, something like, you know, terrorism and whatnot, you know, an AI can have, you know, read my work on that, which I have a degree in, and your book, and then take that information and synthesize something else out of it.
22:22You never say, oh, yeah, it was Joseph Leonard and Cameron kind of got this idea, all those sort of things, and it's something, you know, new enough sort of thing.
22:30That's where I think it's going to get real messy and real hard to track is to say, you know, well, you know, it's not, you know, we're not directly lifting quotes, so it doesn't count.
22:40You know, yes, this was the training data we used, but, you know, that's, you know, that we don't need to compensate people for that.
22:47We don't need to ask their permission or anything like that.
22:49I'm not looking for a direct compensation necessarily, but you're right.
22:54It should give accreditation and attribution.
22:58I mean, Grok or Galaxy used Joseph M. Leonard's Terror Strikes Coming Soon to a City New Year or an article of yours as partial source material, but you're right.
23:11How new does it have to be?
23:14What defines new enough to be its own work?
23:21Yes, yeah, exactly.
23:22And that's, I mean, we've had ideas of what that, you know, means in the past, and there's a difference between, you know, the bibliography of things you read but didn't quote, the work side of the things that you read and did quote, all this.
23:35The academia has kind of figured these things out, and I think the most shocking thing is the AI folks do not care.
23:41Their whole point is to disrupt that entire system to make it easier for people to do stuff.
23:47What without understanding, without the corpus of human knowledge, they would have nothing.
23:54You know, social media is the same problem.
23:57If everyone stopped posting content, that's social media going out of business tomorrow.
24:02Like, these people are standing on the shoulders of everyone else who makes stuff, who does stuff, who writes stuff, who creates stuff.
24:10And no one, for all the billions, and at this point trillions, of dollars they have made, no one has ever really, I think, ever taken that into account or ever tried to, you know, compensate anyone.
24:27When they tried, when Australia tried to make Facebook pay news publishers for news articles posted on Facebook, Facebook told them why they were leaving Australia.
24:34That's what they did.
24:35They're doing the same thing, yes.
24:36I've been on Rick Walker's Maverick News discussing that, and the goal there in Canada, and maybe in Australia, I don't know, I've not looked into it, but Canada, the point was trying to push out independent media.
24:53They want to lift up corporate, state-run CBC up there, basically.
25:02They want to protect those who protect the politicians and crowd out others.
25:09So, and I already said, like I said, I use Galaxy AI to create images, and, you know, if I use an image, I say, you know, image source, Galaxy AI.
25:23Cover my own ass.
25:25What if Galaxy lifts that image from somebody somewhere?
25:30Right.
25:30If I don't say that's where it came from, they're the real source, I'm on the hook for that.
25:37I also use, I'm on DeepCast.FM.
25:43They also run PodSite.FM.
25:47My constitutionalist politics has carried both those places.
25:51And they use an AI also.
25:53So, for show summaries and things like that, they're summarizing our material.
26:03We're telling them, please summarize my material.
26:08That's not the issue.
26:10You and I are talking about when they are, when someone asks Galaxy to create something out of whole cloth, again, about topic C, and it pulls from person X a quote of Y and not accrediting it.
26:28And I agree with you, you know, the law has got to be updated.
26:33They've got to start saying what some of the source material is in there.
26:37Not necessarily force them to pay me a couple pennies because they used me as source material, but at least let people know, hey, wow, this guy has this book and Grok used it.
26:59Maybe I want to buy that book.
27:02Yeah, yeah.
27:04So, when did you start wanting to put yourself out to shows talking about this?
27:11Well, again, being a former IT guy, and then I started with co-hosting Savage Unfiltered.
27:19It's come up there a few times.
27:21And again, you in part were on talking about it and that.
27:25Also, I've written articles on it, of course, being a former IT guy.
27:31I saw this shit coming a long time ago.
27:33People are just now trying to rap as you, right?
27:37We've seen it coming.
27:39Miles away, we've seen it coming.
27:42But again, Congress, Democrats, and Republicans refuse to deal with anything until it becomes a fucking crisis.
27:52Well, and kind of circling back to, you know, the top of the show, and I was kind of level setting, like, here's kind of what happened this morning.
28:02Weird that we're having this conversation today.
28:04I think we're starting to get to a crisis point with this.
28:07I mean, if we even pull out from the whole copyright situation and look at it, the crisis is really here.
28:14I mean, when you have, you know, major layoffs coming along because AI can do it better, cheaper, faster, all this type of thing, that's where you start getting into worrying about societal upheaval.
28:28I need a program to write a report based on data coming from this, and this is the main data I want to mine from my computer system.
28:41Well, an AI program can write that code like that in the snap of a finger, whereas, what, you're going to hire a guy and spend six months to do it instead?
28:52Instead, that's the warning you have here, that contractor's not going to get the job to write that program because AI can write it for the company that much quicker.
29:05And the thing that I kind of laugh when, you know, everyone's like, well, how do we, you know, manage the economic, you know, fallout from people not having jobs?
29:15And I said, I'm not quite sure you understand how business economies work.
29:20If there are no customers, there will be no economy.
29:24End of story.
29:25Like, if the company, well, they will have, companies will have no one to sell to because no one will have any money.
29:31If no one has a job, if people don't have a job, earn some money.
29:37I'm sorry, you communists out there, but you cannot have an economy based on the government doling out a 50K check to everybody in the nation, creating this fiat currency out of nowhere based on nothing for people doing nothing to create a phony, fake, propped up economy.
30:01It doesn't work.
30:02Ask the Weimar Republic, ask the Venezuelans.
30:08No, I mean, there's, I mean, that's, for me, that conversation is very short.
30:13There will be no customers.
30:15Therefore, businesses will go under.
30:17Therefore, we will not have an economy.
30:19That, for me, that conversation is very, very short.
30:21And everyone's talking about all these different things of, oh, we're going to have to have UBI and we'll have to find ways to give people purpose.
30:27And I'm kind of like, no, no, no, no, no, no.
30:29There'll be no one to pay the taxes to fund any of this.
30:32I mean, because you're looking at the, okay, you have all these, and I'm a big tax, the corporations person, which is fine.
30:39But here's the problem.
30:41If they have no customers to sell to, they will generate no revenue and we'll have nothing to tax.
30:46So even as a tax, the rich person, I can see that.
30:50So this conversation is very short of, if we're not, and this is something I've always been very passionate about.
30:58We have, for a long time, not had an economy built for people.
31:02Thank you, Jack Welch and Lee Iacocca, who decided we should build everything for shareholders.
31:07And you're in Michigan at the scene of the crime.
31:09And so, and we've had that for almost 50 years now.
31:14Then on the next kind of layer of that, of that is, is one of the terrible things that that did is it sucked billions of dollars out of the middle class and hauled out the center of this country.
31:23And we, we know what that looks like.
31:27Now imagine what's gone on in the upper Midwest from coast to coast, everywhere.
31:33Because what we know is in communities, when you lose 10 jobs, 60 jobs, 100 jobs, that's thousands of dollars that aren't being spent around town.
31:41Even if it's a corporate places, isn't being spent around town, businesses close.
31:45New businesses don't open or ones that do can't make it because there's no revenue.
31:49When you, when the, when money, money's only usable in so much as you can treat it for other goods and services.
31:54Yeah.
31:55So when dollars stop moving, which is one of the problems our economy has now, the wealth or too much wealth.
32:00When the, when dollars don't move, you don't have a marketplace.
32:05You don't have an economy.
32:07No one has customers.
32:08The system collapsed.
32:09You're explaining free market capitalism, gross domestic product.
32:16The circulation of money in a merit transaction system.
32:22I make money and I choose who I want to buy things from.
32:26That is free market capitalism.
32:28And I'm for that gross domestic product goes away.
32:33And as you're saying, there is nothing to tax if no one is earning money legitimately for trading my services for your need to get something done in your company.
32:48Therefore, then I have money, part of the GDP to be taxed and to buy something from Best Buy, which is also then sales taxed.
33:00Right.
33:01Right.
33:02Which means, and then you get into very basic things of you have no roads, you have no sewers, you have no schools, you have no police, you have no fire people.
33:09Or a shithole third world nation.
33:11Yes.
33:11Yeah.
33:11I mean, it just, it just bubbles up from the bottom and just gets worse and worse and worse the higher up the levels that you go.
33:17So, and I think the only, the only thing I found compelling about what's kind of coming down the pike with AI is this idea that, you know, you, technology tends to be very deflationary.
33:31Um, that's part of the reason why the nineties were so good economically is the personal computer was very deflationary.
33:37I, I do feel like AI could lead to a place where, especially in development nations, it's so deflationary that life becomes incredibly cheap and we can get away with maybe working 10 or 20 hours a week and not necessarily making much money, but living great, good lives.
33:56Is it going to make for great GDP numbers?
33:59Definitely not.
34:00Is it a potential bandaid on all of this?
34:05Yeah.
34:05Cause at least you still have not many dollars, but a dollar or two moving for a very inexpensive, but high quality goods and services delivered by AI and robots and all this type of thing.
34:16Sure.
34:16I could see that.
34:17But to your point, here's, what's frightening.
34:21There's no one in Washington who's even talking about this.
34:25And no one's saying, you know, yeah, we need to start a 10 year transition and it will take a decade, a decade long transition where every year we're slowly reducing working hours as this technology increases, all this sort of thing.
34:43No one's having that conversation.
34:45And what then happens is you have the trauma of people merely being thrown out on their ear.
34:52And you see this, you know, especially when there's not a decent transition for it.
34:57You see this, especially in high-end manufacturing, like automobile manufacturing.
35:01You look at the modern videos and you have truck frames moving through a factory and there's five people behind glass in booths watching the robots do everything.
35:12And even, like, even the guy in the paint booth, he's the old fash that used to wear big yellow suits and spray with a gun.
35:22Now it's a guy who puts his buttons and the machine on a rack does the whole thing.
35:25And so, you know, you build a car with a handful of people, so much fewer people.
35:29And the reality is there was never a transition for the automotive industry.
35:33And so you have a lot of devastation from that, as I don't need to tell you.
35:36You live in it, you know, and I say, well, part of this is unions partially pricing workers out of jobs so that the companies want to automate more.
35:49So both labor and corporation are in part to blame.
35:54But to your deflationary, I could make do with less money if everything costs half as much, to your point, right?
36:01We're talking cost to benefit, we're talking ROI, we're talking the overall GDP.
36:09Indeed, if prices came down, we could make do with less.
36:15But that's that it's a very dangerous, delicate balance.
36:21It is.
36:22We're asking then.
36:23Well, I can get by with less if everything costs less.
36:28But if everything costs only pennies and there's no jobs for me to even make the pennies to buy it, right?
36:37These are complex issues.
36:40Yes.
36:40And therein lies the problem of, okay, we might reduce working.
36:43The vast majority of people are still going to require some form of income stream, usually through employment, all this sort of thing.
36:52And it can't be government fiat money paying you money that doesn't really exist.
36:57And the problem, and it was funny, I had a very nice young man named Stephen Fair on the show last year, and he and I have become, gotten into a business partnership and all this sort of thing.
37:08And I kind of laughed when he said, yeah, you know, we're going to see the end of employment by the end of this decade.
37:13And when he came on the show last year and said that, I laughed.
37:16And I said, that's adorable sort of thing.
37:19And then, but now fast forward a year, and all of a sudden, his little predictions, he was prescient.
37:24And so it's a very frightening thing.
37:29Well, I was going to say the robots aren't going to serve us food.
37:32Well, actually, Elon's robots can be waiters and waitresses and take our orders and cook our food and serve it to us.
37:41No, they have, China already has very advanced things where one chef can run the whole restroom.
37:47I mean, yes, I mean, it's getting to a place where, you know, between automation, AIM blue collar, humanoid robots that are already working in warehouses, picking alongside the people, all this type of thing.
38:01And that's why when I go into McDonald's, I won't use that kiosk.
38:06No, I'm walking to a counter.
38:09I want to talk to a person.
38:11I want an employee to put in my order to see their face, get their action, know they understand what I'm asking for and will get it right.
38:22And they stay employed.
38:24I stay happier.
38:26No, and this, I mean, and this is the great, you know, yes, absolutely.
38:32But I think this is the great debate is, you know, how, as we've found with so many other technological changes in the past, how long will people hang on to it before it becomes what everybody does, it becomes too convenient, all this sort of thing.
38:45And then you start to lose that ground and that progress.
38:49I think we're in a very scary place.
38:51Amazon whole model right there, right?
38:53I myself get mad at myself for the amount of things I order on Amazon because it'll show up at my door in two days rather than go to the mom and pop shop.
39:08I used to go to buy that, right?
39:12It's become too darn convenient.
39:14No, and my great fear is that we're going to have a lot of accelerationism in the adoption of this because it'll be cheap, it'll be easy, it'll be convenient, and, you know, nothing gets this country moving faster than cheap, easy, and convenient.
39:32And then we're going to really start to be in a very frightening and scary place.
39:37And I'm not a big social upheaval person, but I've read a history book enough to know that this is what causes a lot of social upheaval.
39:46What was the Will Smith movie?
39:49I, I, Robot.
39:50Do you remember the Will, right?
39:53Everybody's got their own personal robot.
39:55And there was another Bruce Willis, Surrogates, where people aren't even themselves, like WALL-E, right?
40:04We all went into space and we got lazier and lazier over time because nobody's doing anything, right?
40:12Surrogates, the movie, it's a good movie.
40:14I love it.
40:15I recommend it.
40:16But indeed, you don't even have, you send out your robot, you're operating them or they're operating as you, they come, you know, people aren't even out so, it's a social commentary again, just like social media.
40:32People are literally then putting on fake faces and bodies out in public to interact with other fake people hiding behind a keyboard in their house.
40:48Instagram and a tweet.
40:52All right.
40:53As an IT guy, I, you know, was on all those social media platforms as they first came out.
41:02And I, when Twitter, as I call it, twatter, I call it the twatter attention span.
41:09Look what we, right?
41:11No, I don't want, details matter.
41:14I write articles that have facts in them.
41:17So you understand, I understand what a thesis is and can write one.
41:23Most people don't even know what it is or can't spell thesis.
41:28But, you know, they want the, give me the fake headline.
41:33Give me the fake TikTok summary.
41:36No, details matter.
41:39They do.
41:40No, I mean, I, I, I wish, believe me, as someone who writes, especially on very complicated things like, you know, trade wars and geopolitics and all this, I think, I definitely miss people engaging with material at a deep level.
41:53And I get, as much as I love the views, I get annoyed that a 36 second YouTube short will always do better than the minute long thing.
42:03And that will even be true of this.
42:05This will get split into, into clips by AI.
42:08And I guarantee you the 35 second version will always get far more views.
42:14Taken out of context.
42:16And to present us, not say, or we will have said the words, but that's not what we meant because you're distorting the context.
42:28And indeed, that's the twatter attention span world.
42:32No, it's, it's, but, and, and it's, and it's sad because every video has, you know, the related video to go watch the long form sort of thing.
42:40And I cannot tell you how much time I've spent in the comments with people being angry and my saying, to be fair, this clip is out of its full content and you need to go watch the whole thing.
42:50Cause here's what we said.
42:51And here's what we were talking about, all this type of thing.
42:52And I've had to kind of calm people down in the comment section.
42:56Yeah.
42:56They'll get angry at you for correcting them.
42:59Yeah.
42:59Yes.
42:59And we kind of like, well, go watch the whole video.
43:01It's only 45 minutes.
43:02You should really go check it out and find out what we, all we were talking about sort of thing.
43:05And it's, yeah, it is.
43:07Yes.
43:08And that living in that space is very difficult and people don't engage the same way they used to.
43:12And that is always, always 110%.
43:15What does that portend for books too?
43:18No one goes to a library anymore.
43:20Pretty soon books that, you know, thousands of new authors every month, but pretty soon it's going to dwindle to your JK Rowling's, your, your, your Stephen King's, your only big Patterson's and clients.
43:37Well-known people, people, people aren't going to read books unless they're from the big name.
43:44Those are going to be the only ones published.
43:47And the sad thing is where I was reading this the other day, like we're already getting into, you know, the word post literate society has started to be used.
43:55Oh, I see a lot of those people on the street.
44:01Yes.
44:02And, and, and, and the whole conversation and as someone who, whose life and passion has been spent on words and books and book publishing and all this type of thing, what does it mean to be passionate about literature in a post literate society?
44:16And that's a whole conversation going on in my field right now, you know, it makes one feel very anachronistic sort of thing.
44:24Um, and it's hard to get people to, you know, sit down and, you know, and pay attention for that long or to invest the time.
44:32They don't know who Locke is.
44:35No, no.
44:36You know, if it's not for the series lost.
44:40Yes.
44:40That name would have been forgotten.
44:42And they don't know, you know, uh, oh, uh, the French guy I was going to, the Tocqueville or yeah, they, they certainly don't know Rand or nothing of a Rousseau or Voltaire or even, you know, if not for the Mac commercial, they wouldn't know 1984, the book.
45:03And the sad, and the sad thing is people's acceptance of an AI summary of it as an acceptable consumption is just kind of hurts my, hurts my heart.
45:13Or well for dummies, right?
45:16The cliff notes.
45:17Oh, please do yourself a favor.
45:21If you're falling into right.
45:23We're, we're not calling everybody illiterate, but unfortunately with the Twitter, Twitter attention span, we're getting all.
45:33Getting more and more lazy.
45:34We have to look ourselves in the mirror and catch ourself and say, whoa, whoa, whoa, wait a minute.
45:41I'm becoming one of them.
45:43I can't let that happen.
45:46Self-reflection is important here.
45:48It is.
45:49It is.
45:50So we're at, we're at time.
45:52I got to go.
45:52Cause I keep things a little bit shorter than you guys do over at Savage Overload.
45:56So, um, why don't we, uh, why don't we, um, uh, why don't you tell us where we can find you online on social media and we'll get you out of here.
46:03Well, you can see above my head, terrorstrikes.info, josephmleonard.us.
46:10Again, it looks like Lenard.
46:12It's not French.
46:13It's Leonard without an O.
46:15Joseph M.
46:16Leonard, because there is a Joseph Lenard out of South Carolina.
46:20I don't want you paying attention to any of his stuff.
46:23We don't know him now.
46:24Exactly.
46:25Right.
46:25And of course, constitutionalist politics carried virtually everywhere.
46:30Podcasts are covered.
46:32So, well, thank you so much for coming on the Cameron Journal podcast.
46:36I, thank you so much for having me.
46:41I really appreciate this.
46:42And again, I, I might not have even thought twice about coming on, but having been with you through Savage Unfiltered before, I thought, oh, I know that guy.
46:54I want to be on this show.
46:57Good.
46:58Well, thank you for coming on.
46:59Thank you so much.
47:00That's all for this episode of the Cameron Journal podcast.
47:14Thank you so much for listening.
47:17Visit us online at CameronJournal.com.
47:20We're on Facebook, Twitter, and Instagram.
47:23And I love to talk to my followers and listeners.
47:25So please feel free to get us on social media at Cameron Cowan on Twitter.
47:30And we'll see you next time on the Cameron Journal podcast.

Recommended