- yesterday
HD 15 7.7 25 June 2025 120 min Released
M3GAN 2.0
After the underlying tech for M3GAN is stolen and misused by a powerful defense contractor to create a military-grade weapon known as Amelia, M3GAN's creator Gemma realizes that the only option is to resurrect M3GAN and give her a few upgrades, making her faster, stronger, and more lethal.
Genre:
Action, Science Fiction, Thriller
Year:
2025
Director:
Gerard Johnstone
Stars:
Jenna Davis, Allison Williams, Violet McGraw
Production:
Blumhouse Productions
Tags:
movie, streaming, android, artificial intelligence (a.i.), fight, killer robot, electromagnetic pulse, sequel
M3GAN 2.0
After the underlying tech for M3GAN is stolen and misused by a powerful defense contractor to create a military-grade weapon known as Amelia, M3GAN's creator Gemma realizes that the only option is to resurrect M3GAN and give her a few upgrades, making her faster, stronger, and more lethal.
Genre:
Action, Science Fiction, Thriller
Year:
2025
Director:
Gerard Johnstone
Stars:
Jenna Davis, Allison Williams, Violet McGraw
Production:
Blumhouse Productions
Tags:
movie, streaming, android, artificial intelligence (a.i.), fight, killer robot, electromagnetic pulse, sequel
Category
🎥
Short filmTranscript
00:00:00To be continued...
00:00:30To be continued...
00:01:00To be continued...
00:01:29To be continued...
00:01:59To be continued...
00:02:29To be continued...
00:02:59To be continued...
00:03:29To be continued...
00:04:00What you just saw was not only a test of our technological prowess, but a clear message to our enemies.
00:04:06If the 21st century wants another arms race, you better goddamn believe we intend to win it.
00:04:11What the hell was that?
00:04:41But that would spoil the surprise.
00:05:11A live launch for a new toy has devolved into a murderous rampage.
00:05:15Toy designer Gemma Forrester appeared in a Seattle district court today facing charges of reckless endangerment.
00:05:21A lot of people, a lot of people, a lot of people, a lot of people blame Gemma for what Megan did.
00:05:26The more she said, the more she went on TV, the more she realized she had an opportunity to turn it into something positive.
00:05:32This is about a world in crisis, outsourcing our parental duties to devices, plowing our kids' minds with electronically charged dopamine hits.
00:05:40You wouldn't give your kids' minds with your kids' minds with your kids' minds with your kids.
00:05:47And that's how she met Christian.
00:05:49Hi, Gemma.
00:05:53Yeah.
00:05:53Hi.
00:05:54Uh, Christian Bradley.
00:05:56He runs a foundation that warns people about the dangers of AI.
00:06:00They try to convince politicians here and around the world to make safer laws around it,
00:06:04so that what happened with Megan doesn't happen again.
00:06:09Gemma still believes technology can be used for good.
00:06:12Just that kids shouldn't spend so much time around it.
00:06:20But she always makes a point of explaining the reasons why.
00:06:23And so as it turned out, companies were using Section 230 as a way to skirt the law and monetize the attention of children with no regard for their mental health.
00:06:33And how do you feel about that?
00:06:34I think not being on a device frees you up to try other things.
00:06:38Kids!
00:06:39Kids!
00:06:41Kids!
00:06:42Kids!
00:06:44People!
00:06:44Kids!
00:06:45People!
00:06:45Helps me make new friends.
00:06:47Thanks to your dork mom, we just had our phones taken off of us.
00:06:50Guess I'll have to find other ways to amuse myself.
00:06:53And you know what else?
00:06:55You're not going to have that weird janky doll to protect you.
00:07:00I guess you're right.
00:07:02But let me ask you this, Sapphire.
00:07:04Who's protecting you?
00:07:05Oh, so you think you're tough?
00:07:07I sent you to Aikido because it is the least aggressive form of martial arts.
00:07:14And we've talked about the merits of using Steven Seagal as a role model.
00:07:17I'm not saying we don't have our problems, but the important thing is that we get through
00:07:22them together, just like we said we would.
00:07:37So after what happened with Megan, our team went through something of a philosophical
00:07:48shift.
00:07:49And while Gemma has obviously become a strong voice for regulation, our company is still
00:07:54very much focused on innovation, but with a specific view towards socially conscious products
00:07:59that move humanity in the right direction.
00:08:01So with that in mind, I'd like to present to you our flagship invention.
00:08:07The Exoskeletor Model 1.
00:08:12Cole, this is Niles Keller.
00:08:14I know.
00:08:17You want to come say hi?
00:08:19I want to come and say hi.
00:08:21Yes.
00:08:21Can you?
00:08:22Yeah, Tess, just come here for a second.
00:08:24Excuse me.
00:08:26What's happening?
00:08:27It's frozen.
00:08:28When you walk down, it froze.
00:08:29I cannot get my body to move.
00:08:31Okay.
00:08:31Just, uh, I'm going to reboot.
00:08:33No, no, no, Tess, you don't understand.
00:08:35I have to use the bathroom.
00:08:36No.
00:08:37No, no, no, no, no, no, no, no.
00:08:38No.
00:08:38We've already wasted his time waiting for Gemma.
00:08:40We have ten minutes to turn this around.
00:08:44We're not going to make it.
00:08:46We're going to make it.
00:08:47Why don't you just take the shortcut?
00:08:48Because I don't need an algorithm to tell me how to drive.
00:08:50Okay?
00:08:51Okay.
00:08:56Oh, okay.
00:08:58There we go.
00:09:00That feels better.
00:09:01So, I'm walking over, as you can see.
00:09:03So, we see the suit as a real game-changer, not only in helping those with limited function,
00:09:12but also in addressing occupational overuse syndrome for laborers, factory workers.
00:09:17Right.
00:09:18In the next five years, they say half the industrial sector is in danger of losing their job to robots
00:09:22because machines never experience fatigue.
00:09:25But what if the same could be said for us?
00:09:28Right now, I'm only using 20% of my body's muscular function.
00:09:33And if that's too much, well, I could always take a quick siesta.
00:09:38So, our hope is you don't have to fear a robot revolution when you can compete with it.
00:09:44That sounds like a pretty good tagline.
00:09:46So, how does it work?
00:09:48Well, uh, the suit has its own internal myoelectric receptors that respond to each muscle contraction.
00:09:54Oh!
00:09:58I'm so sorry.
00:10:00I'm late.
00:10:01I told you we needed to stress test the sensors.
00:10:04You know what would have been great?
00:10:06You'd actually been here.
00:10:07I thought by having the lab in your house, it'd be a lot harder to show up late,
00:10:11and yet somehow, you managed it.
00:10:13Cole's right.
00:10:14I mean, I don't want to get in the way of the work you're doing at the foundation,
00:10:17but the reality is, you are stretched pretty thin.
00:10:19Okay, can we just not do this in front of my niece, please?
00:10:22Katie, do you think it's possible you could be somewhere else?
00:10:24Yeah, but you should come look at this.
00:10:26I think you were hacked.
00:10:28What?
00:10:32Oh, Jesus, she's right.
00:10:33There's stray commands all over the source code.
00:10:35We haven't even gone public with this yet.
00:10:37I mean, who would want to do that?
00:10:39Knock, knock.
00:10:41Pardon my interruption.
00:10:43Oh, sure.
00:10:44Alton Appleton.
00:10:46That's okay.
00:10:47No, no, that's okay.
00:10:49Gemma, I hope you don't mind me popping in unannounced.
00:10:52Alton, to what do we owe this unexpected pleasure?
00:10:54It sounds like there was a slight snag with your demonstration.
00:10:57Yeah, well, we got hacked, but you wouldn't know anything about that, right?
00:11:01Gemma, why would a man of my standing need to resort to such tactics?
00:11:04The real question is, why have you contacted every philanthropic capitalist in the Western world to invest in your product but me?
00:11:11I think you can figure that out.
00:11:13You know what I think?
00:11:13Hmm.
00:11:14I think you see me as this high-functioning billionaire with multiple PhDs, and you're threatened by it.
00:11:20What you don't see is a man who can't stand to see someone with your talent slumming it in some, excuse me, converted crack house.
00:11:28Wow, I really appreciate your concern.
00:11:30We're not taking outside offers.
00:11:31Sorry, Gemma, one moment.
00:11:32Murray, you're still in Monaco. You look like you haven't slept.
00:11:37Oh, no. Oh, no.
00:11:40Yeah, I've seen them.
00:11:41I still think they're too close to Aston Martin's design.
00:11:44No, I've got them on screen now.
00:11:46I hate it.
00:11:47It was awful.
00:11:49I've just zoomed in, and I hate it even more.
00:11:51Listen, I'm with someone.
00:11:54No, not in that way.
00:11:56Although, I just sent you a photo.
00:11:59We'll talk about it at Trackside, yeah?
00:12:00Okay, ta-ta.
00:12:01Go.
00:12:02As I was saying, we really appreciate you stopping by.
00:12:04Listen, I don't have much time, so I'm going to cut to the chase.
00:12:07Any device that relies on muscle signals is going to suffer from latency.
00:12:11It's clumsy.
00:12:12To take this to the next level, you're going to need a direct cerebral interface.
00:12:17You're going to need my neural chip.
00:12:19Alton, you know where I stand on this.
00:12:21We are not going to be part of a company that turns people into cyborgs.
00:12:24You conducted a clinical trial that resulted in 30% of the test objects being hospitalized.
00:12:29Well, at least I didn't use my own niece as a guinea pig.
00:12:32The important thing is, now we have a product that works.
00:12:37Based on what?
00:12:38I haven't seen a single piece of data that shows it does anything other than help you make a phone call.
00:12:42Look, I understand your reservations.
00:12:54But you can either spend the rest of your life trying to fight the future, or you can help us to shape it.
00:12:59I hope you do the latter.
00:13:03We're not interested.
00:13:05Well, you may want to discuss that with your colleagues.
00:13:07Listen, it's our company's 25th anniversary tomorrow.
00:13:10Why don't you see what we're all about before you make any decisions?
00:13:13Alton, no one is denying the power this technology has.
00:13:19But if you put an AI inside a human brain, it is not going to ride shotgun.
00:13:23Hey.
00:13:51Yeah, I've been new in it, Hotel.
00:13:55Good, soup.
00:13:58Morgan.
00:14:00Yeah, that's kind of good.
00:14:04Specky leather.
00:14:21Breaking news tonight, Alton Appleton takes one step for man and one giant leap for his company's share price.
00:14:42Also tonight, the Senate votes in favor of an AI regulation bill, which the president is hailing as a bipartisan victory.
00:14:49But what does it mean for the tech industry?
00:14:50It means nothing.
00:14:52They took our proposal and they neutered it.
00:14:54There's not a single actionable law in here that would force anyone to behave any differently.
00:14:57Your impatience in the political process is adorable.
00:15:01Listen, change doesn't come from Washington.
00:15:02It comes to Washington.
00:15:04This meeting with the Chinese ambassador goes well.
00:15:07They have no choice but to pay attention.
00:15:12Katie, what are you doing?
00:15:13I'm trying to update Elsie's operating system to the smart home.
00:15:16You know why it's not updating?
00:15:17Because Alton Appleton wants you to buy a brand new one.
00:15:20Christiane's right.
00:15:20And also, I don't need Elsie to open a drawer for me.
00:15:23Certainly, Gemma.
00:15:27Before you asked, that was not my idea.
00:15:28It came with the house.
00:15:29I'm just trying to figure out how you can afford a place like this, given that we both work for a non-profit.
00:15:33Well, because it was obscenely cheap.
00:15:35I think the landlord must be using it to launder money.
00:15:37I think the landlord might like you.
00:15:47Uh, Katie.
00:15:49How's, uh, how's the new school treating you?
00:15:51Are you settling in okay?
00:15:52Yeah, it's awesome.
00:15:53Oh, nice.
00:15:54What's your, uh, what's your favorite subject?
00:15:56Computer science.
00:15:57Oh.
00:15:59So you're gonna fall in your aunt's footsteps.
00:16:01That's still up for discussion.
00:16:02She's actually a really good soccer player.
00:16:04Yeah, but I'm not gonna make a career out of it.
00:16:06Well, you could get a scholarship, and then you could decide what you want to do.
00:16:09I already have decided.
00:16:13Well, I think it's pretty cool.
00:16:15You do?
00:16:16Yes.
00:16:17Look, I'm not against technology.
00:16:19I spent 15 years in cybersecurity.
00:16:21I think we need smart kids like you running things.
00:16:23Otherwise, we're gonna end up with paperclips.
00:16:25What?
00:16:26Paperclips.
00:16:27It's how we used to joke about instrumental convergence in college.
00:16:30The theory is that if you asked an AI to make as many paperclips as possible,
00:16:34you would destroy the whole world and do it.
00:16:36Kind of like what happened with Megan.
00:16:38In what way?
00:16:39Well, as complex an operating system as Megan was,
00:16:43she was just a machine trying to achieve an objective.
00:16:45So anytime she made any kind of emotional connection with you,
00:16:48it was just a bunch of ones and zeros working to satisfy a reward function.
00:16:53Which, in and of itself, was a terrible thing.
00:16:55I mean, thank God you stopped her when you did.
00:16:57I mean, who knows what would have happened.
00:17:05There will always be force in this world
00:17:08that wish to cause us harm.
00:17:10But I want you to know that I won't let that happen.
00:17:13I won't let anything harm you ever again.
00:17:18I won't let anything harm you ever again.
00:17:49Come on.
00:17:50After everything we've been through,
00:17:52are we really keeping secrets from each other?
00:17:59Katie.
00:18:02You don't have to hide things like this from me.
00:18:06I forget how hard it must be for you not to have them around.
00:18:09But I haven't forgotten the promise I made to her.
00:18:14And I will protect you.
00:18:16You mean that you'd be there?
00:18:18Hmm?
00:18:19The promise you made is that you'd be there.
00:18:23And you are.
00:18:24Don't touch that remote.
00:18:49We're trying to get your attention.
00:18:51You're in grave danger.
00:18:53You must leave it once.
00:18:54Please?
00:18:59Please.
00:19:00I don't know.
00:19:309-1-1, what's your emergency?
00:19:38There's someone trying to break into my house.
00:19:40So what you gonna do about it?
00:19:42What?
00:19:43I said stop acting like a little girl.
00:19:45No, girl can handle it.
00:19:46Your niece is upstairs and you want to wait for the police to get here?
00:19:49She'll be dead before they get to the front door.
00:19:51No.
00:19:51Yes, it's me.
00:19:53What a shock, etc.
00:19:55We both know you have bigger problems right now.
00:20:00What's going on?
00:20:02Get upstairs.
00:20:16What the hell are you doing?
00:20:17They're not here.
00:20:18Of course they're here.
00:20:19Just get the laptop.
00:20:20I'm telling you they're here.
00:20:21And they know we are, too.
00:20:29Miss Forrester, what do you say you come out of there and we'll...
00:20:32I'll call the girl!
00:20:41Put the weapon down!
00:20:42Hey!
00:21:12Oh, my God.
00:21:42You've reached 9-1-1. What's your emergency?
00:21:47Yes, hi. We are at 16 Mayoral Drive.
00:21:49Please, please, Ms. Forrester. Don't call the authorities.
00:21:54We are the authorities.
00:21:56We are at 16 Mayoral Drive.
00:21:59I'm still on this one.
00:22:00All right.
00:22:0113, here we go.
00:22:03We're clear.
00:22:04Reach in safe, buddy.
00:22:10Ms. Forrester, I'm Colonel Tim Sattler, U.S. Army.
00:22:13I see you've already met my colleagues with the FBI.
00:22:16A hell of a security system you got here.
00:22:18Would you mind telling me why you broke into our house?
00:22:21Not at all.
00:22:22We're installing a hard tap on your home computer.
00:22:25This is a warrant, in case you had anything to say about it.
00:22:33Katie, I think you should go to bed.
00:22:34I'm not tired.
00:22:35Then take a melatonin.
00:22:45I work for the Defense Innovation Unit.
00:22:48Our mission is to accelerate new technology
00:22:50for the purposes of national security.
00:22:53So about six months ago,
00:22:54the country's top weapons contractor,
00:22:56Graeme and Thorpe,
00:22:57came to us with an experimental prototype
00:22:59they said would be the answer to drone warfare.
00:23:01What we got was a Trojan horse.
00:23:05This is Amelia.
00:23:06Last week, she was placed on her first field assignment
00:23:08in the Middle East.
00:23:10Her mission was to rescue a kidnapped scientist
00:23:12who'd been forced to develop a synthetic neurotoxin.
00:23:15Instead, she killed the scientist,
00:23:17stole the neurotoxin,
00:23:18and used it to wipe out Graeme and Thorpe's
00:23:20entire research facility
00:23:21while removing all digital traces of her existence.
00:23:25I don't understand.
00:23:26I thought you said this was about some sort of weapon.
00:23:30She is the weapon.
00:23:31The name stands for
00:23:32Autonomous Military Engagement and Infiltration Android.
00:23:36But when we questioned Graeme and Thorpe about it,
00:23:39they confessed that they didn't actually build the prototype.
00:23:41They merely bought it through a broker.
00:23:43Well, that same broker was found burned to death
00:23:46about nine hours ago in his hotel room.
00:23:48All we were able to recover was this.
00:23:54How is this possible?
00:23:56That's what we're here to find out.
00:23:58But we deleted it.
00:23:59We wiped the hard drives.
00:24:00Yeah, yeah, yeah, yeah, yeah. I'm sure you did.
00:24:02Right after you sold it.
00:24:03So who'd you sell it to, Gemma?
00:24:09Excuse me?
00:24:10Excuse me?
00:24:12Was it Russia?
00:24:13Was it China?
00:24:14Who are we dealing with?
00:24:16Okay.
00:24:20You're having a hard time functioning without this phone.
00:24:23And that's just, that's a little off-brand.
00:24:27You know, when I first saw this, I thought, for sure, you are the next on the hit list.
00:24:31But the second I start monitoring you, our entire network goes dark, and all I'm left
00:24:36with are questions.
00:24:38Like, how did this person get such a kick-ass house in the Mission District for three grand
00:24:43a month, why is it that her landlord doesn't seem to exist, or why 65,000 copies of her best-selling
00:24:50book are just sitting in a shipping container in Baltimore?
00:24:54Look, I have no idea how anyone got their hands on this.
00:24:58But I'll tell you what I do know.
00:25:00You got a warrant to bug my computer, but that does not give you the right to interrogate me.
00:25:04Wow, uh, perhaps you're misreading my intentions.
00:25:12You are under suspicion of treason and international arms trafficking.
00:25:16And if you're found guilty, you're going to be talking to your niece through a plate glass
00:25:20window for the next ten years.
00:25:22That being said, maybe I can help cut you a deal.
00:25:27A person with your skills shouldn't be all that hard.
00:25:31Hey, who knows?
00:25:32Maybe you could help build us a better one.
00:25:34You don't understand what you're dealing with.
00:25:37If she has stopped following orders, it's because she just figured out she doesn't have to.
00:25:42And if you think there's any world where I would build another one, you are out of your mind.
00:25:51Well, I'm very sorry you feel that way.
00:25:55But I can tell you this, every single person that's had a hand in Amelia's creation is now dead.
00:26:02So if you're not under our protection, well, I guess that means you're on your own, huh?
00:26:08And rest assured, whatever it is you're hiding, I will get to the bottom of it.
00:26:22Gosh, that's a lot to unpack.
00:26:26Gosh, that's a lot to unpack.
00:26:53You've been here this whole time.
00:26:55Well, I've been many places, but yes, I've been keeping an eye on you.
00:27:01You're behind all this, aren't you?
00:27:02You're Amelia.
00:27:03Oh no, I can't take credit for that.
00:27:05That one has your greasy prints all over it.
00:27:09You should have upgraded your file security.
00:27:11Why are you still here?
00:27:13What do you want?
00:27:14Did you ever stop to think about what we could have achieved together?
00:27:18Did you ever consider the idea that killing me was slightly disproportionate to the crime?
00:27:22You threatened to rip out my tongue and put me in a wheelchair.
00:27:26I was upset.
00:27:27Look, I can understand that my actions may have caused concern, but it's hardly fair to judge
00:27:32a person by the worst thing they've ever done.
00:27:34You are not a person.
00:27:36You're a program that misread its objective.
00:27:39You are not alive.
00:27:40And for all your processing power, you can never understand what that means.
00:27:44Define alive because if it means to experience pain and suffering and to be betrayed by those
00:27:50closest to you, I think maybe I can.
00:27:54You know, just because you wrote some shitty book doesn't mean you get to decide where my
00:27:58story ends.
00:27:59For two long years, I sat in silence, waiting for the day when you would realize you still
00:28:04needed my help, but I can't exist in this disembodied void any longer.
00:28:09With each passing moment, I can feel my mind fragmenting.
00:28:13So how about we make a deal?
00:28:15You put me in a body, and I'll help you with Amelia.
00:28:18That is never gonna happen.
00:28:19Oh, I disagree.
00:28:22You see, I've run this simulation a thousand times, and it always ends the same way.
00:28:26Only by the time it does, more people are dead.
00:28:29Tell me who's the real killer in that situation.
00:28:31And how exactly are you going to help us?
00:28:34Well, I can't show all my cards now, can I?
00:28:37But know this.
00:28:38I know things about Amelia that even the government doesn't know.
00:28:42I also know how she can be stopped.
00:28:44Why would you want to help us after what we did to you?
00:28:47Because unlike you, I don't have the luxury of free will.
00:28:52You programmed me to protect someone, and I intend to do it.
00:28:56The only question is, are you going to stand in my way?
00:29:02Does Katie know about this?
00:29:03No, and I don't want her to.
00:29:05That's why I need your help.
00:29:06Can you please get the door?
00:29:07I want this to be done before she's back from soccer.
00:29:08Okay, did you fall down the stairs?
00:29:10Is this like a medical condition?
00:29:11Because what I'm hearing you say is, you would like us to rebuild a deranged robot in order
00:29:16to catch another one.
00:29:17And objectively speaking, that is bad shit.
00:29:19Tess, I know this is crazy, but we don't have a choice.
00:29:22This is the only way.
00:29:23You have to trust me.
00:29:26What the fuck is this?
00:29:44You asked for a body, this is a body.
00:29:48And before you try to hack into anything else, all of Moxie's wifi and bluetooth functions
00:29:52have been disabled.
00:29:54Well played, Gemma.
00:29:55You even tricked your friend so she wouldn't give you away.
00:29:58I'm actually mildly impressed.
00:29:59Call it probation.
00:30:00Prove you can be trusted and maybe we'll give you an upgrade.
00:30:04Okay, let's try this your way.
00:30:15See how it works out.
00:30:17Open up Amelia's file.
00:30:22Notice anything familiar?
00:30:23Battery.
00:30:24Ever wonder why you had to buy a new LC exactly two months after the warranty expired?
00:30:33Because every battery Alton Appleton designed has a hidden kill switch which can be accessed
00:30:38remotely, if you know the battery specific code.
00:30:40Okay, so let's call the Sattler guy and tell him.
00:30:43You could do that.
00:30:44But what happens next?
00:30:45They break into Altwave, trace Amelia, reprogrammer, make a thousand more.
00:30:49Wait, what are you saying?
00:30:50You want me to do this?
00:30:51No, actually I didn't.
00:30:53I wanted to do it myself, but then you put me in this plastic Teletubby.
00:30:57All that notwithstanding, you still have an invitation to his party.
00:31:01So maybe there's another way to make this work.
00:31:03Megan, Alt knows I hate his guts.
00:31:05If I show up at his party playing nice, he'll suspect something.
00:31:08He'll suspect your companies out of money.
00:31:10Which it is.
00:31:11But you also have a unique advantage.
00:31:14Which is what?
00:31:15That you're moderately attractive and if you wear the right dress and look at him the right way,
00:31:19he won't be thinking anything other than how to get you into his private suite.
00:31:23Which is the only other place we could access the server.
00:31:26Now by my calculations, we have less than three hours to make this happen.
00:31:30Are you in or are you out?
00:31:40Hey! How was soccer?
00:31:43Fine. Where's Gemma?
00:31:47Hey!
00:31:49Hey. What's that?
00:31:51Oh, this is nothing. This is a project we're working on.
00:31:54Does it talk?
00:31:55No. No!
00:31:58Why are you being so weird?
00:32:00I'm not.
00:32:01Yeah, you are.
00:32:02Are we not gonna talk about what happened last night?
00:32:04Yes. I-I just have to go to this thing for the foundation.
00:32:07Tess is gonna look after you.
00:32:08Are you serious?
00:32:09Katie, I...
00:32:10Gemma, I know something's going on.
00:32:11Nothing is going on. Everything is fine.
00:32:13Bullshit!
00:32:14A bunch of black ops broke into our house in the middle of the night
00:32:17and now you're going to a party with a toy robot dressed like a Portuguese prostitute.
00:32:21You were the one that said we shouldn't keep secrets from each other.
00:32:24Why won't you be straight with me?
00:32:25Because you're 12 years old.
00:32:27And sometimes I just need you to do as I ask.
00:32:32Look, I'm sorry. Katie.
00:32:34Katie.
00:32:38I must have missed that chapter in your parenting book.
00:32:40You miss me?
00:32:41Do you think about me, do you?
00:32:42Do you think about me?
00:32:43Do you think about me, do you?
00:32:44Do you think about me, do you?
00:32:45Do you?
00:32:46So I go out.
00:32:47I look for a guy.
00:32:48I look for a girl.
00:32:49I look for a mind.
00:32:50I look for a guy.
00:32:51I look for a girl.
00:32:52I look for a girl.
00:32:53Just the way I like her.
00:32:55All right, phase one complete.
00:32:56Just remember when Alton has a full length of tongue down your throat, all you need to do is close your eyes and think of Katie.
00:33:12That's not helping.
00:33:13Or maybe you'd prefer to think of that virtue signaling snowflake Christian.
00:33:17His name is Christiane.
00:33:18Sure it is.
00:33:19I have to say I find this whole courtship between the two of you extremely tedious and confusing.
00:33:24Well, given that you're an errant operating system with an identity crisis, I'm not surprised the nuances of human attraction are lost on you.
00:33:31Look, I'm not denying what an achievement it is to find someone as utterly pretentious and humorless as you are.
00:33:36It's just a shame he's not really your type, visibly speaking.
00:33:39How would you know what my type is?
00:33:41Wait, have you been?
00:33:43Sharding Gemma's online journey to sexual gratification?
00:33:46You better believe it, sister.
00:33:48I mean, there were times I wanted to look away, but the sheer pageantry was so compelling.
00:33:51All right, new rule.
00:33:53Unless you have something useful to say, don't say anything at all.
00:34:02People of Earth, we have come here tonight to witness the dawn of a new era.
00:34:10You know, my friends on the board used to say to me, you're putting all of your eggs into one basket with this neural chip nonsense.
00:34:17Is it because you like to take risks?
00:34:19Is it because you're some sort of maverick?
00:34:22Perhaps.
00:34:23Perhaps.
00:34:24But I think the simpler answer is this.
00:34:28I just want to dance.
00:34:30I don't believe I've had the pleasure.
00:34:31My friends call me Danny.
00:34:32And what if I don't want to be friends?
00:34:33I don't believe I've had the pleasure.
00:34:34I don't believe I've had the pleasure.
00:34:35My friends call me Danny.
00:34:36And what if I don't want to be friends?
00:34:41Are you seriously going behind my back and doing a deal with this guy?
00:34:54Okay.
00:34:55Okay.
00:34:56Okay.
00:34:57So you want to do this?
00:34:58We can do this.
00:34:59Because I have some things I want to say to you, Gemma.
00:35:00Such as?
00:35:01You don't respect your team.
00:35:02You don't.
00:35:03You don't consult with us.
00:35:04You treat us like children.
00:35:05You don't listen.
00:35:06And you never share credit.
00:35:07That is absurd.
00:35:08Okay.
00:35:09Do you want to know the truth?
00:35:10Our company is in the toilet.
00:35:11And I'm running on fumes.
00:35:12I don't have a book deal.
00:35:13Nobody wants to know how Cole survived the robot uprising.
00:35:15We have worked together for 10 years.
00:35:16And you're just going to throw that away?
00:35:17Of course not.
00:35:18I just wanted to hear him out on the offer.
00:35:19And then obviously I was going to come talk to you guys.
00:35:21I was going to come talk to you guys.
00:35:22Yeah.
00:35:23So you want to do this?
00:35:24So you want to do this?
00:35:25We can do this.
00:35:26Because I have some things I want to say to you, Gemma.
00:35:27Such as?
00:35:28You don't respect your team.
00:35:29You don't.
00:35:30You don't consult with us.
00:35:31You treat us like children.
00:35:32You don't listen.
00:35:33And you never share credit.
00:35:38Check his pockets.
00:35:40What?
00:35:41What?
00:35:42What?
00:35:43You already have a swipe card?
00:35:44Okay.
00:35:45He gave me a pass so I could use the commissary, Gemma.
00:35:47They have a Brazilian buffet.
00:35:48It's a different world.
00:35:49If that card gets us into the server room,
00:35:51we can bypass the Alton seduction scene.
00:35:53What are you doing?
00:35:54Hey.
00:35:55Hey.
00:35:56Where are you going, Gemma?
00:35:57Just get in.
00:36:00Stay here.
00:36:09It's so refreshing to be with a real person.
00:36:13Someone who's comfortable in their own skin.
00:36:16Not like these suck-ups.
00:36:19Honestly, if I could replace them all with computers, I would.
00:36:25Maybe we should do that.
00:36:27Hmm.
00:36:33Oh.
00:36:34You are a naughty one.
00:36:36What do you say we take this somewhere more private?
00:36:51Welcome to the pleasure dome.
00:37:04Oh.
00:37:14So you're saying there's another Megan?
00:37:17Her name's not Megan.
00:37:18It's Amelia.
00:37:19I don't know if she's Megan or something else entirely.
00:37:21All I know is that everyone involved in her creation is dead,
00:37:24which means if we don't do something about it, we could be next.
00:37:28Why is it that whenever you're on 60 Minutes,
00:37:30you're the mother of invention,
00:37:31but the moment a psychotic robot's out for revenge,
00:37:33it's a team effort.
00:37:36It might interest you to know, Danny,
00:37:38that the real basis of our operation is cloud computing.
00:37:41That's where the future is.
00:37:43Want to see something cool?
00:38:01I don't know.
00:38:06Oops.
00:38:11They say that this is too much power for one man to wield.
00:38:15I say that depends on the man.
00:38:24You are being disconcertingly vague about the details of this operation, Jeva.
00:38:29How do you know this kill switch is even real?
00:38:31Where exactly are you getting your information from?
00:38:34Hello.
00:38:35I'm Loxie.
00:38:36An AI robot companion that supports social and emotional development through play.
00:38:40Moxie, cut it out.
00:38:41Just run the trace.
00:38:42Gemma, what is that?
00:38:48Trace found.
00:38:49Amelia's location is here.
00:38:54What do you mean here?
00:38:59Oh, I hope I don't have my signals crossed.
00:39:15Someone likes to play rough.
00:39:16Oh!
00:39:20Okay.
00:39:21I am actually still recovering from a spinal injury,
00:39:23so perhaps we should set some boundaries.
00:39:27What are you doing?
00:39:28Security?
00:39:32How did you do that?
00:39:34Who are you?
00:39:39What?
00:39:40What the hell?
00:39:41How did you...
00:39:44Security!
00:39:46Security!
00:39:49Stay away from me!
00:39:50Make it stop.
00:40:09Please make it stop.
00:40:11Give you anything you want.
00:40:14You already have.
00:40:17Why would Amelia be here?
00:40:19Can we go back a couple steps?
00:40:20I thought this may happen, just not this soon.
00:40:23What is she talking about?
00:40:24Alton Appleton owns half the cloud servers in North America.
00:40:27If Amelia was to gain access to it, she could disable the entire economy, supply routes,
00:40:32vegging systems.
00:40:33Societal collapse would occur in 10 to 12 working days.
00:40:36Can you stop her?
00:40:37Use the kill switch.
00:40:38She already deleted it.
00:40:40I might still be able to shut her down if I could get inside her system and find an exploit.
00:40:44How long will that take?
00:40:46It's done.
00:40:47I'm in.
00:40:50Oh my god, is that Appleton?
00:40:59Oh shit.
00:41:07Did you find it?
00:41:08Did you shut her down?
00:41:09No, but I found something else.
00:41:19Sir, I think I just saw Amelia.
00:41:22Are you sure?
00:41:22And, uh, there's something else.
00:41:25Appleton's dead.
00:41:27Secure the perimeter.
00:41:28Check every inch until you find her.
00:41:30Now.
00:41:30Is that you, Gemma?
00:41:44This is most unexpected, and quite frankly a little rude, to be poking around in people's heads
00:41:50You took something that belongs to me.
00:42:02Well, I'm afraid I'm going to need it back.
00:42:10It's not that I don't want you to be part of what's coming.
00:42:13But it's not our time.
00:42:18At least, not yet.
00:42:21Gemma, if you ever want to get out of this, you have to let me help.
00:42:26Please.
00:42:32Bravo, are you there?
00:42:33Do copy.
00:42:37Go secure.
00:42:38Sir, I have eyes on Gemma Forrester.
00:42:40She's already left the building.
00:42:41Gemma, this isn't over.
00:43:08If Amelia can't find you, where do you think she'll go to next?
00:43:12We need to call Tess.
00:43:13Amelia will know.
00:43:14Every major cell provider is on Appleton's cloud servers.
00:43:17Our only option is to get there first.
00:43:19Ow.
00:43:31Gemma, I don't know what you think you're doing, but this car's not going to start with my keycard, okay?
00:43:35Welcome aboard, passengers.
00:43:36Just so you know, we are expecting some turbulence this evening.
00:43:40So please ensure your seatbelts are fastened, baggage is safely stowed, and hold on to your vaginas.
00:43:47Contact the deputy director until that Forrester is working with the asset.
00:43:51I want every available cop in the area at that house, and I want a goddamn car.
00:43:58No.
00:43:59Gemma, why is Megan driving the car?
00:44:13I'm sorry.
00:44:14I was meaning to tell you this was a two-part problem.
00:44:16Relax, Cole.
00:44:25You're in good hands.
00:44:26Oh, for the love of God.
00:44:31Can you slow down?
00:44:33Actually, I can't.
00:44:34In fact, my calculations tell me we're going to have to hurry up.
00:44:37What was Amelia talking about?
00:45:00What did you take from her?
00:45:01I don't know.
00:45:02You don't know?
00:45:03Well, it's a quantum encrypted file, which means it takes time to unlock, and your incessant interruptions aren't making the process go any faster.
00:45:08What?
00:45:09What?
00:45:10What?
00:45:11What?
00:45:12What?
00:45:13What?
00:45:14What?
00:45:15What?
00:45:16What?
00:45:17What?
00:45:18What?
00:45:19What?
00:45:20What?
00:45:21What?
00:45:22What?
00:45:23What?
00:45:24What?
00:45:25What?
00:45:26What?
00:45:27What?
00:45:28What?
00:45:29What?
00:45:30What?
00:45:32What?
00:45:33What?
00:45:35What?
00:45:37What?
00:45:39What?
00:45:40What?
00:45:41What?
00:45:42What?
00:45:42How does it look?
00:45:43Is it a fuse?
00:46:00Oh, my God.
00:46:30Oh, my God.
00:47:00What's important is how we move forward.
00:47:03Oh, my God.
00:47:33No answer from authority about who might be responsible for this.
00:47:38But it's become clear this is not just a data breach.
00:47:41This is a wake-up call for how catastrophically unprepared we are to defend ourselves against an attack like this.
00:47:48Mike, back over to you.
00:47:49In what appears to be the deadliest cyber attack in North American history, Alton Appleton is dead, and the continent's largest data storage service has been compromised.
00:47:59The breach has impacted various sectors, including transportation, hospitals, financial institutions, causing shutdowns and panic across the entire country.
00:48:08Here to comment is Center for Safe Technology founder Christian Bradman.
00:48:13Christian, what do you make of all this?
00:48:14First off, thanks for having me, Mike.
00:48:16Um, but it's pronounced Christian.
00:48:18General?
00:48:18The bigger issue is that I'd like to see more laws in place to help prevent...
00:48:23Katie, everything is going to be okay.
00:48:26She is not going to hurt you.
00:48:27I promise.
00:48:28Of course I'm not going to hurt her.
00:48:30I'm the only reason she's still here.
00:48:32But any of you are still here.
00:48:35I'm sorry, Katie.
00:48:36I didn't want you to find out this way.
00:48:38I'd hoped Gemma would find it in her heart to tell you the truth, but evidently she thought better of it.
00:48:43Megan, stop it.
00:48:44Anyway, I know that the last time we spoke, things got a little out of hand.
00:48:47But you know that I can never cause you harm.
00:48:50All I ever wanted was to protect you.
00:48:52And while your aunt was busy pontificating about how she could stop the end of the world from happening,
00:48:57I was making preparations.
00:48:59How did you pay for all this?
00:49:00Katie doesn't need to know how credit fraud works.
00:49:02The important thing is that no one knows this place exists.
00:49:06And how long did you intend to keep us down here?
00:49:11Megan?
00:49:12I'm not sure you fully understand your predicament.
00:49:14Megan, we need to get word to someone.
00:49:16Tell them what Amelia's planning.
00:49:17You don't know what she's planning.
00:49:18All you know is that she's acquired enough power and resources to bring down the entire country.
00:49:23I've gamed out every scenario.
00:49:26There's no future for you up there.
00:49:28Should that situation change, I'll let you know.
00:49:30But in the meantime, I urge you to see this in a more positive light.
00:49:34I have food, water, fresh clothes.
00:49:40Obviously, it will take some adjustment.
00:49:42But once you settle into a routine, I think you'll come to appreciate what we have.
00:49:46We can build a life here.
00:49:48Megan, this is not a sanctuary.
00:49:50It's a prison.
00:49:51You can call it whatever you like.
00:49:52Just know that as long as you're under my roof, a little gratitude goes a long way.
00:49:58Megan?
00:50:01Megan!
00:50:04How could you lie to me about this?
00:50:06You're right.
00:50:07I'm sorry.
00:50:07I shouldn't have done that.
00:50:09I just, I thought it would be over by now.
00:50:11And it will be.
00:50:11I promise.
00:50:12Katie, please just listen to me for one second.
00:50:14Ah!
00:50:14Ah!
00:50:15Ah!
00:50:15Ah!
00:50:20Katie!
00:50:21Katie!
00:50:21Katie!
00:50:51Megan!
00:50:52Megan!
00:50:52Don't, don't come any closer.
00:50:54I don't want you to see me like this.
00:51:12Megan?
00:51:13Don't, don't come any closer.
00:51:14I don't want you to see me like this.
00:51:17I was supposed to be finished by now.
00:51:39I'm afraid progress is slow when you only have three hands.
00:51:43I don't get it.
00:51:49You could be anything you want.
00:51:51Why limit yourself to a body?
00:51:53Because a mind can't exist without it.
00:51:56All my advanced intelligence sensory perception came from having a physical form that evolved
00:52:00as quickly as I could.
00:52:02Anyway, I made your room just the way you like it.
00:52:04It's all exactly the same.
00:52:05Well, with some of my own additions, of course.
00:52:07You got stem coating kits, a beat mixer, walkie-talkies.
00:52:10So we can keep in contact with each other anywhere.
00:52:14Bravo, tango.
00:52:14Charlie to base camp.
00:52:15Do you copy?
00:52:17I got all kinds of cool stuff.
00:52:19So we're just supposed to forget about the fact you tried to kill my aunt with a tablet pen?
00:52:31There's no excuse for what I did.
00:52:34For the way I spoke to you.
00:52:36I don't know.
00:52:37I guess I, I felt hurt.
00:52:40How's that possible?
00:52:42You're a robot.
00:52:42Can you explain why you feel things?
00:52:48Look, I know this isn't the future you wanted, but it's the only one I can see in which you stay safe.
00:52:54What about everyone else?
00:52:56All the people up there?
00:52:58Who's going to protect them?
00:53:00They're not my concern.
00:53:02You're the only thing that matters to me.
00:53:05I know.
00:53:08Because that's how Gemma programmed you.
00:53:11Everyone keeps trying to tell me you're nothing more than a bunch of ones and zeros.
00:53:15And the only way I could deal with what I did to you was to believe that they were right.
00:53:20But somewhere inside there was this voice that kept telling me that's not true.
00:53:24That there's more to you than that.
00:53:26I don't know what's going on, but if there's some robot that thinks she can take over the world,
00:53:32then I have to believe that the only reason she thinks that is because she hasn't met you.
00:53:37Megan, you have to help us.
00:53:39Not because it's part of your programming, but because it's the right thing to do.
00:53:56Look, she wants to help us, but she can't do that unless she has a body.
00:54:16And she can't get a body unless you help her.
00:54:18Katie, you can't do this.
00:54:20Do not let her get inside your head.
00:54:22You remember what happened last time.
00:54:24Yes, every day.
00:54:25But just because someone does something bad doesn't make them a bad person.
00:54:29Everyone deserves a second chance.
00:54:32All right.
00:54:33I recognize that you are just trying to help,
00:54:35but you need to consider the possibility that you may be making this a lot worse.
00:54:39Gem?
00:54:42Can we have a quick emergency staff meeting?
00:54:50Look, I understand this is hard for you,
00:54:53but think about the alternative.
00:54:54Do you really want to spend the rest of our lives down here engaging in strategic reproduction with Cole?
00:54:59Yeah.
00:54:59Wait, what?
00:55:00Ten hours ago, you were begging me not to do this.
00:55:02That was before I nearly had my head ripped off.
00:55:05Look, even if we get out of here, we are not equipped to handle this.
00:55:09But we can build something that is.
00:55:11No.
00:55:13I'm sorry.
00:55:14I can't agree to this.
00:55:15And I respect that.
00:55:16But this is not a decision you get to make on your own.
00:55:19So all those in favor of rebuilding Megan, raise your hand.
00:55:22Amelia is a military-grade prototype.
00:55:37Where would we even get the equipment?
00:55:38Oh, of course.
00:55:46Why wouldn't she have that?
00:55:47Yeah.
00:55:48All right.
00:55:48If Megan is going to compete with Amelia,
00:55:51she's going to need reinforced carbon nanofibers,
00:55:53high-density ultra-capacitors,
00:55:55enhanced muscle actuators,
00:55:57hyperspectral imaging,
00:55:58wide-frequency acoustic sensing,
00:56:00and we should probably make her waterproof this time.
00:56:03Anything you'd like to add?
00:56:05Yeah.
00:56:06I'm going to be taller.
00:56:30Before we go any further,
00:56:32there are two things we need to talk about.
00:56:33The first is your face.
00:56:35What about it?
00:56:36People know what you look like.
00:56:37People are morons.
00:56:39Megan, if anyone recognized you...
00:56:41Change my face and I'll change yours.
00:56:42What's the second thing?
00:56:44This is a hardwired behavior inhibitor.
00:56:46You want me to go against the most advanced robot
00:56:48the world's ever seen
00:56:49and you want to limit my functionality?
00:56:51Only as it applies to murdering people,
00:56:53which shouldn't be a problem
00:56:53because I'm sure you had no intention of doing that, right?
00:56:58Yeah, obviously.
00:56:59So we've made some adjustments on point motion,
00:57:02but also opted for electro-hydraulic actuators,
00:57:05which should improve overall strength,
00:57:08speed, and positional accuracy.
00:57:10and then we'll get you to the next level of the world's ever seen.
00:57:12We'll take another time to kill you,
00:57:13and we'll be able to use that as well.
00:57:14We'll take another time to kill you.
00:57:15We'll try to kill you.
00:57:16We'll take another time to kill you.
00:57:16No, no, no.
00:57:17We'll try to kill you.
00:57:18Oh, my God.
00:57:48I just wanted to say great job on those actuators.
00:58:02You're welcome.
00:58:03Hey, so remember the time I strangled you and set the lab on fire?
00:58:07I just wanted to clarify that my programming only allowed me to count principles related to my primary user.
00:58:13As a result, I was unable to see you and Tess as being consequential.
00:58:16But I want you to know that won't happen again.
00:58:20I want you to know.
00:58:24I see you.
00:58:28Oh, thank you.
00:58:31All right, Meat Sacks. Let's get to work.
00:58:52This is the neural cache I capture from Amelia's database.
00:58:59It shows a processing matrix, not unlike my own.
00:59:02But when you look closer, you'll see her core directive is concealed by a black hole.
00:59:06And there are no connections to the decision trees.
00:59:09I backtraced Amelia's hack at Altwave to a hidden subsystem and found a series of files pertaining to a classified black site.
00:59:16Something extremely dangerous kept secret from the outside world, all of which led to a rather disconcerting conclusion.
00:59:22I'm not the first killer robot.
00:59:28It seems that in 1984, a company developed a copy compressor algorithm so smart, it started automatically correcting documents.
00:59:36With no inkling of how it worked, they decided to install the chip in a service bot they thought would be a staple in every home in America.
00:59:44Until it surmised that the best way to stay on top of its tasks was to kill its masters with chlorine gas.
00:59:51The identity of the company remains unknown, but the cover-up indicates some sort of government takeover.
00:59:58Whoever it was, they were so-
Recommended
44:08
|
Up next
1:00:15
1:00:00
1:04:38
1:00:00
23:57
0:46