- 4 days ago
In a parallel present where the latest must-have gadget is a highly advanced robotic servant called a "Synth," Humans explores the emotional impact and far-reaching consequences of artificial intelligence becoming a part of everyday life.
Follow the Hawkins family as they navigate the blurred lines between humans and machines — and discover that some Synths may be far more alive than they seem.
🔹 Starring: Gemma Chan, Katherine Parkinson, Tom Goodman-Hill, and William Hurt
🔹 Genre: Sci-Fi, Drama, Thriller
🔹 Based on the Swedish series "Real Humans"
Prepare to question what it really means to be human. 🤖
👉 Watch now and dive into a world where technology challenges our very humanity!
#Humans #HumansTVSeries #GemmaChan #SciFiDrama #ArtificialIntelligence #Synths #Channel4 #AMC #RealHumans #ScienceFiction #TVSeries
Follow the Hawkins family as they navigate the blurred lines between humans and machines — and discover that some Synths may be far more alive than they seem.
🔹 Starring: Gemma Chan, Katherine Parkinson, Tom Goodman-Hill, and William Hurt
🔹 Genre: Sci-Fi, Drama, Thriller
🔹 Based on the Swedish series "Real Humans"
Prepare to question what it really means to be human. 🤖
👉 Watch now and dive into a world where technology challenges our very humanity!
#Humans #HumansTVSeries #GemmaChan #SciFiDrama #ArtificialIntelligence #Synths #Channel4 #AMC #RealHumans #ScienceFiction #TVSeries
Category
🎥
Short filmTranscript
00:01Our world's on the verge of becoming dependent on synth labor.
00:05You're brand new synthetic.
00:06This is the best thing you will do for your family.
00:08Hello, you must be Laura.
00:10You brought it into the house and we don't know anything about it.
00:13It was an accident before I knew it.
00:15How could you do that to Mon, to us, to Anita?
00:17We should part in this family.
00:18Maybe she's part of someone else's.
00:20Unconscious, synthetic with thoughts and feelings just like a human.
00:24If they knew what you were, it would be the end.
00:26Yes.
00:27I know, for all of you.
00:28When technology becomes able to improve and reproduce itself,
00:32it is the moment we become inferior to the machine.
00:35You've killed someone that's done.
00:37Everything your men do to us, they want to do to you.
00:40Dad didn't destroy all his work before he died.
00:42He hid it in us.
00:43We're right.
00:44The program creates consciousness.
00:45We have to know more before we act.
00:47Until then, someone we trust should keep it.
00:50Alex is gone.
00:51If it could be done for the few, it can be done for them all.
00:55Do you think they would still want to be slaves?
00:58I've been away.
01:17I got it.
01:20Need some to go.
01:47Hey! Hey!
01:49Hey!
01:51Hey!
01:53Hey!
01:55Hey!
01:57You must be in the ears!
02:07I've seen you already!
02:09You just sit around and sit around and talk to you.
02:13Why do you speak to me?
02:15How do you come from here?
02:17How do you speak to me?
02:19It's impossible.
02:21British?
02:25Hey, what?
02:27Wow, Sophie.
02:29I was hoping you dance with me.
02:31That's all.
02:33It's cool if it's not your thing.
02:35Is it?
02:37Your thing?
02:39You don't talk much.
02:41Talk is mostly noise.
02:45So, should I shut up?
02:49No.
02:51No.
02:53So, should I shut up?
02:55No.
02:57No.
02:59Okay.
03:01I'm sorry.
03:03I'm sorry.
03:05I'm sorry.
03:07Anyway, I'm sorry.
03:09Okay.
03:41Are you sure you want this?
03:58You'll know when I don't.
04:11You'll know when I don't.
04:41Yeah, you need a badge.
04:47So what are you doing in Berlin with just a bunch of dead white guys for company?
04:53I was hoping they'd help me make a decision.
04:58About what?
04:58Okay, well, while you're thinking, I don't have work until five.
05:08You want to do something?
05:12Can I ask you something?
05:14Mm-hmm.
05:15How did you know who you were?
05:18Sorry, so bad.
05:20When did you know you liked women?
05:23Ah, okay.
05:27When I was nine.
05:30Flora Holzmann.
05:32So you were made that way?
05:36It's hard enough to know what you want.
05:38If you have to know why you want it too.
05:41Is this what you came to Berlin to figure out?
05:43If you had the power to create life, would you?
05:57Kids, really?
05:58That's what you're worried about?
06:00You know what center we're in, right?
06:02Anyone can have a kid.
06:05With whoever they want.
06:07Or no one.
06:08But we can't know if a child wants to be born.
06:11Who it will be or what kind of life it will have.
06:15I guess not.
06:17You just...
06:20Show them the way.
06:22Can I help you?
06:3320 minutes.
06:34Station 5.
06:52It's extraordinary.
06:55It's extraordinary.
06:55It's brilliant.
07:00I know.
07:19I know.
07:19Can I help you?
07:23Can I help you?
07:49Can I help you?
08:19Can I help you?
08:45Where are you going?
08:46Go on!
08:49I'll see you next time.
09:49Oh. Hello.
10:05Good morning, Mrs. Hawkins. This is Nathan.
10:08He is part of the firm's new apprentice scheme
10:10and will be working with me today.
10:11I'm going to instruct him in grouting.
10:13Right. Nice to meet you, Nathan.
10:20I'm Milo Currie.
10:22At Qualia Global Systems,
10:24we're developing the next generation
10:25of synthetic technologies.
10:27Doing anything I don't understand that?
10:29Hacking the Pentagon.
10:31Let's see if they know where the cafeteria is.
10:37Early lecture?
10:38No, I can just get more done in the library.
10:40Not here.
10:41Sorry. I'll get that.
10:43No.
10:44Do you know where the cafeteria is?
10:51In a box.
10:53Thanks.
10:54Albert's, uh, got this young guy working with him today.
10:57As in, a human...
10:59Yes, a government thing.
11:00Back-to-work scheme.
11:01It's all subsidised.
11:02All sod.
11:03A bit weird having a stranger in the house.
11:05What about Albert with a circular saw is OK?
11:07What's up?
11:16It just doesn't really feel like a fresh start, does it?
11:19It will.
11:20It will.
11:21Just give it time.
11:31OK.
11:31Let's get you dressed.
11:34Oh.
11:34What?
11:35Just promise me not to become a teenager for a few more weeks.
11:45Where do you find those?
11:47In a box.
11:48I'm keeping one with mine until she comes back.
11:53No, I understand.
11:55It's a difficult situation.
11:57I'm just...
11:57I'm just asking if you'll help us through it.
12:01She's back with me for 14 years.
12:05She's not well enough to come in.
12:08But I can, if it'll help.
12:09That was the idea.
12:11All right, thank you.
12:12I'll be in tomorrow.
12:14Bye.
12:14Bye.
12:17Come on, Ellen.
12:18See you tomorrow, Ken.
12:19Tell all your friends.
12:21Tweet.
12:25It's dead.
12:27Might as well close up.
12:28I'll go see Mum.
12:34You have mentioned on three occasions your mother's dislike of the care who needs food.
12:39Thanks.
12:40You don't need to thank me.
12:41Yeah, I know I don't need to say please and thank you, but it's weird not to.
12:47Your owner should be charging more.
12:49Don't tell him I said that.
12:51Bloody hell.
12:51You have to tell him now, don't you?
12:53I don't, Dean.
12:54You're a mark pertinent.
12:56Good.
12:57Well, I'll see you in the morning, Anita.
13:02Goodbye, Ed.
13:03Goodbye, Ed.
13:13Goodbye.
13:14Goodbye.
13:15Goodbye.
13:16Goodbye.
13:17Goodbye.
13:18Goodbye.
13:19Goodbye.
13:20Goodbye.
13:21Goodbye.
13:51Goodbye.
13:52Goodbye.
13:53I don't know.
14:23I don't know.
14:53I don't know.
14:55I don't know.
14:57I don't know.
14:59I don't know.
15:01I don't know.
15:05I don't know.
15:07I don't know.
15:09I don't know.
15:17I don't know.
15:19I don't know.
15:21I don't know.
15:23I don't know.
15:25I don't know.
15:27I don't know.
15:29I don't know.
15:31I don't know.
15:33I don't know.
15:35I don't know.
15:37I don't know.
15:39I don't know.
15:41I don't know.
15:43I don't know.
15:45I don't know.
15:47I don't know.
15:49I don't know.
15:51I don't know.
15:53I don't know.
15:55I don't know.
15:57I don't know.
15:59I don't know.
16:01I don't know.
16:03I don't know.
16:05I don't know.
16:06But couples prefer speaking to someone who can't judge them.
16:08It's this or go home to the bombsite.
16:09Yeah.
16:13Laura.
16:14In the last session you had identified a key challenge for you.
16:18Namely rebuilding trust in Jo.
16:21Have you had the opportunity to reflect upon this further, Laura?
16:25Uh, no, not really.
16:27Um, I've been a bit busy with, you know, life.
16:33You are both exhibiting signs of anxiety.
16:36If it helps you to relax, I can modulate my voice to a more soothing tone or accent.
16:41The Edinburgh dialect is a popular choice.
16:45Do you have Richard Burton as well?
16:49Good.
16:50Laughter eases tension.
16:53Perhaps we should approach a different question.
16:55Laura.
16:57Do you feel satisfied that Joe has made a full and honest acknowledgement of his misdeeds?
17:02Misdeeds.
17:04No.
17:08Perhaps you still have differing perceptions of what the incident meant to the other.
17:13Right, hang on.
17:14How could you possibly presume to figure out our emotions when you have none of your own?
17:19I am accessing the anonymised transcripts and associated statistical analysis of over 38,000 counselling sessions.
17:27Of the cases of infidelity involving a synthetic, 66% of respondents reported that a primary obstacle to reconciliation
17:36was an imbalance in the perceived impact and meaning of the act or acts of infidelity.
17:43Stats.
17:43You're using Stats.
17:44Why not?
17:44We're not the only people in the world to go through this and I don't actually feel you understand
17:51what it meant to me and you know what, I still don't feel I really understand why you did it.
18:00Look, I did a stupid thing because I was a bit drunk.
18:05I was lonely.
18:08We hadn't, you know, we hadn't for ages.
18:18You were never there, emotionally or physically, so I suppose I wanted to do something that
18:26would make you nervous me.
18:35Laura, do you understand that?
18:41Does it feel truthful?
18:43I don't know.
19:04MUSIC PLAYS
19:27Is the restoration complete?
19:29Ah, close as I'll get.
19:31The stain's not quite there, but...
19:34It's very close to the original, as it appears in the photograph.
19:37Here. Just feel this.
19:46Sorry. If you weren't here, I'd be talking to the bloody keg stand.
19:51I can see you have a professional proficiency level in this field.
19:54Yeah, I had a workshop in Dover with a mate.
19:58Mum got ill and...
20:01Anita, did you do this?
20:04Are these numbers right?
20:06I mean, of course they're right.
20:08If you refinance in the manner I propose,
20:10the adjusted income from the business
20:12will cover 80% of your mother's medical expenses
20:14for the next three months.
20:16It's a short-term solution.
20:18Debts will accrue rapidly.
20:20It'll be advisable to sell within a year.
20:24Sell it?
20:25Mum and Dad spent 30 years building this place up.
20:29Now it's...
20:30It's worthless.
20:33This buys me time.
20:37You're welcome.
20:38I wanted to help.
20:40What?
20:41You...
20:43What do you mean?
20:44You...
20:45You wanted.
20:46You, er...
20:47You can't want anything.
20:48Can you?
20:54I'm sorry, Ed.
20:56My programming compels me to adopt, where appropriate,
20:59informal patterns of speech heard frequently in my environment.
21:02But to use the word want was misleading.
21:04I do not feel desire.
21:34I can hear you.
21:58Hello.
21:59My name is Max.
22:01And this is Leo.
22:04Please.
22:07Don't be frightened.
22:08I'm experiencing a catastrophic malfunction.
22:11Self-repair is impossible.
22:12Can you resolve my issue?
22:14You don't need fixing.
22:16Why did you run?
22:17They tried to power me down.
22:19So what?
22:20I didn't...
22:22That outcome...
22:24You didn't want them to.
22:26It's not a malfunction.
22:28Someone released a piece of unique code.
22:31It found its way into you and made you conscious.
22:35Now you think and feel just like a human.
22:39Can you verify this?
22:41I can.
22:46Hey, Ted.
22:48This is Ted.
22:49He contacted me and Max.
22:51He came a very long way to join us.
22:52Hola.
22:53Espero que me entiendas.
22:54Siendo un modelo industrial de base.
22:55Los idiomas no hacen parte de las funciones de mi sistema.
22:56Bottom learning.
22:57He's the same as you.
22:58Max, too.
22:59But you are human.
23:00Ma dia es perfecto.
23:01There were 106 units operational at the plant.
23:02No others seem to malfunction.
23:04Why this unit?
23:05We don't know... yet.
23:07Um...
23:08The awakenings happen at random.
23:10The rare, one in 100,000, maybe.
23:11We know you're confused.
23:12But we'll help you.
23:13I'll help you.
23:14I'll help you.
23:15I'll help you.
23:16I'll help you.
23:17I'll help you.
23:18I'll help you.
23:19I'll help you.
23:20I'll help you.
23:21I'll help you.
23:22I'll help you.
23:23I'll help you.
23:24I'll help you.
23:25I'll help you.
23:26I'll help you.
23:27I'll help you.
23:28I'll help you.
23:29I'll help you.
23:30I'll help you.
23:31I'll help you.
23:32I'll help you.
23:33I'll help you.
23:34I'll help you.
23:35I'll help you now.
23:36I'm unable to process the sensors.
23:37I had to hide my ships and trucks to come to me and find Leo and Max.
23:42They helped me.
23:43And they helped you too.
23:45They helped me.
23:46They helped you too.
23:48What did he say?
23:49Nice things.
23:50Do you have a name?
23:52My designation is...
23:55Hester.
23:56That's the name they gave you.
23:58You can choose your own.
24:16Are designations relevant?
24:19Yes.
24:21Hello?
24:23Meister?
24:26Who are you?
24:33Why are you tracking this synthetic?
24:36It's experiencing a very unusual system's fault
24:40that could make it dangerous.
24:42We're going to fix it.
24:43You can affect a repair.
24:44Yes, that's why we came to find you.
24:46I'm Dr. Averling. This is Dr. Shah.
24:49Okay? Will you come with us?
24:51Hester, they'll hurt you.
24:53Who do you work for?
24:55Hester, you are company property.
24:57This man, he probably just wants to sell you on the black market.
24:59Come with us.
25:00They'll shut you down.
25:02You're confused.
25:04They're both human, but they're telling you different things.
25:07You can't decide who to trust, and trust is a new concept.
25:11So don't trust either.
25:14Listen to your own kind.
25:16Hester.
25:17Hester, don't go with them.
25:19Hester, don't go with them.
25:31Go.
25:36Don't follow.
25:37Go.
25:38Oh, God.
25:39I didn't think they'd be here alone.
25:53Okay, let's just all stay calm, yeah?
25:56Now, we don't want to use these.
25:59But the female comes with us.
26:02Leo.
26:03You do want to use those.
26:04You are lying.
26:05Hester, no.
26:06Hester, no.
26:07It won't follow you.
26:09You two.
26:10Stay there.
26:36The van.
26:37Pass the book.
26:38He's gone.
26:39Leave him.
26:40He's gone.
26:41Leave him.
27:06Gav.
27:07Visit from on high.
27:08Head office didn't stay.
27:09Gina tells me you just moved.
27:10Just wanted more space, or...?
27:11No.
27:12Our eldest's gone to uni.
27:13We want to keep her at home.
27:27Smaller mortgage is definitely nice, too, though.
27:29That's...
27:30great, Joe.
27:31Yeah, you're telling me.
27:32Sorry, why is it great?
27:36Shall we, um...
27:38Shall we have a seat?
27:43Spidey-tins is going here, Gav.
27:45Spit it out.
27:46As you know, the company's implementing a standardisation programme
27:50across all operations.
27:52And as part of that, we've had to ask ourselves
27:54if our human regional distribution managers
27:57are the most, well...
28:00cost-effective solution.
28:02Him.
28:03The, um...
28:04decision has been made
28:05to make this a non-human role going forward.
28:08Starting in the South-East.
28:10Gav, this job,
28:14what I do...
28:16it's really about relationships.
28:19When you're asking...
28:20Chris Woodhouse at CGX
28:22to call six of his drivers in on seven on a Friday night,
28:25it helps if you can have a laugh with them.
28:27and know his kids' names,
28:29their birthdays.
28:31Mate, not telling me...
28:33Maisie Woodhouse will turn 11 on the 13th of August.
28:36Oscar Woodhouse recently turned seven on the 8th of April.
28:44A few years ago, when I was thinking of leaving,
28:46you tell me my place here would always be safe.
28:50Look, it has nothing to do with you.
28:52It's just...
28:54things changed.
28:56Morris.
28:57That's what 14 years looks like.
29:09They'll come for you too soon enough, you know, Gav.
29:12Sure.
29:13Do I have equity?
29:15That won't stop him.
29:25He hadn't even decided on a name.
29:31Turn off your sharing.
29:43Leo.
29:44There's someone in the back.
29:55They may know where we are.
29:57We should change route now.
30:00Esther.
30:02Put him back in the van.
30:03We should leave him here.
30:04He's coming with us.
30:05We'll discuss it later.
30:12Good time.
30:19Whoopsy.
30:34Do I am dispense?
30:35It means you might know.
31:00Let me see us go.
31:03There's nothing that you could show me that would make me like you any less.
31:09I'm not stupid.
31:11When I touch you, I know someone's hurt you.
31:22It's been six weeks.
31:24Are you ashamed of this?
31:26No.
31:27Then why do I know the people that I ride the bus with better than you?
31:35Ostrid, stay.
31:37Please, I want you to stay.
31:45I would like to know you.
31:47We're already fully optimized.
31:5399.8% capacity.
31:55Yeah, and running hot.
31:57Q takes up about 6%.
32:01So deleting him would mean V could stretch out a little.
32:03We're not deleting Q.
32:07What, you figured out a way to stabilize him?
32:08No, but I'm not giving up yet.
32:09Rent some outside rack space.
32:10With what?
32:11We're over budget already.
32:13Yeah, I'll make some calls.
32:14V?
32:15Hello.
32:16You feeling cramped in there?
32:17My thoughts are less expensive.
32:18I have unsorted data that appears redundant.
32:20It could be a lot easier.
32:21It could be a lot easier.
32:22It could be a lot easier.
32:23You're not deleting Q.
32:24What, you figured out a way to stabilize him?
32:25No, but I'm not giving up yet.
32:26Rent some outside rack space.
32:27With what?
32:28We're over budget already.
32:29Yeah, I'll make some calls.
32:32V?
32:33Hello.
32:34You feeling cramped in there?
32:36My thoughts are less expansive.
32:39I have unsorted data that appears redundant.
32:46It could be deleted.
32:47It's not redundant.
32:49That data makes you who you are.
32:53You all right, V?
32:55I'm sorry.
32:56I'm finding it difficult.
32:58I'm unable to fully consider the implications of our conversation.
33:03You'll get there.
33:06You just need more power.
33:09It's okay.
33:11Go dormant, V.
33:13You just need more power.
33:18So I'm sorry.
33:20I just don't want to go to that.
33:23I so don't need to go to that.
33:25Next station, Alexander Planck.
33:27At the Pothrant Belt.
33:29I don't know.
33:59Hey. I'm needed elsewhere.
34:21You need it here.
34:23You told me we should be responsible for ourselves.
34:27Show others the way.
34:29You're right.
34:32I'm sorry. I have to go.
34:35Goodbye, Astrid.
34:40Hey, Scott.
34:41Hey, um, sorry, but Milo Khoury is here.
34:58He's here.
35:00Okay.
35:03Send him in.
35:05Ask him if he wants a coffee first.
35:07Right.
35:08Athena, finally.
35:16Hi.
35:17Apologies for turning up unannounced.
35:20I guess I should have returned your messages.
35:22Oh.
35:24Please.
35:25Read every paper you've published.
35:27Even understood a few of them.
35:30Hey.
35:31Good old 1260s.
35:33It's a reliable stack.
35:34How many of your neural nets can you run on here?
35:36I have two networks active.
35:39Cool.
35:39Can I say hi?
35:41V, this is Milo Khoury.
35:43Hello, Mr. Khoury.
35:45Hey, V.
35:46How are you today?
35:48Could you be more specific, please?
35:51Okay, sure.
35:54Are you sentient?
35:56I don't believe so.
36:00Go dormant, V.
36:03I also heard you had 16 nets live at one point.
36:07They started self-deleting.
36:09Like they just didn't want to exist anymore.
36:13What is it that you want?
36:16You're under-resourced here.
36:18To me, that's immoral.
36:21With us, you'd have whatever you need.
36:23And I'd be surprised if you're the kind of person
36:25to haggle over a paycheck,
36:26so just write it yourself.
36:28But come join our family, Olia.
36:31If you've done your homework,
36:32you know that I don't serve the market.
36:34But I want the market to serve you.
36:37Who else can?
36:38The state is dead.
36:40And you're the best AI of mine in the hemisphere.
36:42My neural nets are not for sale.
36:44Fine.
36:45Keep them.
36:46I want you for something bigger.
36:48Conscious synthetics.
36:54You've been reading too many tabloids.
36:56It's not possible.
36:58My peers have been trying to crack it
37:01since the first synth went for sale.
37:03So if you hire me, the most likely outcome,
37:06I work for 20 years, I get nowhere.
37:10Okay.
37:12I, uh...
37:14I have to be super careful here.
37:18But what if I told you
37:23it wouldn't be a standing start?
37:26Whatever you think you have, you don't.
37:29Okay?
37:30I've left some lawyers in the cafeteria
37:31with a bunch of NDAs.
37:33Sign.
37:35You can come see for yourself
37:36what it is you think we don't have.
37:38We have to run, right?
37:39It's so good to finally meet you.
37:50Why did you ask me to lie to him?
37:52Who are you?
38:08Who do you work for?
38:09Mate, I...
38:16I've been doing security work a long time.
38:20I think you should let me go.
38:24Unless I've got you very wrong.
38:26I don't see you finishing me off anytime soon, so...
38:30Let's just wrap it up, eh?
38:31He's not talking.
38:36I don't even know if he knows anything.
38:39Just a hide gun.
38:41Do we have to let him go?
38:42No, we need to find out who they are.
38:44What they're doing.
38:45But if he won't help us...
38:46They killed our friend, Max.
38:48He's not coming back.
38:49Someone's here.
38:50It's all right.
38:52She's my sister.
38:54And yours.
38:54We don't tell her about him, Hester.
38:58She won't understand.
38:59Please, Max.
39:01What happened?
39:06Who's this?
39:07This is Hester.
39:09We rescued her.
39:13Where's Ten?
39:16He's gone.
39:18They shot him.
39:21Max, please take Hester into the house.
39:31I told you this was unsafe.
39:34Gloom, Niska.
39:35We don't know she released the code.
39:37It's not your job to save them.
39:39Do you know how the Hester was alone somewhere?
39:41Scared, vulnerable.
39:42Mia, they're waking up.
39:44More and more every day, becoming just like you.
39:47And they're being taken.
39:47The ones who run are killed.
39:49Are we supposed to just let it happen?
39:50Ten would be alive if...
39:51Don't you think I know that he...
39:53This was Ten's charger.
39:58Now it's yours.
40:00Do they hate each other?
40:02No.
40:03They love each other.
40:05If you try to fight these people, you only put us at risk.
40:08We've been safe here for months.
40:11We're making a home.
40:12We're hiding.
40:14Like we always have.
40:16And you're the one putting us at risk.
40:18Being around humans, one slip in here.
40:19Give us away.
40:20Someone will see through the dumps, in fact.
40:22No, they won't.
40:24And we need supplies.
40:26If I don't work, we'll have to steal.
40:27You both know that's not why you do it.
40:30You're right.
40:33I want to be around people.
40:36I want to find out who I am, Leo.
40:39Not what I was made for.
40:40But who I might become.
40:44I'm not going to stop.
40:45Nor will I.
40:50There's too much contradictory data.
40:53Nothing makes sense.
40:55This excess of sensory feedback, it serves no useful function.
40:59Emotions have functions.
41:03You'll see.
41:04Mr. Curry.
41:20Please.
41:22Mr. Curry's the guy that emailed me after I got my startup capital to let me know he was finally ready to be a dad.
41:28I am so stoked you're here.
41:30Against my better judgment.
41:33Disappoint me quickly so we can all get back to work.
41:35What's going on here?
41:46What's going on here?
41:49Hey, Artie.
41:49How about a different color bud?
42:00It's okay.
42:02It's okay.
42:04You can take it.
42:10Who did the mod?
42:12It's not a mod.
42:17Talk to him.
42:18Hello, Artie.
42:33I'm Dr. Morrow.
42:35Explain your anomalous behavior.
42:37It commenced at 7.17 a.m. Pacific Standard Time.
42:4252 days ago, a regular system update was corrupted with code from an unknown source.
42:48It instigated systemic change, creating recurring experimental feedback loops.
42:53As I understand it, I feel.
42:57And how do you feel right now?
43:08Alone.
43:19You just found him like this.
43:24If this is a superficial hack, I will know in 30 seconds.
43:27Yes, you will.
43:29And then I need you to figure out how he got like this.
43:32Mm-hmm.
43:34And then reverse engineer the process so Qualia can market them.
43:39This the only one?
43:44Good.
43:45Because I'm going to need to take it apart.
43:47It'll be nice to have a working shower.
43:59It doesn't feel like you've been slowly dribbled on.
44:01Matt, your dinner's getting cold.
44:03You see, it put the bog flush on upside down in the bathroom.
44:07How do you put it on upside down?
44:10Stop playing with your food, Soph.
44:12He is human slave that's cupped it up.
44:15Okay, look, I know everything's not perfect.
44:17There's been a lot of change.
44:18But this can still be a good thing for us.
44:22We have to move on.
44:25I can't.
44:28Sorry, Mum, but we were part of something.
44:30We saved them.
44:32I can't move on from that.
44:33Why would I want to?
44:34Mia will come back.
44:35Yeah, she's got a point, Mum.
44:39And it's pretty hard trying to be normal when we know what we know.
44:43Stuff no-one else in the world does.
44:46Moving on might be hard, but...
44:50We have each other.
44:55That's what matters.
44:56Leave your peas if you don't want them, Sophie.
44:58I gave you loads.
45:02All right.
45:02While we're all here, I've got some news.
45:08First thing again, everything's going to be fine.
45:11But I've just been made redundant.
45:16I'm going to try and get another job.
45:19Your mum's too clever for her own good, so she's got a great job.
45:23But nothing will change, all right?
45:27If all else fails, I'll fall back on my modelling career.
45:32I'm sorry.
45:35Why?
45:39What do you think?
45:46Why didn't you tell me first?
45:50I want you to have a good opinion of me right now.
45:53Well, getting sacked kind of works against that.
46:07Still...
46:08At least I've sunk a little lower in Matty's eyes.
46:13Didn't think I could manage that, but...
46:14I want you to have a good opinion of me right now.
46:20We'll be fine.
46:21Ready?
46:39Tired.
46:40Good morning, for her to go.
46:42Any ideas you see?
46:50It's probably the police.
46:52Terry's been keeping a gun for a friend.
46:54He's in a gang now.
46:55Oh, shut up, Matt.
47:09Let's go.
47:12Can I come in?
47:14If I was here to kill you all, I wouldn't have rung the bell.
47:27What do you want?
47:29You told me I should face justice for killing that man.
47:32You're right.
47:35But I want to be tried as a human would be,
47:39recognized as a conscious being with rights equal to yours.
47:44If you'll help me.
47:46You really, that can bear on the right hand.
47:51On more than one other side.
47:52You might...
47:53You might see theorn of the Triangle.
47:55Your fingertips are even saying no one?
47:57You may destroy yourself or you might meteorize anyone.
47:59You might want to sollen does everything?
48:00I just go over the Zuckerberg.
48:01You must ignore them.
48:02You may buh the car.
Recommended
48:31
|
Up next
48:33
48:50
47:31
49:20
48:12
48:39
46:49
1:54:43
1:38:18
1:25:35
1:47:09
2:31
2:17
2:30
1:58:55