Pular para o playerIr para o conteúdo principalPular para o rodapé
  • 07/07/2025
FRONTLINE and ProPublica investigate how an online network of white supremacists known as Terrorgram spread extremism and violence and the anonymous, loosely moderated platforms they use to spread hate and promote terror attacks.

Categoria

😹
Diversão
Transcrição
00:00:00Viewers like you make this program possible.
00:00:03Support your local PBS station.
00:00:21Breaking news after reports of a shooting.
00:00:23Multiple people ruthlessly gunned down.
00:00:26Police have also discovered a lengthy manifesto.
00:00:29In collaboration with ProPublica,
00:00:32reporters A.C. Thompson and James Bandler
00:00:34investigate the global rise of a dangerous movement.
00:00:38This is a very militant, aggressive, dangerous community.
00:00:43The story that we unraveled was frankly very disturbing.
00:00:47What's interesting about this collective is how transnational it is.
00:00:52How they radicalized online.
00:00:54He was a very active user of all these channels.
00:00:57And promoted violence.
00:00:58Here was someone that they successfully indoctrinated
00:01:01and encouraged to kill.
00:01:03And he'd gone out and done it.
00:01:04Now on Frontline, the rise and fall of Terrorgram.
00:01:09You have to ask a question.
00:01:11What have we reaped as a result of this?
00:01:14What new whirlwind are we throwing ourselves into?
00:01:18This program contains graphic content. Viewer discretion is advised.
00:01:30Old Town in Bratislava, Slovakia.
00:01:43I was here to investigate a deadly attack that had terrorized this central European city.
00:01:51It happened on the evening of October 12, 2022.
00:02:01That night, on Zamoska street, three friends sat talking outside a gay bar they frequented.
00:02:06The last year, for me, it was like a freedom.
00:02:11A lot of people, according to me, had a second family there.
00:02:15At least those who were not at home.
00:02:18Rodka Troksyarova was catching up with her friends, Matusz and Juraj.
00:02:23When it was hot, there were three legs, where they could sit, but no one.
00:02:30Juraj was out of the cage or sitting.
00:02:32Matusz hit me.
00:02:34Then I tell him, I'll give you something else.
00:02:38And he opened my head to me.
00:02:44Rodka saw another person standing in the shadows, not far from the bar.
00:02:49And then he came and shot.
00:02:52And he was standing there.
00:02:54Our first thoughts were that he was standing there, because he was still a queer.
00:02:58And he was afraid to go further.
00:03:00And we also met him.
00:03:04His name was Juraj Krajcik.
00:03:06He was 19 years old.
00:03:08He was an intelligent student.
00:03:10He was very good in English.
00:03:13He was usually alone.
00:03:16It was not suspicious to anybody at that time.
00:03:18But earlier that day, Krajcik had posted a hate-filled manifesto online, full of false narratives and racist conspiracy theories.
00:03:27He wrote that white people were facing a critical situation and that Jews and gay people should be eliminated.
00:03:33Krajcik pulled out a gun and aimed directly at the three friends.
00:03:48He was shot from this side.
00:03:49So there was no chance.
00:03:50And I heard a gun.
00:03:51And I heard a gun.
00:03:53Because the nine of them, the only thing I know, was that Juraj fell in front of us.
00:04:01And Matusz led me up to the ground.
00:04:03He was in the ground.
00:04:04He was in the ground.
00:04:05He was completely in the ground.
00:04:06He was in the ground and attacked by a few seconds.
00:04:09And there was no place.
00:04:11So there was no room here.
00:04:15Krajcik fired twice more into the bodies on the ground.
00:04:18Then he fled into the night.
00:04:20Then he fled into the night.
00:04:22A 19-year-old young man shot two people
00:04:24apparently don't like homosexuals.
00:04:26The police was still on fire.
00:04:28He was convinced to perform such a crime
00:04:31that has signs of terroristic crime.
00:04:36We did not know what the motive was.
00:04:38We knew it was a murdering called blood,
00:04:42but of course we didn't have the shooter who was at large
00:04:44moving around the largest center of Bratislava.
00:04:47The police confirmed that he didn't have any punishment.
00:04:52Slovak officials believed no one else was involved in the shooting,
00:04:55that Krajcik was a so-called lone wolf.
00:04:59But his manifesto contained clues
00:05:01that in fact he'd been radicalized
00:05:03by a global community of online extremists
00:05:07to commit an act of 21st century terror.
00:05:11The Bratislava attack is important because it's a pure example
00:05:15of how influencers today can encourage and inspire other people
00:05:20to go out and commit acts of terrorism.
00:05:23It explains and shows how terrorism works today.
00:05:30For the past year, I've been reporting on the dark corners of the internet
00:05:33and social media that have given rise to a series of deadly far-right terror attacks.
00:05:38The ecosystem is designed such that anyone can pop off
00:05:42and create a very high-impact, damaging attack on society
00:05:46at the drop of a hat.
00:05:47With a team of reporters from Frontline and ProPublica,
00:05:50my colleague James Bandler and I have been investigating the anonymous
00:05:54and loosely moderated platforms where extremists have been able
00:05:57to share propaganda and terrorist instructional material.
00:06:01It was allowing for really kind of unfiltered and unregulated hate
00:06:07and extremism to kind of run rampant.
00:06:09And the transnational terrorist network behind the Bratislava attack,
00:06:13known as the Terragram Collective.
00:06:16These people on the messaging and social media app Telegram
00:06:21were trying to stir other people to commit acts of incredible violence
00:06:27and to spark a race war,
00:06:29which they hoped would lead to a white ethnostate rising from the ashes.
00:06:40You will not replace us!
00:06:43You will not replace us!
00:06:46You will not replace us!
00:06:50You will not replace us!
00:06:53You will not replace us!
00:06:57You will not replace us!
00:07:00I've been documenting hate groups in America for Frontline and ProPublica
00:07:04for almost a decade.
00:07:06Back in August 2017, I was in Charlottesville, Virginia,
00:07:10when white supremacists made their biggest public show of force in years.
00:07:14White lives matter! White lives matter!
00:07:17White lives matter!
00:07:18It was incredibly chaotic and disturbing.
00:07:21The rally descended into racist, anti-Semitic violence.
00:07:42A young counter-protester was murdered.
00:07:44Far-right extremists were arrested, criminally prosecuted, and sued in civil courts.
00:07:51We are here today to announce the arrests of four members of the militant white supremacist group.
00:07:59For the movement, Charlottesville was pivotal.
00:08:02One of the things that happened is the movement kind of splintered.
00:08:08And so there was a faction of the movement that said,
00:08:11we're going underground.
00:08:12We're not going to meet in person anymore.
00:08:14We're going to engage in terrorism.
00:08:17And we're going to communicate with each other through these online platforms.
00:08:21We can win a secret, clandestine battle.
00:08:24And we can try to bring down the government.
00:08:30In online chats I was monitoring at the time,
00:08:33extremists were increasingly promoting a violent ideology
00:08:37called militant accelerationism.
00:08:39Militant accelerationism is a terroristic ideology that is rooted in notions of white supremacy
00:08:47that looks to collapse the societal order
00:08:49and encourage race riots across Western countries in particular.
00:08:53Milo Comerford is an extremism expert.
00:08:56This is really about encouraging violence,
00:08:59polarization, and racial animus that can lead to a war and a conflict
00:09:04that can be used as the basis for forming a white ethnostate.
00:09:07In 2019, this growing accelerationist movement
00:09:14would be galvanized by a horrific terrorist attack
00:09:17on the other side of the world.
00:09:20What happened in Christchurch, New Zealand would provide a grim template.
00:09:32leaderless, decentralized terrorism performed for an online audience.
00:09:44Friday prayers, the Al Noor Mosque in Christchurch,
00:09:47a hub for practicing Muslims on New Zealand's South Island.
00:09:51March 15th, it was a beautiful sunny day.
00:09:58I go to Al Noor Mosque for my worshiping.
00:10:02And there is my regular mosque to go every Friday.
00:10:09I go to Al Noor Mosque for my worshiping.
00:10:19Imam started this speech when I hear the big bang sound.
00:10:24And then I see someone with the helmets and vest.
00:10:37He's prepared the weapon.
00:10:39And I see myself in front of, you know, that weapon.
00:10:46And then I see the smokes come out and then I feel the bullets in my mouth.
00:11:00He started walking towards us.
00:11:06And serial shooting, papapapapa sounds like that.
00:11:12I see like he's enjoying.
00:11:17You know, it's like a video game.
00:11:25He's just focused what he's doing.
00:11:28While I'm seeing the bullets entering my legs, I said,
00:11:37Oh, I think this is how you're feeling when you get shot.
00:11:41Bang, bang, you know, just...
00:11:46It seems like never stop.
00:11:53Oh, suddenly he left.
00:11:59The mosque is full of smoke.
00:12:03It went very quiet.
00:12:06I heard somebody by the door saying,
00:12:10Imam, I know you are there.
00:12:12Come out.
00:12:13The police is here.
00:12:16He pulled my hands.
00:12:20I wish that he covered.
00:12:22He had covered my eyes.
00:12:24Because I saw here on that corner,
00:12:27three meters long, this side and high,
00:12:30people on top of each other and bleeding.
00:12:33And on that corner, the door,
00:12:37I saw people over there.
00:12:39That is the real massacre I have seen in my life now.
00:12:44We're interrupting normal programming with some breaking news
00:12:54after reports of a shooting in Central Christchurch.
00:12:5644 people were killed in the attack at the Al-Nor Mosque.
00:13:05Dozens were injured.
00:13:08At a second mosque, seven more were shot dead.
00:13:11Police arrested 28-year-old Australian Brenton Tarrant.
00:13:16The gunman eventually pleaded guilty
00:13:18to the murder of 51 people
00:13:20and the attempted murder of another 40.
00:13:23He was given multiple life sentences.
00:13:26In the aftermath of the attacks,
00:13:31a government inquiry, known as the Royal Commission,
00:13:34concluded that he was a lone actor.
00:13:37But recently published research from the University of Auckland
00:13:42found that Brenton Tarrant had been part of a global community
00:13:45of online extremists for many years.
00:13:48We've got five years of him speaking candidly online.
00:13:51So being able to understand what drove him,
00:13:54be able to see him speaking with, you know,
00:13:56completely unguardedly and in his environment
00:13:58where he kind of felt safe.
00:14:00Researchers Chris Wilson and Mikhail Jyvulski
00:14:03uncovered more than 400 posts they linked to Tarrant,
00:14:07including threats against Muslims,
00:14:09all scraped from the online platform 4chan.
00:14:13So this is 4chan,
00:14:15a simple image-based bulletin board
00:14:16where anyone can post comments and share images.
00:14:19There's a whole bunch of different boards
00:14:23that have got different types of themes.
00:14:26The most famous far-right one is the Politically Incorrect board,
00:14:29which we found Tarrant on.
00:14:32Politically Incorrect was known as the Poll Board for short,
00:14:36a home for people with extremist opinions.
00:14:40As with all 4chan boards, every post is anonymous.
00:14:44This isn't just showing how thorough we've been
00:14:46with our multi-factor authentication rights.
00:14:49They linked the post to Tarrant by scouring his known online history,
00:14:53cross-referencing thousands of posts,
00:14:56a digital trail of old usernames,
00:14:59spelling quirks and odd word usages.
00:15:03This email address that he used in 2013
00:15:07is the same address that he used right before his attack.
00:15:12The researchers compared these posts
00:15:14to an account of Tarrant's travels compiled by the Royal Commission,
00:15:18which uncovered that he had travelled to more than 50 countries
00:15:21in the five years before the attack, apparently alone.
00:15:25They matched Tarrant's itinerary to posts on the Poll Board.
00:15:30The flags on Politically Incorrect were a feature that was added to,
00:15:35I guess, promote nationalism.
00:15:37When he's in these different countries,
00:15:39we can see that the IP address comes up with those flags.
00:15:44And so the flags and geography line up with the dates
00:15:48that the Royal Commission said he was in these places.
00:15:51That's right.
00:15:51So all of the posts that we're showing,
00:15:53they correspond to the countries that he was visiting at the time.
00:15:58As he travelled, Tarrant stayed in touch with the Poll community on 4chan,
00:16:03which was increasingly becoming an echo chamber of hate.
00:16:06The history of 4chan is an interesting one in that
00:16:13it developed initially and very popular within kind of anime culture.
00:16:18Pete Simi is a sociologist who studies violent extremists
00:16:22and how they distribute propaganda online.
00:16:25It's very rudimentary in many respects.
00:16:27They're not glossy.
00:16:28They don't look very digitally sophisticated.
00:16:31They look kind of old school in a way,
00:16:33almost like the old bulletin boards from the early 80s.
00:16:384chan was started in 2003 by 15-year-old entrepreneur Chris Poole
00:16:43as a space to communicate with his friends.
00:16:46Unexpectedly, it became a phenomenon.
00:16:49There are very few places now where you can go
00:16:52and not have an idea to be completely anonymous
00:16:54and say whatever you'd like.
00:16:574chan became known as a meme factory,
00:17:00but with anonymity and little moderation,
00:17:03the content grew increasingly edgy.
00:17:05Despite some efforts to clean up the site,
00:17:08by the time Chris Poole sold 4chan in 2015,
00:17:12it had become a popular destination for racists and extremists.
00:17:16One of the things that started to happen on 4chan
00:17:19is a kind of accumulation of neo-Nazi white supremacist,
00:17:24misogynistic, extreme, hateful tenets and trends.
00:17:29These people are going in,
00:17:32it's humour, there are memes,
00:17:35they're getting all these ideas of anti-semitism,
00:17:37Islamophobia,
00:17:39ideas of white genocide and so on,
00:17:41all in just little snippets that are really, really easy to take in.
00:17:44And maybe without even noticing
00:17:46that they are developing these ideas that are racist and potentially violent.
00:17:50Tarrant's path over the years appears to have followed that process of radicalization.
00:17:56The researchers found posts from 2015,
00:17:59after Dylann Roof killed 9 black members of a church in Charleston.
00:18:04These are posts from Tarrant dated June 2015,
00:18:07and they're in response to somebody who posted Dylann Roof's manifesto.
00:18:12They are in the wrong country perpetuating the destruction of the white race.
00:18:16For Tarrant, it was a turning point.
00:18:19The big moments on the 4chan poll board
00:18:23are the moments where there's a massive attack.
00:18:264chan's a place where when people go out
00:18:29and commit acts of white supremacist terrorism,
00:18:32they are celebrated and lauded and hailed as heroes.
00:18:37So, you should not be surprised
00:18:41if that kind of adulation
00:18:44doesn't spawn more people seeking that fame.
00:18:48The current owner of 4chan, Hiroyuki Nishimura,
00:18:52did not respond to interview requests.
00:18:55The site's administrators have said
00:18:58that any threat of violence or terrorist acts violates their rules,
00:19:02and that they've banned users who've done so.
00:19:09In August 2017, Brent and Tarrant moved to Dunedin,
00:19:13a city on the coast of New Zealand's South Island.
00:19:17It was here that he plotted his accelerationist attack,
00:19:22leaving clues of his intent on 4chan.
00:19:26He let everybody know that he was going to commit an attack,
00:19:29at least twice.
00:19:31In two threads in March and August 2018,
00:19:35he speaks angrily about the spread of people of colour
00:19:39and the supposed spread of mosques in New Zealand.
00:19:44Somebody posts matches to indicate,
00:19:47hey, this would be a good idea to burn these mosques down,
00:19:50and Tarrant comes in and says soon.
00:19:53The truth of the matter is,
00:19:55he was carrying out the aspirations of the online community.
00:20:01So he wasn't a lone actor.
00:20:03He was a guy who was reflecting the values of his community.
00:20:07As Tarrant prepared, he wrote a 74-page manifesto
00:20:13full of falsehoods and racist ideology.
00:20:17He called it the Great Replacement,
00:20:19after a conspiracy theory about a supposed plot
00:20:21to wipe out the white race.
00:20:23And then came March 15th.
00:20:27Tarrant posted links to his manifesto
00:20:45on a site similar to 4chan, called 8chan.
00:20:49In a disturbing innovation,
00:20:51he also live-streamed the attacks on Facebook,
00:20:54filmed like a first-person shooter game.
00:20:57He used every available means of dissemination of his ideas
00:21:02that he could.
00:21:03Mainstream social media accounts.
00:21:04He used this new technology, the GoPro camera,
00:21:08to make a live stream,
00:21:10which would mean that the propaganda was almost impossible to stop
00:21:14and would continue for years to come.
00:21:16The Christchurch attack convulsed New Zealand.
00:21:20In the aftermath,
00:21:22Prime Minister Jacinda Ardern issued an edict.
00:21:25He may have sought notoriety,
00:21:27but we in New Zealand will give him nothing,
00:21:30not even his name.
00:21:32His manifesto was banned.
00:21:34The attack video made illegal to possess.
00:21:37They were trying to stop Tarrant's toxic propaganda
00:21:41and erase his name.
00:21:42But online, Tarrant was becoming an icon.
00:21:45His racist propaganda was spreading.
00:21:48The manifesto was posted, quoted, and shared worldwide.
00:21:58Facebook had quickly taken down the live stream,
00:22:01but new links were put up on 4chan and 8chan.
00:22:04Matt Kreiner is an expert on violent extremism
00:22:17and has studied how Christchurch became a model for future attacks.
00:22:21Keith constructed a very clear formula for others to follow in his footsteps
00:22:25because that's ultimately what he wanted.
00:22:26He wanted others to see what he was doing as a call to action.
00:22:30By the time of the Christchurch attacks,
00:22:34many far-right extremists had migrated from 4chan to 8chan.
00:22:39We wanted to see who created this platform with almost no content moderation,
00:22:47this platform that had become sort of a free-for-all for some of the ugliest speech.
00:22:52And we reached out to Frederick Brennan and went and saw him.
00:22:56Frederick Brennan had started 8chan in 2013.
00:23:01He was born with a congenital condition affecting his bone structure.
00:23:06It's a very rare disease, so there's not much funding in research.
00:23:10What's the disease called?
00:23:11Osteogenesis imperfecta.
00:23:13Early on, Brennan developed a passion for computers and coding
00:23:17and discovered chat sites, including 4chan.
00:23:20I became really interested in how 4chan was set up, like, technically.
00:23:26It was fun for me because I, especially as a kid, was very different from everyone else.
00:23:33As a teenager on 4chan, Brennan used the platform's signature anonymity
00:23:38to make outrageous and offensive statements.
00:23:41One of the ways that I rebelled was that I became, like, this very fringe advocate
00:23:48for the Nazi movement of eugenics.
00:23:50To kill people like yourself.
00:23:51Yes, to kill people like myself, yes.
00:23:53It made me feel really smart to be, like, 15, 16, right?
00:23:59And to be saying things that these adults don't know how to even conceive
00:24:04of somebody who is disabled telling them,
00:24:07people like me shouldn't exist.
00:24:09Have you changed your views about eugenics and about...
00:24:12Yes, of course.
00:24:13Yeah.
00:24:14I'm definitely not a believer in eugenics anymore, no.
00:24:19The teenage folly.
00:24:24In 2013, inspired by 4chan, Brennan created his own site.
00:24:30I thought that it would be kind of like the next stage of 4chan's evolution.
00:24:38Brennan wanted 8chan to have even less content moderation than 4chan.
00:24:43See, on 4chan, the website's administrators were still in charge of all the boards.
00:24:49So even the poll, like, politics board on 4chan was not just, like, a Nazi haven.
00:24:54But on 8chan, since it was all user-created, we didn't have a poll board
00:24:59until, like, a user who wanted one created it.
00:25:02Well, that guy, he named his user account Heil.
00:25:05So what do you think his...
00:25:06Heil as in Heil Hitler.
00:25:07Yes.
00:25:08And, you know, he very clearly made it a Nazi stronghold.
00:25:12You're basically asking people...
00:25:14Asking users, yes.
00:25:15...who were already in the site to police it.
00:25:16Yes.
00:25:17Yes.
00:25:18How does that work?
00:25:20Um, not very well.
00:25:23Within two years, amid a wave of new users and mounting costs, he sold 8chan.
00:25:29But he continued working with the new owners until a falling out led him to cut ties altogether.
00:25:34I, you know, I don't know what the future holds, but...
00:25:37Then in 2019, I was just starting to get over it.
00:25:42And I was thinking at that time, okay, these guys are nothing.
00:25:46You know what I mean?
00:25:46Like, 8chan is not that big of a site.
00:25:484chan is still the main one.
00:25:50It's just gonna slowly die and sink into irrelevance.
00:25:53That's what I thought.
00:25:54Very foolishly.
00:25:55Um...
00:25:56And then these shooters start using it.
00:25:59There's been a shooting at a synagogue in Poway.
00:26:03White male shooter entered with an AR-style rifle and opened fire.
00:26:07Six shots, a pause, another burst.
00:26:09The sheriff believes the weapon jammed at that point.
00:26:12Just six weeks after Brent and Terran had posted his manifesto and livestream on 8chan.
00:26:18The first copycat attack near San Diego, California.
00:26:22One woman was killed, three other people wounded.
00:26:25The gunman killed Lori Gilbert Kaye, a 60-year-old member of the synagogue,
00:26:30and injured three others, including an 8-year-old.
00:26:33The suspected gunman is identified as 19-year-old John Ernest of San Diego.
00:26:38He's a college student with no criminal record.
00:26:41Worshippers say he stormed...
00:26:45There was an enormous amount of police activity.
00:26:49We parked our car, and I can remember talking to a police officer.
00:26:54They had mentioned another house, and I felt this relief.
00:26:57Oh, thank goodness it's not... it's not us.
00:27:02But that wasn't the case.
00:27:04It was us.
00:27:05It was my son.
00:27:07Authorities began investigating the shooter's online life.
00:27:14Police are reviewing an open letter he posted online slamming President Trump, the Jewish faith, and conservatives.
00:27:20The 19-year-old gunman bragged in his manifesto about starting a mosque fire in Escondido last month.
00:27:28This is the mosque, and he spray-painted the name of this New Zealand shooter.
00:27:37And POL is from the HN. It's politically incorrect.
00:27:41San Diego District Attorney Summer Stephan prosecuted Ernest for the attack.
00:27:47What happened is he first shot right from outside to the inside of the synagogue.
00:27:54He struck Laurie Gilbert Kaye.
00:27:57He went into a room where there were a lot of kids and began shooting.
00:28:03So he was prepared to do maximum damage.
00:28:06The suspect accused of shooting four people, killing one of them.
00:28:11Including an arson fire at a mosque in Escondido.
00:28:14And numerous hate crime allegations.
00:28:16Ernest eventually pleaded guilty.
00:28:18He was sentenced to life in prison without parole.
00:28:21This is the first time his father has spoken publicly about his son.
00:28:27He loved his family.
00:28:29He had friends.
00:28:31He was a fantastic pianist.
00:28:34He was never a part of any kind of a hate group.
00:28:39I have to ask it.
00:28:40Your family are not anti-Semites.
00:28:42No.
00:28:43You're not racist haters.
00:28:44Is that right?
00:28:45We are not at all.
00:28:47He was never exposed to that.
00:28:49He, from us.
00:28:55Investigators said the shooter was radicalized over the course of about a year on 8chan and other sites.
00:29:01The entire process of him becoming radicalized appeared to have happened online only.
00:29:11Taking someone who was a 4.4 student all the way to a cold-hearted killer.
00:29:17Ernest's father says his son started out on mainstream platforms like YouTube.
00:29:24Early on, he was a fan of conservative political commentator Ben Shapiro.
00:29:29Until suddenly, he turned against him.
00:29:32John T said that that's a Jew.
00:29:36This guy is hated.
00:29:38He's part of the whole conspiracy.
00:29:40And that I don't listen to him anymore.
00:29:43And for me, that was such an abrupt change.
00:29:48It is something that kind of caught me off guard.
00:29:51And then he moves to 4chan and 8chan.
00:29:55Yes.
00:29:56John T, he did talk to me a little bit about 4chan.
00:30:00I had heard about it.
00:30:01And then I had also heard about 8chan.
00:30:03I knew that that was something even darker.
00:30:05But I did not quite understand what it was all about.
00:30:11In his manifesto, which was posted on 8chan, he wrote that Tarrant was a catalyst for him.
00:30:17Did he talk to you about Tarrant?
00:30:22He had mentioned Tarrant and the heinous crime that Tarrant had committed against the mosque.
00:30:32He wanted to be very clear that he supported what Tarrant had done in the Christchurch shooting.
00:30:38He gave no hint that this is something that he would like to do as well.
00:30:44So he was playing you in your mind.
00:30:47He sat at me.
00:30:48At that point, I don't know.
00:30:50I don't know how this grows.
00:30:53It could be that he was playing me.
00:30:56In fact, I explicitly asked him at one point about violence.
00:31:01And he said that he would never resort to violence.
00:31:05What prompted you to ask him about the possibility that he would engage in violence?
00:31:11Because so much violence was associated with the people that he seemed to be honoring.
00:31:22This is when he was arrested.
00:31:24It seemed like he was really aspiring to that mass killer.
00:31:30In less than a month had downloaded and searched the name of the New Zealand shooter 104 times.
00:31:38104 times?
00:31:39Yes.
00:31:40He was hoping he would do the same as the New Zealand shooter and inspire a horrible generation of people that will also kill.
00:31:49Three months later, another attack linked to 8chan.
00:32:05This time in El Paso.
00:32:08At least 20 people dead.
00:32:13More than two dozen injured.
00:32:15The wounded ranging in age from two years old to 82 years old.
00:32:19Moments before the mass murder in El Paso, the suspect detailed his plans and ideology in a post on the internet forum 8chan.
00:32:26Although 8chan quickly removed the shooter's manifesto, as it had done after San Diego, it was too late.
00:32:34It had been copied and would be widely reposted.
00:32:38Christchurch had happened and that was the first large one.
00:32:41Then in San Diego.
00:32:43Then finally in El Paso.
00:32:45Tonight the man who created 8chan wants it shut down.
00:32:488chan's founder, Frederick Brannan, decided enough was enough.
00:32:53The current administrators of 8chan don't care that this is happening.
00:32:59He urged that the site he founded be shut down.
00:33:028chan was dropped by the companies that kept it up and running on the internet and soon went dark.
00:33:11But it would be a short-lived victory.
00:33:13You know, these white supremacist groups adapt, right?
00:33:17It's like you take away their toy and they're not going to just sit around and do nothing.
00:33:23You know what I mean?
00:33:24They're going to try to find a new platform.
00:33:28That new platform would turn out to be Telegram.
00:33:31A messaging and social media app started in 2013 by Russian tech entrepreneur Pavel Durov and his brother Nikolai.
00:33:41One of their chief marketing pitches was that this would be a place free of censorship.
00:33:48That it would be a place where free speech was paramount.
00:33:51And it was also private and secure.
00:33:54And I'm starting a new document.
00:33:56Open it up.
00:33:58I'm calling it Letter to Pavel.
00:34:02Telegram is based in Dubai.
00:34:04It has close to a billion users, but only about 60 employees, many of them engineers.
00:34:11From the start, Telegram took an extreme approach to free speech and offered more powerful features than other platforms, giving it mass appeal.
00:34:21So you can send encrypted messages to your friends.
00:34:25You can create big chat groups where you have thousands of people chatting.
00:34:30Or you can turn it into a sort of one way broadcast system where you are pumping out your propaganda and your message day after day to a group of people who are subscribing to your broadcast channel.
00:34:44Telegram, much like the chance was a platform where almost anything went.
00:34:51There was virtually no moderation.
00:34:54This platform became immensely useful for social justice activists in repressive countries.
00:35:01And it also became a place that was also useful for people committing crimes and terrorists.
00:35:09In 2015, the app became popular with ISIS.
00:35:13There's an app called Telegram that more and more members of ISIS are using.
00:35:17Telegram became the preferred communication method for ISIS.
00:35:20The group used it to claim responsibility for the Paris attacks.
00:35:23Under pressure from European governments, Telegram began to shut down ISIS channels.
00:35:29But accelerationists soon followed the ISIS example.
00:35:33Extremists started plotting in a chat room called Telegram.
00:35:38By 2019, white supremacists were flocking to Telegram.
00:35:43From 8chan and from major platforms like Facebook, YouTube, and Twitter.
00:35:47Which were increasingly policing content and banning users.
00:35:52There was an organized effort by white supremacists and accelerationists to move onto Telegram.
00:35:59And they saw it as their new home.
00:36:01And a place where maybe they wouldn't be harassed and kicked off the platform for a while.
00:36:07These are all people who have met to encourage one another to engage in acts of lethal terrorism and industrial sabotage.
00:36:16This is a very militant, aggressive, dangerous community.
00:36:22Nobody from Telegram would agree to an interview.
00:36:29In a written statement, the company said it has always screened postings for problematic content and calls for violence from any group are not tolerated.
00:36:41But on October 12, 2022, a young man who had spent years in accelerationist chats on Telegram made the leap from online extremist to real world terrorist.
00:36:54Here, in Bratislava.
00:37:07That night, Radka Troksyarova had seen her two friends get gunned down outside the Teplarim bar.
00:37:13She'd also been shot, twice, in the leg, but dragged herself to safety.
00:37:20In the hours after the attack, as 19-year-old Juraj Krajcik went on the run, he kept posting on social media.
00:37:49His handle was an acronym.
00:37:52NTMA 0315.
00:37:55Never take me alive, March 15.
00:37:58An apparent reference to the Christchurch attack.
00:38:01This brutal crime has been accused of the president of the premieres of other politicians.
00:38:06I'm very personally living on the street, along the street, so I heard the shooting.
00:38:13Merik Madro is a psychologist who runs a youth crisis hotline in Bratislava.
00:38:19As the manhunt continued, the shooter was threatening to kill himself.
00:39:06The manhunt would have been caught in the sense of that, and that he has a huge fear, but without him, that he would never have been killed.
00:39:13And that he doesn't have to deal with the narrative and the way he talks to the person.
00:39:20He was shooting and he left there quietly.
00:39:24I came here to try to understand more about what had motivated this 19-year-old to launch
00:39:41a terror attack and then kill himself.
00:39:46At the time, the authorities examined the shooter's manifesto, but little was known
00:39:51about his online connections to far-right extremists or who he'd been communicating with about
00:39:56his plans.
00:39:58How's your morning going?
00:40:00I met the prosecutor who oversaw the investigation of the attack.
00:40:03I knew about the manifesto when we were on the crime scene already because he published
00:40:08it, I think, a few hours before the attack.
00:40:11Now that manifesto, if you read it, 90% of that is anti-Semitic.
00:40:16Frankly speaking, only a small part of it is against LGBT.
00:40:21The bulk of it is anti-Semitic.
00:40:24What do you think led him to commit this act of terrorism and to subscribe to this extremist
00:40:30ideology?
00:40:31Well, based on, of course, he made it very clear in the manifesto.
00:40:37I mean, he was interested very much in the Christchurch attack in New Zealand and in the
00:40:41attack committed by John Ernest in California.
00:40:45Those were the role models for him.
00:40:48It was kind of, to a certain extent, you might call it a copycat crime.
00:40:54Do you know if the attacker was a member of any real-world groups?
00:40:58We were not able to establish that he was involved in any groups.
00:41:02From the investigation, it seems that he was a long wolf.
00:41:07But as I dug deeper, a different story began emerging.
00:41:14In his manifesto, Christchurch thanked the Terragram Collective for what he called its incredible
00:41:20writing, political texts, and practical guides.
00:41:25I had already heard of this group in 2019.
00:41:28It had started as an informal network on Telegram.
00:41:33At first, it's just a handful of chat rooms and channels on Telegram.
00:41:40Then it is bigger and bigger and bigger.
00:41:42And finally, it takes on a formal shape.
00:41:47And people within that group say, now we're starting something called the Terragram Collective.
00:41:51And this is going to be our organized arm that is going to generate in-depth propaganda,
00:41:58in-depth material for this community.
00:42:04They're saying, hey, these guys like Brenton Tarrant, these people are heroes.
00:42:09We'll call them saints, and you should act like them.
00:42:13You should become a disciple of Brenton Tarrant, and go out and kill people.
00:42:22He posted it on Twitter.
00:42:23I think, but I have to check.
00:42:26I had been able to obtain a trove of archived posts from Telegram and other platforms.
00:42:32I teamed up with investigative journalist Lukasz Diko and Karim Shoemosh, who had reported
00:42:37on Terragram in the aftermath of the Bratislava shooting.
00:42:42In his manifesto, Krychik had also singled out a key individual from the Terragram Collective,
00:42:48known as Slovak Bro.
00:42:51Using the trove, we began to piece together a picture of Slovak Bro.
00:42:55Lucas came up with a bunch of usernames, and I took those usernames and put them everywhere
00:43:01I could find.
00:43:02And my understanding from looking at his social media history is that Slovak Bro starts off
00:43:08being, like, kind of a normie.
00:43:10And eventually he gets into accelerationism and terrorism.
00:43:13And you see the whole arc of his change online.
00:43:19The post showed that Slovak Bro was a founding member of the Terragram community, and one of
00:43:24its most prolific content creators.
00:43:27Slovak Bro is a big guy in this world.
00:43:30He has thousands of people in his channels.
00:43:34He's spreading all kinds of stuff.
00:43:36Some of it is instructions.
00:43:39This is how you do a crime and you don't get caught.
00:43:41Don't talk about, you know, what you're going to do on here.
00:43:44Some of it is inspirational.
00:43:46It's like, here's a graphic that's going to inspire you to go kill people.
00:43:50Some of it is operational.
00:43:52It's like, here's a manual for making high-powered explosives or to 3D print a gun.
00:44:01Slovak Bro's real identity was Pavel Benedik, a 22-year-old Slovakian student.
00:44:09He'd been arrested and jailed months before the Bratislava shootings and charged with more
00:44:14than 200 terrorism offenses stemming from his telegram posts.
00:44:18We wanted to find out what the Slovak authorities knew about any ties between him and Juraj Krajcik.
00:44:28Were you able to establish if they were in any kind of communication?
00:44:32Well, they were in communication, but because we, of course, interrogated the person known
00:44:40as a Slovak Bro in this case.
00:44:42And yes, they did communicate, but only very briefly.
00:44:46And this was a direct message between the two?
00:44:48Yes.
00:44:49Yes.
00:44:50Yes.
00:44:51Of course, they did not know about their identities, but that was it.
00:44:54There was nothing significant in their communication.
00:44:57You mean that Slovak Bro didn't incite him to go and kill people?
00:45:01No, not at all.
00:45:04But Slovak authorities didn't have the whole picture.
00:45:08The prosecutors say, oh, they didn't really know each other.
00:45:11They didn't communicate.
00:45:12They only had one brief interaction.
00:45:16Before we came here, I didn't know what Krajcik, the Bratislava attacker, had done online.
00:45:24I didn't know who he was online.
00:45:26I didn't know what he had posted.
00:45:28And when we were here, we discovered what I believe is his handle, his account in these
00:45:35telegram chats.
00:45:36And it helped tell a different story than what law enforcement was telling.
00:45:41And he did not know who he is.
00:45:42And he did not know what he said.
00:45:43He did not know what he did.
00:45:46Can you go just a more up?
00:45:48To find Krajcik's handle, Lukasz Diko and I poured over thousands of archived
00:45:53telegram chats from the trove.
00:45:55Slovak Swarrow.
00:45:56This is Slovak Swarrow.
00:45:57That's the one Slovak Bro deleted?
00:45:59Yeah.
00:46:00Oh.
00:46:01I started going through these chats looking to figure out who Yuri Krajcik was.
00:46:05Is he in these chats?
00:46:06Does he connect with Slovak Bro?
00:46:08And then I found someone speaking Slovak, and it wasn't Slovak Bro.
00:46:13It was someone else.
00:46:14I thought, oh, this person could be Yuri Krajcik.
00:46:19This is Slovak Bro, and now he's speaking in your language.
00:46:22It's Bobby, my friend.
00:46:25Let's fight.
00:46:27The user went by the name Bobby Bowie.
00:46:30These are chats with another teragram group.
00:46:35Zeroing in on Bobby Bowie revealed more than 500 posts in Slovak Bro's chats in late 2019.
00:46:42The early days of teragram.
00:46:45Then there's other language in here that's like the language in his manifesto.
00:46:50It's almost exactly.
00:46:51It soon became clear that Bobby Bowie was Yuri Krajcik.
00:46:56This is his manifesto, and he's talking about these memes that he saw on 8chan about Brenton Tarrant, the Christchurch attacker.
00:47:07In the post, on telegram, he's talking about the same exact thing, comparing Russia and America and saying they're both controlled by Jews.
00:47:17All this stuff that...
00:47:18All the stuff that Krajcik mentions in the manifesto.
00:47:22Yeah.
00:47:23We could now track Krajcik's radicalization on teragram and see what his life was like before the attack.
00:47:34This is the apartment complex?
00:47:35Yes.
00:47:36He had been living a comfortable life with his family in a middle class neighborhood a few miles from the center of Bratislava's old town.
00:47:45The chat logs show he'd spent many hours a day on teragram.
00:47:49You know, Yuri Krajcik got on that platform. He was already a racist.
00:47:53But he was molded and shaped by the veterans on that platform who were really looking for someone for the things that they wanted to do.
00:48:04Krajcik was only 16, getting primed for the militant accelerationist cause.
00:48:10Discussing the merits of other so-called saints with Slovak Bro, including synagogue attacker John T. Ernest.
00:48:18There's conversations between Yuri Krajcik and Pavel Benedict, where Pavel Benedict says,
00:48:27John Ernest, down in San Diego, he messed up. He didn't kill enough people.
00:48:32And he was running from these Jewish people in the synagogue.
00:48:36He's like, that's a bad look. That's bad optics. You need to be better than that.
00:48:40Train, prepare, be a better killer.
00:48:43These are conversations that he was having with Yuri Krajcik.
00:48:46It was all about killing.
00:48:49Krajcik was in. He soon began posting about potential targets for terror attacks.
00:48:55Spending hours in Slovak Bro's chat.
00:48:59In one instance, posting his own photographs of LGBTQ protestors at a climate rally.
00:49:06Amid the hundreds of posts we looked at, one popped out.
00:49:10This is it. This is the thing that blew my, this is what blew my mind.
00:49:16I'm scrolling through here and then I see that the attacker has posted about the place where the attack happened years before.
00:49:24This is his username, talking about the place that Krajcik was going to go shoot, the cafe, the gay bar.
00:49:35And then there's all this discussion between Krajcik and Slovak Bro about attacking the place.
00:49:41And Slovak Bro says, I don't want to even use nail bombs with that joint.
00:49:46What I want to do is so unprintable that hell is going to be preferable.
00:49:51And the guy we think is the attacker says, just saying it will instantly make a squad of federal agents appear behind you and arrest you.
00:49:57So this is from September of 2019. He'd been thinking about it for years.
00:50:13I brought the trove of Teragram posts to London.
00:50:17This is an amazing library of data.
00:50:20Where Pierre Vaux works as an open source investigator.
00:50:22Once you start building up a huge amount of data, you need to start putting it into a different graphical setting because otherwise it becomes overwhelming to read.
00:50:33He created a database from our chat archives, as well as other sources, that showed how the Teragram network expanded in the years after 2019.
00:50:43And how Slovak Bro was central to it.
00:50:45This is Slovak Bro, who ran a telegram channel called Slovak Seed Shack.
00:50:53And the chat room attached to that really had a sort of who's who of the Nazi scene at that time as members of it.
00:51:01Oh, wow.
00:51:02So you can see his like emergence in 2019.
00:51:06And these are people mentioning him.
00:51:09And these are messages from him.
00:51:11Yeah. Interesting.
00:51:12Each of the lines here is a forward or a mention.
00:51:17That's one channel sending people to another channel.
00:51:20So this is a visualisation that really shows us where these people are talking, often quite candidly because of the perception of privacy that telegram brings.
00:51:30Especially as some of these are private groups.
00:51:32He was able to find more evidence of CryChick's activity in the Teragram community.
00:51:39So Bobby Bowie had come up lots of times in this dataset because he was coming up as a member of these chats, but we didn't know who he was.
00:51:46You're looking for what nodes turn up in networks over and over again.
00:51:50So CryChick's account is Bob Bowie and we can expand that one.
00:51:56He was a very active user of all these channels.
00:51:58Wow.
00:51:59I've got another 40 channels he was in.
00:52:01Holy ****.
00:52:03These are all chats?
00:52:04Yep.
00:52:06Oh, wow.
00:52:07So I knew he was in 14 words.
00:52:10He's in some like kind of Q stuff.
00:52:12He's in some like fucker stuff.
00:52:14Yeah, like there's.
00:52:15But like that's a lot more than I knew.
00:52:18Especially it looks like Slovak Bro was bringing him into his much smaller chats.
00:52:23What's really clear now is that Slovak Bro had been trying for years to influence people to engage in terrorism and he was successful.
00:52:31Yuri Crychick is his product.
00:52:33Yuri Crychick is somebody that he influenced.
00:52:37We took our reporting on Slovak Bro and Yuri Crychick to Slovak authorities.
00:52:50It's really become clear to me that Yuri Crychick and Slovak Bro had ongoing conversations for years.
00:53:00But you know for us this communication was not a normal communication.
00:53:05Peter Kaisel is the prosecutor who oversaw the investigation into Slovak Bro.
00:53:11But he said he'd never seen the 2019 messages between the two men.
00:53:15The fact is that we were not aware of these communications on the prosecution law office was not aware about this communication.
00:53:23You were surprised when you learned that they were having these extensive conversations.
00:53:27Yeah because there was a communication in 2019 and the attack was in 2022 so there was a really gap.
00:53:36Yeah and I believe that that connection persisted past 2019 but it seems to me like Slovak Bro lied to you about his level of connection.
00:53:45Yeah it's possible.
00:53:51Throughout 2020, Teragram was evolving from a loose network of accounts into a prolific propaganda machine made up of dozens of accelerationist channels.
00:54:03With Slovak Bro at its center.
00:54:07Hey everybody and welcome to Hate Lab.
00:54:09Alright, Slovak Bro is joining us for the very first time on Hate Lab.
00:54:13That year, Pavel Benedikt as Slovak Bro was interviewed on a Teragram related podcast.
00:54:21Law enforcement in the U.S. and abroad was taking notice.
00:54:39What's interesting about this collective is how transnational it is and how interconnected some of the players are in it.
00:54:49Rebecca Weiner is Deputy Commissioner for Intelligence and Counterterrorism at the NYPD.
00:54:55Her unit was monitoring Teragram as it stepped up production of extremist content.
00:55:01The influence of Slovak Bro in this world was quite strong.
00:55:07Not just around Teragram collective and propaganda output but also encouraging people to take next steps into action.
00:55:17You are the revolutionary.
00:55:19So, act like it.
00:55:20Hail victory, man.
00:55:21Yes, Segal.
00:55:22Get ready.
00:55:23Read useful literature.
00:55:25Get useful skills.
00:55:27By 2021, Slovak Bro and others had begun calling themselves the Teragram Collective.
00:55:33And released an official publication under the new name.
00:55:39This was the first time that we started to see the bringing together of some of the ideological output of saints culture.
00:55:46And of the sort of broader aesthetic of the Teragram Collective with specific instructional material that was calling for attacks against specific groups.
00:55:55And really generating a clearer violent extremist ideology.
00:56:01It was posted as a PDF designed to be shared widely.
00:56:05More official Teragram publications would follow.
00:56:11So, you have this sanctification of martyrs who've come before.
00:56:15Combined with the ideology that you see at play in many of these manifestos.
00:56:21Neo-Nazi propaganda, targeting guidance and tactical guidance.
00:56:26How to make certain kinds of explosives.
00:56:30As well as who you might want to target.
00:56:33In October 2021, a new series of Teragram publications began to emerge called The List.
00:56:40Alleged assassination targets.
00:56:43With addresses, maps of their homes, and rationales for killing.
00:56:48And it's basically just this ongoing hit list of dozens and dozens of people.
00:56:55American corporate leaders.
00:56:57Government officials.
00:56:59Academics.
00:57:00And others.
00:57:02Telegram tried to shut down user accounts that were posting the list.
00:57:07In its statement to us, the company said it had been removing groups and channels using the Teragram name since it first surfaced.
00:57:14And that it was harder for criminals to open accounts on Telegram than other platforms.
00:57:23But in many cases, we saw in the archived posts that users had just opened new accounts and new channels.
00:57:30And continued posting about assassinations.
00:57:33The nature of Telegram as an app is that many of the channels are highly ephemeral in nature.
00:57:39They come and go and are able to be easily replaced.
00:57:42It's very easy to sidestep enforcement attempts that really try to use a whack-a-mole approach to a takedown.
00:57:55The targeting of Americans triggered an international criminal investigation involving law enforcement in the U.S. and Europe.
00:58:03In May 2022, Slovak Bro was arrested and was ultimately sentenced to six years in prison for more than a hundred terrorism offenses.
00:58:12But the Teragram collective lived on.
00:58:21Teragram is not reliant on any one individual or entity to make it what it is.
00:58:26What we often see is these leaders in the Teragram space come and go.
00:58:33They fall off.
00:58:34They build back brands.
00:58:35They gain prominence.
00:58:36They lose prominence.
00:58:37It's very much a fluid environment where no one person owns a commanding stake of it.
00:58:43Researcher Matt Kreiner has studied how other Teragram leaders emerged after the arrest of Slovak Bro, including one known as Miss Gorehound.
00:58:53Miss Gorehound is one of the aliases that we know to be a central figure within the Teragram ecosystem.
00:58:59She has been a strong proponent of the development of the Teragram publications, ran a number of channels that had direct influence and ownership over the saints' culture.
00:59:10Miss Gorehound picked that up and said, this is a model.
00:59:13We can actually turn this into a very consolidated pipeline for radicalization and mobilization of individuals to carry out more terrorist attacks.
00:59:21So there's a lot of overlap between these people.
00:59:28Miss Gorehound, otherwise known as RWBC, standing for Right Wing Book Club, who runs another range of Telegram channels in the Teragram network.
00:59:38We can select that and we can see that she's got her Cat and Joy's Anonymous channel.
00:59:43She's got the Right Wing Book Club one.
00:59:45She's got Rider88, Rider Returns, Miss Gore88.
00:59:50Now, when we highlight these groups, again, we get a lot of shared channels with Miss Gorehound.
00:59:57This is a closed chat, but these individuals are really active in it.
01:00:04The database showed that by mid-2022, Uri Crychick was a member of a group chat run by Miss Gorehound.
01:00:12So if you look at someone like Miss Gorehound, they're super connected because they're creating loads of channels and they're infiltrating loads of channels.
01:00:20Whereas someone like Crychick are the opposite of that in a way.
01:00:25They're just really desperate to get into this.
01:00:27They found a wonderful world of friendly, like-minded people with their funny memes that they can consume.
01:00:36And, you know, it gives them a sense of camaraderie and belonging.
01:00:39What you see in the chats that I've read is that he's getting ideas from the older people.
01:00:46He's expressing his desire to target specific targets and he's being pushed in this very violent direction.
01:00:53And also this direction of sort of self-immolation that is heroic to go and kill and then kill yourself.
01:01:02Miss Gorehound is alleged to be this woman, Dallas Erin Humber.
01:01:07She's pleaded not guilty to terrorism charges and is in jail awaiting trial.
01:01:12Her identity was originally exposed and posted online by a group of activists.
01:01:19We felt that people needed to know who these Nazis were.
01:01:23One of them agreed to speak to us if we granted them anonymity.
01:01:27Tell me about Dallas and her life.
01:01:30She is a 35-year-old woman from Elk Grove, California.
01:01:36She considers herself an artist.
01:01:40What was her role in the Terragram Collective, from what you can tell?
01:01:45It looks like she started as the narrator of mass murderers' manifestos.
01:01:54It was a new kind of propaganda.
01:01:56Manifesto audiobooks.
01:01:58Any mass murder manifestos she got her hands on, she would turn into an audiobook and put it out.
01:02:05Mass immigration will disenfranchise us, subvert our nations, destroy our communities.
01:02:11The shooter audiobooks became a signature.
01:02:14Long before low fertility rates ever could.
01:02:17The audiobooks were posted to Terragram channels linked to her.
01:02:21How important would you say she was in the Terragram Collective and the Terragram scene?
01:02:29I think initially she was just a mouthpiece.
01:02:33But over time, as certain members of Terragram started to get doxxed or arrested, it created this vacuum.
01:02:42And in that vacuum, Dallas Humber managed to carve out a niche for herself.
01:02:48And through that, she absolutely came up to leave that collective.
01:02:59We could see in the trove of Terragram chats that Miss Gorehound was working with another prolific propagandist with the username BanThisChannel, or BTC.
01:03:08We can zoom in on an individual.
01:03:13So here we've got BTC, which is the alias behind several Telegram channels, which all have similar initials.
01:03:21So BoldTurdsCoin, BanThisChannel, and Big **** Chicken.
01:03:27Now, BTC is a super spreader in terms of Telegram group membership.
01:03:33BTC was the white whale of the Terragram Collective.
01:03:42Everyone knew about his posts, but no one knew who he was.
01:03:47And this, for us, was a big puzzle.
01:03:50So BTC first emerged on YouTube, and his schtick was creating content that was controversial enough to get him banned.
01:04:02He created dozens of channels and groups on Telegram.
01:04:08I wasn't sure whether BTC was one person, whether it was maybe a small group of people that were putting these videos together.
01:04:20When did you first become aware of BTC?
01:04:22I first became aware of BTC through a specific video that BTC produced called The Last Battle on a Telegram channel that I was monitoring.
01:04:31And then started seeing other videos circulating on Telegram that were also allegedly made by this BTC.
01:04:40We're building, we're organizing.
01:04:42They served the purpose of propaganda to do things like recruit new people and sustain members, in this case, encouraging violence.
01:04:50BTC posted around 120 videos, many of them with graphic, racist, and anti-Semitic violence, clips of the Christchurch attack video, and homophobic imagery.
01:05:09Teragram's most infamous video was a BTC, Miss Gorehound collaboration.
01:05:18We know that the narrator is Dallas Humber.
01:05:20105 white men and women of action have taken...
01:05:23So white terror is meant for somebody who's been indoctrinated.
01:05:26And the push is, if you do commit an act of violence, this is what you can expect.
01:05:31You will be celebrated too.
01:05:33Breaking news right now, the Justice Department charging two people, saying that they were leading a white supremacist group that wanted to ignite a race war...
01:05:48In September 2024, the mystery of BTC's identity was solved.
01:05:54The indictment charges in 15 counts...
01:05:57Federal prosecutors announced they had arrested two leaders of the Terrorground Collective.
01:06:02One was Dallas Humber.
01:06:05The other, a 37-year-old man named Matthew Allison.
01:06:10We were very familiar with Dallas Humber.
01:06:13The second person was a 37-year-old we'd never heard of.
01:06:19But as we talked with researchers over the next day, it became pretty clear to us that Matt Allison was none other than BTC.
01:06:29Allyson lived in Boise, Idaho.
01:06:36We wanted to find out who Matt Allison really was.
01:06:42We went to Boise and started talking with people who knew Matt Allison in the real world.
01:06:48Hi, I'm James Bandler, your friend of Allison.
01:06:53And the story that we unraveled was frankly very disturbing because this man was living a complete double life.
01:07:03We learned that in public, Matthew Allison was an aspiring DJ who worked menial jobs and partied with friends in the local electronic dance music community.
01:07:18But federal investigators say that in private, he was helping run the Terrorgram Collective.
01:07:25We quickly got a picture of a person who was very well-liked, who hung out with a very multicultural group of friends, and a person who was also a gay man.
01:07:38So, while he was living the life of a gay man in Boise, on Telegram, he was making posts celebrating the murders of gay people in Bratislava.
01:07:52Matthew Allison agreed to talk to James against the advice of his lawyer.
01:08:13No cameras were allowed in jail.
01:08:20After the interview, James called me.
01:08:24Hey, so what happened?
01:08:26So, he came out.
01:08:29He's exactly like we were told, tall, rangy, skinny guy.
01:08:34He confirmed, he said he didn't confess, like, to the crimes.
01:08:38He just confessed to being BTC to the FBI.
01:08:42So, he said he urged people to just be legal on the channel and to not actually incite actual violent acts.
01:08:51He chalks it up to being an artist.
01:08:53This is my free expression.
01:08:54I'm an artist.
01:08:55And he's really proud of a lot of his work.
01:08:58So, he's not contrite.
01:09:00He's not taking it back.
01:09:02No, no, no.
01:09:03I don't think he's contrite at all about it.
01:09:07And I think he's going to push a hardcore free speech, you know, case on it if he can.
01:09:13He described himself as a video artist, a person who was creating art and content that was protected under the First Amendment.
01:09:23He denied being a terrorist.
01:09:26He denied wanting to incite people to commit murder.
01:09:32He admitted that he was an ethno-nationalist.
01:09:36And he says, I believe white people need to band together, meaning to tribe up.
01:09:40Tribe up is how he put it.
01:09:42And he called the indictment bull****.
01:09:49Prosecutors allege that Dallas Humber and Matthew Allison were both in direct contact with the Bratislava shooter, Yurai Krychik, in the year before his attack.
01:10:00The indictment accuses Humber of pushing a message in her posts.
01:10:04To become a teragram saint, you had to be white and kill those deemed inferior.
01:10:11The mission was clear.
01:10:13To indoctrinate younger people on teragram.
01:10:18She allegedly wrote about one user.
01:10:20He's like 18 years old and seems very impressionable.
01:10:23I'm trying to radicalize him.
01:10:26I found posts from July 2022 where she was promoting a new teragram publication.
01:10:33Soon after its release, Yurai Krychik was talking to her about it.
01:10:38Look at this.
01:10:40He says, just finished reading ****.
01:10:42It's excellent.
01:10:43And then this is the US person who's a teragrammer, Dallas, and says, I haven't finished reading it yet, but it's a literary and artistic masterpiece.
01:10:52And she says, what were some of your favorite passages?
01:10:55I know that Yurai Krychik and Dallas Humber were in contact.
01:11:01And I know that Dallas Humber was repeatedly encouraging Krychik to engage in terrorism and cheerleading for terrorism.
01:11:11That's her own words.
01:11:12She says, I love those two.
01:11:13I hope the next saints out there read those passages and feel inspired.
01:11:17I know.
01:11:22At the same time, Yurai Krychik had started posting on a private Twitter channel.
01:11:27That's the Twitter?
01:11:28Yeah.
01:11:29We have all the screenshots.
01:11:32It was the diary of a fully radicalized accelerationist.
01:11:36We find ourselves in a critical situation.
01:11:39We will need to radically alter the course of history.
01:11:42And only radical action can do that.
01:11:45In mid-August 2022, Krychik starts to do reconnaissance.
01:11:50He posts about targeting the Prime Minister of Slovakia.
01:11:54Hashtag Edward Hager, Hashtag NTMA 0315, Test and Preparation.
01:12:02There were several targets.
01:12:06The perpetrator made a list of targets based on the difficulty to commit a successful attack.
01:12:14Jakob Gaidosh was a counterterrorism official assigned to the Krychik case.
01:12:19Preparing an attack, he took several pictures of the potential targets.
01:12:23The highest ranked place of hitting the Prime Minister.
01:12:26And I think it's this one.
01:12:28Because you can identify the bush.
01:12:30And identify the two windows that are on the picture, actually.
01:12:34Oh, yeah.
01:12:35It looks like it.
01:12:38August 15th, Krychik tweets three selfies in front of the home of the Prime Minister,
01:12:44along with the caption.
01:12:46Just taking a look at some places.
01:12:48In his rankings of the targets, an attack on a Jewish community
01:12:52should follow.
01:12:53Later that day, Krychik tweets sarcastically.
01:12:58As a proud LGBTQIAP plus Jew, I would like you all to join me at the Chabad office today,
01:13:04before we proceed to the LGBTQIAP plus bar for a drink.
01:13:10Attached are two more selfies.
01:13:12One in front of a Jewish community center.
01:13:15The other in front of his final target, the Teplar and Bar.
01:13:21In the coming weeks, his tweets reflect his commitment to violence.
01:13:25I don't expect to make it.
01:13:28In all likelihood, I will die in the course of the operation.
01:13:31September 6th.
01:13:33I want to damage the system to the best of my abilities.
01:13:36September 16th.
01:13:38Accelerationism means helping the system collapse faster.
01:13:42October 10th.
01:13:43People who I consider heroes and role models, Brenton Tarrant and John T. Ernest.
01:13:49And finally, October 11th.
01:13:53I have made my decision.
01:14:06The next night, when Krychik was on the run after his attack,
01:14:11U.S. prosecutors say he sent a direct message to Matthew Allison, BTC.
01:14:16He wrote,
01:14:19Not sure how much time I have, but it's happening.
01:14:23They allege he also sent Allison his manifesto.
01:14:26It was posted on several accounts linked to BTC.
01:14:31Ms. Gorehound's accounts also announced the news.
01:14:35Urii Krychik was hailed as Teragram's very first saint.
01:14:39We mourn Saint Krychik's death, but his life is true.
01:14:42A few days later, she released the manifesto as an audiobook.
01:14:45His manifesto is absolutely fire.
01:14:48Here was someone that they successfully indoctrinated and encouraged to kill,
01:14:53and he'd gone out and done it.
01:14:54By the time of the Bratislava attack, law enforcement was already closing in on the Teragram Collective.
01:15:08The downfall of the Teragram Collective starts with the arrest of Pavel Benedict.
01:15:17He gets arrested in 2022, and then you see a cascade of arrests around the world of people who are both collective leaders and people who are responding to the collective's propaganda and part of this broader Teragram community.
01:15:34The FBI disrupted a plot.
01:15:35An 18-year-old is accused in a plot to destroy a PSENG power plant.
01:15:40Authorities would arrest around a dozen people allegedly tied to the group, including Dallas Humber and Matthew Allison.
01:15:49They were charged with leading a transnational terrorist organization, encouraging hate crimes and terrorist attacks.
01:15:56Are these arrests at the end of Teragram?
01:15:59You may have a collapse specifically of this particular network, but is that the end? Absolutely not.
01:16:05Neo-Nazis walk through Nashville.
01:16:08Conspiracy by white supremacists to attack a portion of the Northwest Power Grid.
01:16:12These groups come and go all the time. Individuals come and go all the time.
01:16:16The group chanting white power and other racially provocative language.
01:16:21And unfortunately, there's no indications that these ideas and emotions and practices that are associated with things like Teragram are going anywhere.
01:16:30Hate crimes have been on the rise for the past several years.
01:16:33Especially given the broader climate that exists within our society.
01:16:38There will be new Teragrams that take its place by another name.
01:16:42And we will continue to see this kind of extremism propagated through platforms of various sorts, not just Telegram.
01:16:51The US, UK and Australia have officially designated Teragram as a transnational terrorist group.
01:16:58And Telegram, the platform, is facing its own troubles.
01:17:05Pavel Durov has been arrested for allegedly failing to moderate criminal activity on the platform.
01:17:10So in August 2024, we saw the founder of Telegram, Pavel Durov, arrested and charged on the basis of alleged complicity in the spread of a range of different illegal content on this platform.
01:17:21From charged sexual abuse material through to engagement in drug trafficking.
01:17:25This is a hugely significant moment in really showing how far the platform had come and how authorities were frustrated by its unwillingness to take actions.
01:17:35This is really unprecedented.
01:17:38After Durov's arrest, he publicly pledged to work more closely with law enforcement on action against illegal content on the platform, including providing IP addresses in cases of legitimate warrants being served.
01:17:51Durov has called the charges misguided for trying to hold a CEO personally responsible for crimes committed by a third party.
01:18:04Telegram told us while it is still dedicated to upholding free speech, it now has around 750 contractors to moderate problematic content.
01:18:16It said since 2022, it has partnered with a counter extremism group to remove more than 129 million pieces of content.
01:18:25Telegram is still one of the more lawless places.
01:18:32The telegram channels, a lot of them have disappeared or gone dark, but you can still find plenty of awful content on telegram.
01:18:43Even five months after Matthew Allison was arrested, you could still go on telegram and find all his racist videos, the ones urging people to kill, the assassination manuals.
01:18:56It wasn't until we actually pointed this out to telegram that they took this content down.
01:19:04Texas and Florida both passed laws restricting tech platforms for moderating content and blocking political views.
01:19:16Does the First Amendment protect these companies Facebook, TikTok, YouTube from curating content?
01:19:23For years now, social media companies have sort of been trying to balance two things, how to make their platforms safe and then also how to give people a place where they can express themselves.
01:19:35Republicans have argued there's too much content moderation on social media. Democrats often say there is not enough.
01:19:42The debate recently has been about whether the policing of content has gone too far and whether content moderation basically amounts to censorship.
01:19:54Hundreds of Twitter accounts belonging to far right activists and QAnon theorists have been reinstated.
01:20:00That debate is going to keep going.
01:20:03We're going to get rid of fact checkers and replace them with community notes.
01:20:07But this is what the telegram story is about is the extreme end of all this.
01:20:13What do you do as a company when you have people on your platform saying let's go kill folks, let's assassinate people, let's do sabotage and terrorism?
01:20:25How do you deal with that?
01:20:26How do you deal with that?
01:20:31We call upon governments around the world to bring an end to hate speech and the politics of fear.
01:20:43What we've seen through the telegram story is that there are consequences to unfettered free speech,
01:20:50to having influencers out there advocating violence or mass murder.
01:20:58Thousands of people gathered at a vigil in Slovakia to commemorate two people killed outside of gay bar.
01:21:05There is no doubt we are.
01:21:06We are in a room.
01:21:07We are dead, with prejudice.
01:21:08We are in a room.
01:21:09We are in a room.
01:21:11If you look at what happened over the last five years, you have to ask a question.
01:21:17What have we reaped as result of this?
01:21:21What new whirlwind are we throwing ourselves into?
01:21:26I don't know if you can get it completely out of the way,
01:21:38or that you will be completely okay.
01:21:44I don't see those people in the head.
01:21:47What is the idea that I'm going to take a weapon
01:21:51and I'm going to shoot by people?
01:21:54I'm going to kill someone.
01:21:56I'm going to kill someone.
01:21:58I'm going to kill someone.
01:22:00The question is why?
01:22:24For more on this and other Frontline programs,
01:22:26visit our website at pbs.org slash frontline.
01:22:30The Frontline
01:22:40The Frontline
01:22:50The Frontline
01:22:52The Frontline
01:22:54The Frontline
01:22:55The Frontline
01:22:56The Frontline
01:22:58The Frontline
01:23:00The Frontline
01:23:01The Frontline
01:23:02The Frontline
01:23:04The Frontline
01:23:06The Frontline
01:23:07The Frontline
01:23:08The Frontline
01:23:09The Frontline
01:23:10The Frontline
01:23:11The Frontline
01:23:12The Frontline
01:23:13The Frontline

Recomendado