Skip to player
Skip to main content
Skip to footer
Search
Connect
Watch fullscreen
Like
Comments
Bookmark
Share
Add to Playlist
Report
How to stay safe on social media by the Head of Safety at Meta
Brut America
Follow
9/24/2023
Online harassment, bullying … How do social networks protect us from it all?
Brut spoke to Antigone Davis, Vice President and Global Head of Safety at Meta (Facebook Instagram, WhatsApp) about what goes on behind-the-scenes at her job ...
Category
🤖
Tech
Transcript
Display full video transcript
00:00
Take a teen, you're at school, people are making fun of your outfit all day long.
00:05
You come home, someone hits you up on one of our apps, they say, nice outfit.
00:10
We can't know that that is bullying, but you do and it hurts.
00:15
And so we have a filter where you can, it's called hidden words, where you can put in
00:19
words that are particular to what someone might be doing that might be harmful or you
00:24
might feel sensitive about and remove those and those comments will be filtered out of
00:28
your experience.
00:29
So what we want to do is create the right rules, but give you the tools so you can personalize
00:34
that experience and make it a fun and positive and safe place for you.
00:38
How does it work behind the scenes?
00:40
Are there beyond AI, are there people, humans who sit behind screens and kind of sift through
00:46
these reports or has it all become automated?
00:49
No, it's a combination of both artificial intelligence and technology and human review
00:56
in order to moderate the platform and create a safe space.
01:00
If you think about something that we, there's some content that's obviously going to be
01:04
violating our policies, some content it may not be as obvious.
01:08
The artificial intelligence helps us to find the content that clearly is violating our
01:13
policies and remove it, but there are instances where it's not clear, where there's the only
01:17
way to know for sure is for a human to review and take a little bit more time.
01:22
Correct me if I'm wrong, but I think sometimes we can have the impression that Meta and other
01:26
platforms act in response to things that are happening rather than in a proactive way.
01:34
Would you agree or disagree and how is Meta trying to proactively tackle any issues that
01:41
users might have on their platforms?
01:44
What I would say is that we are constantly looking at our platform to see how we can
01:49
create a safe and positive experience.
01:52
We also are taking feedback and responding to people's experience and their experiences
01:57
are evolving in the way that the user apps are evolving and so as a result, we evolve.
02:04
Sometimes that can take time, which is why I think sometimes people think that we may
02:07
be being reactive.
02:09
When we build a tool, we do a lot to build that tool.
02:13
We talk to experts, we talk to users, we may test the tool to see what makes the most sense
02:20
or works the easiest.
02:21
How does the testing work?
02:23
I'll give you an example.
02:24
When we were building our parental supervisory tools, we sat down with parents and with teens
02:30
and we wanted to understand from parents what are the things that would be useful to you.
02:35
We heard things like, I want to be able to manage my teen's time, I want to have a sense
02:39
of who they're engaging with, who's following them.
02:45
That was the guidance that we heard from them.
02:46
We also talked to experts about how do we build these in a way that teens are going
02:50
to want to use them and they're not going to push them away.
02:54
How do we create that right balance?
02:56
Taking all of that information in, we then started developing our tools and what we often
03:02
do is we'll launch our tools out to a smaller group of the population, test it, see how
03:07
people respond, maybe make some changes to those tools to ensure that we get those tools
03:13
in a place that people find them valuable.
03:16
How can people find out about all of these tools?
03:20
Where concretely can they go on the platforms to find out what they can use to limit their
03:25
time or block out certain kinds of content?
03:29
If you can sum it up in two sentences, where can they go to find the tools they need to
03:35
stay safe?
03:36
First of all, first and foremost, go to your settings.
03:39
You're going to find a lot there.
03:41
The second place that I would say to go to is the family center.
03:45
Particularly for parents, you will find tips there.
03:48
You'll find explanations for all our tools there.
03:51
I would say go to your settings, go to our family center, and you can always go to our
03:58
safety center.
03:59
Those three things should really give you a full picture of the things that are out
04:02
there.
04:03
How then do you explain that despite all of these tools, there's still so many people
04:07
who feel like they are being harassed or can be harassed on META's platforms?
04:15
The internet is a very big place.
04:19
Just like in the offline world, you are going to have people who are going to try to do
04:24
bad things on the platform.
04:26
There's no chance that you'll get to a place where there's zero, just like in the offline
04:31
world.
04:32
There's no chance we're going to get to a place where there's zero opportunity for somebody
04:36
to do something that would bother somebody or harm another person.
04:41
That said, we are using artificial intelligence to find things at scale.
04:46
We're using human review to find things that are specific.
04:50
We're giving people controls to make sure that they can personalize where we may not
04:54
have the information.
04:56
We're fundamentally committed to creating that safe and positive experience and learning
05:01
from our users where we can be doing better and evolving.
05:04
Many studies have shown that spending too much time online can have a negative impact
05:09
on our mental health, on self-esteem, etc.
05:13
So would META be in favor of users spending less time on social networks?
05:20
What is in it for META to encourage users to spend less time on their platforms?
05:25
It's one of the, I think, bigger misconceptions about what we want for our users.
05:32
We want people to have a positive experience, to be having a positive experience when they're
05:38
on the app, but when they walk away from the app to feel like they had a positive experience,
05:41
that they haven't spent too much time on the app, to be able to manage their time.
05:46
We have different tools that we have.
05:48
So Take a Break, for example, this is something that we'll use where if someone's been down
05:52
for a while, a teen's been on, we'll say, hey, would you like to take a break?
05:56
We have another tool called Quiet Mode.
05:59
So to really encourage teens, if it's nighttime and they want to turn it off, there's something
06:05
that you can use called Quiet Mode.
06:07
It allows them to send an auto-reply to people, letting them know that they're taking a break
06:12
and they are not on their device, but enabling them when they go back to see who may have
06:18
been in touch so they don't have this fear of sort of missing out on what was happening
06:23
online.
06:24
We've done that knowing that that would reduce the time, and in many cases, it has reduced
06:29
the amount of time that people are spending on that platform.
06:32
The idea is that here, just as much time as is humanly possible, it's actually have
06:37
a positive time, have a positive relationship with our app.
06:41
Enjoy the time that you're on, build the community, discover the things that interest you.
06:46
But walk away, go explore, take that out into the offline world too.
06:50
We really want to give people that opportunity to kind of manage and feel good about their
06:54
time online.
Recommended
10:54
|
Up next
Social media platforms are in 'race to the bottom', campaigner says as Meta ends fact-checking
FRANCE 24 English
1/8/2025
1:00
Online Safety Bill: Social media bosses could face jail time if they fail to protect children online
National World - LocalTV
1/18/2023
2:15
Sharenting online: Keeping kids identities safe
Australian Community Media
6/14/2024
2:00
How much screen time should my child have? - Paediatrician’s advice for each age group
National World - LocalTV
10/22/2024
1:08
Social media - Tips for keeping your child safe on social media
National World - News and sport explainers
9/16/2019
1:00
Sharenting online: Keeping kids' identities safe
Australian Community Media
6/14/2024
1:15
Ofcom warns social media sites they could be banned for under-18s
Bang Tech News
5/8/2024
0:19
Watch timelapse of cloud rolling over Minnesota as storms cause power outages across state
The Independent
today
0:46
England manager Sarina Wiegman sings and dances with Burna Boy at Euro 2025 victory parade
The Independent
today
0:55
Jeffrey Epstein’s brother claims the paedophile financier ‘had dirt on people’
The Independent
today
1:08
These Companies Paid the Consequences for Their Monopolies
FYI News
5/14/2019
0:57
People React to Breakdancing Addition to 2024 Olympics
FYI News
5/14/2019
1:11
Jesse Pinkman's Best Scenes in Breaking Bad
FYI News
2/26/2019
0:44
Sea otter shows off rock to fisherman
Brut America
6/27/2025
6:48
Danny Boyle on his new film "28 Years Later"
Brut America
6/26/2025
0:35
Paris police car plays Titanic” music in flooded street after storms.
Brut America
6/26/2025
1:00
Police caught thief by testing the DNA on his lost shoe.
Brut America
6/25/2025
2:20
Trump to put a statue garden near Mt. Rushmore
Brut America
6/25/2025
1:30
What is the pink triangle in San Francisco?
Brut America
6/24/2025
1:30
Elon Musk shares clean urine drug test on X
Brut America
6/19/2025
4:47
The story of the Spice Girls
Brut America
4/9/2025
3:35
El concreto daña al medio ambiente mucho más de lo que te imaginas
Brut America
4/9/2025
5:25
What is the G7?
Brut America
4/9/2025
5:44
TikToker shares her life from the South Pole
Brut America
4/9/2025
3:47
La vida de Megan Rapinoe
Brut America
4/9/2025