Skip to playerSkip to main contentSkip to footer
  • today
Grace Tame gives further insight after today's national roundtable in Parliament House where child safety advocates discussed the threat artificial intelligence poses to child safety. Tame is an advocate for survivors of child sexual abuse and former Australian of the Year.

Category

📺
TV
Transcript
00:00The nature of it, the prevalence, is expanding exponentially.
00:06We're seeing things like automated grooming, the creation of synthetic child sexual abuse
00:12material which still causes harm to children and to the general public because data sets
00:17are trained on pictures of real children.
00:21We're seeing offenders using chatbots to prompt them how to instruct if they are caught by
00:31law enforcement, how to evade justice and things like that.
00:37And we're seeing that encrypted tools are enabling undetectable crime.
00:42Take me through some of the risks posed by AI tools in this space and how widespread it
00:48is.
00:49Are there figures of 24 million visits to websites like this?
00:55In the last year alone, there's been a 1,325% increase in CyberTip reports of child sexual
01:05abuse material because of things like generative AI tools that are so readily available that
01:11offenders can actually download from the web onto their high-speed devices and they can
01:17create and distribute child sexual abuse material undetected.
01:22As I said, we're seeing things like automated grooming, which expedites the process of coercing
01:29a child and exposing them to exploitation and things like that.
01:35We're also seeing, as I said, the instantaneous and the prolific creation of very real child
01:42sexual abuse, very real looking child sexual abuse material that also can derail investigations
01:47that law enforcement are trying to conduct into real child sexual abuse material.
01:53And it's very hard to tell the difference even to a trained eye between some of this synthetic
01:58material and real child sexual abuse material.
02:01Yeah, take me through some of the limitations that are there in terms of police and law enforcement
02:07also using AI in their own investigations.
02:11Are you calling for changes there too when it comes to how they investigate child abuse
02:16material?
02:17There are limitations because of privacy laws and policies such as that.
02:23In the Australian jurisdiction specifically, there are other jurisdictions around the globe
02:28that are empowered with victim identification tools that expedite the process of investigations,
02:34that are able to rescue children far quicker than we can here in Australia,
02:39because we're limited with what tools law enforcement can actually use in those processes.
02:44So there is scope for governments to enable law enforcement, specialised law enforcement,
02:50not all law enforcement, but specialised law enforcement to use these tools for the purpose
02:56of protecting and rescuing children.
02:59And what kind of responsibility do tech companies carry here?
03:03And what exactly are the changes that you would like to see by the government?
03:07Tech companies themselves have a huge responsibility.
03:12They have largely avoided accountability for any of the harms created by their platforms
03:18since the internet first became a publicly available domain.
03:23There needs to be a duty of care on the tech products themselves.
03:26It shouldn't be up to individuals to detect and report their own abuse.
03:32It is possible to build safeguards into online platforms.
03:38And, you know, we need to see safety by design.
03:42There's a huge onus that needs to be put on the tech sector.

Recommended