Skip to playerSkip to main contentSkip to footer
  • yesterday
Nicola Henry is a gender-based violence legal expert at RMIT and says better guardrails are needed surrounding AI tools.

Category

📺
TV
Transcript
00:00So these purpose-built tools, they come in different, there's different ways in which
00:06sex offenders can access these tools. Some of the tools are freely available on legitimate websites
00:13that host open source software. So on the surface, they're designed for harmless or creative uses to
00:19generate new content, but with a few extra steps, people can tweak or they can fine tune
00:24those tools to generate illegal content, including child abuse content. And then there's also freely
00:30available AI nerdify tools that aren't actually illegal when they're marketed for adults,
00:38but they are problematic for adults because they can be used to generate non-consensual
00:42intimate image content as well as child sexual abuse material. So these tools can be used to
00:47basically transplant a person's face onto an existing image to create abuse imagery.
00:54So we just heard there from independent MP Kate Cheney, this bill will create new offences
00:59targeting the misuse of these emerging technologies. Child safety experts say this
01:05addresses an urgent gap here. Can you tell us about that gap?
01:09So the gap is currently in Australia, as the package just outlined, that it is a criminal
01:14offence to access, possess, solicit or share child sexual abuse material. And that includes fictional
01:20fake images, so drawings, cartoons, as well as images that are generated using AI,
01:25is actually criminalised under the Federal Criminal Code Act in Australia with a maximum penalty of 15
01:31years. Kate Cheney's bill would make it a criminal offence to download, access, supply or offer access
01:37to tools where the dominant or the sole purpose is the creation of child abuse material. And the bill
01:45would also makes it a criminal offence to collect, scrape or distribute data. So to train the technology to create
01:52child sexual abuse material. There is a limited defence of the law enforcement are investigating. There are
01:59defences in place. And this really follows what's happening in the UK with the introduction of their bill early this
02:05year to criminalise AI tools as well.
02:09And you just addressed a moment ago image-based abuse of adults, you know, such as non-consensual
02:14pornography. We know that's also on the rise. Should those tools also be criminalised?
02:21I think they should. I think that these tools should be criminalised. In some countries, like in the UK,
02:28even some of the tools for creating, the nudify tools for creating adult content have been banned.
02:36So there is a case to be made to also have them banned. I think that they can easily be circumvented
02:45to create child sexual abuse material, as well as creating non-consensual intimate image of people,
02:51of adults as well. As to this legislation that is being put forward in this bill though,
02:58have overseas experience informed that, do you think?
03:02Yes, overseas. The UK, as I mentioned, has introduced a bill earlier this year.
03:08They also proposed to criminalise the position of instruction manuals, also known as paedophile manuals,
03:18which can result in up to three years jail time. That's what's proposed in the UK.
03:23The UK also targets website operators, facilitating grooming or sharing for up to 10 years imprisonment,
03:30and allows border forces to inspect devices suspected of harbouring such content. But also internationally as well,
03:37we have seen reports from organisations such as the Internet Watch Foundation,
03:42which reported a 380% increase in AI-generated child sexual abuse material in 2024, compared to the
03:52previous year. So we are seeing rises in the numbers of reports. But there's also a lot of the unknown
03:58information here as well. We don't know how many people, what the scope is of non-consensual
04:08image generation and the use of notify tools, because a lot of it happens behind closed doors.
04:15And aside from those limitations we discussed a moment ago of adult material, are there any
04:20limitations here or unintended consequences that you would have any concerns about?
04:24The thing that I'm most concerned about is the criminalisation of children. So young people,
04:30for example, experimenting with these tools which are freely available and creating their own content
04:36or creating content of their classmates, because they think it's funny or because they've got a crush
04:41on somebody. Of course, there are real harms when they do that, and I'm not meaning to undermine
04:47those harms. But I think we do really need to be careful about criminalising young people who are
04:51experimenting with the tools. And that's why I think it's another reason why we could ban these tools.
04:56The other thing I do worry about, I guess, is that sometimes these nudifier tools can be used by
05:02consenting adults where everything is consensual. And that's why it's really important to have the
05:07guardrails in place so that these tools can't be used to create non-consensual images of either adults
05:14or child sexual abuse material or child sexual abuse material of children.

Recommended