• last year
The Australian Federal Police have teamed up with Monash University to develop an artificial intelligence tool that can help detect child sexual abuse material online. The use of AI will be a significant breakthrough for investigators, but they are calling on Australians to submit photographs of themselves in their youth to train the program up.

Category

📺
TV
Transcript
00:00 So, developing AI tools is really important.
00:04 Emerging technologies are out there being used by criminals and the AFP, partnering
00:09 with Monash University, we want to stay a step ahead.
00:12 But we want to develop really ethically sourced AI tools and then we're asking for adults
00:17 to give us their childhood photos because they can consent to deliver those photos to
00:22 the project in all different kinds of circumstances.
00:25 And hopefully those images will train these tools so that we can combat child exploitation.
00:29 We can put them into the mypicturesmatters.org website and that will use those images to
00:36 be able to train the algorithm to be able to detect other children in other circumstances.
00:43 So we need about 100,000 images to be able to train the tools effectively.
00:49 It's really important to have children in different settings, different backgrounds,
00:53 different skin tones.
00:54 So anything that adults want to consent to give us, it will be helpful.
00:58 But what it is, is that the ability for the tool to be able to look at different scenarios
01:05 of images and skin tones and circumstances, and the tool needs to be trained.
01:11 It's a bit like you put everything into a database or like statistics.
01:16 The more numbers gives you a better outcome.
01:18 And that's what the tool will do.
01:19 It will harvest all these images in a real safe way in the Monash University and then
01:25 be able to scour other seized material that police have.
01:29 And we're getting volumes of it, like hundreds of thousands of images.
01:33 So this tool will be able to scan all that and be able to tell us with certainty that
01:38 it might be child abuse material.
01:40 [BLANK_AUDIO]

Recommended