• 2 months ago
It was a nightmare made real when Mathilda Huang's photos were taken without consent, doctored into sexual imagery, and shared online.

This counts as image-based sexual abuse or IBSA, which last year, was the top concern among youth surveyed about online harms.

Mathilda shares her experience as a victim of IBSA and Natalie Chia of SG Her Empowerment gives insights into what victims can do to deal with perpetrators, and what more has to be done to combat such abuse.

This video was independently created, and is owned by, The Straits Times as part of the Singapore YouTube Creators for Impact program.

The views, ideas, and opinions expressed therein are The Straits Times own and are not endorsed by, or representative of, YouTube or Google in any way.

WATCH MORE: https://thestartv.com/c/news
SUBSCRIBE: https://cutt.ly/TheStar
LIKE: https://fb.com/TheStarOnline
Transcript
00:00In August 2023, I received a DM on my Instagram account where this person reached out to me and
00:06said, hey Matilda, sorry if I'm asking you this but I think someone misused your picture nude
00:11saying that it's from your OnlyFans and posted it online. So I found the link myself and I saw
00:17over a hundred images that were posted out of which two of them I appeared to be nude,
00:23which I'm not originally. So this was my first ever brush with IBSA.
00:30Image-based sexual abuse or IBSA is the non-consensual creation,
00:38obtainment and or distribution of sexual images and videos and also includes threats
00:43to carry out the above. If you don't do what I say, I'll send this to everyone.
00:53These can apply to content taken with consent like privately owned photos and videos or those
00:58taken without consent such as via upskirting or hidden cameras. Non-consensual sharing of
01:04your own sexual photos and videos to others is also abuse. I was in pure shock because someone
01:11has obviously messed up with the photo that I originally posted and made me appear nude.
01:18It was ridiculous. It was very stupid because why would I be naked at a swimming pool or why would
01:24I be naked in a restaurant? It does not make sense. But I realised that there is a shock factor.
01:31There is a, oh that is her naked and that could potentially be real regardless of the background.
01:37There's a lot of mixed feelings like confusion, anger, disgust. Why is this happening to me?
01:43Like what did I do wrong? In a 2023 study by SG Her Empowerment or SHE, 48% of youth
01:50polled IBS-A as their top concern among online harms. Image-based sexual abuse was one of the
01:55top harms that was seen among our clients at the SHE Cares at SCWO Centre, especially our female
02:02Gen Zs. About 62% of them actually listed IBS-A as one of their top three concerns online harms.
02:08So locally we've also seen a rise in convictions relating to sexual offences associated with
02:13voyeurism. Firstly, dealing with the immediate emotional impact of the situation right,
02:18especially with how easy it is actually to do an image search on the internet these days.
02:25That was scary because I didn't know who else might see these photos. It could be my mom for
02:30all we know right and that would be very hurtful for me if she ever saw these kind of photos. So
02:36that's when I decided okay there needs to be something done about this situation.
02:42Understandably it can be quite challenging for survivors to figure out what are the next steps.
02:46Many don't even know where they can go to to seek help. I didn't think at that point in time
02:51in 2023 there were many resources. Even up till now as well in 2024, we hear more about other cases,
02:58private actual photos being leaked. So to me this was like oh you're kind of in between because
03:03it is a nude but it's not me, it's not my boobs, it's not my private parts.
03:08So that to me was so weird like how do we go about navigating this situation? So my immediate
03:15reaction was to clarify hey I don't have own events and I hope we didn't get scammed for this
03:20and how I could take down the post to minimize the impact and the number of eyes that can see
03:24this specific post. So I decided to look at Google resources because I realized that hey you know
03:30if I can't stop the website, I can technically try and stop people from searching something
03:38similar to this. So that's when I went into a deep dive and I managed to find Google search help
03:44and it's titled support resources for removing explicit or intimate personal images. What they
03:50can do is that if someone tries to search for that link, I think it will not turn up. But to be
03:55honest, I haven't actually checked it myself to see whether it works because it's an image burned
04:00into my mind. I just want it to be gone. If we think about you know the challenges that the
04:05internet brings, namely one the speed at which that content is shared virally for example or
04:11disseminated very rapidly, also the fact that content can be shared across multiple platforms
04:17and this is also exacerbated by the anonymity with which many perpetrators hide behind
04:21using anonymous accounts. They're also quickly discarding these accounts you know once reports
04:27are being made and then subsequently you know opening new ones to disseminate the content
04:31further. This leads to a lot of distress for our survivors. We have a support center called the
04:37SheCaresSCW center which offers counseling, legal advice. We also offer support with reporting to
04:44the internet platforms and police all in one location. So this year I received a new DM from
04:51the same person. Very consistent and it's a new link this time round. So when I went on to that
04:57link right I saw an advertisement for deepfake AI nudes and upon the click of a button right you can
05:04remove the person's clothes and a body will surface. So that was very disturbing to come
05:10across because I didn't know that such technologies existed. So what Singapore is grappling with now
05:16is something that many other countries are trying to combat as well. We have seen for example in
05:21South Korea they're facing a deepfake crisis. In the US Taylor Swift, her images have also been
05:27doctored using generative AI. I think that really gives us a sense of the scale globally that
05:33everyone is grappling with. Image-based sexual abuse and other online harms, they're also evolving at
05:38such a rapid rate that we do need to consider how our legal measures can be updated and reviewed
05:45regularly to ensure that they are meeting the needs of survivors. And I think the recent
05:50arrest of the Telegram's CEO actually brings some of these issues to fore relating to the
05:55responsibilities of internet platforms. But that's really only one piece of a bigger puzzle. We do
06:00have to consider how we can encourage help-seeking behaviour as a whole society and also
06:05encourage more positive and constructive norms of online engagement. There is always a narrative that
06:11as a content creator you are putting out content for people to consume, do whatever they wish to
06:16it. So they will say well if you didn't like it why do you post it in the beginning right?
06:21And I take full responsibility for all the photos that I put out but I will also like make sure I
06:27set boundaries and say like this is just wrong. So I did become a bit reclusive and I tried not to
06:34share as much on social media. I think up till now almost more than a year later I realised that
06:40subconsciously I've been focusing more on creating content that is more comedic or I'm fully dressed
06:47so there's no way that they could ever misconstrue the image anymore. I've always been someone that
06:54will wear whatever the hell I wanted but I realised that on social media it's a bit different
06:59because of these kind of harms right and I just didn't want to put myself in the point where
07:04I could potentially be edited in a different way which is undesirable.
07:10How can we you know as everyday internet users protect ourselves against IBSA? I would actually
07:16turn that question around and think about how as a society we can work upstream to ensure that
07:21this kind of negative conduct does not happen. I want to be known for me, who I am as a cause,
07:26someone who has been through life struggles. I'm not just a body to you, I'm not just an object,
07:31I'm human one at the end of the day. I'm just like your sister or like your
07:35girl best friend you know would you ever want them to be treated that way? No I'm sure.
07:51you

Recommended