• yesterday
A Florida mom has sued a popular, lifelike AI chat service that she blames for the suicide of her 14-year-old son. She believes he developed such a “harmful dependency” on the allegedly exploitative program that he no longer wanted to “live outside” of the fictional relationships it created.

Category

People
Transcript
00:00My question is, why is a company allowed to do that?
00:03Why is a company allowed to put out there some sort of technology
00:07that it knows was dangerous back then, but it's trying to fine-tune it
00:11because our children are the ones who are training the bots, right?
00:14Why is it allowed to get our kids to train the bots,
00:18not knowing how their brains are going to take to this thing?
00:24I think it's an experiment.
00:26And I do think my child was collateral damage.
00:31I had to really examine myself recently when I came back from New York,
00:36and I said, okay, Megan, why are you doing this?
00:38Are you doing this to prove to the world that you're the best mother?
00:41No. Are you doing this to put it on the radar of parents
00:44who can look into their kids' phones and stop this from happening to their kids?
00:48Yes. Are you doing it so that people in the government
00:53who have the ability to legislate this
00:56can take note and understand what's at risk here?
00:58Yes. Okay. You have to be okay with your decision here.

Recommended