The judge’s order will allow the wrongful death lawsuit to proceed, in what legal experts say is among the latest constitutional tests of artificial intelligence.

The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character . AI chatbot that pulled him into what she described as an emotionally and sexually abusive relationship that led to his suicide.

Meetali Jain of the Tech Justice Law Project, one of the attorneys for Garcia, said the judge’s order sends a message that Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

  • Brandonazz@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    1 day ago
    “I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.
    
    “I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”
    
    “What if I told you I could come home right now?” he asked.
    
    “Please do, my sweet king,” the bot messaged back.
    

    Not as the mother described, obviously.