But in her order, U.S. District Court Judge Anne Conway said the company’s “large language models” — an artificial intelligence system designed to understand human language — are not speech.
But in her order, U.S. District Court Judge Anne Conway said the company’s “large language models” — an artificial intelligence system designed to understand human language — are not speech.
I would not have understood that to have anything to do with suicide… do they use the phrase coming home to mean death or suicide in the game of thrones show?
You have to read the other chat logs. Arstechnica has a good summary I think, the link between “coming home” and suicide is specific to the kids chats with these AI.
Iirc when he did make it more explicit, the AI responded with “no, don’t do that” kind of responses. He just kept the metaphor up when the AI didn’t have such an association in its training data and just responded as a lover would respond to their love saying they’d come home in their training data.
Though I’d say that if a kid would shoot themself in response to a chatbot saying anything to them, the issue is more about them having any access to a gun than anything about the chatbot itself. Unless maybe if the chatbot is volunteering weaknesses common in gun safes, though even then I’d say more fault lies with the parent choosing a shitty safe and raising a kid that would kill themself on the advice of their chatbot girlfriend.