Pro@programming.dev to Technology@lemmy.worldEnglish · 11 hours agoAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comexternal-linkmessage-square21fedilinkarrow-up1207arrow-down14
arrow-up1203arrow-down1external-linkAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comPro@programming.dev to Technology@lemmy.worldEnglish · 11 hours agomessage-square21fedilink
minus-squaredohpaz42@lemmy.worldlinkfedilinkEnglisharrow-up7arrow-down14·8 hours agoIf a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
minus-squareramirezmike@programming.devlinkfedilinkEnglisharrow-up28·7 hours agothat’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”
If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”
Bullshit.