If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?
that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”
This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting. It’s a bullshit engine.
Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”
Also, for the record, this is the most dystopian headline I’ve come across to date.
If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?
If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?
that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”
Bullshit.
This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting. It’s a bullshit engine.