• Imgonnatrythis@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    46
    ·
    18 hours ago

    Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”

    Also, for the record, this is the most dystopian headline I’ve come across to date.

    • dohpaz42@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      16
      ·
      17 hours ago

      If a human does not know an answer to a question, yet they make some shit up instead of saying “I don’t know”, what would you call that?

      • JuxtaposedJaguar@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 hours ago

        If you train a parrot to say “I can do calculus!” and then you ask it if it can do calculus, it’ll say “I can do calculus!”. It can’t actually do calculus, so would you say the parrot is lying?

      • ramirezmike@programming.dev
        link
        fedilink
        English
        arrow-up
        40
        ·
        16 hours ago

        that’s a lie. They knowingly made something up. The AI doesn’t know what it’s saying so it’s not lying. “Hallucinating” isn’t a perfect word but it’s much more accurate than “lying.”

        • 5too@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          2 hours ago

          This is what I’ve been calling it. Not as a pejorative term, just descriptive. It has no concept of truth or not-truth, it just tells good-sounding stories. It’s just bullshitting. It’s a bullshit engine.