• BossDj@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    17 hours ago

    I like fabrication going forward. Clearly made up, doesn’t imply intent

    • ToastedRavioli@midwest.social
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 hours ago

      The word hallucination has zero implication of intent whatsoever. Last time I checked hallucination is an entirely involuntary experience, regardless of the context the word is used in.

      They are called hallucination in computer science not “to romanticize” it. It is called that because the output is totally random from the perspective of the input. If there is no logical path from input to the output, it is similar to a human hallucinating. Human sees no actual weird visual stimuli that results in them hallucinating a dragon, therefore the input info from their eyes has no bearing on what they imagine is actually there.

      This is different from “fabrication” in that the AI intentionally creating fake info based on your input request would not be a hallucination, because there would be a relationship between input and output.

      While you say you prefer “fabrication”, the word fabrication actually implies some intent that is absent from what we are referring to as AI hallucinations

      • BossDj@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        I meant that fabrication doesn’t imply intent as “lies” would.

        It seems like you use the hallucinations term correctly, when output has no relation to input.

        In this case, as in many numerous others, the Ai took input of “cite a source” and did as output cite a source as requested, but invented the content of the source. It fabricated, which means to make up, create.

        Fabricate does not imply intent to deceive, where lie does.

        I will agree that if the output is purely unrelated to the input, hallucination is still fine, but is absolutely a romanticized term when we’re referring to this computer generated code… It’s literally personification.

        • jungle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          51 minutes ago

          Everything an LLM outputs is hallucinated. That’s how it works. Sometimes the hallucination matches reality, sometimes it doesn’t.