This is peak laziness. It seems that the reading list’s author used autoplag to extrude the entire 60 page supplemental insert. The author also super-promises this has never happened before.

  • paraphrand
    link
    fedilink
    English
    823 days ago

    AI assistants such as ChatGPT are well-known for creating plausible-sounding errors known as confabulations especially when lacking detailed information on a particular topic.

    No, they are hallucinations or bullshit. I won’t accept any other terms.

    • @o7___o7OP
      link
      English
      12
      edit-2
      22 days ago

      If it makes you feel better, I’ve heard good folks like Emily Bender of Stochastic Parrots fame suggest confabulation is a better term. “Hallucination” implies that LLMs have qualia and are accidentally sprinkling falsehoods over a true story. Confabulation better illustrates that it’s producing a bullshit milkshake from its training data that can only be correct accidentally.

      • paraphrand
        link
        fedilink
        English
        722 days ago

        You’ve swayed me. I’m now down with all three. Thanks for the explaination.

OSZAR »