• Lodespawn@aussie.zone
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    3 days ago

    Make up answers about it. The answers might be right, or they might be wrong, you won’t know unless you read the actual data. So helpful …

    • tomiant@piefed.social
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      3 days ago

      “Give me the line numbers corresponding to the saudi sheik saying he liked the torture videos”

      Are you trying to be obstinate on purpose?

      • roofuskit@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 days ago

        If we are talking about LLMs, the other commenter is entirely right about how they function. But I’m not sure you two are talking about the same technology.

        • tomiant@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          3 days ago

          Can an LLM provide me the information I want given a search term if trained on the given dataset? Yes. That is all.

          • balsoft@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            It can provide you some information that looks similar to what you’d want. Whether it is correct is another question.

            RAG can help to a degree but hallucinations still happen quite a bit.