LLMs like to repeat themselves, which isn’t great for password creation.

  • Elting@piefed.social
    link
    fedilink
    English
    arrow-up
    46
    ·
    edit-2
    1 day ago

    A word recycler would churn out used passwords? If only foresight could have foresoot this.

    Edit:
    I guess its more like: “The probabilistic outcome generator generates passwords that are probable?”

    • a_non_monotonic_function@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      You also had it right in the first place. Very likely that the training data sets could have included random files and password leaks. I don’t think they’re discriminating at this point.

      • Elting@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        It would explain why the same strings appear over and over in generated results. Its just one of the tokens most associated with “Password”

        • a_non_monotonic_function@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Fundamentally that’s all these systems are actually doing is rearranging the words they’re trained on. They’re not really fundamentally capable of coming up with anything on their own just mixing up words that are strongly correlated with the input.