Kill me now.

  • Gabu@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    3 months ago

    Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).