Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    9 months ago

    A1: if I know neither the person nor the cat, and there’s no further unlisted suffering, then the fingernail pulling is worse.

    The answer however changes based on a few factors - for example I’d put the life of a cat that I know above Hitler’s fingernail. And if the critter was another primate I’d certainly rank its death worse.

    A2: I’ll flip the question, since my A1 wasn’t what you expected:

    I’m not sure on the exact number of cat deaths that, for me, would become worse than pulling the fingernail off a human. But probably closer to 100 than to 10 or 1k.


    Within the context of your hypothetical AI: note that the cat is still orders of magnitude closer to us than the AI, even if the later would be more intelligent.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      9 months ago

      Thanks for taking the intuitive to flip the question.

      The next question is: what metric are you using to determine that 100 cat deaths is roughly equivalent to one person having a fingernail pulled out? Why 100? Why not a million?

      Do you think there is an objective formula to determine how much suffering is produced by?

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I’m not following any objective formula, nor aware of one. (I would, if I could.) I’m trying to “gauge” it by subjective impact instead.

        [I know that this answer is extremely unsatisfactory and I apologise for it.]