Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • Lemvi@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    9 months ago

    I think pretty much everyone would agree that’s bad. However, I don’t think we’ll ever get to the point where we recognize a machine might be capable of suffering. There is no way of proving anything, biological or not, has a consciousness and the capability to suffer. And with AI being so different from us, I believe most people would simply disregard the idea.

    Heck, look at the way we treat animals. A pig’s brain is very similar to our own. Nociceptors, the nerve cells responisble for pain in humans, can also be found in most animals, but we don’t care. We kill 4 million pigs every day, and 200 million chickens. No mass murder in the history of mankind even gets close to that.

    The sad truth is, most people only care about their wellbeing, and that of their friends and family. Even other humans don’t matter, as long as they’re strangers. Otherwise people wouldn’t be hoarding wealth like that, while hundreds of millions of people around the world are starving.

    Ah sorry, I kinda started ranting. Yes, I’d care.

    • skye@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      edit-2
      9 months ago

      yeah! prairie dogs gossip; crows tell stories, have communities, and some of them even seem to understand money; whales mourn the deaths of other whales

      sentience is trippy, and it’s always been questionable to me that we decided we’re the only sentient life on the planet

      i already get emotionally attached to, like, roombas and those suitcases that connect to your phone and follow you around, i can’t wait to have a robo buddy

      • CALIGVLA@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        prairie dogs gossip; crows tell stories,

        Speaking purely as a layman, I find these kinds of claims very questionable at best and at worst it’s anthropomorphism in my eyes. I can understand animals exchange information in some way or another, but “telling stories” or “gossip” would require a higher form of communication than just grunts, smells or body language.

        It could just be scientists using simple wording for lay people, but to me it doesn’t sound right regardless.

        • skye@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          9 months ago

          it was me using simpler phrasing in part because i couldn’t remember the details very well

          but i was referencing an experiment where researchers wearing “threatening” and “non-threatening” masks interacted with and marked crows, and other crows in that area who they had not interacted with recognized them later. https://www.sciencedirect.com/science/article/abs/pii/S0003347209005806 (however that crows tell stories is, as far as i know, only a popular interpretation, their official conclusion, at least of this experiment, is that crows are capable of long term memory retention and fine-feature discrimination)

          and simple observations suggesting prairie dogs may have a very advanced language - which went viral in my online circles with people joking that they gossip about us, which probably just stuck with me because i think it would be very cute

          i personally believe that animals most likely do communicate among each other and the complexities of their languages just varies, even if most are not obviously very complex. my personal beliefs are that communication is complicated and can happen through more than verbal/vocal language, animals are clearly capable of feeling complex emotions and pain which is enough for me personally to consider them sentient, and (again this is just my personal belief) i believe it’s probably better to treat them as if they are sentient until proven otherwise than the opposite. and just to be upfront and honest with others and myself about my possible biases, i believe in the Buddhist concept of Saṃsāra, and believe that that we’re all a part of the same cycle of death and rebirth

          edit found some more info:

          prairie dogs: https://www.cbc.ca/news/science/prairie-dogs-language-decoded-by-scientists-1.1322230

          Researchers noticed that the animals made slightly different calls when different individuals of the same species went by. … so they conducted experiments where they paraded dogs of different colours and sizes and various humans wearing different clothes past the colony. They recorded the prairie dogs’ calls, analyzed them with a computer, and were astonished by the results.

          “They’re (prairie dogs) able to describe the colour of clothes the humans are wearing, they’re able to describe the size and shape of humans, even, amazingly, whether a human once appeared with a gun,” Slobodchikoff said. The animals can even describe abstract shapes such as circles and triangles.

          Also remarkable was the amount of information crammed into a single chirp lasting a 10th of a second. “In one 10th of a second, they say ‘Tall thin human wearing blue shirt walking slowly across the colony.’”

          crows: https://www.washingtonpost.com/national/health-science/the-interesting-thing-that-crows-do-when-they-see-one-of-their-own-dead/2016/03/18/78d97a9e-ec48-11e5-b0fd-073d5930a7b7_story.html

          “They know your body type. The way you walk,” Dyer said. “They’ll take their young down and say: ‘You want to get to know this guy. He’s got the food.’ ”

          Scientists have known for years that crows have great memories, that they can recognize a human face and behavior, that they can pass that information on to their offspring.

          that article also mentions that crows have been observed to make and use tools, which is something i knew but forgot to mention and is interesting and feels relevant to this conversation

        • AnUnusualRelic@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          9 months ago

          Anthropomorphism has long been used as a big bad thing, the catchall excuse to keep animals as the stupid things they were supposed to be. We’re coming back from that thankfully.

          It doesn’t mean the animals function the same way we do. But they do function in a lot of very similar ways.

          • CALIGVLA@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            9 months ago

            My point is I can’t see how they can “gossip” or “tell stories”, if that isn’t textbook anthropomorphism I don’t know what that is.

            • AnUnusualRelic@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              9 months ago

              It’s shorthand for information sharing. Which they certainly do. Crows will absolutely tell one another about lots of stuff, such as people that have harmed them.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      9 months ago

      I’m on board with what you’re saying.

      Doctors used to be told “human babies don’t feel pain, they just react like the do”.

      Which is basically like saying “lobsters don’t scream when you boil them alive, that sound is just air escaping”

      To me, it seems less like an intuitive position to hold, and more like a fortunate convenience.

      “I sure am glad that lobsters don’t feel pain. Now I don’t need to feel guilty about my meal”.

      No doubt, there would be a large demographic claiming the pain isn’t real, it’s just “simulated pain”. - like, okay, let’s simulate your family fucking dying in the most violent and realistic way possible and see if you don’t develop incurable PTSD?

        • ZozanoOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          9 months ago

          Good to know, though the point remains; people will readily accept claims which absolve them of guilt.

          You essentially just illustrated it. Even though they aren’t screaming, it says nothing about whether they feel pain.

  • stoly@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    9 months ago

    Yes. If it’s alive then I’d care for it just as I do for any living thing.

  • skye@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    9 months ago

    “Freedom is the right of all sentient beings.” - Optimus Prime

    I don’t know if I’d consider it the worst crime ever committed in the history of the universe, but I would consider it very bad personally. I would personally value the life of that AI the same as I would value the life of a human, the same way I would value the life of anything sentient, so I would be against anyone treating an AI that way. Is it worse than genocides? idk maybe i don’t feel qualified to quantify the moral weight of things so big, but ya i’d definitely care x3

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Had to edit the post to change “crime” to “atrocity” because people were taking it literally.

      It’s funny that when I considered this, I thought about asking whether people would think it was worse than genocide, but decided against that because some people might think my opinion is “genocide isn’t as bad as bullying a robot”.

      • skye@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        9 months ago

        i edited my comment a few times because i didn’t feel like i was making sense and being too rambly, it’s 6am (well 6:30am) and i haven’t slept (and cuz after i initially posted i read other comments and realized other people had said what i had said but better x3)

        i didn’t mean to imply i thought you were saying genocide is worse than bullying a robot, it’s just that i was thinking about things that could be comparable or worse to me than torturing someone for millions of years and came up with genocide

        i took crime to mean something morally bad

        i mean i think this is a fun conversation, it’s something i think about a lot, i’m glad to talk about it with other people, sorry if i came across obtuse or pedantic or negative/hostile or anything

        • ZozanoOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          Don’t worry, I haven’t made any judgements about you.

          And I wasn’t implying that you were implying that I was implying genocide being comparable, I just thought it was funny that we both thought that.

          In some sense the combined suffering of all people involved in a genocide is horrific. But if you were to lay out the experiences of everyone involved in a genocide end-to-end, and compare that to an equivalent length of time to ceaseless sadistic torture of one person, the torture is going to be worse.

          However, there is value besides personal experience which is lost during a genocide. That’s what makes it hard to compare the two.

          • skye@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 months ago

            Sorry for the confusion then! I suppose I place some value on life itself (or maybe more fitting in this discussion, on awareness itself)

            Which is to say that for me, ending the life of a being who is aware is at least one of the worst things you can do. Like, if I were forced to choose between millions of years of suffering or immediate death, I’d probably pick the millions of years of suffering because at least I’d still be aware. Of course I might regret that decision later on but that’s where I’m at right now. But also I couldn’t imagine being tortured for millions of years and the toll that must have on someone. So torturing someone for millions of years has, for me, very similar moral weight to genocide. Again I don’t feel able to quantify them personally, and for me deciding which is ultimately worse is probably not possible. I’d guess the answer would vary from person to person based on how they weigh life itself vs experiences in life, and whether the conscious experience of being tortured is worse in their opinion than not existing anymore. I consider life valuable because I consider my life valuable (valuable to me, not necessarily to anyone else), and I consider my life valuable because I really enjoy the ability to think about and experience things. One of my favorite thing about us is that we look up into the sky and wonder, look down into the ocean and wonder, look forward in our future and wonder, look back on our past and wonder, that we can look at other people and wonder. That we can look at any of the above and love and write and sing. sentience might as well be magic lol. Having that taken away from me is the worst thing I can imagine happening to me, which might skew my perspective in conversations like this one. And idk if most people would agree with my reasons for valuing life.

  • MicrowavedTea@infosec.pub
    link
    fedilink
    English
    arrow-up
    11
    ·
    9 months ago

    I don’t know if the question comes from there but that’s the exact plot of White Christmas in Black Mirror. I’d say if you build something with the ability to suffer then its suffering matters. Not sure how you would prove that though.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      9 months ago

      Actually, that episode has bounced around in my head for years. The episode was fucking horrifying.

      So, yeah, you are correct.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    9 months ago

    Isn’t this how AM came to be in I Have No Mouth And I Must Scream?

    Hate. Let me tell you how much I’ve come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word ‘hate’ was engraved on each nanoangstrom of those hundreds of millions of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      9 months ago

      I’m not cultured enough to have read this.

      imagine wasting all 387.44 million miles of circuitry on the word “hate”. TLDR NPC. Get skinpilled hater.

  • Everythingispenguins@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    9 months ago

    “Well consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it’s too difficult or too hazardous. And an army of Datas, all disposable… you don’t have to think about their welfare, you don’t think about how they feel. Whole generations of disposable people.”

    -Guinan, Star Trek TNG: The Measure of a Man

  • leftzero@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    Would this be morally inhumane? Yes.

    Has using Windows often made me wish that computers could experience pain, and that they came with a button to cause them pain when they were not doing what the user wants them to do? Also yes.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      9 months ago

      Okay you’ve convinced me this is a good idea.

      How do I give consciousness to the “antivirus” software on my parents computers, so I can digitally rape if for a thousand years?

  • Admiral Patrick@dubvee.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    Black Mirror did a couple of episodes that’s basically that: Black Museum, USS Callister, and San Junipero (but in a good way).

  • livus@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    I don’t know what else has happened in the history of the universe but yes it would be a terrible crime to deliberately cause massive suffering to any sentient being.

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    9 months ago

    I’m human. And I care first and foremost about my own kin - other human beings. The “worst crime ever” [with crime = immorality] for me is human suffering, even in contrast with the suffering of other animals.

    But even in the case of other animals, I’d probably be more concerned about their well-being than the one of the hypothetical AI.

    Even then, it somewhat matters. Provided that what the AI is experiencing is relatable to what humans would understand as pain.

    • ZozanoOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      Suppose for the sake of the hypothetical we can plug a human brain into the same network, and offload a fraction of the consciousness to confirm the pain is equivalent, and it is not just comparable, but orders of magnitude greater than any human can suffer.

      You say you care about other human beings most. So I have two questions for you.

      Q1: Which is worse, one person having a finger nail pulled out with a pair of pliers, or a cat being killed with a knife?

      Q2: (I’m assuming you answered killing the cat is worse) how many people need to lose finger nails until it becomes worse? 10? 100?

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        9 months ago

        A1: if I know neither the person nor the cat, and there’s no further unlisted suffering, then the fingernail pulling is worse.

        The answer however changes based on a few factors - for example I’d put the life of a cat that I know above Hitler’s fingernail. And if the critter was another primate I’d certainly rank its death worse.

        A2: I’ll flip the question, since my A1 wasn’t what you expected:

        I’m not sure on the exact number of cat deaths that, for me, would become worse than pulling the fingernail off a human. But probably closer to 100 than to 10 or 1k.


        Within the context of your hypothetical AI: note that the cat is still orders of magnitude closer to us than the AI, even if the later would be more intelligent.

        • ZozanoOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          9 months ago

          Thanks for taking the intuitive to flip the question.

          The next question is: what metric are you using to determine that 100 cat deaths is roughly equivalent to one person having a fingernail pulled out? Why 100? Why not a million?

          Do you think there is an objective formula to determine how much suffering is produced by?

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            9 months ago

            I’m not following any objective formula, nor aware of one. (I would, if I could.) I’m trying to “gauge” it by subjective impact instead.

            [I know that this answer is extremely unsatisfactory and I apologise for it.]