• Lumiluz@slrpnk.net
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    12 hours ago

    Same vibes as “if you learned to draw with an iPad then you didn’t actually learn to draw”.

    Or in my case, I’m old enough to remember “computer art isn’t real animation/art” and also the criticism assist Photoshop.

    And there’s plenty of people who criticized Andy Warhol too before then.

    Go back in history and you can read about criticisms of using typewriters over hand writing as well.

    • finitebanjo@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      10 hours ago

      None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.

      • Lumiluz@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 hours ago

        Have you ever looked at the file size of something like Stable Diffusion?

        Considering the data it’s trained on, do you think it’s;

        A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes

        Second, what’s the electrical cost of generating a single image using Flux vs 3 minutes of Balder’s Gate, or similar on max settings?

        Surely you must have some idea on these numbers and aren’t just parroting things you don’t understand.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 minutes ago

          What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?

          Gaming:

          A high scenario electricity consumption figure of around 27 TWh, and a low scenario figure of 14.7 TWh

          North American gaming market is about 7% of the global total

          then that gives us a very very rough figure of about 210-285 TWh per annum of global electricity used by gamers.

          AI:

          The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.

          That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.

          Lets just say we’re at the halfway point and its 600 TWhper anum compared to 285 for gamers.

          So more than fucking double, yeah.

          And to reiterate, people generate thousands of frames in a session of gaming, vs a handful of images or maybe some emails in a session of AI.