“Jensen sir, 50 series is too hot”

“Easy fix with my massive unparalleled intellect. Just turn off the sensor”

If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!

If you buy a Nvidia GPU, you are part of the problem here.

  • empireOfLove2@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    2 days ago

    Unlikely, as the hotspot sensors/detection logic is baked into the chip silicon and it’s microcode. AIB’s can only change the PCB around the die. I’d almost guarantee the thermal sensors are still present to avoid fires, but if Nvidia has turned off external reporting outside the chip itself (beyond telling the driver that thermal limit has been reached), I doubt AIB’s are going to be able to crack it too.

    Also the way Nvidia operates, if an AIB deviates from Nvidia’s mandatory process, they’ll get black balled and put out of business. So they won’t. Daddy Jensen knows best!

      • empireOfLove2@lemmy.dbzer0.comOP
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Oh I’m sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that’s what hotspot temp reporting helps with diagnosing.

        • Rav Sha'ul@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          Isn’t a GPU that pulls 600 watts in the whackjob territory?

          The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.

          • alphabethunter@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            It’s not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I’ve seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.