- cross-posted to:
- gpu@programming.dev
- cross-posted to:
- gpu@programming.dev
“Jensen sir, 50 series is too hot”
“Easy fix with my massive unparalleled intellect. Just turn off the sensor”
If you needed any more proof that Nvidia is continuing to enshittify their monopoly and milk consumers. Hey lets remove one of the critical things that lets you diagnose a bad card and catch bad situations that might result in gpu deathdoors! Dont need that shit, Just buy new ones every 2 years you poors!
If you buy a Nvidia GPU, you are part of the problem here.
Oh I’m sure the lower cards will run cool and fine for average die temps. The 5090 is very much a halo product with that ridiculous 600w TBP. But as with any physical product, things do decay over time, or are assembled incorrectly, and that’s what hotspot temp reporting helps with diagnosing.
Isn’t a GPU that pulls 600 watts in the whackjob territory?
The engineers need to get the 6090 to use 400 watts. That would be a very big PR win that does not need any marketing spin to sell.
It’s not a node shrink, just a more ai-focused architecture in the same node as the 4090. To get more performance they need more powah. I’ve seen reviews stating a ~25% increase in raw performance at the cost of ~20% more powah than the 4090.