Can we talk about how absolutely terrifying is that 600W figure? We're not transcoding or generating slop as the primary use case, we're playing computer games. What was wrong with the previous-generation graphics that we still need to push for more raw performance, rather than reducing power draw?
What was “wrong” is that enough people are willing to pay exorbitant prices for the highest-end gear that Nvidia can do most anything they want as long as their products have the best numbers.
Other companies do make products with lower power draw — Apple in particular has some good stuff in this space for people who need it for AI and not gaming. And even in the gaming space, you have many options for good products — but people who apparently have money to burn want the best at any cost.
We must be thinking about very different types of games, because even though I’m completely bought into the Apple ecosystem and love my M3 macbook pro and mac mini, I have a windows gaming PC sitting in the corner because very few titles I’d want to play are available on the mac.
Perhaps I phrased it poorly but I was trying to separate out GPU workloads for AI and gaming. The apple ecosystem is very poor for gaming overall, but in their ML and LLM related abilities they have very good performance at a fraction of the power draw of a modern nvidia card.
So the point being, nvidia is optimizing for gamers who are willing to throw top dollar at the best gear, regardless of power draw. But it’s a choice, and other manufacturers can make different tradeoffs.
Is the primary use case for *090 series gaming anymore? 5070 which is probably what most popular gaming card is 250W. If I recall correctly it can push 4k @ 60fps for most games.
But yes, I do agree that TDPs for GPUs are getting ridiculous.
4k 60Hz is still largely unachievable for even top of the line cards when testing recent games with effects like raytracing turned up. For example, an RTX 4090 can run Cyberpunk 2077 at 4k at over 60fps with the Ray Tracing Low preset, but not any of the higher presets.
However, it's easy to get misled into thinking that 4k60 gaming is easily achieved by more mainstream hardware, because games these days are usually cheating by default using upscaling and frame interpolation to artificially inflate the reported resolution and frame rate without actually achieving the image quality that those numbers imply.
Gaming is still a class of workloads where the demand for more GPU performance is effectively unlimited, and there's no nearby threshold of "good enough" beyond which further quality improvements would be imperceptible to humans. It's not like audio where we've long since passed the limits of human perception.
4k@60 isn't all that good today and 5070 can do it with reduced graphics in modern games.
x90 cards IMO are either bought by people that absolutely need them (yay market segmentation) or simply because they can (affording is another story) and want to have the best of the latest.
This genereation seems that is getting performance using more power and more cores. Not really an architectural change but only packing more things in the chip that require more power.
Too true. I've been looking replace my 1080. This was a beast in 2016, but the only way I can get a more performant card these days is to double the power draw. That's not really progress.
Then get a modern GPU and limit the power to what your 1080 draws. It will still be significantly faster. GPU power is out of control these days, if you knock 10% off the power budget you generally only lose a few percentage of performance.
Cutting the 5090 down from 575w to 400w is a 10% perf decrease.
5090 was an example, same process applies to lower tier GPUs that don't require extra power cables. ie a 3080 with the same power budget as a 1080 would run circles around it (1080 with default max power limit of 180w gets approx 7000 in TimeSpy, 3090 limited to 150w gets approx 11500). Limiting the power budget is very simple with tools such as MSI Afterburner and others in the same space.
Because previous generation graphics didn't include ray/path tracing or DLSS technologies. They had baked in lighting and shaders that required much less compute to generate. Now that it does it requires more computing power that (we assume) Nvidia hasn't been able to solve with improved higher efficient computing power but simply by pushing more power through the card.
It's what Intel has been grappling with, their CPU's are drawing more and more wattage at the top end.
1. People want their desktop computers to be fast. These are not made to be portable battery sippers. Moar powa!!!
2. People have a powerpoint at the wall to plug their appliances into.
Ergo, desktop computers will tend towards 2000w+ devices.
"Insane!" you may cry. But a look at the history of car manufacture suggests that the market will dictate the trend. And in similar fashion, you will be able to buy your overpowered beast of a machine, and idle it to do what you need day to day.
Well exactly my point. I'm "still" using an M1 Mac mini as my daily driver. 6W idle. In a desktop. It is crazy fast compared to the Intel Macs of the year before, but the writing was already on the wall: this is the new low-end, the entry level.
Still? It runs Baldur's Gate 3. Not smoothly, but it's playable. I don't have an M4 Pro Max Ultra Plus around to compare the apples to apples, but I'd expect both perf and perf per watt to be even better.
If one trillion dollar company can manage this, why not the other?
I imagine it's using more than 6w to play Baldurs Gate 3 but still, I get that it is far more efficient for the work being done. I'm a bit irked that my desktop idles at 35w. But then I recall growing up with 60w light bulbs as the default room lighting...
But other people will look at that and say "Not smooth = unplayable. If you can do so much with 100w or less, then lets dial that up to 2000w and make my eyes bleed!"
We're not the ones pushing the limits of the market it seems.
Is your argument that computer games don't merit better performance (e.g. pushing further into 4K) and/or shouldn't expand beyond the current crop and we give up on better VR/AR ?