Nvidia GeForce RTX 5090 Review

You guys can't be serious! 80/100 for a 2000$ GPU that barely get 27% more performance?
While consuming 100W more than a 4090?

And offering the same cost per frame value as a 4090 from 2 years ago?
Flagship or not, this is horrible. Not to mention the worst uplift from an Nvidia GPU ever achieved... 27%...

reality.jpg
 
Last edited:
Great review as always.. it is what it is, whether anybody wants it or not..

What surprises me a little is how good the 7900 XTX is in comparison, unless you are using RT where it sucks.

In my country it was literally half the price of the RTX 4090 last year... on the 2nd hand market it is almost 1/3 the price of a used 4090! crazy...
 
The most disappointing aspect is that the 5000 series did not become more efficient at ray tracing.

If the performance hit for RT was significantly less than the 4000 series, it would have increased the value of the new cards (particularly the still unreleased ones).

 
They can't all be winners. As much as I like to hate on nVidia, I had no desire for this cart to be as disappointing as it is. If you factor in the increased die space, it's even less impressive. The performance per mm^2 has actually gone down in some ways.

Maybe this give give AMD the opportunity to take some much needed market share. Let's hope AMDs UDNA flagship in 2026 will bring some excitement back to the GPU space.
 
Well, now it seems like the 5070 will be a major failure.

I am not sure it will be able to beat the 4070 SUPER.
 
It was never going to be worth the money and it was never going to matter. People with all the money and only desire for the fastest card will buy it. That's it.

5070Ti is the card I want to see. Maybe AMD have a big spanner with the 9070XT if it comes in near a 4070Ti and there are plenty of choices under $600.
 
Is it sad that the only thing I appreciate and is impressed about the 5090 FE is that it manages to keep ~575w board power under control with a 2-slot cooler design, opposed to the 3-slot design of the 4090FE (and plenty of partner boards) and 4-slot monstrosities (read: monstrosities) out there.

non-edit: same as what harby said.
 
Great review as always.. it is what it is, whether anybody wants it or not..

What surprises me a little is how good the 7900 XTX is in comparison, unless you are using RT where it sucks.

In my country it was literally half the price of the RTX 4090 last year... on the 2nd hand market it is almost 1/3 the price of a used 4090! crazy...
I sold my 7900XTX for a 4080S and am incredibly glad I did so.

The 7900XTX looks good if you squint at the graphs, but the horrific RT performance doesn't make sense on a flagship card when RT is such a strong visuals upgrade in many titles. With the 7900XTX RT just wasn't an option in many titles, whereas I can enjoy RT and fluidity with my 4080S.

On top of that, the presented raster results assume that FSR2/3 are equivalent to DLSS, which is simply not the case (and soon to be a bigger gap with DLSS4). On FSR I had to check each game for artifacts to see if it had a good FSR implementations at varying quality levels. On my 4080S I just slam DLSS Quality by default and move to balanced as needed without too much worry (at 1440p).

This is on top of some nasty driver issues/crashes in newly released games. You really get a lot for the Nvidia premium.
 
They can't all be winners. As much as I like to hate on nVidia, I had no desire for this cart to be as disappointing as it is. If you factor in the increased die space, it's even less impressive. The performance per mm^2 has actually gone down in some ways.

Maybe this give give AMD the opportunity to take some much needed market share. Let's hope AMDs UDNA flagship in 2026 will bring some excitement back to the GPU space.
It is built on a similar process node as the previous generation, so the performance increase could only come from increased die size and power. Nvidia did what it could be done with the process available. The 8GB more ram alone add about 20W to total power consumption.
 
It's silly to protest the price of halo products that are meant for "price is no object" buyers. Let's see where the 5080 lands.

It's also odd to ding multi-frame-gen for not improving responsiveness when it is explicitly about motion fluidity. As someone who hates blur this is an exciting development, and I've been using 2xFG on demanding games without issue to improve presentation.
 
Well, now it seems like the 5070 will be a major failure.

I am not sure it will be able to beat the 4070 SUPER.
This is the new order. 20% improvements to each card to be one model higher than the previous gen with similar price bumps between them:

5090 AI GPU / Halo product (now irrelevant for other GPU positioning)
5080 . . +20% of 4080 $1000
5070 TI +20% of 4070 TI (~4080) $750
5070 . . +20% of 4070S (~4070 TI) $550
 
Two things

1. 4K is no longer sufficient to benchmark as the top resolution. I think it's time to start looking at 5K, 6K, or 8K for the top resolution. ESPECIALLY with DLSS 4. And let's face it, 8K TVs have been available at B&M retailers for a while now. Time to rethink the high end. Keep 1080p since it's the most common resolution, but 1440p or 4K have to go if we can only have three tested resolutions.

2. RT, when looking at it as a performance penalty compared to the 40 series, has basically stagnated. While numbers are overall higher, they appear to still have roughly the same performance penalty compared to the prior generation. That's disappointing given how this is now the 4th generation of RT hardware from nVidia. Intel is now showing up as a serious RT contender too.
 
I'm getting a 5090 (and a 9800X3D), but I'm also upgrading from my 2019 rig (2080 Ti/2950X). While the cost is higher, after you adjust for the ~25% or so of inflation we have had since 2019, the cost increase isn't that much more. If I keep this new system for another 5-6 years, I'm paying less than a grand a year to have flagship parts, which isn't bad considering how much time I spend using it.

I for one am excited about the extra and faster VRAM in the 5090, which will be extremely useful in local AI workloads which I do run (and in games probably 4-5 years from now), but I agree that this does feel more like a 4090 Ti than an all-new generation of cards.

I am curious though, other sites have stated that there is a process node improvement for the 5090, specifically that it uses 4NP (compared to the 4090's 4N). Not a full generational leap, but an optimization. Still, with the measured power increases, I wouldn't have noticed.
 
Two things

1. 4K is no longer sufficient to benchmark as the top resolution. I think it's time to start looking at 5K, 6K, or 8K for the top resolution. ESPECIALLY with DLSS 4. And let's face it, 8K TVs have been available at B&M retailers for a while now. Time to rethink the high end. Keep 1080p since it's the most common resolution, but 1440p or 4K have to go if we can only have three tested resolutions.

2. RT, when looking at it as a performance penalty compared to the 40 series, has basically stagnated. While numbers are overall higher, they appear to still have roughly the same performance penalty compared to the prior generation. That's disappointing given how this is now the 4th generation of RT hardware from nVidia. Intel is now showing up as a serious RT contender too.
As someone who plans to upgrade to 8k as soon as it becomes viable, it's pointless on screens below ~32 inches or lower. Perhaps if you want a 60" monitor then 8k makes sense. However, it won't make sense until we see 8k120 displays. Then, and the major issue I haven't upgraded to an 8k60 display yet, is that the onboard processing increases LATENCY. If you look at input lag at the same display going from 1080p to 4k, you see it go from ~2ms to ~8-10ms depending on manufacturer. In 8k displays, the latency is between 70-100ms.

8K displays have a long way to go before they are game-capable. For now, they're only good at watching movies.

I almost bought the Samsung QN800C last year. Best buy let me connect my laptop up to it and test it out. Even in game mode, the latency was TERRIBLE.

 
27% more performance for 30% more power? That doesn't seem like a new generation. It's just a bigger version of last generation. Of course there are improvements on the software side (plenty of them), but it also costs $400 more.
 
Back