Nvidia's GeForce RTX 5090 is here, bringing next-gen power with a $2,000 price tag. Featuring a 23% larger die, 33% more cores, and blazing-fast GDDR7 memory, it's built to dominate – but is it worth it?
Nvidia's GeForce RTX 5090 is here, bringing next-gen power with a $2,000 price tag. Featuring a 23% larger die, 33% more cores, and blazing-fast GDDR7 memory, it's built to dominate – but is it worth it?
Frankly, TechSpot should just get rid of the number score in their reviews. It's often contentious, applied inconsistently across different products, and means absolutely nothing to the readers.You guys can't be serious!
80/100 for a 2000$ GPU that barely get 27% more performances?
I sold my 7900XTX for a 4080S and am incredibly glad I did so.Great review as always.. it is what it is, whether anybody wants it or not..
What surprises me a little is how good the 7900 XTX is in comparison, unless you are using RT where it sucks.
In my country it was literally half the price of the RTX 4090 last year... on the 2nd hand market it is almost 1/3 the price of a used 4090! crazy...
Well, now it seems like the 5070 will be a major failure.
I am not sure it will be able to beat the 4070 SUPER.
It is built on a similar process node as the previous generation, so the performance increase could only come from increased die size and power. Nvidia did what it could be done with the process available. The 8GB more ram alone add about 20W to total power consumption.They can't all be winners. As much as I like to hate on nVidia, I had no desire for this cart to be as disappointing as it is. If you factor in the increased die space, it's even less impressive. The performance per mm^2 has actually gone down in some ways.
Maybe this give give AMD the opportunity to take some much needed market share. Let's hope AMDs UDNA flagship in 2026 will bring some excitement back to the GPU space.
This is the new order. 20% improvements to each card to be one model higher than the previous gen with similar price bumps between them:Well, now it seems like the 5070 will be a major failure.
I am not sure it will be able to beat the 4070 SUPER.
As someone who plans to upgrade to 8k as soon as it becomes viable, it's pointless on screens below ~32 inches or lower. Perhaps if you want a 60" monitor then 8k makes sense. However, it won't make sense until we see 8k120 displays. Then, and the major issue I haven't upgraded to an 8k60 display yet, is that the onboard processing increases LATENCY. If you look at input lag at the same display going from 1080p to 4k, you see it go from ~2ms to ~8-10ms depending on manufacturer. In 8k displays, the latency is between 70-100ms.Two things
1. 4K is no longer sufficient to benchmark as the top resolution. I think it's time to start looking at 5K, 6K, or 8K for the top resolution. ESPECIALLY with DLSS 4. And let's face it, 8K TVs have been available at B&M retailers for a while now. Time to rethink the high end. Keep 1080p since it's the most common resolution, but 1440p or 4K have to go if we can only have three tested resolutions.
2. RT, when looking at it as a performance penalty compared to the 40 series, has basically stagnated. While numbers are overall higher, they appear to still have roughly the same performance penalty compared to the prior generation. That's disappointing given how this is now the 4th generation of RT hardware from nVidia. Intel is now showing up as a serious RT contender too.