RTX 5090 early benchmarks show underwhelming performance uplift over the RTX 4090

That is BECAUSE:
Blackwell is not design/engineered for Games, like RDNA is.
Blackwell Architecture and the chip (GB202-300) that the RTX 5090 is based on, is not even the full die/chip. The RTX 5090 is based off a PRO card (full die) that sells for $3,499.

CUDA is not for gaming, Nvidia just overmarkets it's lack of raw power behind gimmicks to sell to Gamers, who do not do creative content, nor work in Enterprise.

The RTX 5080 is going to illustrate that^ even more, because it even small chip with even less Gaming die space than the 5090.


If you are a GAMER, then RTX is not for you, that is why SONY, Microsoft, Steam, etc.. all chose RDNA as the architecture choice in their gaming hardware. AMD is for gaming.
Don't forget that AMD is going 2 prong: Chiplet and Monolithic.

AMD is about to humiliate Jensen for trying to push so much of their non-gaming hardware off on consumers, instead of Prosumers.



The RX 9080 is going to show the power of RDNA and show price/performance/power supremacy of their Gaming Architecture.

And then later this year, AMD will announce their top-tier Chiplet architecture using AMD Prosumer XDNA architecture, allowing AMD to compete directly with nVidia's $3k "gaming card", but AMD will offer custom chiplet designs, catering to individual needs.

If u need more Ai, then choose the RX Chiplet that has more tensor/xdna cores, etc. If you want all raster, then pick up the chiplet that has your best interests at heart.


Blackwell is a joke for Gaming !
Yeah... except... AMD hasn't rivalled Nvidia for years in the high end - and has already admitted they are never going to.

AMD sells low-performance, cost-effective cards... but they haven't been able to hold a candle to Nvidia in ages.

Blackwell might not have been designed SOLELY for gaming - but it still games WAY better than any AMD card you can buy - or will be able to buy in the near future.

Interesting take I hadn't considered, thank you. I haven't owned an AMD card since my dual HD 5670s in crossfire... and now I feel old.
Don't be too interested... he's an AMD shill... make sure you fact check everything he posts...
 
33% more cores, 512 vs 384 bit bus, GDDR7 so massive bandwidth increase, much higher TDP and if 4090 had these specs it would have jumped 100% over it's predecessor. This is an absolutely pitiful generational improvement that relies on brute force and DLSS BS to blind the rusted on fanboys. Nvidia's RDNA3 moment.

Now we hear AMD delaying 9070 to late March to cut Nvidia a break.
 
Tensor Cores come into play for AI-accelerated denoising of the noisy ray-traced image.
They don't. That's only a thing in games that have DLSS ray reconstruction, where AI-accelerated denoising is the point of that feature.
If denoising required tensor cores, AMD GPUs and the consoles wouldn't support ray tracing as a feature.
 
Yeah... except... AMD hasn't rivalled Nvidia for years in the high end - and has already admitted they are never going to.

AMD sells low-performance, cost-effective cards... but they haven't been able to hold a candle to Nvidia in ages.

Blackwell might not have been designed SOLELY for gaming - but it still games WAY better than any AMD card you can buy - or will be able to buy in the near future.


Don't be too interested... he's an AMD shill... make sure you fact check everything he posts...
dude, you're talking about literally 1% of the market. Everyone who just wants to get to work in the morning doesn't care that Ferrari is charging $500,000 for a car. And the idea that people are having difficulty seeing that is frustrating.

Most people just want to play games at an "acceptable" level. nVidia has been pushing the games industry in this super high fidelity direction and now they haven't been able to follow it up with the 50 series. for the 90% the industry that plays on a 70 class card or below, no cares if AMD isn't competing at the high end. Frankly, I see the most exciting products in the last several years coming from the lowest end of the market in the form of APUs and small form factor PCs.

That isn't even beginning to go over how poor development has made graphics WORSE than they were on the 8th gen consoles. All people are really asking for is for PS4 level graphics at 120+hz. The PS5 is basically a 60hz PS4.

On of the coolest products right now, atleast to me, is the minisforum g7 pt. It's a mini PC with a 7945HX and 7600m XT in it for $1000. All in an 85watt package that can fit in your pocket. That is way more exciting to me as a product than a 600watt AI server chip that didn't make the cut to be sold to a data center for 50k. And if you look at all the hyperscalers canceling their orders for blackwell, they aren't too happy with NV right now, either.

They wasted die space on more AI features that noone was asking for. The real tragedy here is that the 5090 COULD have been another awesome card like the 4090, but they wasted their die space. Now they softlocked DLSS4 to the 50 series because people wouldn't upgrade to it if they let DLSS4 run on the 4090s. But, honestly, I don't think many people who bought 4090s are going to buy 5090s just to use DLSS4 because a 20% performance boost is not worth $2000, probably much more than that after "price adjustments" from stores and board partner cards that are going to be selling for $200-300 over MSRP.

Unless you REALLY NEED 4x framegen, which if nVidia has their way, you wont be able to play games without it, there is no sensible reason to upgrade to a 5090 unless you're still on a 3090 and need an upgrade after 4 years.
 
dude, you're talking about literally 1% of the market. Everyone who just wants to get to work in the morning doesn't care that Ferrari is charging $500,000 for a car. And the idea that people are having difficulty seeing that is frustrating.

Most people just want to play games at an "acceptable" level. nVidia has been pushing the games industry in this super high fidelity direction and now they haven't been able to follow it up with the 50 series. for the 90% the industry that plays on a 70 class card or below, no cares if AMD isn't competing at the high end. Frankly, I see the most exciting products in the last several years coming from the lowest end of the market in the form of APUs and small form factor PCs.

That isn't even beginning to go over how poor development has made graphics WORSE than they were on the 8th gen consoles. All people are really asking for is for PS4 level graphics at 120+hz. The PS5 is basically a 60hz PS4.

On of the coolest products right now, atleast to me, is the minisforum g7 pt. It's a mini PC with a 7945HX and 7600m XT in it for $1000. All in an 85watt package that can fit in your pocket. That is way more exciting to me as a product than a 600watt AI server chip that didn't make the cut to be sold to a data center for 50k. And if you look at all the hyperscalers canceling their orders for blackwell, they aren't too happy with NV right now, either.

They wasted die space on more AI features that noone was asking for. The real tragedy here is that the 5090 COULD have been another awesome card like the 4090, but they wasted their die space. Now they softlocked DLSS4 to the 50 series because people wouldn't upgrade to it if they let DLSS4 run on the 4090s. But, honestly, I don't think many people who bought 4090s are going to buy 5090s just to use DLSS4 because a 20% performance boost is not worth $2000, probably much more than that after "price adjustments" from stores and board partner cards that are going to be selling for $200-300 over MSRP.

Unless you REALLY NEED 4x framegen, which if nVidia has their way, you wont be able to play games without it, there is no sensible reason to upgrade to a 5090 unless you're still on a 3090 and need an upgrade after 4 years.
Not gonna argue about cost/performance... Nvidia clearly doesn't give a F** about that - and that certainly annoys me... but Nvidia has about 80% of the discrete GPU market for a reason. They're superior to AMD... I really wish they weren't as I resent being charged $1000 (or $2000) for a GPU....
 
Moore's law: transistor density will double every 2 years.

Jensen's law: number of fake/subprime frames generation per real frame will triple every 2 years.
 
Not gonna argue about cost/performance... Nvidia clearly doesn't give a F** about that - and that certainly annoys me... but Nvidia has about 80% of the discrete GPU market for a reason. They're superior to AMD... I really wish they weren't as I resent being charged $1000 (or $2000) for a GPU....
90% percent of the news are about 5090
but 90% of the purchases are 500ish dollar gpu.

And most games will be designed for 1080p 30fps on last gen ...60 class gpu.
 
Don't forget that AMD is going 2 prong: Chiplet and Monolithic.

AMD is about to humiliate Jensen for trying to push so much of their non-gaming hardware off on consumers, instead of Prosumers.



The RX 9080 is going to show the power of RDNA and show price/performance/power supremacy of their Gaming Architecture.

And then later this year, AMD will announce their top-tier Chiplet architecture using AMD Prosumer XDNA architecture, allowing AMD to compete directly with nVidia's $3k "gaming card", but AMD will offer custom chiplet designs, catering to individual needs.

If u need more Ai, then choose the RX Chiplet that has more tensor/xdna cores, etc. If you want all raster, then pick up the chiplet that has your best interests at heart.


Blackwell is a joke for Gaming !
You live in lalaland. AMD said there was not a 1000$ GPU this gen. They focus on creating the best mid tier GPU. That`s the reason it is monolithic this time, to cut cost and latency issues.

They are creating a 70 class card and will retail at 70 class pricing.

Let`s hope they are pricing it at their 70 XT class historical MSRPs and not the one from Nvidia. They need to undercut Nvidia 5070 by 50$, so 499$ is the max they can ask to really shock the market.

At 480$, it will be a major disruptor, and at 450$, they will literally get half of the marketshare of the 5070.
 
30% is in line with the game benchmarks that did not use frame generation posted in the press release. Far Cry 6 and A plague tail: requiem
2025-01-07-image-7.jpg

Without a real process upgrade, there is not much "free" performance increase. The 30% performance increase comes from increasing the die size from 600 to 750 mm^2 and the power consumption from 450 to 575W. Like it or not, with the available process and the maximum chip size of about 800 square millimeters the 5090 is the maximum possible.
Remove those Tensor and RT cores and you will be able to get 50% better raster performances.
 
No shocker. Anyone with half a brain realizes this is an interim series...the last large leap was the 4090, you aren't getting quantum leaps on 2 consecutive models in a row..
The only reason it was, was because they went from Samsung 7n, which was an awful node, to TSMC 4n, which was the industry best.
 
Yeah... except... AMD hasn't rivalled Nvidia for years in the high end - and has already admitted they are never going to.

AMD sells low-performance, cost-effective cards... but they haven't been able to hold a candle to Nvidia in ages.

Blackwell might not have been designed SOLELY for gaming - but it still games WAY better than any AMD card you can buy - or will be able to buy in the near future.


Don't be too interested... he's an AMD shill... make sure you fact check everything he posts...
No it is not when the 9070 XT, a 64CU card, is beating a 5070 and almost goes against the 5070TI at 300W.

If AMD is able to price this to 480$ MSRP, like was the 6700XT, then it is game over for Nvidia. The only relevant GPU they will have will be the 5090, not because of its price, but because it is the strongest you can get.

Buying a 5080 at 1000$+ for barely 20-25% performance increase over a 9070XT that will cost half of it would feel downright insane.
 
But....but....but......It's nvidia! My hero, Leather-jacket man, wouldn't do this to me........Please...please...please....charge what you want - just take my money............I love you........LOL..................................
 
No it is not when the 9070 XT, a 64CU card, is beating a 5070 and almost goes against the 5070TI at 300W.

If AMD is able to price this to 480$ MSRP, like was the 6700XT, then it is game over for Nvidia. The only relevant GPU they will have will be the 5090, not because of its price, but because it is the strongest you can get.

Buying a 5080 at 1000$+ for barely 20-25% performance increase over a 9070XT that will cost half of it would feel downright insane.
I like your if, but it is a big IF cause AMD's shareholders would kill them if they were to sell the 9070 at that price. I think it will be 600 and the AIB cards will be around 650 to 750.
 
90% percent of the news are about 5090
but 90% of the purchases are 500ish dollar gpu.

And most games will be designed for 1080p 30fps on last gen ...60 class gpu.
Nvidia has 80% of the discrete GPU market... that includes $500 GPUs as well... they smoke AMD in every category...
 
DISCRETE GPUs… read please…
Yeah? And Nvidia share on different categories are?

First, you talk about discrete units, then it's suddenly Every category. Too bad for you, Every category includes other than discrete ones too.

Also feel free to put share on Discrete GPU price categories too.
 
Yeah? And Nvidia share on different categories are?

First, you talk about discrete units, then it's suddenly Every category. Too bad for you, Every category includes other than discrete ones too.

Also feel free to put share on Discrete GPU price categories too.
You failed to see the CONTEXT of my comment - I clearly meant Nvidia dominates AMD in every DISCRETE category... and actually, it's not 80% but 90% of the Discrete GPU market...

And while Intel has 65% of the non-discrete GPU market, AMD still trails Nvidia there as well (but it's a lot closer, 18% to 17%).

I understand you champion AMD - we see your posts all the time... and while they are justifiably praiseworthy when it comes to their CPU division, their discrete GPU division is sorely lacking.
 
Cannot see any categories there.
I always wonder, when arguing with you, whether you are being deliberately obtuse or just don't understand...

I'm done with you for now. The point of this article is Nvidia's 5090's benchmarks - which AMD has already stated they won't (aka CAN'T) compete with - and you've moved the goalposts far enough already...

Nvidia dominates AMD in the discrete GPU market - it's not even close. I gave you a source - feel free to Google others... but there is NO source that will show you that AMD is out-selling Nvidia in any discrete GPU category.
 
Buying a 5080 at 1000$+ for barely 20-25% performance increase over a 9070XT that will cost half of it would feel downright insane.
Yes it would be, but it wouldn't be the first time NVIDIA offers a far worse product for the money and still outsells AMD 2 or even 3 to 1.
And now they can always claim CUDA as a selling point as well.
(And possibly energy efficiency? Haven't read anything yet on how RTX 5xxx is supposed to compare to AMD RX 9xxx when it comes to that).
 
I always wonder, when arguing with you, whether you are being deliberately obtuse or just don't understand...

I'm done with you for now. The point of this article is Nvidia's 5090's benchmarks - which AMD has already stated they won't (aka CAN'T) compete with - and you've moved the goalposts far enough already...

Nvidia dominates AMD in the discrete GPU market - it's not even close. I gave you a source - feel free to Google others... but there is NO source that will show you that AMD is out-selling Nvidia in any discrete GPU category.
Bwahaha. Now you say point of article is 5090 benchmarks. But it was YOU who claimed Nvidia smokes AMD on EVERY category.

And when you are asked for source Nvidia smokes AMD on very category claim, you say there is no source AMD is outselling Nvidia on any category. However you failed to privide source Nvidia outsells AMD on every category. Or even one.
 
Not gonna argue about cost/performance... Nvidia clearly doesn't give a F** about that - and that certainly annoys me... but Nvidia has about 80% of the discrete GPU market for a reason. They're superior to AMD... I really wish they weren't as I resent being charged $1000 (or $2000) for a GPU....

Yeah, that reason is MARKETING.

Extremely gullable people have been buying NVidia GPU for the last 3-4 years. It takes an oldschool clan with tons of disposable income and tons of hardware to explain all out bs marketing to you kids. Reality hits and in the end, when People who have exceptional gaming rigs and racing/fighter pits, are NOT using Nvidia's RTX gimmicks <--- is all you need to know.

Nobody in a racing sim is using CUDA to work on Content creation, or pseudo games. They are going for strait performance and no gimmicks. (Why waste heat and energy on NVidia gimmicks?)

How is Nvidia architecture superior, if they can not match AMD's price/performance/efficiency..? AMD Radeon cards beat Nvidia's at Gaming from $200~$1000 in performance. Nvidia's HALO card the RTX5090 is not even a gaming card it is a binned/cut-down of a $3,400 Pro card.

If you are planning on dropping $2,000+ on a GPU, then you run a business, or are part of a high-end clan/guild/org.


If you drop $2k on a Nvidia GPU and you do not do creative content and do not need CUDA then AMD has just the $1,500 Gaming card for you coming later this year, in their new UDNA Chiplet GPU.
 
Last edited:
Back