RTX 5090 early benchmarks show underwhelming performance uplift over the RTX 4090

Looking at the tech specs and the marketing, nobody should be surprised that the 5090 is a relatively nominal upgrade from the previous generation. Pretty similar to how the RTX 2000 generation wasn't the huge uplift from the 1090 generation that people though it was going to be.

Now whether it's worth it or not depends on what you are upgrading from. RTX 4090 owners IMHO should be very happy to keep their money in their wallets. For someone like myself coming from the RTX 2080 Super (shudder) and got shut out of the 3080 and 4090 due to pandemic, cryptomining and scalping and felt that the 4080 just wasn't worth it, the 5090 is the right future-proofing upgrade at the right time though if the 4090 were still available at a reasonable cost I wouldn't have hesitated to pick it up instead.

Interestingly, the turmoil we've all been through these last few years as PC gamers taught me that I can in fact survive succeeding GPU generations and don't always have to get the latest and greatest generation. That was fully my plan when I went from my GTX 1080 to the RTX 2080 Super (No RTX 2080 because, again, cryptomining) but I have been doing just fine gaming at 4K/60 with the right game settings and a touch of upscaling. We are onto the third generation now, and I am overdue for some future proofing upgrading. I expect, though, that after what I expect to be an expensive upgrade for a new CPU and GPU upgrade, that I will be skipping the next two generations again.
 
You live in lalaland. AMD said there was not a 1000$ GPU this gen. They focus on creating the best mid tier GPU. That`s the reason it is monolithic this time, to cut cost and latency issues.

They are creating a 70 class card and will retail at 70 class pricing.

Let`s hope they are pricing it at their 70 XT class historical MSRPs and not the one from Nvidia. They need to undercut Nvidia 5070 by 50$, so 499$ is the max they can ask to really shock the market.

At 480$, it will be a major disruptor, and at 450$, they will literally get half of the marketshare of the 5070.
No, you are leaning on old news.

-RDNA4 is monolithic dies (~$1000) and are coming out over the next months. (ie RX 9070x)
-RDNA5 chiplet has been changed to UDNA chiplet ($1,300 ~ $2,500) (Utilizing Radeon PRO cards w/HBM3 chiplet design).

AMD is going to do the EXACT same thing NVIDIA is doing, selling a PRO card, as a gamer card but actually customizing the architecture for Gamers, leveraging AMD's chipset design.


Not one card for everyone, but one design for everyone.
 
Yeah, that reason is MARKETING.

Extremely gullable people have been buying NVidia GPU for the last 3-4 years. It takes an oldschool clan with tons of disposable income and tons of hardware to explain all out bs marketing to you kids. Reality hits and in the end, when People who have exceptional gaming rigs and racing/fighter pits, are NOT using Nvidia's RTX gimmicks <--- is all you need to know.

Nobody in a racing sim is using CUDA to work on Content creation, or pseudo games. They are going for strait performance and no gimmicks. (Why waste heat and energy on NVidia gimmicks?)

How is Nvidia architecture superior, if they can not match AMD's price/performance/efficiency..? AMD Radeon cards beat Nvidia's at Gaming from $200~$1000 in performance. Nvidia's HALO card the RTX5090 is not even a gaming card it is a binned/cut-down of a $3,400 Pro card.

If you are planning on dropping $2,000+ on a GPU, then you run a business, or are part of a high-end clan/guild/org.


If you drop $2k on a Nvidia GPU and you do not do creative content and do not need CUDA then AMD has just the $1,500 Gaming card for you coming later this year, in their new UDNA Chiplet GPU.
Ok… so it’s only marketing that Nvidia has a 90% market share?
So you’re claiming that 90% of GPU buyers are gullible fools?
Well, gullible or not, Nvidia customers speak with their wallets…

Don’t get me wrong - it would be really nice to see competition in the GPU market as Nvidia can charge whatever they want and they don’t really have to innovate… we are seeing the same with AMD’s 9000 CPU line as Intel can’t compete and AMD has been sitting on their laurels…
 
Ok… so it’s only marketing that Nvidia has a 90% market share?
So you’re claiming that 90% of GPU buyers are gullible fools?
Well, gullible or not, Nvidia customers speak with their wallets…

Don’t get me wrong - it would be really nice to see competition in the GPU market as Nvidia can charge whatever they want and they don’t really have to innovate… we are seeing the same with AMD’s 9000 CPU line as Intel can’t compete and AMD has been sitting on their laurels…
Who cares about market share if they still have a shitty product. Most people don't build their own PCs. Prebuilts still far outsell custom builds and if you look at the prebuilt market, they almost all exclusively use nVidia cards.

Most people type "gaming PC" into Google, find one that looks cool in their budget and click buy. I encountered this problem when looking for a laptop where I wanted an AMD laptop with JUST and APU. For the longest time I was just finding ryzen APUs paired with a 4050 or 4060m. When I was looking for my "perfect laptop" I also noticed a ton of prebuilts did the same thing, they were all nVidia. Do you think console gamers know or care that their consoles have AMD GPUs in them? Probably not.

without getting to much further into a tangent, nVidia didn't steal the dGPU market because every PC made a conscious decision to buy and NV graphics cars, Most just bought what came in whatever PC they could afford. And if they did buy an NV to install in their own system, it's probably because some steamer they know has a 4090 and they wanted to be cool like some influencer.

They didn't take that much market share because they make a better product, they got that much market share buy offering systen builders incentives and marketing through streamers.

Anyone with any sense who didn't need a 4090 should have bought literally any other AMD card.

AMD beat NV at every price point with most people citing DLSS as the reason they went NV. Now look where we are, everyone is shouting into the sky about how they want more raw performance and not DLSS fake frames.
 
36% faster in Blender? Great, now I can stare at the render progress bar for 4 hours instead of 6 while questioning if I should’ve just stuck to stick figure animations.
 
I do not usually watch videos about graphics technology in such details, but this one was interesting even for me. 2k GPUs won't save us if optimizations is a forgotten technology
 
Who cares about market share if they still have a shitty product.
Well... I'd say Nvidia does? And I bet AMD does too... cause no matter how good your product is, if no one is buying it, then you wasted your time...

Hey, I'm not arguing about whether AMD is more cost effective - just cold hard facts...they are being outsold 90-10...

Oh.. and as for integrated GPUs... they only make up 25% of GPU sales... so no, Intel doesn't DOMINATE GPU manufacture anymore... they have 65% of integrated GPU marketshare - but 65% of 25% is less than 90% of 75%.... thanks math :)

 
Because I see some confusion on what hardware components do what in Nvidia GPUs. Here is what I found on the internet. Maybe it will be helpful.

1. CUDA Cores
What They Are: General-purpose parallel processing units for traditional graphics and compute workloads.
What They Do:
- Handle shaders for rendering textures, lighting, and effects.
- Perform rasterization tasks for generating 3D scenes in 2D displays.
- Support physics simulations (e.g., fluid dynamics, particle effects).
- Execute general-purpose GPU (GPGPU) tasks like simulations, computational modeling, and data parallelism.
- Aid in path tracing by computing shading, material interactions, and sampling.

2. Tensor Cores
What They Are: Specialized units optimized for AI and deep learning computations.
What They Do:
- Perform high-speed matrix multiplications, vital for neural network training and inference.
- Drive DLSS (Deep Learning Super Sampling), upscaling images with AI to boost resolution and frame rates.
- Accelerate denoising algorithms in ray tracing and path tracing for cleaner visuals.
- Support mixed-precision calculations (e.g., FP8, FP16, INT8) for flexibility and performance in AI workloads.
- Enhance AI-based tools and applications in content creation, research, and gaming.

3. RT (Ray Tracing) Cores
What They Are: Dedicated hardware units for real-time ray tracing computations.
What They Do:
- Perform Bounding Volume Hierarchy (BVH) traversal to identify which parts of a scene rays interact with.
- Execute ray-triangle intersection tests to calculate where rays hit objects.
- Enable global illumination, soft shadows, ambient occlusion, and reflections for photorealistic rendering.
- Support advanced techniques like path tracing by handling secondary ray bounces efficiently.
- Introduce new features in the 40xx series, such as Opacity Micro-Maps and Displaced Micro-Meshes, improving ray tracing performance and visual accuracy.

4. Optical Flow Accelerator
What It Is: A specialized hardware block for motion estimation and frame interpolation.
What It Does:
- Analyzes pixel motion between two frames to predict movement in dynamic scenes.
- Powers DLSS 3 Frame Generation, creating intermediate frames to boost frame rates in gaming.
- Helps reduce perceived motion blur and improves the smoothness of animations in real-time rendering.
- Useful for virtual reality (VR) and video applications where motion prediction is critical for performance and quality.
 
Ok… so it’s only marketing that Nvidia has a 90% market share?
So you’re claiming that 90% of GPU buyers are gullible fools?
Well, gullible or not, Nvidia customers speak with their wallets…

Don’t get me wrong - it would be really nice to see competition in the GPU market as Nvidia can charge whatever they want and they don’t really have to innovate… we are seeing the same with AMD’s 9000 CPU line as Intel can’t compete and AMD has been sitting on their laurels…
Anyone can use words and numbers, but they have to mean something. McDonalds outsells 5-Guys... therefore McDonalds is thee best hamburger, bcz u do not care about ingredients and the actual meat of the story.


Market share..?

What does that even mean to a Gamer, or even a clan of gamers..? Why are you so overly vocal for those little kids who Dad's pick up a Holiday special so their kids has a laptop to game on..?

Market share? If you are gaming on a laptop, are you really a gamer..? Do you actually care about your squad mates, or dedicating a room for your competitive e-sports and night long Competitions, etc. Most people are given nVidia, they don't buy it.

See..? Market share doesn't mean anything to someone who is buying a GPU to game on. So please stop with your Stock market analysis... I was mining bit coins in 2009 and already have investments.



No, Nvidia can NOT charge whatever they want, how much can u charge for a defunct 4080, when the XTX beats it is most/all competitive games..?

Why was the 4080 SUPER discontinued and replaced by a more powerful card, just to compete with AMD price/performance RDNA efficiency that Ada Lovelace can not match. And yet the XTX still beasts the 4080's older brother the 4090 in a few games...

RDNA is that^ powerful.


Towards the end of this this year, AMD is going to scale up their GPU technology into heavy wattage gaming Pro cards, so that people like you (who are stuck on only top #'s) will see UDNA vs Blackwell, watt to watt.

More Game Developers are designing games for RDNA, than CUDA. That is why Nvidia has to use marketing and pay Game Developers to use their tech.

More people game on their couch (using RDNA), than own RTX cards.



Note: Nearly 1/8th of all RTX cards are used outside of gaming and an equal percentage end up in China for nefarious reasons, not gaming.

Market share doesn't mean best at gaming, like you keep regurgitating.
 
Being "Underwhelmed" is an opinion and a clear indication of a bias against Nvidia. The competition doesn't get this fast so there is no comparison. Surely if a competitors flagship product is now slower than a 5090 the author should be even more underwhelmed by that product? If not, you are holding the manufacturers to different standards. And I'm not interested in hearing "but it's better value for money", these are flagship GPUs, you shouldn't be going anywhere near them if value for money is your priority. I doubt the buyers of these things care about the cost per frame.

Personally, I am not underwhelmed. It's exactly what I expected it to be. I doubt il pay the money Nvidia want for it. But that doesn't matter, as long as enough people do, this product will be a success for Nvidia. And I have a feeling this product will be a success. Despite the author being "underwhelmed".

 
Anyone can use words and numbers, but they have to mean something. McDonalds outsells 5-Guys... therefore McDonalds is thee best hamburger, bcz u do not care about ingredients and the actual meat of the story.


Market share..?

What does that even mean to a Gamer, or even a clan of gamers..? Why are you so overly vocal for those little kids who Dad's pick up a Holiday special so their kids has a laptop to game on..?

Market share? If you are gaming on a laptop, are you really a gamer..? Do you actually care about your squad mates, or dedicating a room for your competitive e-sports and night long Competitions, etc. Most people are given nVidia, they don't buy it.

See..? Market share doesn't mean anything to someone who is buying a GPU to game on. So please stop with your Stock market analysis... I was mining bit coins in 2009 and already have investments.



No, Nvidia can NOT charge whatever they want, how much can u charge for a defunct 4080, when the XTX beats it is most/all competitive games..?

Why was the 4080 SUPER discontinued and replaced by a more powerful card, just to compete with AMD price/performance RDNA efficiency that Ada Lovelace can not match. And yet the XTX still beasts the 4080's older brother the 4090 in a few games...

RDNA is that^ powerful.


Towards the end of this this year, AMD is going to scale up their GPU technology into heavy wattage gaming Pro cards, so that people like you (who are stuck on only top #'s) will see UDNA vs Blackwell, watt to watt.

More Game Developers are designing games for RDNA, than CUDA. That is why Nvidia has to use marketing and pay Game Developers to use their tech.

More people game on their couch (using RDNA), than own RTX cards.



Note: Nearly 1/8th of all RTX cards are used outside of gaming and an equal percentage end up in China for nefarious reasons, not gaming.

Market share doesn't mean best at gaming, like you keep regurgitating.
I’d love to see some evidence of your hogwash… as for market share not meaning anything… I’d love for you to have that discussion with the people at Nvidia and AMD - they’d laugh you out of the building.

If McDonalds outsells a restaurant that COMPETES with them - then yes, they’re better! Not necessarily their food - their marketing, service or location might be the clincher… but it means they’re doing something better than their competitors!!

Nvidia is selling more GPUs - and making more profits from them - than AMD.
This means that Nvidia is BETTER than AMD. Not necessarily their cards (but their high end ones can’t be beaten) but the whole package: availability, quality, performance, marketing, etc…
Whether you think people are dumb or ignorant or gullible is irrelevant - they’re the ones who make Nvidia money.
 
I’d love to see some evidence of your hogwash… as for market share not meaning anything… I’d love for you to have that discussion with the people at Nvidia and AMD - they’d laugh you out of the building.

If McDonalds outsells a restaurant that COMPETES with them - then yes, they’re better! Not necessarily their food - their marketing, service or location might be the clincher… but it means they’re doing something better than their competitors!!

Nvidia is selling more GPUs - and making more profits from them - than AMD.
This means that Nvidia is BETTER than AMD. Not necessarily their cards (but their high end ones can’t be beaten) but the whole package: availability, quality, performance, marketing, etc…
Whether you think people are dumb or ignorant or gullible is irrelevant - they’re the ones who make Nvidia money.
Intel makes more money selling CPUs than AMD. Intel also has much higher market share. That means Intel is better than AMD? That means Intel has better CPUs than AMD?

That is because stupid customers buy Intel. Just like stupid customers buy Nvidia. Those same people that believe Jensen who claims RTX 5070 equals RTX 4090. In other words, anyone that buys RTX 4090 now is an ***** or Jensen is talking BS. Make your choice. We'll see later which option is right. In any case, that tells much about Nvidia.
 
Intel makes more money selling CPUs than AMD. Intel also has much higher market share. That means Intel is better than AMD? That means Intel has better CPUs than AMD?

That is because stupid customers buy Intel. Just like stupid customers buy Nvidia. Those same people that believe Jensen who claims RTX 5070 equals RTX 4090. In other words, anyone that buys RTX 4090 now is an ***** or Jensen is talking BS. Make your choice. We'll see later which option is right. In any case, that tells much about Nvidia.
Many people have as much memory as a fish. When Nvidia launched the 4070, they claimed performance similar to 3090. I remember the discussion here, in techspot. In the end it performed %6 behind the 3080 with $100 less MSRP. But I must admit, Nvidia is herding its flock successfully every time :D

And today I see the same people cheering for 5070 and 5090. Why don't we wait for benchmarks first?

If we keep supporting Nvidia, some people are going to be applauding a xx90 tier card with $4000 MSRP, 5 year later.
 
AMD has just the $1,500 Gaming card for you coming later this year, in their new UDNA Chiplet GPU.
You need to stop lying, you couldn't point me to any sources that even rumour this, let alone any credible sources.

Even a simple Google of "AMD UDMA release date" the most faintest rumours say "it's plausible for a release in 2026", With nothing to backup those claims either...
 
Intel makes more money selling CPUs than AMD. Intel also has much higher market share. That means Intel is better than AMD? That means Intel has better CPUs than AMD?

That is because stupid customers buy Intel. Just like stupid customers buy Nvidia. Those same people that believe Jensen who claims RTX 5070 equals RTX 4090. In other words, anyone that buys RTX 4090 now is an ***** or Jensen is talking BS. Make your choice. We'll see later which option is right. In any case, that tells much about Nvidia.
Intel WAS better than AMD for years…. And its market share used to be vastly higher. It takes time for market share numbers to move - but they ARE!
AMD now has around 28% of the CPU market - WAY higher than it used to be - and it will continue to climb unless Intel can get their act in gear.

AMD and Nvidia used to be much closer in the GPU market - but Nvidia has surged ahead as AMD drops the ball generation after generation.
If they don’t do something soon, their market share will go even lower to approaching 0….
 
Well... I'd say Nvidia does? And I bet AMD does too... cause no matter how good your product is, if no one is buying it, then you wasted your time...

Hey, I'm not arguing about whether AMD is more cost effective - just cold hard facts...they are being outsold 90-10...

Oh.. and as for integrated GPUs... they only make up 25% of GPU sales... so no, Intel doesn't DOMINATE GPU manufacture anymore... they have 65% of integrated GPU marketshare - but 65% of 25% is less than 90% of 75%.... thanks math :)

I never said anything about intels marketshare of the iGPU market.
 
You need to stop lying, you couldn't point me to any sources that even rumour this, let alone any credible sources.

Even a simple Google of "AMD UDMA release date" the most faintest rumours say "it's plausible for a release in 2026", With nothing to backup those claims either...

Oh.........you don't think perhaps that he was just joking?
 
You failed to see the CONTEXT of my comment - I clearly meant Nvidia dominates AMD in every DISCRETE category... and actually, it's not 80% but 90% of the Discrete GPU market...

And while Intel has 65% of the non-discrete GPU market, AMD still trails Nvidia there as well (but it's a lot closer, 18% to 17%).

I understand you champion AMD - we see your posts all the time... and while they are justifiably praiseworthy when it comes to their CPU division, their discrete GPU division is sorely lacking.

AMD is sorely lacking? Hmmm........I had nvidia for many years. Last year I bought a Radeon 6750xt. It runs every single game of mine (DOOM/DOOM ETERNAL/SKYRIM/ FALLOUT 4....etc, etc.) perfectly at 1080 in top graphics selection-with NO glitches. They are doing just great by my estimation........
 
AMD is sorely lacking? Hmmm........I had nvidia for many years. Last year I bought a Radeon 6750xt. It runs every single game of mine (DOOM/DOOM ETERNAL/SKYRIM/ FALLOUT 4....etc, etc.) perfectly at 1080 in top graphics selection-with NO glitches. They are doing just great by my estimation........
Good for you… unfortunately for AMD, 9 out of 10 people use Nvidia instead
 
Oh.........you don't think perhaps that he was just joking?
He argued with me for ages in another thread (that got locked because of it) insisting AMD had some large GPU coming out this year, but could not link me to any sources of any kind, just arguing he knows best apparently...
 
The first leaked Geekbench test of the 5090 on x.com was with a 12900K and DDR4 Ram. I searched a bit and found two comparable scores, but there are slight differences in the RAM specs and of course the scores are also much older (early 2023), so there are certainly further differences in Windows /drivers etc. Still, it's interesting:

OpenCL Scores (results linked to the geekbench profile):

RTX 5090, 12900K, Z690, 64GB DDR4 3,600: 367740 (the leaked 5090)
RTX 4090, 12900K, Z690, 32GB DDR4 3,200: 326086 (4090 with slighty slower Ram)
RTX 4090, 12900K, Z690, 32GB DDR5 6,400: 332618 (4090 with DDR5)

The second one is almost exactly the same rig with the 4090 and slighty slower memory => approx 12,5% higher OpenCL score going from 4090 to 5090 on the same old system.

If that leak is true (no one knows at this point), that kind of result is not optimal. At least not with a 12900k or comparable system. And considering the higher price of the 5090, the better specs and last but not least the much higher power consumption, it's pretty disappointing.

I don't know to what extent the CPU and RAM slow down the 5090 in that result (that's definitely the case here). With a top CPU and system, things certainly would look better (although I personally still doubt a 30%+ average performance gain can be achieved with a top notch CPU based on these results - a 12900k isn't that weak to let that happen).

I guess, we will know in a couple of days.
 
Intel WAS better than AMD for years…. And its market share used to be vastly higher. It takes time for market share numbers to move - but they ARE!
AMD now has around 28% of the CPU market - WAY higher than it used to be - and it will continue to climb unless Intel can get their act in gear.

AMD and Nvidia used to be much closer in the GPU market - but Nvidia has surged ahead as AMD drops the ball generation after generation.
If they don’t do something soon, their market share will go even lower to approaching 0….
AMD has 28% market share despite being better for years? Techspot agreed AMD went ahead Intel with Zen 3. That was 11/2020, that is, over 4 years ago. AMD being better for 4 years and 28% share, that's pretty pathetic to be honest. If looking only market share number.

Nvidia started to go ahead when AMD had indeed better product stack on GPUs. Then AMD decided that market buys Nvidia anyway so AMD lost interest. Well, market got what they wanted, hope they are happy now.
 
I’d love to see some evidence of your hogwash… as for market share not meaning anything… I’d love for you to have that discussion with the people at Nvidia and AMD - they’d laugh you out of the building.

If McDonalds outsells a restaurant that COMPETES with them - then yes, they’re better! Not necessarily their food - their marketing, service or location might be the clincher… but it means they’re doing something better than their competitors!!

Nvidia is selling more GPUs - and making more profits from them - than AMD.
This means that Nvidia is BETTER than AMD. Not necessarily their cards (but their high end ones can’t be beaten) but the whole package: availability, quality, performance, marketing, etc…
Whether you think people are dumb or ignorant or gullible is irrelevant - they’re the ones who make Nvidia money.
Yes, I am sorry you believe that McDonalds has the best burgers... and 100% completely dismiss the idea they have the best locations (ie marketing).

Obvious that you are here as an investor, not as an consumer or end-user.
You said it your self:
"If McDonalds outsells a restaurant that COMPETES with them - then yes, they’re better! Not necessarily their food"

You should try posting ovr and Zerohedge if you are concerned with connecting with investors and talking about Financial statements, etc.

Gamers and Gaming Orgs do not care about Nvidia's portfolio. We are talking about Computer Games, Frames & People playing them. Not investors...!
 
You need to stop lying, you couldn't point me to any sources that even rumour this, let alone any credible sources.

Even a simple Google of "AMD UDMA release date" the most faintest rumours say "it's plausible for a release in 2026", With nothing to backup those claims either...
You need to stop beclowning Urself.
You did "research" and all you could find was the truth... & 2026..!

How long does it take to tape out to ES..? When was RDNA5 "change" announced (7 months ago)...? And when is AMD's chiplet PRO card coming into Production..?



What if I told you that AMD right now has an (ad hoc) XTX Chiplet protype using essentially two AMD APU's (think PS6 APU x 2).. and there is no need for a computer (GPU/CPU card). Now imagine if they used all the CPU space, for just Raster and released this a a discreet card.

That^ is what engineers do.



So what you really are saying to me, is that you agree 100% with me, just that it wont' happen in in 2025. I am telling you that Dr Lisa Su is playing Her hand wisely and is ready to release Pro Gaming cards this year's Holiday season.
 
Back