Intel Arc B570 Review: Decent Performance, Not Great Value

I was not aware of the CPU bottleneck here. It is surprising to see something like this for such a weak GPU.

Good one Steve for finding this one. I don't recall any other reviewers mentioning this.
 
I think it's dishonest to test such weak GPUs at 1440p, any more recent Unreal Engine game will bring them to their knees.

What's going on with HUB? lol
 
If you can't squeeze out the extra $30 for the B580 - then wait until you can. The price differential is ridiculous. Same as the last gen, the lower tiers aren't worth buying.
 
Steven's comments on the AU prices is spot on...
I'm kinda dreading what I'll have to pay ($AU) this year to upgrade my 3060Ti...

Perhaps my secondary thoughts of upgrading my CPU (Ryzen 5 3600), mobo (Gigabyte B550M DS3H) & RAM (DDR4-3200 2x16GB) first might be a better decision... I've tried picking up a decent priced 5800X3D to put that upgrade off for a bit but they're as rare as hens teeth now, seems I left it too late dammit.
I got a sweet 49" 5120x1440 super ultrawide curved monitor with all the bells and whistles for my birthday back in August 24, which is why I'm looking at a new graphics card this year, but I'm not so sure that's my best option anymore...
 
I got a sweet 49" 5120x1440 super ultrawide curved monitor with all the bells and whistles for my birthday back in August 24, which is why I'm looking at a new graphics card this year, but I'm not so sure that's my best option anymore...
A belated "Happy Birthday"! Let me set the stage here. I don't game myself, but I do follow reviews and such. 1440p in some games is, a hurdle that a 4060ti barely clears. Despite my best efforts to avoid it, I have learned to count. Which led me to the conclusion, that your birthday present is the equivalent of TWO 1440p monitors rolled into one. (5120/2 =2560 whereas 1440p = 1440 x 2560).

So, and this is where any interested party can dive in to correct or assist me, aren't you going to need a VGA at least on the order of a GTX-4070ti to make your "birthday gaming experience", truly happy? If you're considering using ray tracing, that could put you (ostensibly or perhaps uncomfortably?) close to 4090 land. (Confession, I know less than squat about AMD's VGA performance).

The sum total of what I got for my birthday, was a card from my district's rep in the federal house of representatives, Joanna McClinton. I was damned grateful for that. Plus, I had to be a registered "libtard", to even get that.

B'day screen, 7372800 (pixels) = 89% of 8294400 (4K)
 
Last edited:
I think it's dishonest to test such weak GPUs at 1440p, any more recent Unreal Engine game will bring them to their knees.

What's going on with HUB? lol
IIRC, Gamers Nexus reviewed the Intel B580, and reported that it did better, (or "lost less" if you prefer), than some other cards when making the jump from 1080p to 1440p.

 
IIRC, Gamers Nexus reviewed the Intel B580, and reported that it did better, (or "lost less" if you prefer), than some other cards when making the jump from 1080p to 1440p.

It makes sense since competitors are working with much less bandwidth. But it doesn't change the fact that performance is insufficient for 1440p.
 
It makes sense since competitors are working with much less bandwidth. But it doesn't change the fact that performance is insufficient for 1440p.
Well relative. "performace", is somewhat of a sliding scale, which bases, itself on the the user's hopes, dreams, vanity, expectations, ego, and other factors too numerous to mention.

With that said, my intent was to show that someone with mediocre to average expectations, yet still wanted to game at 1440, is better off with the Intel, than other cards in the same price category.

"Performance" for those with much higher expectations, will never actually allow them to be content with whatever is available. Got 4K? Oh man, that's yesterday's news. Gotta have 8K.

For example, 500+- RPS monitor was released., quickly followed by a 600 RPS from another maker. I'm not a gamer, and can only sit back in lurid fascination, and wonder how many >actual, not DLSS< FPS are needed to finally stop the whimpering about screen tearing, artifacts, motion blur, stuttering, and so forth? Or is such a level of "performance" even attainable?

If not, what then, seppuku? Or maybe some will throw down their joysticks and game controllers, and say, "goddammit, I'm fed up with all this fake killing and gore", I wantz me da real thin'..! Then join the army and volunteer for the upcoming invasion of Panama, or is it Greenland, or maybe, dare I even say it, Canada. :rolleyes: 🤣

All of that was terribly insensitive, I know. But it's late, I was bored, and it seemed like a good idea at the time.
 
I think it's dishonest to test such weak GPUs at 1440p, any more recent Unreal Engine game will bring them to their knees.

What's going on with HUB? lol

Intel has recommended and claimed they're 1440p cards, so no idea why you would think or even suggest testing at 1440p is 'dishonest' :S Moreover we included 1440p upscaling which is essentially 1080p!
 
A belated "Happy Birthday"! Let me set the stage here. I don't game myself, but I do follow reviews and such. 1440p in some games is, a hurdle that a 4060ti barely clears. Despite my best efforts to avoid it, I have learned to count. Which led me to the conclusion, that your birthday present is the equivalent of TWO 1440p monitors rolled into one. (5120/2 =2560 whereas 1440p = 1440 x 2560).

So, and this is where any interested party can dive in to correct or assist me, aren't you going to need a VGA at least on the order of a GTX-4070ti to make your "birthday gaming experience", truly happy? If you're considering using ray tracing, that could put you (ostensibly or perhaps uncomfortably?) close to 4090 land. (Confession, I know less than squat about AMD's VGA performance).

The sum total of what I got for my birthday, was a card from my district's rep in the federal house of representatives, Joanna McClinton. I was damned grateful for that. Plus, I had to be a registered "libtard", to even get that.

B'day screen, 7372800 (pixels) = 89% of 8294400 (4K)
Thank you for the reply, and yes, those points have been considered, however my games consist of older titles and are not FPS intensive, so that "softens" my needs a little (heavily modded Skyrim and Fallout 4 mostly for visual fidelity purposes, Elder Scrolls Online, Baldur's Gate 3 and shorlty adding a modded Starfield to my rotation). So none of my games really have RT, but may need in the future. I DO need more VRAM for the huge mods I use (8GB struggles in Skyrim and Fallout 4).
Based on current information I'm considering a around 5070Ti type rasterisation performance levels... Waiting to see what the red team have up their sleeves (as it seems the blue team let us down a little bit this round...)
Thank you for your feed back, it's appreciated....
 
Waiting to see what the red team have up their sleeves (as it seems the blue team let us down a little bit this round...)
Thank you for your feed back, it's appreciated....
Minor correction here. "team blue" is Intel. But it's true that they've pretty much "let us down", for quite a while, with the possible exception of Gen 12.

As for, "team green", (being Nvidia) if you're using "letting us down", as a metaphor for, "bending us over", then you're 100% correct.

And you're quite welcome. (y) (Y)
 
Everyone will have their own situation to consider. I'm on a 4K 60FPS monitor, I'd likely be quite satisfied with 1440p upscaling to 4K but I want reasonably steady frame rates around 60FPS. The B580 largely achieves that, while the B570 is far more marginal and likely moreso over time (notwithstanding devs targeting 10GB VRAM on the Xbox Series X).

That said, I like small form factor, too, and truly fitting in two slots has something to be said for it there. But that's a less crucial factor, like idle power usage (realistically I might use it to run BOINC work units anyway).
 
It will be interesting to see how these cards sell when excitement settles and scalpers get their share.
In any case, these cards are close to that exciting price range where they will sell no matter how well or not so well the drivers work.
 
these cards are close to that exciting price range where they will sell no matter how well or not so well the drivers work.
Judging by what Gamer's Nexus has said, and the amount of games in Steve's testing, it seems that Intel has put enough work into their drivers that someone who isn't an experienced, tech savvy, "enthusiast", needn't fear buying one of them.

Although, with some of the erratic results from game to game compared to the other brands, I'd say that Intel still has a way to towards ** "full optimization".

** Assuming that such a thing is possible.

My question is, does AMD deserve it's reputation for sketchy drivers, or is it just warring fanboy clanz, with brand prejudice?
 
They need to lower the price to 199$ before next gen entry level cards come out.
Nvidia tried something akin to this with their GTX-1630 4 GB. They released it at a
whopping $190.00, and it did not fare well at all. The fact that they had the ballz to label it as a "gaming card", amplified the chorus, nay a crescendo, of boos from the community.

While this isn't directly comparable, it does illustrate Intel's need to drop this price of this card, to about a deuce. (The GTX-1630 was down to about $130.00 last time I checked, which it a lot closer to what it's actually worth).
 
I'd choose the 4060 over the 7600 and the B570. NVidia's extra features and better software support are worth the small premium for me at least. Value is important but it's not the deciding factor for me. Graphics cards are luxury items that I can do without if I have to.

But it does appear that the Intel card is competitive, it is the cheapest of the three. And everything would change if you just cut $20 off the Intel card. Thats how close it is. I think it's a positive thing overall. I mean it's only the second go at it for Intel. If Intel's third attempt improves by as much again it will beat its Nvidia and AMD competition.

Also, I find it amusing that people are genuinely criticizing Steve for testing a card marketed to 1440p gamers at 1440p.
 
Super review. I'm not going to buy one of these myself, but I'm really excited at the direction Intel is going with their gaming GPU products.

It's kind of hard to talk about specific prices, as we're right at the start of the biennial GPU silly season. Low-quantity staggered lauches alongside an over excited tech media. It whips up the prices of both new and enthusiast grade products to absurd levels.

The market price of Intel's new GPUs should calm down by the summer.
 
Back