AAA video games struggle to keep up with the skyrocketing costs of realistic graphics

Skye Jacobs

Posts: 380   +10
Staff
Recap: Gaming studios face increasingly difficult decisions as they grapple with the escalating costs of creating cutting-edge visuals and the diminishing returns these visuals often provide. The challenge lies in striking a balance between visual spectacle, engaging gameplay, and sustainable development practices.

For decades, giants in the gaming industry like Sony and Microsoft banked on realistic visuals to captivate audiences. Other publishers like Naughty Dog (The Last of Us, Uncharted series), CD Projekt Red (The Witcher 3, Cyberpunk 2077), Rockstar Games (Red Dead Redemption 2, Grand Theft Auto V), and Guerrilla Games (Horizon Zero Dawn, Horizon Forbidden West) have consistently prioritized visual excellence and this approach has led to critical acclaim and commercial success.

This strategy has transformed once-flat pixelated worlds into immersive experiences that rival cinematic productions. However, the cost of achieving such realism has skyrocketed, leading to a reevaluation of priorities within the gaming sector.

Marvel's Spider-Man 2, released in 2023, is one example. Leveraging the PlayStation 5's processing power, developers at Insomniac Games crafted a visually stunning New York City. Peter Parker's iconic suits were rendered with intricate textures, while skyscrapers reflected sunlight with remarkable accuracy.

This level of detail, however, came at a steep price. The game's development reportedly cost around $300 million, more than triple the budget of its predecessor from just five years earlier.

Another example of the industry's technical prowess can be seen in a particularly noteworthy scene in The Last of Us: Part II, when the protagonist Ellie removes her shirt, revealing bruises and scrapes on her back. This moment unfolds without any graphical glitches, demonstrating the painstaking attention to detail that has become a hallmark of high-budget game development.

While these visuals are impressive, they raise questions about the sustainability of such investments. Despite Spider-Man 2's commercial success, with over 11 million copies sold, Sony announced 900 layoffs in February 2024, which affected the game's developers at Insomniac.

In short, the financial returns on these investments are diminishing. At the same time, audience preferences are shifting as well. Jacob Navok, a former executive at Square Enix, told The New York Times that high-fidelity visuals primarily appeal to a specific demographic of gamers in their 40s and 50s.

Meanwhile, younger generations are gravitating towards games with simpler graphics but robust social features, such as Minecraft, Roblox, and Fortnite.

For many young gamers, "playing is an excuse for hanging out with other people," said Joost van Dreunen, a market analyst and professor at New York University. This social aspect has become a driving force in game design and popularity.

As development costs soar and player preferences evolve, some studios are exploring alternative approaches. The live service model, which prioritizes regular content updates over graphical fidelity, has gained traction. Games like Genshin Impact have found tremendous success, generating billions in revenue primarily through mobile platforms.

However, this model has risks. High-profile failures like WB Discovery's Suicide Squad: Kill the Justice League and Sony's short-lived Concord demonstrate the challenges of entering the competitive live service market.

Industry professionals are divided on the path forward. Some, like David Reitman of PricewaterhouseCoopers, see potential in artificial intelligence to reduce the costs associated with high-end graphics. Others, like independent developer Rami Ismail, are skeptical of quick technological fixes and worry about the industry's current trajectory.

Permalink to story:

 
A way to cut cost is focus more on the creative side and create a game with a freaking great storyline with rich characters, and complex missions with multiple options to complete them. If the game is compelling enough I would play a text base game ala "Zork" or "Hitchhikers Guide to the Galaxy".
 
A way to cut cost is focus more on the creative side and create a game with a freaking great storyline with rich characters, and complex missions with multiple options to complete them. If the game is compelling enough I would play a text base game ala "Zork" or "Hitchhikers Guide to the Galaxy".

I still play a MUD game in this day and age. I find it more entertaining than most games that come out at times. Gemstone IV, been a part of the game since the AOL days.....now I feel old.
 
Well, maybe the solution is to make games people want to play? If you want sales, you have to cater to the majority of the market. 80% of the market has 60 class card's or below. That number goes up if you include console sales.
 
Interesting. I would've thought with the advent of Unreal Engine 5 that high quality graphics would become much cheaper and more streamlined...so this is not the case?

At any rate, cloud gaming is on the horizon. Just pay $20/mth and you'd have a xx80 class GPU rendering on wifi 7 w/ any cheap laptop connected to whatever monitor you prefer, or skip the laptop and render on your smart TV directly.(I'm doing this now, but w/ a lan connection) So at least on the gamers' end, it will no longer be cost prohibitive. If the 5080 is indeed gonna be $1500, then $20/mth to render on it through Geforce Now is a steal.
 
Gaming studios face increasingly difficult decisions as they grapple with the escalating costs of creating cutting-edge visuals and the diminishing returns these visuals often provide.

Ah, the pricey visuals. Only Boomers want that. It must be true if the gaming experts at PwC come to the conclusion (thankfully they have the solution right away - so they can continue to advise producers and studios on how to implement AI).

And here some of us thought the problem was the dwindling creativity in game design, buried in dozens and dozens of sequels, prequels, remastered editions and clones of successful games or game formulas. Some studios sell the same game for years (Ubisoft) and then wonder why no one wants to shell out $60 for the next clone.

Not only the visuals are pricey, but apparently new ideas are also expensive.
 
Lol, I am running as fast as I effin' can AWAY from games with social features, as random people suck so much arse it's not even funny.

-Yep.

When I was younger I loved the social element of gaming, but as I've gotten older I've gone over to really craving strong single player narrative driven games, preferably if they're "short" like 15-20 hours or so.

I got a lot of stuff on my plate and I really enjoy games that don't waste my time.
 
If the Witcher 3 had looked far worse it wouldn't have sold that much worse.
It's got some of the best story telling of any game ever and your choices matter.

That it had some of the best graphics to boot at the time was a nice bonus.

If they want to save costs, stop hiring DEI consultants. They cause both Hollywood and the (western) game industry to release flop after flop, they limit creativity. Be bold take some risks, the modern watered down writing afraid to upset anyone doesn't lead to intrigue it just makes everything predictable. It's a tiny majority on X that makes a lot of noise of wanting that stuff but it turns out that they don't buy games, in fact they're often quoted as hating gamers.

Also, hire people that want to make games. Not have a the higher ups decide they want to make it a certain type as that's projected to make the most profit. Then hire people there to just listen and cash their paycheck. Finally hire a DEI consultancy to cater to an audience that doesn't exist and then release a game that no-one asked for and no-one wants.
I'd be surprised if the Suicide Squad game wasn't made exactly using that formula. I'm one of few people that tried it (got it gifted) but after 92m gave up. Such generic uninteresting generic slop.
You have a rich cast of characters and what do you for a hour and a half? Shoot no name enemies with guns. If I play as Harley Quinn I want to move around like a gymnast and hit things with a bat - preferably with some witty dialogue whilst doing so. Not shoot a fricking gun like the other 3 in the group.

Have less studios like Ubisoft and more studios like Larian (Baldur's Gate 3).
 
Last edited:
Graphics quality has only been getting worse. The constant focus on lens flare and RT and upscaling has left games looking unfinished. Playing through W40k: darktide, this is evident everywhere on default settings.

There are plenty of games from previous generations that are WAY more fun to play. OG Battlefront 1/2, empire at war, battlefield 1943, the ace combat games, ece.

Stop focusing so much on "dem graphics" and focus more on making fun games with good writing. Like, with skull and bones, did black flag really look THAT bad? Render Black Flag at 4k and most people would be perfectly fine with it.
Oh it's the DEI now. What happened to the woke?
syn.o.nym noun. a word or phrase that means exactly or nearly the same as another word or phrase in the same language, for example shut is a synonym of close.

Hope This Helps.
Interesting. I would've thought with the advent of Unreal Engine 5 that high quality graphics would become much cheaper and more streamlined...so this is not the case?

At any rate, cloud gaming is on the horizon. Just pay $20/mth and you'd have a xx80 class GPU rendering on wifi 7 w/ any cheap laptop connected to whatever monitor you prefer, or skip the laptop and render on your smart TV directly.(I'm doing this now, but w/ a lan connection) So at least on the gamers' end, it will no longer be cost prohibitive. If the 5080 is indeed gonna be $1500, then $20/mth to render on it through Geforce Now is a steal.
LOLno. Please look up the bandwidth required for 4k60 on a monitor, then compare it to the bandwidth of an internet connection. Hint, that "4k" you get from streaming is highly compressed, with both audio and visual quality loss, not to mention the horrid input lag inherent to any "cloud" service.

Y'all have been preaching that cloud gaming is "right around the corner" for 15 years now. Remember OnLive? Stadia? PlayStation Now?
 
Graphics quality has only been getting worse. The constant focus on lens flare and RT and upscaling has left games looking unfinished. Playing through W40k: darktide, this is evident everywhere on default settings.

There are plenty of games from previous generations that are WAY more fun to play. OG Battlefront 1/2, empire at war, battlefield 1943, the ace combat games, ece.

Stop focusing so much on "dem graphics" and focus more on making fun games with good writing. Like, with skull and bones, did black flag really look THAT bad? Render Black Flag at 4k and most people would be perfectly fine with it.

syn.o.nym noun. a word or phrase that means exactly or nearly the same as another word or phrase in the same language, for example shut is a synonym of close.

Hope This Helps.
LOLno. Please look up the bandwidth required for 4k60 on a monitor, then compare it to the bandwidth of an internet connection. Hint, that "4k" you get from streaming is highly compressed, with both audio and visual quality loss, not to mention the horrid input lag inherent to any "cloud" service.

Y'all have been preaching that cloud gaming is "right around the corner" for 15 years now. Remember OnLive? Stadia? PlayStation Now?
I'd like to pull up Geforce now as a good example of game streaming, but they've changed it recently, so now it's pretty crap even with the most expensive plan
 
All I'm reading are crying and excuses from developers about how their wholly incorrect graphics-first approach to gaming is biting them in the butt. Make a fun video game. If that seems impossible, then make something stupid with minimal gameplay to sell for 10 dollars and start paying streamers to play it.
 
Let the community help... I play heavily modded versions of Skyrim and Fallout 4, with the predominant mods being for visual fidelity... My Skyrim and F4 are often photo-realistic and yet they're both very old engined games...

(My dream game would be an MMO with the size, scope and freedom of Elder Scrolls Online (including bi-annually released new zones, stories and content) BUT with the D&D 5E mechanics of Baldur's Gate 3. The visual fidelity I'd be happy with ESO on max settings kinda levels (my 3060Ti can run 4K ESO at max graphic settings between 90 & 120 FPS and it looks surprisingly awesome!)
 
Ah, the pricey visuals. Only Boomers want that.

Do you have any idea of what “boomers” mean? Clearly not. The vast majority of boomers never grew up on computers and don’t want them with fancy graphics as they don’t care. Most they would do is play a card game.

You clearly have an issue with your age based on this stupid comment. So lets go through the generations and their game expectations.

All generalisations
Boomers - little to no interest in computers
GenX - grew up on txt and very basic games where a big game had to have a story as there was no decent graphics. Hercules, monochrome, CGA, EGA were the big graphics after text based games. GenX still lived in a physical reality.
GenY - these guys grew up in the VGA reality and expect good graphics so some are also in the 40yo category. But they also grew up in an evolving computer game world where massive adventures could be told with “stunning” graphics at the time. These guys have family and the vast majority are busy making ends meet with children of their own. They aren’t demanding super graphics.
Genz - now these guys have lived with expectations of good graphics and now into the work force with decent paying jobs so clearly these are the guys? Well also this generation is busy trying to buy a house and having kids so probably not a big demand fir gutting edge graphics if they have time for a game.

So this demand must be from a younger generation, at least 4 generations off boomers. Some generation that demands everything without putting any effort into anything themselves, a type of generation that blames others for their issues.

What generation are you?
 
I'm old enough to remember when talk of photo realistic textures were just around the corner for gamers yeah that didn't work out well did it 20 years later and we're still waiting. It seems to me that game dev studios are slacking off and relying on the likes of DLSS and FSR or XeSS to make their games run semi decently and then shoveling RT into everything. I've seen textures in CP2077 that look like they were ripped straight out of a quake III map
 
Here's the problem: graphics don't sell consoles and they haven't for like 10 years.

It's not 2007 anymore. "Can it play Crysis" is now literally just a meme at this point, rather than a serious line of inquiry as to whether or not your GPU can play a 17-year-old game at Ultra. The Steam Deck, a now 4-year-old handheld, can play the original maxed-out (I'm pretty sure, but someone feel free to correct me).

We have long since slammed into the fidelity ceiling, meaning that characters are, more or less, now fully realized: there are very few AAA game characters with obvious polygonal models; what once was a technical limitation is now a stylistic decision. Most circles are actually circles, rather than being many-sided polygons, and curved surfaces are quite smooth. It took us a while to get here, but now that we've reached it, there's really no more triangles you can add to the character, without it being a redundant waste of the budget.

So, now, games have to be actual games, not just expensive straight-to-consumer tech demoes. Those days are over. There's only two things left to "scale up": frame rate and image size. Right now, we're in the 4K 120Hz era. Soon, we'll probably be in the 8K 240Hz era...and that's it. We'll just keep making the image bigger bigger, increasing the refresh rate as we go, but even the refresh rate has a limit, because the eye can only process so many excess refresh cycles before that too is redundant. Then, I guess we'll just keep making the resolution bigger and bigger still until the end of time or VR becomes a thing again.

I don't know, it's not exciting. I guess the only way to make games "cool" again is to make lighting better. But, I mean, the 4090 was already a chonker of a GPU and the 5090 is probably going to be larger still. How long until the graphics budget demands a desktop-sized GPU that costs what you could get for a whole rig 5 years ago.

Would anyone even bother playing games anymore, if Raytracing/Pathtracing eventually becomes mandatory for every AAA game going forward, but it requires you to have a $10K graphics card, the size of a small car because, "well, you want the best, right?"
 
DEI is the new woke. All these game companies need to hire only old white men with no college education and pay them high salaries with full benefits, because this is America! Problem solved.
Kashim, You are angry and uninformed.
 
I think Doom had this problem. Darn, I need four more MB RAM so I won't have to edit my auto and config. What's new is old. Keeping up is only essential when it comes to securing countries. Keeping up with the Jones's or whatever you want to insert for Jones isn't important.
 
It makes sense that those in their 40s and 50s are driving the need for constantly-improving visuals. Those of us in that age group saw the progression from Atari 2600 graphics to 4, 16, and then 256-color “photorealistic” VGA graphics within a single decade. And we got addicted to the improvements. I still remember eagerly awaiting the first Test Drive game since it promised to put you into the cockpits of those cars more than any other game could’ve at that time. The differences were stunning until things seemed to level off over the years to now when you have to sit uncomfortably close to your TV to notice all the differences between the PS5 and PS5 Pro. The old low-res games are still fun to play for some reason, and now devices made to run older games are starting to have OLED screens, fast CPUs, and ways of boosting resolution well beyond what the older games were designed for.
 
Do you have any idea of what “boomers” mean? Clearly not. The vast majority of boomers never grew up on computers and don’t want them with fancy graphics as they don’t care. Most they would do is play a card game.
I tried to make a point by using irony in upgrading 50+ gen x to boomers, which refers to a part in the article. Did you even read it?

This is the quote:
Jacob Navok, a former executive at Square Enix, told The New York Times that high-fidelity visuals primarily appeal to a specific demographic of gamers in their 40s and 50s.
It's funny, even ironic, that we share the same opinion (older gamers don't primarily care about gfx), yet you completely missed that.
 
Back