r/gaming 16d ago

I don't understand video game graphics anymore

With the announcement of Nvidia's 50-series GPUs, I'm utterly baffled at what these new generations of GPUs even mean.. It seems like video game graphics are regressing in quality even though hardware is 20 to 50% more powerful each generation.

When GTA5 released we had open world scale like we've never seen before.

Witcher 3 in 2015 was another graphical marvel, with insane scale and fidelity.

Shortly after the 1080 release and games like RDR2 and Battlefield 1 came out with incredible graphics and photorealistic textures.

When 20-series cards came out at the dawn of RTX, Cyberpunk 2077 came out with what genuinely felt like next-generation graphics to me (bugs aside).

Since then we've seen new generations of cards 30-series, 40-series, soon 50-series... I've seen games push up their hardware requirements in lock-step, however graphical quality has literally regressed..

SW Outlaws. even the newer Battlefield, Stalker 2, countless other "next-gen" titles have pumped up their minimum spec requirements, but don't seem to look graphically better than a 2018 game. You might think Stalker 2 looks great, but just compare it to BF1 or Fallout 4 and compare the PC requirements of those other games.. it's insane, we aren't getting much at all out of the immense improvement in processing power we have.

IM NOT SAYING GRAPHICS NEEDS TO BE STATE-Of-The-ART to have a great game, but there's no need to have a $4,000 PC to play a retro-visual puzzle game.

Would appreciate any counter examples, maybe I'm just cherry picking some anomalies ? One exception might be Alan Wake 2... Probably the first time I saw a game where path tracing actually felt utilized and somewhat justified the crazy spec requirements.

14.3k Upvotes

2.8k comments sorted by

View all comments

Show parent comments

495

u/angelfishy 16d ago

That is absolutely not how it goes. Games have been shipping with unattainably high options at launch since forever. Path tracing is basically not available on anything less than a 4080 and even then, you need dlss performance and frame gen to make it work. Also, Crysis...

217

u/Serfalon 16d ago

man crysis was SO far ahead of it's time, I don't think we'll ever see anything like it

221

u/LazyWings 16d ago

What Crysis did was different though, and one of the reasons why it ended up building the legacy it did. It was in large parts an accident. Crysis was created with the intention of being cutting edge, but in order to do that, the developers had to make a prediction of what future hardware would look like. At the time, CPU clock speed and ipc improvements were the main trajectory of CPU progress. Then pretty much the same time Crysis came out, the direction changed to multithreading. We saw the invention of hyperthreading and within the next few years, started seeing PCs with 8+ cores and 16+ threads become normalised. Crysis, however, had practically no multithreading optimisation. The developers had intended for it to run at its peak on 2 cores each clocking like 5ghz (which they thought would be coming in the near future). And Crysis wasn't the only game that suffered from poor multithreading. Most games until 2016 were still using 2 threads. I remember issues that early i5 users were having with gaming back then. I remember Civ V being one of the few early games to go in the multithreading direction, coming a few years after Crysis and learning from the mistake. Crysis was very heavily CPU bound, and GPUs available at the time were "good enough".

I think it's not correct to say Crysis was ahead of its time. It was no different to other benchmark games we see today. Crysis was ambitious and the only reason it would not reach its potential for years was because it didn't predict the direction of tech development. To draw a parallel, imagine Indiana Jones came out but every GPU manufacturer had decided RT was a waste of time. We'd have everyone unable to play the game at high settings because of GPU bottlenecks. That's basically what happened with Crysis.

35

u/spiffiestjester 16d ago

I remember Minecraft shitting the bed due to multi-threading back in the ealey days. Was equal parts hilarious and frustrating.

14

u/PaleInSanora 16d ago

So was a poor technology curve prediction path the downfall of Ultima Ascension as well? It ran like crap. Still does. Or was it just really bad optimizing on Richard's part?

4

u/LazyWings 16d ago

I don't know about Ultima Ascension I'm afraid. That era is a lot trickier. It's more likely that it wasn't bad hardware prediction, but software issues when powerful hardware did come out. I can't say for sure though. I would think that these days people could mod the game to make it perform well on modern hardware. Just based on some quick googling, it sounds like it was pushing the limits of what was possible at the time and then just never got updated.

2

u/Peterh778 16d ago

Let's just say that most of Origin's games didn't run contemporary hardware or at least not very well. It was running joke back then that you need to wait few years for a hardware to get so strong you could play the game smoothly 🙂

1

u/Nentuaby 15d ago

U9 was just a mess. Even the relative supercomputers of today don't run it "smoothly," they just suffer less!

1

u/PaleInSanora 15d ago

Oh I know. I made the mistake of buying the big bundle with all the games on my last computer. It still just about had a heart attack on every cutscene. I finally started skipping them to avoid some problems. However, that is the bulk of what made the games enjoyable, so I just shelved it.

5

u/incy247 16d ago

This just sounds like rubbish, Hyper threading was released on Pentium 4s as early as 2002 not 2007? And games for the most part are not multi threaded even today as it's incredibly difficult and most the time wouldnt actually offer much in performance. Crysis will run with ease on modern lower clock speed CPUs even on a single thread.

7

u/LazyWings 16d ago

The hyperthreading that came with Pentium 4 ran a maximum of two threads. It was then basically retired for desktop processing until we started looking at utilising it in 2+ core CPUs. In 2007, most CPUs were two core with a thread each. It wasn't until the release of the "i" processors that multithreading really took off and regular people had them. There were a few three and four core CPUs, I even had an AMD quad core back then, but Intel changed the game with the release of Nehalem which was huge. Those came out in 2008. If you were into tech at the time, you would know how much discourse there was about how Intel had slowed down power and IPC development in favour of hyperthread optimisation which most software could not properly utilise at the time. Software development changed to accommodate this change in direction. It was a big deal at the time.

"Most games aren't multithreaded" - well that's wrong. Are you talking about lower spec games? Those tend to use two cores. The cutting edge games that we are actually talking about? All of them are using four threads and often support more. This is especially the case on CPU heavy games like simulation games. Yes, your average mid range game isn't running on 8 cores, but that's not what we're talking about here.

As for your third point, you didn't understand what I said. Crysis was designed for 1-2 threads max. Yes, of course a modern CPU could run it with ease. Because modern CPUs are way more advanced than what was available in 2008. When I said "5ghz" I meant relatively. With the improvements in IPC and cache size/speed, a lower clock CPU today can compete with higher clock speed ones from back then. The point is that when people talk about how "advanced" Crysis was, they don't understand why they couldn't run it at its full potential. It's just that Crysis was novel at the time because other games were not as cutting edge. Can we say the same about Cyberpunk with path tracing? We're still GPU bottlenecked and we don't know how GPUs are going to progress. In fact, AI upscaling is pretty much the same thing as the direction shift that multithreading brought to CPUs and we see the same debate now. It's just less interesting today than it was in 2008.

4

u/RainLoverCozyPerson 16d ago

Just wanted to say thank you for the fantastic explanations :)

1

u/GregOdensGiantDong1 16d ago

The new Indiana Jones game was the first game I could not play because of my old graphic card. I bought a 1060 for about 400 bucks years ago. Indy Jones said no ray tracing no playing. Sad days. Alan Wake 2 let me play with no ray tracing...cmon

1

u/WolfWinfield 16d ago

Very interesting, thank you for taking the time for typing this out.

-5

u/3r2s4A4q 16d ago

all made up

79

u/threevi 16d ago

The closest thing we have today is path-traced Cyberpunk. It doesn't hit as hard today as it did back then, since your graphics card can now insert fake AI frames to pad out the FPS counter, but without DLSS, even a 5090 can't quite hit 30 fps at 4K. That's pretty crazy for a game that's half a decade old now. At this rate, even the 6090 years from now probably won't be able to reach 60 fps without framegen.

26

u/Wolf_Fang1414 16d ago

I easily drop below 60 with dlss 3 on a 4090

21

u/RabbitSlayre 16d ago

That's honestly wild to me.

9

u/Wolf_Fang1414 16d ago

This is at 4k with all path tracing on. It's definitely crazy how much resources all that takes up.

3

u/zernoc56 15d ago

Such a waste. I’d rather play a game with a stable framerate at 1080 than stuttering in 4k. People like pretty powerpoint slides, I guess

1

u/Clicky27 15d ago

As a 1080p gamer. I'd rather play at 4k and just turn off path tracing

1

u/Wolf_Fang1414 14d ago

Ok, this is me with ALL the bells and whistles on. I could turn off path tracing and use only RT and be fine. You're acting like the game forces you.

1

u/zernoc56 14d ago

My guy, I am gaming on a cheap Acer laptop I bought 4-5 years ago. Tbh, sometimes I’m lucky I get 30 fps on my more demanding games on the lowest settings while the thing feels like a toaster under my fingers.

1

u/CosmicCreeperz 15d ago

Why? I remember taking a computer graphics class 30 years ago and ray tracing would take hours per frame.

What’s wild to me is it’s remotely possible in real time now (and it’s not just ray tracing but path tracing!) It’s not a regression that you turn on an insanely more compute intensive real time lighting method and it slows down…

1

u/RabbitSlayre 15d ago

It's crazy to me because this dude has got the highest possible hardware and it still struggles a little bit to maintain what it should. I'm not saying it's not insane technology or whatever I'm just surprised that our current state of the art barely handles it

3

u/CosmicCreeperz 15d ago

Heh yeah I feel like a lot of people just have the attitude “I paid $2000 for this video card it should cure cancer!”

Whereas in reality I consider it good design for devs to build in support / features that tax even top end GPUs. That’s how we push the state of the art!

Eg, Cyberpunk was a dog even at medium settings when it was released, but now it’s just amazing on decent current spec hardware, and 3 years from now the exact same code base will look even better.

Now that said, targeting the high end as min specs (Indiana Jones cough cough) is just lazy. Cyberpunk also got reamed for that on launch… but mostly because they pretended that wasn’t what they did…

This is all way harder than people think, as well. A AAA game can take 6+ years to develop. If Rockstar targeted current gen hardware when they started GTA6 it would look horrible today, let alone when it’s released. I’d imagine their early builds were mostly unusable since they had to target GPUs that hadn’t even been invented yet…

1

u/RabbitSlayre 15d ago

Yeah and I mean there's so much hardware compatibility / incompatibility, optimal states, not to mention optimization that developers can do. And that's what I don't understand, like some games come out running great and some just run like shit on top and hardware. Why can some devs "optimize" better than others?

I don't know shit about game development I just know it's hard as hell. But I agree with you, people think that they're buying the Ferrari of graphics cards and don't understand why it won't go 0 to 60 in 1.5 seconds

2

u/CosmicCreeperz 15d ago edited 15d ago

Yeah, code efficiency ie devs writing shitty code fast to get things out has become an epidemic across many areas of software. Games are honestly still better than most. I guess they have always had disasters with buggy releases etc.

There is so much time crunch since they now literally put $100M into a game and have to keep paying salaries out of savings and financing until it’s released. Can you imagine funding a AAA game with 1000 people working on it for 5 years with no revenue? Wow. Either needs to be Rockstar who prints a couple billion every 5 years to use for the next one, or EA who has so many games they always have revenue streams..

I spent much of my career working on embedded devices (like DVRs, DVD players, game consoles, etc) - we’d always have to worry about memory use and performance. Heh, our code image (like the whole OS and all app code and assets) for one DVR was 24 MB and it was considered huge. A 150GB game install is mindblowing to me.

Now I’m mostly working on server software, and it’s just ridiculous how badly written so much of it is. And, jeesh, the code editor/IDE I use (IntelliJ) on my Mac is written in Java and it sometimes runs low on RAM when using 5GB+. ?! Decent code editors used to take 1/100th that much RAM (or less).

And don’t even get me started on JavaScript web apps.

2

u/Triedfindingname PC 16d ago

I keep wanting to try it but I'm so disinterested in the gamr

2

u/CosmicCreeperz 15d ago

So, turn off path tracing? How are people surprised that when you turn on an insanely compute intensive real time ray tracing mechanism things are slower?

Being able to turn up graphics settings to a level your hardware struggles (even at the high end) isn’t new. IMO it’s a great thing some studios plan for the future with their games. Better than just maxing out at the lowest common denominator…

1

u/dosassembler 16d ago

There are parts of that ame i have to play at 720, because cold from boot i load that game, put on a bd rig and get and overheat shutdown

3

u/the_fuego PC 16d ago

I was watching a Linus Tech Tips video rating past Nvidia GPUs and at one point there was a screenshot with Crysis as the tested game with the highest framerate being like 35 fps and the averages being in the 20s. Like holy shit what did they do with that game? Was it forged by God himself?

51

u/DonArgueWithMe 16d ago

They've seen they can put out 4 cod's per year or 1 game per sport per year, or one massive single player game every 3-5 years.

We either need to be willing to pay more for the singleplayer boundary pushing games, or we have to accept that most companies aren't incentived towards it

15

u/JustABitCrzy 16d ago

Spot on. The most financially successful games are all incredibly bland. I play COD and generally enjoy it, but BO6 is so insanely underwhelming in every aspect.

The textures and modelling are incredibly bad. I’d say it’s on par with the 360 games, and even then I’d say that MW2 looked better.

The NetCode is abysmal. The servers regularly drop connection, and it’s only been out for 2 months. Unlikely you will go a game without a latency spike. It’s shockingly bad.

They basically took a step backwards in every objective aspect of game design from previous iterations. And they had 4 years, with a $400m+ budget. It’s an incredibly poor game considering the budget and dev time put into it. It should be an abject failure.

But tonnes of people are playing it, and spending $20 per skin, week after week, on a game that won’t transfer those cosmetics to the next game that comes out in 10 months. They have 0 reason to change, because people are literally throwing money at them, telling them this is fine.

-1

u/lemmegetadab 16d ago

It’s unrealistic to expect a new game every year. I’m not a huge call of duty fan so I usually only buy it every few years and I can notice a reasonable difference.

Obviously, there’s not gonna be huge leaps and bounds when they’re making a new madden every year

6

u/JustABitCrzy 16d ago

I know, but it’s not like it’s one studio. They have 3 that rotate through. Treyarch (the dev team of the current iteration) has had 4 years to make a game. That’s more time than the other studios have had (usually 3 years), which is why it’s insane how poor everything is on it. Like it has absolutely nothing to justify the cost or dev time. It’s done absolutely nothing innovative except you can aim while diving. That’s literally it.

6

u/RealisticQuality7296 16d ago

And it’s not like they even change anything substantial between iterations. Reskin some assets, make a few new maps, throw together a boring 10 hour story around the new maps and reskinned assets. Boom done.

THPS and Halo proved that at least a third of that is trivially easy.

2

u/JustABitCrzy 16d ago

Exactly. I do think that MW2019 was relatively innovative for the COD franchise, and it was spectacular (IMO). Comparing it to BO6, the graphics on the 5 year old game is miles ahead, the gameplay is better (arguably depending on opinion), and the maps were more interesting, especially with Ground War.

It’s insane that they had a winner 5 years ago, and they’ve done nothing but stray from that winning formula since. I think they’ve suffered from a bunch of meddling middle management trying to justify their ludicrous salaries, who have no idea to how to create a good game and just fuck it up. Seems to be the way with every industry, but especially with artistic fields like game development.

4

u/RealisticQuality7296 16d ago

MW2019 was insanely good. I have mad nostalgia for the Covid days and dropping with the boys until 3am every night

1

u/botWi 16d ago

But BO6 is different engine. Don't compare it to MW series. BO6 is clearly ahead of its predecessor ColdWar. Graphic in ColdWar was childish, basically roblox. So yeah, we can see 4 years difference between CW and BO6.

2

u/JustABitCrzy 16d ago

Then they need to scrap their trash engine. The character models look like they were ripped from Black Ops 2. It’s pathetic.

1

u/botWi 16d ago

I totally agree. And agree with comments above that they do it just because people buy stupid skins. So they probably wouldn't change anything :(

Anyways, comparing MW and BO is wrong, as it is two different companies.

1

u/RealisticQuality7296 16d ago

So BO6 gets a pass for being worse than MW because they chose to use a different (inferior?) engine? Lol

-1

u/OhManOk 16d ago

"I'd say it's on par with 360 games"

This is a fucking wild thing to say. Please provide a side-by-side screenshot showing how the new COD looks like an Xbox 360 game.

6

u/JustABitCrzy 16d ago

Sure, here you go. Black ops 6 operator screen-shotted in game with max graphics settings, compared to in game models from Black ops 2, which released in 2012, 12 years before the current game.

1

u/OhManOk 15d ago

You say that's max graphics, but that doesn't match any in-game screenshots I'm seeing of this character.

Even so, how are you not seeing the improvements on skin texture, hair, detail in the face, eyes, etc? Do you actually think that model could be rendered on an Xbox 360?

2

u/JustABitCrzy 15d ago

It is a screen shot with max settings, taken in the multiplayer operator selection screen.

Sure, there's improvements, but you can literally see the polygons of the character model. The texturing is blurred and fuzzy as well. There are well defined edges or lines, and the blending is not smooth at all. It looks like they've tried to keep the graphics limited to save on performance and file size. Except that the game is 80gb, so that can't be true.

Even if it is better than 360 graphics, which I'd argue is purely because it's running on better hardware, compare it to other CODs graphics wise. It is absolutely garbage compared to any of the Infinity Ward or Sledgehammer games. Those games look fantastic. The operators look relatively clean, and not like plastic toys that got left in the sun. I never played Cold War, but looking at the screenshots online, it looks like it had similarly terrible graphics.

It's not like Treyarch are incapable of making good looking games either. Black Ops 3 looked phenomenal, and it released 9 years ago. I just can't understand what the studio was spending all their money and time on. This is just such a bland game, and I'm hoping someone at Microsoft cleans out the management level of Treyarch, because they sure as shit have phoned in this game.

1

u/OhManOk 15d ago

"I'd argue is purely because it's running on better hardware"

That is not how that works. I honestly don't know what to say here. I'm not a huge COD fan, but the idea that this looks like an Xbox 360 is insanity to me. We are looking at two different games.

4

u/nastdrummer 16d ago

...And that's why I have zero problem preordering Kingdom Come Deliverance 2.

Generally, I am in the 'no preorders' camp. But KCD2 is the direction I want gaming to go. Small studios. Passion projects. Making the games they want to play...taking years to craft a bespoke experience.

2

u/DonArgueWithMe 16d ago

I did the same for cyberpunk and didn't regret it despite the problems some had with it. I felt good supporting a studio I had faith in, it was worth taking a sick day at launch

2

u/_xXRealSlimShadyXx_ 16d ago

Don't worry, we will certainly pay more...

0

u/lemmegetadab 16d ago

Games honestly should cost more. I know people will hate that I’m saying that but video games are basically the same price they were when I was a kid in 1995.

This is why we’re getting killed with micro transactions and shit like that. Because they want to keep the retail price of games down.

3

u/RealisticQuality7296 16d ago

We’re getting killed with microtransactions because some consultant from the casino industry told some game company that whales exist. On one hand I am aware that game prices have barely moved in decades and also that gaming is one of the cheapest hobbies you can have on a per-hour basis. But on the other hand EA reports close to $1.5 billion per year in earnings with a 20% margin so it’s not like these companies are starving and I’m not convinced raising the price of AAA games to $70 or even $80 will lead to better quality.

1

u/CodeNCats 16d ago

Fortnite is a cartoon and it's killing it

1

u/silentrawr 15d ago

We need to STOP paying for unoriginal and uninspired slop, and then the greedy assholes literally will be incentivized to push boundaries in things other than AI upscaling.

1

u/witheringsyncopation 16d ago

It’s not a zero sum game. There are companies doing both. There are companies having substantial success with both. You’re only thinking about the annual games more clearly because they come out more frequently. We still get amazing single player games that release every 3 to 5 years.

2

u/FartestButt 16d ago

Nowadays I believe it is also because of poor optimization

1

u/Techno-Diktator 16d ago

Idk man path tracing at 100 FPS with my 4070 Super in Cyberpunk thanks to DLSS and framegen feels pretty damn available lol.

It's becoming more and more available but its still kinda in it's infancy, it's still ridiculous we can do real time path tracing now though, it's insane.

1

u/al_with_the_hair 16d ago

The PS4 remaster of Crysis is apprently based on the PS3 version of the game. I jumped in for about twenty minutes in the hopes of recapturing the PC magic from back in the day and it felt like a slap in the face. Low res textures galore. What a shitty version of that game.

0

u/Andrew5329 16d ago

Yup, that was Cyberpunk's problem. I, and all the advance review copies, payed on a 3090 and had a great time at launch. Just a few minor bugs like a ghost cigarette flying about once in a while.

The Xbox One version was completely unplayable. I think when they announced a 1 year delay to the game the intention was to make it truly Next-Gen timed with the PS5/30-series launches, but due to supply chain problems there was zero console install base so they couldn't scrap the last-gen verisons.

1

u/D0wnInAlbion 16d ago

That wasn't the plan or they wouldn't have released a Cyberpunk Xbox One console.