PC Gaming in the Early 2020s
Jan 13, 2024 13:13:10 GMT -5
Post by anayo on Jan 13, 2024 13:13:10 GMT -5
I got into PC gaming in 2018. It is now 2023. During those 5 years, I’ve hopped around different hardware ranging from low end to high end.
There is a lot of hype in PC gaming today. Some of it I think is well-deserved. Some of it is stupid and overblown.
I don’t stand to gain anything by selling people stuff they don’t need, so I think I can help sort out some of the BS from the stuff that’s actually worth caring about. I’ll start with a list of reasons to get a gaming PC, arranged from best to lamest.
1 High Frame Rates
High frame rates make you feel more firmly “in control” of your game. This is an excellent feature. If I were to get a gaming PC for no other reason, it would be for this.
Video games do not actually move. They present a series of static images so quickly that it tricks the human brain into seeing movement.
Cinematic single player games on a home console such as Playstation 4 try to show the player 30 frames per second. This way, the game can look as beautiful as possible without getting so choppy that it irritates the player.
A gaming PC can be several times more powerful than a Playstation 4. So, a 30 fps game on console can run at more fps on PC. Maybe the PC can achieve 60 fps, 120 fps, or even higher. This gives the game smoother motion.
Originally, I thought smoother motion was supposed to make the graphics look nicer. I was wrong. High frame rates make the game feel better. It’s a tactile improvement, not a visual one.
To get across what I mean, I have to bring up something that’s so flagrantly wrong I don’t even know where to start with setting it straight. This excerpt is from the September 1998 premier issue of Maximum PC magazine:
These people do not know what they're talking about. I don't completely blame them, since 100+ hz monitors probably weren't widely available in 1998. In 2020 I had a 144 hz monitor, but I was playing Apex Legends on a Radeon HD 7870 (an aging card from the year 2012) at 40-50 fps.
Later, somebody on Facebook marketplace gave me a free GTX 1050 ti (a budget card from the year 2016). Suddenly I could play at about 100 fps. Right away, my kill count went up, and I was winning more matches than before. The fluid motion made me feel like Neo from the Matrix, bobbing and weaving around bullets as they harmlessly flew past me.
The experience reminded me of NiGHTS into Dreams (Sega Saturn, 1996), which I played with a digital pad at first, then upgraded to the Saturn 3D pad with its analog stick. This was a breakthrough, skyrocketing my scores right away. Sub-dividing the digital pad into much smaller “pie slices” allowed me to go exactly where I wanted. I was no longer locked to 4 rigid directions, or even binary combinations of those four.
High frame rates have the same effect as analog controls, it's just applied to the dimension of time instead of space. When I divide each second of gameplay into smaller slices, it allows me to move exactly when I want. Conversely, low frame rates delay my inputs, making the game feel unresponsive and “heavy”.
Even in non-multiplayer games where you're not competing with anybody else, high frame rates are still amazing. Star Wars: Jedi Fallen Order's single player campaign felt incredible to play at triple-digit frame rates on my RTX 3070. Even though I played on an XBOX 360 controller instead of a mouse, I could steel feel the camera picking up every minute jitter of my thumb, snapping exactly where I told it to go with laser precision.
I now insist upon at least 120 fps for online competitive shooters or fast-paced single player games like Doom Eternal or Ratchet and Clank: a Rift Apart. The effect is not as noticeable in slow-moving games, so I find 60 fps to be acceptable in Red Dead Redemption 2, which is more languid and graphically beautiful. In Dark Souls, combat was so slow and lumbering that 60 fps felt okay.
30 fps is out of the question on a modern PC. That feels like playing Super Mario 64 with a d-pad. I will only settle for 30 frames on the lowly Nintendo Switch, or vintage games built from the ground up to run at a low frame rate.
2 AI Image Reconstruction (DLSS 2.0)
DLSS makes frame rates higher. In exchange, you lose some picture clarity. But the cool thing about DLSS is that it reduces picture clarity in an artful way that's often hard to notice, while the increase in frame rate is very noticeable. DLSS is great. It lets me sacrifice a little and get a lot in return. I recommend getting an Nvidia RTX GPU just for this feature.
There is a news interview with Nvidia CEO Jensen Huang where he talks about having a graphics card render one pixel in a game, then having AI “guess” the other 3 pixels. I’m pretty sure he was talking about DLSS on the “performance” setting when he said this.
Apparently, when they have AI “guess” how a 1080 pixel by 1920 pixel image would appear at 3840 x 2160, this “guesswork” is faster than actually rendering it. Meanwhile, the AI-upscaled image appears mostly as you would expect it to. Today they use that process to make PC games run faster. It's basically cutting corners, but it's cutting corners that are mostly hard for me to notice. Well, mostly.
In F.I.S.T. Forged in Shadow Torch, I needed DLSS to play with the settings I wanted. It convincingly touched-up the picture to look crisp and detailed. But it created strange, dark outlines around my character's head when he went outdoors. I didn't like that very much.
Fortnite did this, too. It only happens at the very beginning of a match when the battle bus is carrying everyone to the island. My camera would look at the bus over the ocean, but the outline around the bus was not ocean-colored. I wondered if the AI was “filling in the blanks” with the earthy color of dry land. It was quite odd.
For the most part, though, DLSS just works. In God of War, I found DLSS Quality to be virtually indistinguishable from native 4K. All I could really perceive as the end user was that it raised my frame rate from 90 to 120.
In Ratchet and Clank: a Rift Apart, DLSS Performance did make the image somewhat soft and shimmery, but the pace of the game was so frantic that it was nearly impossible for me to dwell on this. So, it was worth it to me to trade a fuzzier picture for smoother motion.
DLSS isn't without its drawbacks, and I don't recommend turning it on if your game is already running well. I've grown fond of it, though. I think it's here to stay. It gets a solid recommendation from me.
3 High Resolutions
PCs can put more pixels on screen than a console, making the image sharper. While I don't think this matters as much as high frame rates, it is still cool. My verdict is that 1080p (1920 x 1080 pixels) is for people who want to save money, 1440p (2560 x 1440 pixels) is for everyone else, and 4K (3840 x 2160 pixels) is for crazy people.
Today, 1080p is a budget resolution. Suitable computer components are cheap and abundant. We even have 1080p handheld video games today.
1080p games look okay to me on my parents' 60 inch 4K living room TV, probably since I'm viewing it from across the room. When seated up close at a computer desk, I found 1080p to be rather blurry in Apex Legends. I wondered if it was making me worse at the game because I couldn't clearly see threats from far away.
1440p is a welcome bump in visual detail. Triple-digit frame rates are a little more costly to reach at this resolution, but the cost to get there is still mostly reasonable. It's easy to recommend 1440p as the “sweet spot” for most people.
4K gaming is insane. GPUs capable of 4K 60 fps are available, but currently cost about $500. GPUs intended for 4K at triple-digit frame rates in the latest games still cost over a thousand dollars at retail.
Good 4K monitors cost about the same. I did not pay this for my 4K equipment (an RTX 4080 graphics card and an LG Ultragear 48GQ900-B). Local people sold them to me for less than half of MSRP. If not for that, I'd still be gaming at 1440p on an RTX 3070. This gear is amazing, but personally I'm no sure it elevates the gaming experience enough to justify what it would cost brand new.
When gaming at 4K, there are other practical considerations aside from cost, like desk space. I'm not even sure true 4K is perceptible on a monitor any smaller than 36”. And you need to be fairly close to the monitor, which is probably why I think 1080p looks okay on my mom and dad's living room TV from afar. That's why I think 4K could be a waste under the wrong conditions.
If your computer is powerful enough for 4K, but your desk is too small for a large monitor, it might make more sense to game in 4K on a smaller 1440p monitor. “Supersampling” is the generic term for this. I used supersampling while playing Dark Souls, which “squeezed” 4K gameplay onto my 1440p monitor. It looked way nicer than 1440p native resolution.
If you still insist on true 4K, I found the experience to be overwhelming on my LG Ultragear monitor at first. Before getting used to it, it put me into sensory overload, so I enabled 3840 x 1600 ultra widescreen mode, which added black letterbox bars to the top and bottom of the screen. This softened the blow as I reprogrammed my brain to receive so much visual stimuli at once.
Once I got my bearings, the Resident Evil 4 Remake Demo in 4K made a striking impression on me. I hesitate to say it was like I could reach out and touch the game world, but 4K made definitely made it more intense and visceral. I daresay it was scarier, which is perfect for a horror game.
I didn't really like Wolfenstein II as a game, but in 4K it was quite an experience. The words that kept coming to me were, “Larger than life.” The picture was vast enough for me to get lost in it.
In Apex Legends, the titanic 48” screen size made my shots more accurate. There is a weapon in Apex called the “wingman”. It's basically a hand cannon that chomps 40% health out of enemy players, but has a low rate of fire. Formerly I would avoid the wingman because I could never hit anyone with it on my smaller monitor. But on a larger screen, the wingman became far more viable and deadly. Basically, I tightened my shot group by physically enlarging my target.
Another time in Apex, my squad and I were traveling across the map when out of nowhere I alerted them, squawking into my headset, “I see enemies there.” My teammate said, “What are you talking about? I don't see anyth-” Then suddenly the enemies shot at us. He said, “How did you even see them?”
I don't think there's any use in going beyond 4K. That's it. You are colliding with human perceptual limitations at that point. I even tried supersampling Doom Eternal at 8K because it ran too well at 4K. It didn't look any better. My frame rate just jumped off a cliff while the picture looked basically the same as before.
4 Raytracing
Raytracing is intended to make the graphics look better by improving the lighting. In exchange, you lose some of your frame rate, since raytracing needs more processing power than before.
I foresee that raytracing will be very impressive in the near future. But currently, raytracing effects can be subtle or hard to notice. Truly breathtaking, transformative raytracing is either far from widespread adoption, or calls for crazy powerful hardware almost nobody can afford. People who want a peek at the future of video game graphics will appreciate raytracing. Everyone else can probably ignore it for now.
For the past 10 years, video game graphics have looked great. We’re no longer stuck with Playstation and N64’s silly-looking origami graphics. But video games still look like video games.
To illustrate what I mean by that, one time I was listening to some movie critics on YouTube lambast a bad movie. While complaining about the movie’s lousy CGI effects, one YouTuber said that the effects in one scene “looked like a video game.”
This remark struck a chord with me, because it pointed out that even though PS4 graphics look great, we wouldn’t settle for PS4 graphics in a Hollywood movie. Somehow PS4 still falls short of real life.
There are many reasons for this, but the way that lighting works in video games is partially responsible. Games on Playstation 4 look beautiful, but they do not simulate light in a physically accurate way. Instead they use rasterization.
My understanding is that rasterization is mostly dependent on where your in-game camera is aiming. In a rasterized world, if a tree falls in the woods and nobody’s around to see it, then in terms of lighting it didn’t really happen.
In a raytraced world, lighting phenomena still happen regardless of whether or not the camera directly sees it. This enables certain effects that are hard or downright impossible with rasterization, like reflections in the mirror showing things you cannot directly see, or light coming in through the window, hitting the floor, and pinballing off the walls and furniture. In some games this looks amazing.
The problem with raytracing is that in the wrong hands, it can cut frame rates in half while barely looking any better than rasterized graphics. “More physically accurate” doesn’t always mean “better looking”.
Mechwarrior 5’s raytracing stood out as especially unimpressive to me. I don’t understand why it tanks the frame rate so badly for what I’m seeing on screen.
I can’t even tell what the raytracing is doing in Capcom games. Resident Evil Village looks good, but I truly have no idea how its raytracing options differ from rasterized graphics.
The raytracing in F.I.S.T. Forged in Shadow Torch did look nice, but it was very subtle. It’s almost like I had to know what I was looking for in order to notice it.
Control has very nice-looking raytracing. When I showed it to my brother, who is not a graphics nerd, even he appreciated the difference. It cut the frame rate in half, though. I had to use DLSS for smooth gameplay.
Control's most dramatic raytracing effect has to be the reflections. In one scene, two security guards were pursuing me. I peered down the hallway to check if the coast was clear. While I couldn't see the guards themselves, their figures were visible in a pane of glass, inching closer toward me with their weapons drawn. It was so strange to perceive a threat by such indirect means in a video game.
Doom Eternal’s raytraced reflections look really nice, since the game is full of metallic, industrial surfaces. There are weird consequences, though. You can see your Doom Slayer’s reflection, but his arms and gun can poke too far ahead into the reflective surface ahead of you, resulting in a reflection where his arms are chopped off at the wrists. It's a bit surreal to say the least.
In one part of Doom Eternal, I was in a room full of demons hurling fireballs at me. As I fled, I peered down and saw a glass vial on the floor containing a health potion. A specular gleam was traveling across the curvature of the vial. I realized this gleam was a fireball passing above my head. I watched it pass harmlessly behind me, fully visible even though I was looking down at the ground.
In Metro Exodus, raytraced light can bounce off of solid surfaces more than once. This makes it possible for a lone beam of sunshine to pour into the room, splash off of the floor, and flow in all directions, painting the walls a gentle glow. This effect is called raytraced global illumination, and it looks incredible.
Actually, it made the graphics look rather lopsided. What I mean is the lighting was closer to what I’d expect from pre-rendered CGI, while the models and assets looked like they were originally meant for a PS4 game. These two aesthetics clashed in my brain somewhat.
Fortnite’s raytracing is among the most striking I’ve seen yet. Unreal Engine 5’s raytracing, called “Lumen”, was only available in Fortnite at first. After I got used to playing with Lumen, I turned it off, and suddenly the scene looked virtual and fake. I think Lumen uses raytraced global illumination of some sort. I hope it isn't long before this technology shows up in more Unreal Engine 5 games other than Fortnite.
All the raytracing I’ve described up till now is half rasterized, half raytraced. But the holy grail of raytracing is pathtracing. Pathtracing is 0% rasterized and 100% raytraced. It turns the rendering pipeline into a gestalten light simulation. At the time of writing, I am only aware of 4 official pathtraced games. I have played 2 of them: Portal with RTX, and Cyberpunk 2077.
Portal with RTX left me with the profound impression that I was experiencing a leap forward in game graphics completely impossible up until now. It was a sense of awe comparable to seeing Samus Aran set foot on the pirate space station on my Gamecube at age 13, or emerging from Fallout 3’s vault 101 at age 19. It’s been too long since I’ve felt that game graphics were truly “next gen”.
Portal with RTX brought my poor RTX 3070 graphics card to its knees. Very aggressive DLSS was required for a playable frame rate. I guess it looked okay enough for me to finish the game, but it was often quite fuzzy, like Chell could have used a pair of glasses.
Cyberpunk 2077 with pathtracing wasn’t even worth playing on the RTX 3070. If the frame rate wasn’t obscenely low, then the DLSS was so intense that it looked like my in-game character would have bombed that test in the optometrist’s office where they ask you to read the letters on the wall.
Later, I got an RTX 4080, which is far more costly and powerful than an RTX 3070. I tried Cyberpunk 2077’s pathtraced mode again, tinkering with the settings before landing on 50-60 fps at 3840 x 1600 with DLSS performance. The picture quality was mostly okay, but there were often strange artifacts, like out-of-place dark outlines appearing around female characters’ hairdos.
The picture tended to “squiggle”, like a living Vincent Van Gogh painting. My impression was that the AI was struggling to inference what was actually supposed to be on screen with limited real information. I would have loved to give the AI more real pixels to work with, but 60 frames was already barely within my grasp. I just couldn’t ease up on DLSS any further without tanking the frame rate. Pathtraced Cyberpunk is barely possible on the mighty RTX 4080, which usually crushed any game I threw at it.
Even though today's high-end hardware can barely run it, something about Cyberpunk’s pathtracing made the world look more grounded, I daresay tangible. It was hard to quantify when I turned it on, but I sure noticed once I turned it off. Without pathtracing, Night City’s inhabitants sometimes appeared flat and plastic, reminding me somewhat of vintage CGI.
Overall, I’d say raytracing in 2023 is like gaming with a 3D accelerator card in 1997. The experience is nicer, and the technology is really cool. However, it still costs too much to do it properly, I'm not sure today's raytracing cards are going to age very gracefully, and I can’t think of anything earth-shattering that requires it yet. If you want an early peek at the future, then go for it. But I wouldn't blame you if you bought a GPU that was worse at or even completely incapable of raytracing to save money.
5 AI Frame Generation (DLSS 3.0)
Frame generation is disappointing. Only Nvidia’s latest RTX 40 series cards officially support AI frame gen. I do not endorse buying a 40 series card just for this feature.
The purpose of frame generation is to make frame rates higher without actually rendering more frames. To accomplish this, frame gen inserts AI-generated frames in between “real” frames. So, you play your game with half “real” frames, half AI frames.
What this fails to address is why I even care about high frame rates, which is the improvement to the feel of the game, not the game’s aesthetic appearance. AI frames are not interactive, so they don't actually increase reaction times.
If you frame gen 60 fps to 120 fps, it looks silky smooth, but still feels like 60 fps. If you frame gen 30 fps to 60 fps, it feels terrible. It's like aiming a laser pointer that trails behind by a fraction of a second. How is this so different from using an analog joystick to play a game that only supports digital controls?
AI frame generation just isn't for me. Maybe “AI-assisted motion blur” would be the most charitable way to look at it.
Now I will now recommend 4 different PCs targeting 4 different price points.
1 – the $300-ish 1080p PC
You can't walk into Microcenter and buy one of these. This price is based on cherry-picked second-hand prices I've seen in my area. It will require some effort to find this hardware for this price. But it can be done.
A PC like this will not support raytracing, DLSS, or smooth 4K. But it will support high frame rates at 1080p in games from the PS4 generation. High frame rates are not a superficial or cosmetic improvement. The purpose of the extra frames is to supercharge the player's kinesthetic connection to the game. I found the impact of this to be just as profound as going from keyboard-only controls in Doom and Duke Nukem 3D to the now-ubiquitous keyboard and mouse combo introduced in Quake. So, if I had to build a gaming PC that could do nothing else, at the absolutely minimum I would make sure it could play breakneck action games as close to 120 fps as possible.
For this tier of PC I recommend a pre-owned GTX 1080 graphics card ($100), which was high-end in the year 2016, and a 1080p 144 hz monitor ($100.) I have seen at least one Facebook Marketplace listing for a B-450 mobo + Ryzen 5 2600 CPU + 16 GB DDR4 combo ($100) before. With a 600 watt power supply and case, this hardware would be worthwhile for graphically lightweight games at triple-digit frame rates.
2 – the $500-ish PC
Do not get a PC for this price. Get a PS5. My view is that no PC can compete with a PS5 for this asking amount.
3 – the $1000-ish 1440p PC
I think this is the sweet spot to get a PC which is appreciably better than a PS5 without going off the deep end. With pre-owned parts, I'd get an RTX 3080 ($450), a 144 hz 1440p monitor for ($200), and some kind of AM4 slot motherboard ($70) with a Ryzen 7 5800X ($150). 16 GB of DDR4 RAM ($50) and a 750 watt power supply ($75) should be enough for this.
4 – the 4K PC for crazy people (easily multiple thousands of dollars)
I do not think a PC like this is worth the asking amount. I do have a PC like this, but I fished through dumpsters and sniped insane deals on craigslist and facebook marketplace to assemble it. If I had bought this brand new, I would have felt gross.
Some General Thoughts on Modern PC Gaming
I took a break from almost all mainstream modern gaming from roughly 2012 to 2018. Getting back into it, I was apprehensive of finding that all modern games had melted into a homogenous glop of semi-interactive movies made up of visually beautiful but mechanically braindead QTE's.
I walked away with a more positive outlook. Certain modern tropes still irk me somewhat, like the inexplicable need for every game to have a skill tree, or that part from Uncharted where Nathan Drake climbs a rock wall showing up again and again and again in every game ever. But there are still modern gaming experiences I sincerely enjoy. I wholeheartedly recommend these single player modern games on PC:
Star Wars Jedi Fallen Order
Ratchet and Clank: a Rift Apart
Doom Eternal
God of War
Titanfall 2
That being said, I think my heart still lies with 90s PC gaming. Tastes developed during the formative years of my upbringing do factor into this. But I also think certain economic realities have permanently changed the PC gaming landscape.
For one, the skyrocketing cost of making big budget games has left game creators increasingly risk-averse, reluctant to inconvenience the player, and incentivized to target the widest possible audience. This likely accounts for all the homogeneity and recurring tropes, whereas indie games have been more vibrant and willing to try new things. From 2012 to 2018 I went nowhere near modern AAA games, but quite a few indie releases caught my interest.
It's possible the “dumbing down” of modern gaming touches on societal changes as well. While owning and using your own PC did become a lot easier in the 90s than it had been before, having a computer of your own was still a novel concept for most households, and the technical proficiency expected of users was steeper than it is today. So, it's possible that top PC game genres of the 90s reflected this, with standout genres such as real time strategy and simulation.
Maybe 90's PC owners just tended to be more cerebral people, so the market at the time catered to that. I would love it if modern games combined state of the art visuals with the combat simulation of Comanche 3, the strategy of Dungeon Keeper, or the rich interactivity of Thief. Maybe it's my fault and I haven't looked hard enough for modern experiences like this. But it doesn't seem as though PC games like that are so prominent anymore.
Corporate consolidation in the 2010s has wiped out many all-star teams from the 90s. Monolith, Blizzard, Bethesda, and BioWare still exist in the 2020's, but they aren't the same thing. Thankfully I do sense a genuine creative soul in the games from studios like Respawn Entertainment and Insomniac. I'm also really into whatever From Software is doing with Dark Souls and its spinoffs. Instead of trying to appeal to everybody, they're intentionally pissing people off, and the craziest thing of all is that it's working. But it's also hard to escape the feeling like PC gaming in the 2020's is a high school reunion where most of the old familiar faces are gone.
There is also talk of Moore's Law slowing down. So, maybe it's no longer economically feasible to develop forward-looking games targeting tomorrow's hardware. Some aren't even sure that tomorrow's hardware will offer that much more processing power than today's. Nvidia claims this is why they're pivoting so strongly toward AI technology like DLSS and frame generation.
The most noticeable consequence of this is that there is seemingly no such thing as Crysis anymore. We do have Cyberpunk 2077 with a pathtraced graphics mode, but Cyberpunk came out on PS4, too. It's rough and doesn't perform very smoothly, but it's not like you need a monster PC just to play the game itself. I miss how the 90s had unique games designed especially for the PC with no concern as to whether it would ever run on anything else.
There is a lot of hype in PC gaming today. Some of it I think is well-deserved. Some of it is stupid and overblown.
I don’t stand to gain anything by selling people stuff they don’t need, so I think I can help sort out some of the BS from the stuff that’s actually worth caring about. I’ll start with a list of reasons to get a gaming PC, arranged from best to lamest.
1 High Frame Rates
High frame rates make you feel more firmly “in control” of your game. This is an excellent feature. If I were to get a gaming PC for no other reason, it would be for this.
Video games do not actually move. They present a series of static images so quickly that it tricks the human brain into seeing movement.
Cinematic single player games on a home console such as Playstation 4 try to show the player 30 frames per second. This way, the game can look as beautiful as possible without getting so choppy that it irritates the player.
A gaming PC can be several times more powerful than a Playstation 4. So, a 30 fps game on console can run at more fps on PC. Maybe the PC can achieve 60 fps, 120 fps, or even higher. This gives the game smoother motion.
Originally, I thought smoother motion was supposed to make the graphics look nicer. I was wrong. High frame rates make the game feel better. It’s a tactile improvement, not a visual one.
To get across what I mean, I have to bring up something that’s so flagrantly wrong I don’t even know where to start with setting it straight. This excerpt is from the September 1998 premier issue of Maximum PC magazine:
These people do not know what they're talking about. I don't completely blame them, since 100+ hz monitors probably weren't widely available in 1998. In 2020 I had a 144 hz monitor, but I was playing Apex Legends on a Radeon HD 7870 (an aging card from the year 2012) at 40-50 fps.
Later, somebody on Facebook marketplace gave me a free GTX 1050 ti (a budget card from the year 2016). Suddenly I could play at about 100 fps. Right away, my kill count went up, and I was winning more matches than before. The fluid motion made me feel like Neo from the Matrix, bobbing and weaving around bullets as they harmlessly flew past me.
The experience reminded me of NiGHTS into Dreams (Sega Saturn, 1996), which I played with a digital pad at first, then upgraded to the Saturn 3D pad with its analog stick. This was a breakthrough, skyrocketing my scores right away. Sub-dividing the digital pad into much smaller “pie slices” allowed me to go exactly where I wanted. I was no longer locked to 4 rigid directions, or even binary combinations of those four.
High frame rates have the same effect as analog controls, it's just applied to the dimension of time instead of space. When I divide each second of gameplay into smaller slices, it allows me to move exactly when I want. Conversely, low frame rates delay my inputs, making the game feel unresponsive and “heavy”.
Even in non-multiplayer games where you're not competing with anybody else, high frame rates are still amazing. Star Wars: Jedi Fallen Order's single player campaign felt incredible to play at triple-digit frame rates on my RTX 3070. Even though I played on an XBOX 360 controller instead of a mouse, I could steel feel the camera picking up every minute jitter of my thumb, snapping exactly where I told it to go with laser precision.
I now insist upon at least 120 fps for online competitive shooters or fast-paced single player games like Doom Eternal or Ratchet and Clank: a Rift Apart. The effect is not as noticeable in slow-moving games, so I find 60 fps to be acceptable in Red Dead Redemption 2, which is more languid and graphically beautiful. In Dark Souls, combat was so slow and lumbering that 60 fps felt okay.
30 fps is out of the question on a modern PC. That feels like playing Super Mario 64 with a d-pad. I will only settle for 30 frames on the lowly Nintendo Switch, or vintage games built from the ground up to run at a low frame rate.
2 AI Image Reconstruction (DLSS 2.0)
DLSS makes frame rates higher. In exchange, you lose some picture clarity. But the cool thing about DLSS is that it reduces picture clarity in an artful way that's often hard to notice, while the increase in frame rate is very noticeable. DLSS is great. It lets me sacrifice a little and get a lot in return. I recommend getting an Nvidia RTX GPU just for this feature.
There is a news interview with Nvidia CEO Jensen Huang where he talks about having a graphics card render one pixel in a game, then having AI “guess” the other 3 pixels. I’m pretty sure he was talking about DLSS on the “performance” setting when he said this.
Apparently, when they have AI “guess” how a 1080 pixel by 1920 pixel image would appear at 3840 x 2160, this “guesswork” is faster than actually rendering it. Meanwhile, the AI-upscaled image appears mostly as you would expect it to. Today they use that process to make PC games run faster. It's basically cutting corners, but it's cutting corners that are mostly hard for me to notice. Well, mostly.
In F.I.S.T. Forged in Shadow Torch, I needed DLSS to play with the settings I wanted. It convincingly touched-up the picture to look crisp and detailed. But it created strange, dark outlines around my character's head when he went outdoors. I didn't like that very much.
Fortnite did this, too. It only happens at the very beginning of a match when the battle bus is carrying everyone to the island. My camera would look at the bus over the ocean, but the outline around the bus was not ocean-colored. I wondered if the AI was “filling in the blanks” with the earthy color of dry land. It was quite odd.
For the most part, though, DLSS just works. In God of War, I found DLSS Quality to be virtually indistinguishable from native 4K. All I could really perceive as the end user was that it raised my frame rate from 90 to 120.
In Ratchet and Clank: a Rift Apart, DLSS Performance did make the image somewhat soft and shimmery, but the pace of the game was so frantic that it was nearly impossible for me to dwell on this. So, it was worth it to me to trade a fuzzier picture for smoother motion.
DLSS isn't without its drawbacks, and I don't recommend turning it on if your game is already running well. I've grown fond of it, though. I think it's here to stay. It gets a solid recommendation from me.
3 High Resolutions
PCs can put more pixels on screen than a console, making the image sharper. While I don't think this matters as much as high frame rates, it is still cool. My verdict is that 1080p (1920 x 1080 pixels) is for people who want to save money, 1440p (2560 x 1440 pixels) is for everyone else, and 4K (3840 x 2160 pixels) is for crazy people.
Today, 1080p is a budget resolution. Suitable computer components are cheap and abundant. We even have 1080p handheld video games today.
1080p games look okay to me on my parents' 60 inch 4K living room TV, probably since I'm viewing it from across the room. When seated up close at a computer desk, I found 1080p to be rather blurry in Apex Legends. I wondered if it was making me worse at the game because I couldn't clearly see threats from far away.
1440p is a welcome bump in visual detail. Triple-digit frame rates are a little more costly to reach at this resolution, but the cost to get there is still mostly reasonable. It's easy to recommend 1440p as the “sweet spot” for most people.
4K gaming is insane. GPUs capable of 4K 60 fps are available, but currently cost about $500. GPUs intended for 4K at triple-digit frame rates in the latest games still cost over a thousand dollars at retail.
Good 4K monitors cost about the same. I did not pay this for my 4K equipment (an RTX 4080 graphics card and an LG Ultragear 48GQ900-B). Local people sold them to me for less than half of MSRP. If not for that, I'd still be gaming at 1440p on an RTX 3070. This gear is amazing, but personally I'm no sure it elevates the gaming experience enough to justify what it would cost brand new.
When gaming at 4K, there are other practical considerations aside from cost, like desk space. I'm not even sure true 4K is perceptible on a monitor any smaller than 36”. And you need to be fairly close to the monitor, which is probably why I think 1080p looks okay on my mom and dad's living room TV from afar. That's why I think 4K could be a waste under the wrong conditions.
If your computer is powerful enough for 4K, but your desk is too small for a large monitor, it might make more sense to game in 4K on a smaller 1440p monitor. “Supersampling” is the generic term for this. I used supersampling while playing Dark Souls, which “squeezed” 4K gameplay onto my 1440p monitor. It looked way nicer than 1440p native resolution.
If you still insist on true 4K, I found the experience to be overwhelming on my LG Ultragear monitor at first. Before getting used to it, it put me into sensory overload, so I enabled 3840 x 1600 ultra widescreen mode, which added black letterbox bars to the top and bottom of the screen. This softened the blow as I reprogrammed my brain to receive so much visual stimuli at once.
Once I got my bearings, the Resident Evil 4 Remake Demo in 4K made a striking impression on me. I hesitate to say it was like I could reach out and touch the game world, but 4K made definitely made it more intense and visceral. I daresay it was scarier, which is perfect for a horror game.
I didn't really like Wolfenstein II as a game, but in 4K it was quite an experience. The words that kept coming to me were, “Larger than life.” The picture was vast enough for me to get lost in it.
In Apex Legends, the titanic 48” screen size made my shots more accurate. There is a weapon in Apex called the “wingman”. It's basically a hand cannon that chomps 40% health out of enemy players, but has a low rate of fire. Formerly I would avoid the wingman because I could never hit anyone with it on my smaller monitor. But on a larger screen, the wingman became far more viable and deadly. Basically, I tightened my shot group by physically enlarging my target.
Another time in Apex, my squad and I were traveling across the map when out of nowhere I alerted them, squawking into my headset, “I see enemies there.” My teammate said, “What are you talking about? I don't see anyth-” Then suddenly the enemies shot at us. He said, “How did you even see them?”
I don't think there's any use in going beyond 4K. That's it. You are colliding with human perceptual limitations at that point. I even tried supersampling Doom Eternal at 8K because it ran too well at 4K. It didn't look any better. My frame rate just jumped off a cliff while the picture looked basically the same as before.
4 Raytracing
Raytracing is intended to make the graphics look better by improving the lighting. In exchange, you lose some of your frame rate, since raytracing needs more processing power than before.
I foresee that raytracing will be very impressive in the near future. But currently, raytracing effects can be subtle or hard to notice. Truly breathtaking, transformative raytracing is either far from widespread adoption, or calls for crazy powerful hardware almost nobody can afford. People who want a peek at the future of video game graphics will appreciate raytracing. Everyone else can probably ignore it for now.
For the past 10 years, video game graphics have looked great. We’re no longer stuck with Playstation and N64’s silly-looking origami graphics. But video games still look like video games.
To illustrate what I mean by that, one time I was listening to some movie critics on YouTube lambast a bad movie. While complaining about the movie’s lousy CGI effects, one YouTuber said that the effects in one scene “looked like a video game.”
This remark struck a chord with me, because it pointed out that even though PS4 graphics look great, we wouldn’t settle for PS4 graphics in a Hollywood movie. Somehow PS4 still falls short of real life.
There are many reasons for this, but the way that lighting works in video games is partially responsible. Games on Playstation 4 look beautiful, but they do not simulate light in a physically accurate way. Instead they use rasterization.
My understanding is that rasterization is mostly dependent on where your in-game camera is aiming. In a rasterized world, if a tree falls in the woods and nobody’s around to see it, then in terms of lighting it didn’t really happen.
In a raytraced world, lighting phenomena still happen regardless of whether or not the camera directly sees it. This enables certain effects that are hard or downright impossible with rasterization, like reflections in the mirror showing things you cannot directly see, or light coming in through the window, hitting the floor, and pinballing off the walls and furniture. In some games this looks amazing.
The problem with raytracing is that in the wrong hands, it can cut frame rates in half while barely looking any better than rasterized graphics. “More physically accurate” doesn’t always mean “better looking”.
Mechwarrior 5’s raytracing stood out as especially unimpressive to me. I don’t understand why it tanks the frame rate so badly for what I’m seeing on screen.
I can’t even tell what the raytracing is doing in Capcom games. Resident Evil Village looks good, but I truly have no idea how its raytracing options differ from rasterized graphics.
The raytracing in F.I.S.T. Forged in Shadow Torch did look nice, but it was very subtle. It’s almost like I had to know what I was looking for in order to notice it.
Control has very nice-looking raytracing. When I showed it to my brother, who is not a graphics nerd, even he appreciated the difference. It cut the frame rate in half, though. I had to use DLSS for smooth gameplay.
Control's most dramatic raytracing effect has to be the reflections. In one scene, two security guards were pursuing me. I peered down the hallway to check if the coast was clear. While I couldn't see the guards themselves, their figures were visible in a pane of glass, inching closer toward me with their weapons drawn. It was so strange to perceive a threat by such indirect means in a video game.
Doom Eternal’s raytraced reflections look really nice, since the game is full of metallic, industrial surfaces. There are weird consequences, though. You can see your Doom Slayer’s reflection, but his arms and gun can poke too far ahead into the reflective surface ahead of you, resulting in a reflection where his arms are chopped off at the wrists. It's a bit surreal to say the least.
In one part of Doom Eternal, I was in a room full of demons hurling fireballs at me. As I fled, I peered down and saw a glass vial on the floor containing a health potion. A specular gleam was traveling across the curvature of the vial. I realized this gleam was a fireball passing above my head. I watched it pass harmlessly behind me, fully visible even though I was looking down at the ground.
In Metro Exodus, raytraced light can bounce off of solid surfaces more than once. This makes it possible for a lone beam of sunshine to pour into the room, splash off of the floor, and flow in all directions, painting the walls a gentle glow. This effect is called raytraced global illumination, and it looks incredible.
Actually, it made the graphics look rather lopsided. What I mean is the lighting was closer to what I’d expect from pre-rendered CGI, while the models and assets looked like they were originally meant for a PS4 game. These two aesthetics clashed in my brain somewhat.
Fortnite’s raytracing is among the most striking I’ve seen yet. Unreal Engine 5’s raytracing, called “Lumen”, was only available in Fortnite at first. After I got used to playing with Lumen, I turned it off, and suddenly the scene looked virtual and fake. I think Lumen uses raytraced global illumination of some sort. I hope it isn't long before this technology shows up in more Unreal Engine 5 games other than Fortnite.
All the raytracing I’ve described up till now is half rasterized, half raytraced. But the holy grail of raytracing is pathtracing. Pathtracing is 0% rasterized and 100% raytraced. It turns the rendering pipeline into a gestalten light simulation. At the time of writing, I am only aware of 4 official pathtraced games. I have played 2 of them: Portal with RTX, and Cyberpunk 2077.
Portal with RTX left me with the profound impression that I was experiencing a leap forward in game graphics completely impossible up until now. It was a sense of awe comparable to seeing Samus Aran set foot on the pirate space station on my Gamecube at age 13, or emerging from Fallout 3’s vault 101 at age 19. It’s been too long since I’ve felt that game graphics were truly “next gen”.
Portal with RTX brought my poor RTX 3070 graphics card to its knees. Very aggressive DLSS was required for a playable frame rate. I guess it looked okay enough for me to finish the game, but it was often quite fuzzy, like Chell could have used a pair of glasses.
Cyberpunk 2077 with pathtracing wasn’t even worth playing on the RTX 3070. If the frame rate wasn’t obscenely low, then the DLSS was so intense that it looked like my in-game character would have bombed that test in the optometrist’s office where they ask you to read the letters on the wall.
Later, I got an RTX 4080, which is far more costly and powerful than an RTX 3070. I tried Cyberpunk 2077’s pathtraced mode again, tinkering with the settings before landing on 50-60 fps at 3840 x 1600 with DLSS performance. The picture quality was mostly okay, but there were often strange artifacts, like out-of-place dark outlines appearing around female characters’ hairdos.
The picture tended to “squiggle”, like a living Vincent Van Gogh painting. My impression was that the AI was struggling to inference what was actually supposed to be on screen with limited real information. I would have loved to give the AI more real pixels to work with, but 60 frames was already barely within my grasp. I just couldn’t ease up on DLSS any further without tanking the frame rate. Pathtraced Cyberpunk is barely possible on the mighty RTX 4080, which usually crushed any game I threw at it.
Even though today's high-end hardware can barely run it, something about Cyberpunk’s pathtracing made the world look more grounded, I daresay tangible. It was hard to quantify when I turned it on, but I sure noticed once I turned it off. Without pathtracing, Night City’s inhabitants sometimes appeared flat and plastic, reminding me somewhat of vintage CGI.
Overall, I’d say raytracing in 2023 is like gaming with a 3D accelerator card in 1997. The experience is nicer, and the technology is really cool. However, it still costs too much to do it properly, I'm not sure today's raytracing cards are going to age very gracefully, and I can’t think of anything earth-shattering that requires it yet. If you want an early peek at the future, then go for it. But I wouldn't blame you if you bought a GPU that was worse at or even completely incapable of raytracing to save money.
5 AI Frame Generation (DLSS 3.0)
Frame generation is disappointing. Only Nvidia’s latest RTX 40 series cards officially support AI frame gen. I do not endorse buying a 40 series card just for this feature.
The purpose of frame generation is to make frame rates higher without actually rendering more frames. To accomplish this, frame gen inserts AI-generated frames in between “real” frames. So, you play your game with half “real” frames, half AI frames.
What this fails to address is why I even care about high frame rates, which is the improvement to the feel of the game, not the game’s aesthetic appearance. AI frames are not interactive, so they don't actually increase reaction times.
If you frame gen 60 fps to 120 fps, it looks silky smooth, but still feels like 60 fps. If you frame gen 30 fps to 60 fps, it feels terrible. It's like aiming a laser pointer that trails behind by a fraction of a second. How is this so different from using an analog joystick to play a game that only supports digital controls?
AI frame generation just isn't for me. Maybe “AI-assisted motion blur” would be the most charitable way to look at it.
Now I will now recommend 4 different PCs targeting 4 different price points.
1 – the $300-ish 1080p PC
You can't walk into Microcenter and buy one of these. This price is based on cherry-picked second-hand prices I've seen in my area. It will require some effort to find this hardware for this price. But it can be done.
A PC like this will not support raytracing, DLSS, or smooth 4K. But it will support high frame rates at 1080p in games from the PS4 generation. High frame rates are not a superficial or cosmetic improvement. The purpose of the extra frames is to supercharge the player's kinesthetic connection to the game. I found the impact of this to be just as profound as going from keyboard-only controls in Doom and Duke Nukem 3D to the now-ubiquitous keyboard and mouse combo introduced in Quake. So, if I had to build a gaming PC that could do nothing else, at the absolutely minimum I would make sure it could play breakneck action games as close to 120 fps as possible.
For this tier of PC I recommend a pre-owned GTX 1080 graphics card ($100), which was high-end in the year 2016, and a 1080p 144 hz monitor ($100.) I have seen at least one Facebook Marketplace listing for a B-450 mobo + Ryzen 5 2600 CPU + 16 GB DDR4 combo ($100) before. With a 600 watt power supply and case, this hardware would be worthwhile for graphically lightweight games at triple-digit frame rates.
2 – the $500-ish PC
Do not get a PC for this price. Get a PS5. My view is that no PC can compete with a PS5 for this asking amount.
3 – the $1000-ish 1440p PC
I think this is the sweet spot to get a PC which is appreciably better than a PS5 without going off the deep end. With pre-owned parts, I'd get an RTX 3080 ($450), a 144 hz 1440p monitor for ($200), and some kind of AM4 slot motherboard ($70) with a Ryzen 7 5800X ($150). 16 GB of DDR4 RAM ($50) and a 750 watt power supply ($75) should be enough for this.
4 – the 4K PC for crazy people (easily multiple thousands of dollars)
I do not think a PC like this is worth the asking amount. I do have a PC like this, but I fished through dumpsters and sniped insane deals on craigslist and facebook marketplace to assemble it. If I had bought this brand new, I would have felt gross.
Some General Thoughts on Modern PC Gaming
I took a break from almost all mainstream modern gaming from roughly 2012 to 2018. Getting back into it, I was apprehensive of finding that all modern games had melted into a homogenous glop of semi-interactive movies made up of visually beautiful but mechanically braindead QTE's.
I walked away with a more positive outlook. Certain modern tropes still irk me somewhat, like the inexplicable need for every game to have a skill tree, or that part from Uncharted where Nathan Drake climbs a rock wall showing up again and again and again in every game ever. But there are still modern gaming experiences I sincerely enjoy. I wholeheartedly recommend these single player modern games on PC:
Star Wars Jedi Fallen Order
Ratchet and Clank: a Rift Apart
Doom Eternal
God of War
Titanfall 2
That being said, I think my heart still lies with 90s PC gaming. Tastes developed during the formative years of my upbringing do factor into this. But I also think certain economic realities have permanently changed the PC gaming landscape.
For one, the skyrocketing cost of making big budget games has left game creators increasingly risk-averse, reluctant to inconvenience the player, and incentivized to target the widest possible audience. This likely accounts for all the homogeneity and recurring tropes, whereas indie games have been more vibrant and willing to try new things. From 2012 to 2018 I went nowhere near modern AAA games, but quite a few indie releases caught my interest.
It's possible the “dumbing down” of modern gaming touches on societal changes as well. While owning and using your own PC did become a lot easier in the 90s than it had been before, having a computer of your own was still a novel concept for most households, and the technical proficiency expected of users was steeper than it is today. So, it's possible that top PC game genres of the 90s reflected this, with standout genres such as real time strategy and simulation.
Maybe 90's PC owners just tended to be more cerebral people, so the market at the time catered to that. I would love it if modern games combined state of the art visuals with the combat simulation of Comanche 3, the strategy of Dungeon Keeper, or the rich interactivity of Thief. Maybe it's my fault and I haven't looked hard enough for modern experiences like this. But it doesn't seem as though PC games like that are so prominent anymore.
Corporate consolidation in the 2010s has wiped out many all-star teams from the 90s. Monolith, Blizzard, Bethesda, and BioWare still exist in the 2020's, but they aren't the same thing. Thankfully I do sense a genuine creative soul in the games from studios like Respawn Entertainment and Insomniac. I'm also really into whatever From Software is doing with Dark Souls and its spinoffs. Instead of trying to appeal to everybody, they're intentionally pissing people off, and the craziest thing of all is that it's working. But it's also hard to escape the feeling like PC gaming in the 2020's is a high school reunion where most of the old familiar faces are gone.
There is also talk of Moore's Law slowing down. So, maybe it's no longer economically feasible to develop forward-looking games targeting tomorrow's hardware. Some aren't even sure that tomorrow's hardware will offer that much more processing power than today's. Nvidia claims this is why they're pivoting so strongly toward AI technology like DLSS and frame generation.
The most noticeable consequence of this is that there is seemingly no such thing as Crysis anymore. We do have Cyberpunk 2077 with a pathtraced graphics mode, but Cyberpunk came out on PS4, too. It's rough and doesn't perform very smoothly, but it's not like you need a monster PC just to play the game itself. I miss how the 90s had unique games designed especially for the PC with no concern as to whether it would ever run on anything else.