Funnily enough, I'm using a 4K TV, but don't actually have any 4K content for it. It's nice for looking at high-resolution photos, but for games, I don't see it being a meteoric rise in quality.
Occasionally we watch 4K movies on our 4K tv, but that's about it. Usually it's 1080p movies and shows.
I agree ultra high resolutions bring quickly diminishing returns concerning gaming. Wow the ray tracing on that droplet of water really looks defined on my protagonist's eyebrow. Who gives a shit. Give me game design that has had as much work as the graphics, then we're talking.
I've become a sucker for the resolution bump. I kind of double dipped for most of the FPS's I loved on the 360 again on PC over the years and playing them again in better resolutions and framerates really improves the game even more. It's insane to me that I played stuff like FEAR on the 360 originally which seems to run at like 15FPS, doesn't even seem like 30FPS. I've become picky about it too with this genre, anything this gen FPS wise that isn't 60FPS does annoy me now and seems unacceptable.
I might get an RTX 3070 when they drop this year as I want to build a monster rig, I guess I could probably target 1440p. Monitor shopping seems like a serious headache though.
I still don't have a 4K TV around or anything with HDR. Maybe I'd eat my words, but I just wonder if the difference is really that much for current consoles and what they can perform... personally I'm more interested in waiting for OLED's to keep dropping in price. I hear they do blacks/colors really well and would love to see that again, since the CRT days.
Yes many times. It's easy to do. Just use the "Submit" option.
Also thanks, had a hunch I was missing something stupidly simple.
Trophies come in handy here... since I forgot to jot down the day of when I beat the DLC, but that will be logged on the trophy for beating the final boss here.
personally I'm more interested in waiting for OLED's to keep dropping in price
Now that is technology I will endorse. I've only got one OLED screen, my original Vita, but man is it pretty. The blacks are indeed deeper, but the color vibrancy is also much richer versus LCD as well. I'm looking forward to the days when OLED screens are as inexpensive as LCDs are to produce. (Hopefully that day will come.)
Thanks for finding out and summarizing the more in-depth explanation, Ex .
I appreciated your input as well. I'm glad there's enough knowledgeable folks hanging around HRG now, that we can talk about these things at an appreciable technical level.
I think I was conflating TSAA with something like bilinear filtering, which just averages the color values between two discrete pixels and softens the sawtooth edges between them. I knew better than that, but failed to question this incorrect assumption. My weak understanding of TSAA is probably preventing me from 100% understanding the underpinnings of DLSS. I can at least understand from your explanation that the former seems to be a prerequisite to the latter.
My understanding of modern AI-driven neural networks is not very robust. But if pressed for an explanation I would say it uses many iterations of trial and error to reach a desired outcome. Somehow this drives modern software such as facial recognition, deepfakes, AI-generated human faces, and other as-yet unseen use cases.
I did not think that DLSS was somehow creating new detail out of a vacuum. Even in cases where AI can seemingly create something from nothing, like synthetic portraits of people who are not actually alive, the neural network first has to be taught what human faces look like. I believed this to be accomplished by feeding the neural network many, many sample human faces for the AI to form a “concept” of how someone's face should appear.
In DLSS’s case, I thought it was using externally rendered high resolution frames of the target computer game to derive a “model” of how that game’s assets should look at the desired high resolution. Then it did something to the relatively low resolution rendered frame on your GPU to make those frames more closely match the externally rendered high-res ones. Is this process the same as simply eliminating cosmetic artifacts? (I’m not asking rhetorically, I honestly don’t know.)
In DLSS’s case, I thought it was using externally rendered high resolution frames of the target computer game to derive a “model” of how that game’s assets should look at the desired high resolution. Then it did something to the relatively low resolution rendered frame on your GPU to make those frames more closely match the externally rendered high-res ones.
That is in essence what is happening. What I was trying to explain before, was how this happened, along with what visual effects were being used to upscale that 1080p render into a convincing 4K resolution. I probably didn't do a good job of explaining the process as I now understand it, because I was too busy trying to prove my initial assumptions of how the visual tricks were being performed were correct. (One must first nurture one's childish pride, you see.) Although I was still wrong, in that machine learning and AI driven rendering, were indeed part of the process. DLSS is a double edged sword of traditional graphics sharpening methods, guided by an intelligent algorithmic process. You were not wrong about that part, I was.
I must say that NVidia are really kicking ass right now. They have captured the GPU market in a way they never have before. This kinda makes me sad, because I was always an ATI guy. Nostalgic GPU war memories aside, the Nvidia of today is not the Nvidia of 15 years ago, so fair's fair. ATI got fat and lazy, while Nvidia studied the blade.
Tom's Hardware just recently published an article about the Ampere Architecture you may find interesting:
I still want to stress one thing though. True internal 4K rendering that is output at true 4K resolution, will always be superior to 1080p internal rendering that uses tricks to output at 4K resolution. As end users, we should be pushing our chipset manufacturers to target true 4K rendering as the endgame goal, rather than clever tricks to make us think we're seeing actual 4K content. The amount of money that has gone into DLSS... could it have instead been used to develop true 4K processing that is efficient in and of itself? Cost effective and hyper optimized 4K GPU rendering I mean? If one follows the money, the answer must be NO, otherwise Nvidia would have done so, instead of going all in on the Tensor Core stuff. All the same, I'd be more impressed if say ATI came up with true 4K rendering GPUs that were relatively inexpensive and super streamlined. Because if they released a cost competitive product line that produced actual 4K and blistering speeds, that'd make all Nvidia's work on their neural networks and convoluted algorithms a moot effort. Maybe that's ATI's plan.
Speaking of video cards, and NVidia kicking butt... they revealed the 3000 series yesterday. While I'm sure there needs to be more verification of their claims, it appears that the "mid-range" 3070 is going to be more powerful than the $1200 2080 Ti, which is their premium gaming card. And they're doing at a price of only $500. Holy cow, y'all, I'm glad I waited on buying a new machine. And the 3080 is going to be $700, which for a top-line card from them is astounding. Of course, they have a ridiculous 3090 that sports an insane 24 GB of memory (!) that retails for $1500, but if the specs are true, it might actually be worth that premium pricing for those that need it.
Well Sarge whenever you do build your new gaming PC, I hope it's everything you dreamed it'd be.
-
Despite building many in years past, personally I have zero interest in building a powerful PC gaming desktop these days. Not when consoles are the lead development platforms for releases which utilize expensive graphics. Paying double or triple what a PS5 or XSX costs, just so I can play those self-same games at a marginally higher resolution, or slightly faster frame rates, isn't worth the investment IMO. I do think it's worth having a relatively modern laptop for PC-only indie releases though. Those indie games tend to have low system requirements, so a relatively recent laptop does just fine there.