|
Post by Ex on Jun 22, 2019 22:24:46 GMT -5
there's a flicker filter/softening mode that a lot of devs used in their games Wow, I didn't know that, but it sure makes sense in retrospect. I mean damn, the Wii already looked bad enough on 16:9 HDTVs thanks to no HDMI, why smear additional Vaseline on the works? I have the same qualm with the original Xbox's built-in anti-aliasing filter, it can be awful as well, and lots of devs (ab)used it. Good to know you can hex edit the filter OFF in Wii games.
|
|
|
Post by Sarge on Jun 22, 2019 22:55:47 GMT -5
Well, I've been testing it to mixed results. I tried a bit with RE4, and it certainly changes things (menus look very clean), but it also has a sort of scanline appearance similar to that of Twilight Princess. Apparently that was one of the standard video modes in the Wii, since I've seen it crop up other places as well, like the other title I tried, Arc Rise Fantasia. Apparently the changes there did nothing, but I think that's because I found no occurrences of one of the strings, and some of the other. Chances are, it has the flicker filter for interlaced modes, but nothing with the progressive modes, so the image is as good as it's going to get with that game.
But yes, that softening mode when using progressive scan is just dumb.
|
|
|
Post by Sarge on Jun 23, 2019 18:29:26 GMT -5
I think I'm getting a better handle on the games that actually benefit and those that don't. Wii visuals still won't get razor sharp, and a lot of the really blurry games are using a mode similar to that of Twilight Princess, so turning on that softening filter gets rid of the "scanlines". I'm pretty sure that's a sort of fake progressive scan mode, but I haven't looked.
Another factor with Wii visuals that makes things worse is that running in widescreen mode doesn't use a true widescreen. It's actually anamorphic widescreen, which really just stretches the image out to fill 16:9, without an accompanying boost in resolution. So you're not looking at 800x480, you're looking at probably a standard 640/720x480, with your overscan and whatnot. Not great! I'm actually going to swap to 4:3 and see if it boosts image quality even more. I think I might be willing to trade off aspect ratio for clarity.
EDIT: Survey says... yes. Not a lot, but yes. I suspect some games benefit more than others. I'm starting to think the best way to play is on a CRT, though; I remember games having quite sharp visuals through S-Video.
|
|
|
Post by Xeogred on Jun 23, 2019 20:05:55 GMT -5
All that basically sums up why I was bummed when Eiji Aonuma recently said a Skyward Sword remaster will probably never happen at this point. The Wii not being a legit HD system is really wack and those games look jaggy as heck. I don't really have issues with how my PS2 upscaler works, but even through the Wii U, a lot of Wii games still leave a lot to be desired on an HDTV. It probably is best left for CRT gaming or emulation.
|
|
|
Post by Ex on Jun 23, 2019 20:16:00 GMT -5
I'm starting to think the best way to play is on a CRT, though It probably is best left for CRT gaming or emulation. Without a doubt, 4:3 CRT is what the Wii' creators had in mind. I think Nintendo grossly underestimated how quickly 16:9 HDTVs would take over the consumer market. So their cost cutting measure (not providing the means to render true 16:9 HDMI capability) ended up making the Wii's already lower powered graphics look even shoddier, on the next gen of TVs. That said, there's a few Wii games that look really crisp over component on a 16:9 HDTV. The Super Mario Galaxy games come to mind.
|
|
|
Post by Sarge on Jun 23, 2019 20:35:29 GMT -5
Yeah, I noticed it wildly varies from game to game. SMG looks really nice. Smash with the filter off looks great, too, and it's nice enough to give you the option in the menu.
The PS2 falls into the same boat. Until now, I hadn't seen a single HDTV that handled that content like I wanted, but somehow it looks fantastic on my 4K set. As fantastic as that can look on a flatscreen, anyway.
|
|
|
Post by Ex on Jun 23, 2019 20:43:24 GMT -5
Until now, I hadn't seen a single HDTV that handled that content like I wanted, but somehow it looks fantastic on my 4K set. I don't recall what the max resolution your HDTV supports is. I assume though, at greater supported resolutions, the pixel multiplier is simply cleaner when the TV upscales. So a 4K HDTV would probably do it more cleanly than say a 720P HDTV would. That said, I'm no expert on this stuff; I don't have the requisite free time to be as anal informed as the My Life In Gaming dudes are.
|
|
|
Post by Sarge on Jun 23, 2019 20:46:23 GMT -5
Yeah, having more pixels to throw at it helps a ton! I also think Samsung has a pretty solid de-interlacing routine in there. And 720p sets are definitely problematic, not the least of which because not only are they not even close to a multiple of an SD resolution, but they aren't even actually true 720p! 1366x768, whee!
|
|
|
Post by Xeogred on Jun 23, 2019 21:08:41 GMT -5
Huh, never knew that about 4K TV's! How was your PS2 hooked up to this 4K?
It still doesn't sound like current stuff, games/systems, movies, etc, really support 4K enough and as common as I'd like to make the jump, though they're really getting affordable as the standard lately. I guess I never considered they might be better at upscaling old stuff though, if they force in higher resolutions or more pixels. Uh... I'll just say like Ex, I'm definitely no expert on this. Just have a 1080p TV and monitor right now.
|
|
|
Post by Sarge on Jun 23, 2019 21:23:37 GMT -5
Depends on your brand. I suspect a 1080p set works out a lot better for SD content, especially if it has a decent de-interlacer. And while it's less pixels, at least it's more than double an SD signal.
I'm using a component cable for my PS2. This TV, much like most modern sets, doesn't have an S-Video jack, it's component, composite, or HDMI. And clearly the de-interlacer in this thing is pretty darn solid; I rarely get combing artifacts, although they can show up from time to time, and the image is very clear.
|
|