• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.
So, since I discovered that over component on a CRT the PS3 doesn't display 240p game as 240p.
Yeah, Xbox 1, PS3, X360, Wii U, PS4 and XBone are not able to go as low.

The Wii was the last home console capable of feeding 240p "unpadded" to a TV.
I continued playing Arc The Lad 2 (For about 13 Hours)upscaled to 720p on my 768p TV with the smoothing disabled. This looked great except for some weird scrolling artifacts and motion blur from the TV. Both of which the TV are responsible for I am assuming at this point.
That's normal. 768p TV's never pull a perfect 720p, 480p or 240p checkerboard pattern.

You should mostly be unable see it without a checkerboard pattern going on though. It just won't be uniform if you unleash it, no.


Try setting the PS3 to 480p, as it is sometimes better.

720p is only 50% taller than 480p [extra 480/2)*3=720] so, depending on how the PS3 is doing it - either taking 240p and doing 3:1 (1 pixel=3x3) conversion directly for 720p or taking the 240p image padded all the way until it's 480p (2:1 or 2x1) and then scaling it that extra 50%?

If so, 480p might be better, more native. And one conversion less. I don't know if I was very clear in the explanation, but yeah. Always try every resolution available if either fails to deliver.

The trick is scaling as less as possible in separate steps. Your TV already takes 720p and splatters it all over 768p, so that's one conversion you can't undo, now if only you can keep the pipeline from doing to much resizing overwork.
For a 60FPS game though, Arc does have some laggy controls though. Even on original hardware. Obviously not really a problem for such a game though.
I can imagine playing it on the PS3 being a chore then.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Despite my previous musings that I could probably do without a dedicated lag tester and use the various equipment that I already have to test my displays, I realized that Bodnar's device would come in handy greatly when it comes to fine-tuning a display's options to see what does and does not make a significant difference in lag, so I picked one up in the end. Before I give it a spin on an HDTV that I'm planning to pick up (the first I'm ever going to buy, actually), I thought I'd give it a spin on my two existing gaming displays.

The monitor that I've used for all of my HD gaming for the past 3 years is an Asus VH236H, chosen because it's the standard in fighting game tournaments, which I'm pretty active in (be it livestreaming them, assisting in setup and operation, or occasionally still competing in them). The picture on this thing isn't anything to write home about, and it's a rather obsolete model at this point, but in terms of affordability, convenience, and responsiveness, it's still a well-performing display.

lag_asus_vh236h_hdmi_1080p.png


A few things I noticed when using the lag tester:

After calibrating each individual preset mode to the best of my ability for brightness, contrast, sharpness, saturation, etc., the mode that actually produced the best result was the "Night View" mode, which beats out "Game" mode by a miniscule but still perceptible and consistent difference of 0.1 to 0.2 ms. Contrast seems to be better across the board in Night View mode, and I'd wager that that has something to do with how the tester performs.

Cranking up the Overdrive setting to maximum on the VH236H improves the results by another slight 0.1 to 0.2 ms. I do recall reading some Asus marketing materials that mentioned that this feature slightly improves response time, so maybe it wasn't just marketing fluff after all, as miniscule of a difference that it might be.

I did a second test on this monitor, this time using an HDFury 2 to convert from HDMI to VGA:

lag_asus_vh236h_vga_1080p_hdfury2.png


This returns a result 0.1 ms slower, but given some of the fluctuations, I can probably just write that off as the margin of error. The HDFury 2 is effectively lagless, which is relevant for a third test that I wanted to run:

lag_dell_e773c_vga_1080p_hdfury2_xcapture1.png


This time I'm testing my Dell E773c CRT monitor, which is the main display I've been using for SD gaming ever since I picked up my XRGB-3. It's a pretty powerful combo, not only having the best quality picture on any display I've ever owned, but also one that I've noted previously for actually beating out some CRT TVs when it comes to display lag. In the above pictue, I'm using the HDFury 2 to convert the lag tester's HDMI signal to VGA, and an XCAPTURE-1 to bridge that signal to the monitor (since the monitor's built-in VGA cable and the VGA dongle on the HDFury both have male end connectors).

From what I understand, the theoretical minimums for the lag tester are 0 ms for the very top of the screen, 8.3 ms for the middle, and 16.6 ms for the very bottom, if the the 3 markers are lined up perfectly in those exact positions. Both CRT and LCD progressive scan monitors read in a new frame from top to bottom, and at 60 fps, it takes about 16.7 ms to display a full frame, hence the difference in the readings for those 3 positions. (I believe plasma screens display a picture in a different manner, and that the entire picture is updated at once on them, but I'm not totally sure.)

The E773c then would appear to hit those minimums, at least to the best of my ability to measure it. My best explanation for the bottom reading falling under 16ms and the top reading going over 0 ms is that the viewable area of the screen is only part of the total physical screen, and that there's some lines above and below the borders that are still scanned as part of a single 16.7 ms frame.

Gamers who place an importance on input lag, myself included, often judge the worthiness of a flatscreen display on "how much it lags behind a (processing-free) CRT," so this test is pretty useful for providing a baseline when I get around to testing HDTVs. My personal threshold has always been "no more than a 1 frame difference from a CRT," so all I'd have to do is make sure that the display I'm considering isn't more than 16 ms above the E773c's numbers.
 
I can't get a photo of them, but I'm seeing two horizontal lines on my PVM-14L2 spaced evenly in about 1/3rds the screen height apart down the screen. I did some digging around, could these be the stabilizing wires in the aperture grill? Once I noticed them they've been driving me crazy. They're mostly visible on lighter screens with large areas of a single color (I first noticed them on the red carpet in Hyrule castle in ALTTP).
 

antibolo

Banned
I can't get a photo of them, but I'm seeing two horizontal lines on my PVM-14L2 spaced evenly in about 1/3rds the screen height apart down the screen. I did some digging around, could these be the stabilizing wires in the aperture grill? Once I noticed them they've been driving me crazy. They're mostly visible on lighter screens with large areas of a single color (I first noticed them on the red carpet in Hyrule castle in ALTTP).

It's a Trinitron thing, I've seen it often on computer monitors that used Trinitron tech back in the day, on some of them they were VERY noticeable. Never noticed it on my PVM though, but maybe I haven't looked close enough.

ADDENDUM:
The Wikipedia article on Trinitron has a section about it: http://en.wikipedia.org/wiki/Trinitron#Visible_support_wires
The effect is less visible on low resolution tubes (TVs and PVMs), but on computer monitors it was pretty intense, you could immediately tell that a CRT was a Trinitron just by looking at the picture.
 

BONKERS

Member
If so, 480p might be better, more native
Not really to be honest. TV doesn't offer a mode equivalent to 4:4:4 with processing turned off/etc for standard EDTV 480p signal. So even after adjusting it, it still looks mediocre.

Yeah, Xbox 1, PS3, X360, Wii U, PS4 and XBone are not able to go as low.

The Wii was the last home console capable of feeding 240p "unpadded" to a TV.

This is true. And something I now recall/notice with the Original XBOX, which for emulation or even game collections can be an issue. Come to think of it, I wonder how Mega Man Collection fares over Component in 480i on the XBOX? I have it, haven't played it though yet. Bought because it's supposed to the most accurate version of the 3.

..but then again too. The PS2 version, didn't it run line doubled improperly to 480i only?
Uck

That's normal. 768p TV's never pull a perfect 720p, 480p or 240p checkerboard pattern.

You should mostly be unable see it without a checkerboard pattern going on though. It just won't be uniform if you unleash it, no.

Oh yes, i'm quite aware, 1366x768 is not even a perfect 16:9 ratio (1.777777778 vs 1.778645833 etc). A perfect 16:9 is actually 1360x765. Quite a dumb decision being based off current XGA/SVGA/VESA standards of the time when they made 720p TVs. 1366x768

The scrolling artifact I mention, I think is actually an overdrive issue with the TV panel itself due to the signal it's being sent. When smoothing is used. The issue disappears because the content is no longer 100% high frequency. It's been poorly averaged out.


FWIW the same artifact also happened on this display in 480p as well.

And i'd also like to mention that I haven't actually tried other games on this specific TV on the PS3 to see if the same issue occurs yet(To see if it's TV specific). I previously played PS1 games on PS3 on a CRT or different LCD set and the issue didn't occur as far as I recall

720p is only 50% taller than 480p [extra 480/2)*3=720] so, depending on how the PS3 is doing it - either taking 240p and doing 3:1 (1 pixel=3x3) conversion directly for 720p or taking the 240p image padded all the way until it's 480p (2:1 or 2x1) and then scaling it that extra 50%?

This is actually a very well thought out inquiry. Unfortunately something we will never be able to really tell for sure, unless there can be some kind of practical test that can be devised.

The trick is scaling as less as possible in separate steps. Your TV already takes 720p and splatters it all over 768p, so that's one conversion you can't undo, now if only you can keep the pipeline from doing to much resizing overwork.

I agree for sure. I've gotten used to the 720p>768p conversion by now, and it's not as much an issue as you'd imagine on this specific set to be completely honest. (This is also coming from having played several games at 768p Native in recent months as well)I have this TV almost entirely for playing lower resolution PC games and 720p Console games.

I can imagine playing it on the PS3 being a chore then.
The difference is definitely there when such laggy controls are there to begin with, but it's not that much of a difference since it's already fairly laggy. So the experience is much the same. And makes no impact on gameplay since the game is a turn based strategy rpg.
 
Not really to be honest. TV doesn't offer a mode equivalent to 4:4:4 with processing turned off/etc for standard EDTV 480p signal. So even after adjusting it, it still looks mediocre.
I see you've tested it.

Well, doesn't work every time to mess things up but 768p TV's are pretty weird so it's always worth a try.
This is true. And something I now recall/notice with the Original XBOX, which for emulation or even game collections can be an issue. Come to think of it, I wonder how Mega Man Collection fares over Component in 480i on the XBOX? I have it, haven't played it though yet. Bought because it's supposed to the most accurate version of the 3.

..but then again too. The PS2 version, didn't it run line doubled improperly to 480i only?
Uck
I don't suppose Xbox 1 in 480i will be much better than the PS2 version if at all, sadly.

PS2 big fault when it came to retro collection was that they never supported 240p nor 480p, just 480i.

Right now, and for a CRT, the PS2 version might be even trick-able to be the better one via GS Mode Selector and forced 240p. One can't force the Xbox that way.
Oh yes, i'm quite aware, 1366x768 is not even a perfect 16:9 ratio (1.777777778 vs 1.778645833 etc). A perfect 16:9 is actually 1360x765. Quite a dumb decision being based off current XGA/SVGA/VESA standards of the time when they made 720p TV's. 1366x768
Despite that, even if they just hit that resolution, checkerboard patterns wouldn't be perfect, it's a ~7% resolution increase no scaling algorithm can run far with that. And then you have 1024x768 16:9 plasmas.

Doesn't bother me that it's not a perfect scale ratio, seeing (1366/16)*9=768,375 it's less than half a pixel miss, what bothers me is the resolution itself.

It's like the whole 852, 853 or 854 pixels in panoramic 480p. What's the official value? Doesn't really matter if you ask me.

768p is never that predictable in regards to what the hell are they doing there.

And it's use on LCD's is always because it's cheaper than both 720p and 1080p 1:1 screens and still 16:9 (close enough), not something 16:10.
The scrolling artifact I mention, I think is actually an overdrive issue with the TV panel itself due to the signal it's being sent. When smoothing is used. The issue disappears because the content is no longer 100% high frequency. It's been poorly averaged out.


FWIW the same artifact also happened on this display in 480p as well.
Yeah, deinterlace and upscale of really small resolutions (like 240p) often has extra lag, but if it happens on 480p then it usually is the panel or the rendering engine altogether. Same for 720p and 1080p.
This is actually a very well thought out inquiry. Unfortunately something we will never be able to really tell for sure, unless there can be some kind of practical test that can be devised.
True, someone in the know that tested it might know or suspect, but I don't own a PS3, yet - thus have no clue.

I know that the ps3 lacks a vertical scaler though, so the jump from 480p to 720p or 1080p vertically has to be done via software (which is why lots of games don't do 1080p on it). Horizontal can just pull a call to the GPU so it's for sure 640x720 before it is 1280x720.
The difference is definitely there when such laggy controls are there to begin with, but it's not that much of a difference since it's already fairly laggy. So the experience is much the same. And makes no impact on gameplay since the game is a turn based strategy rpg.
I understand, since it lagged by default it never relied on timing.
 

antibolo

Banned
Also, to add to the list of reasons why 1366x768 is the stupidest thing ever: 1366 is not a multiple of 8, so sometimes GPUs can't even render that resolution natively, you have to use 1360 or 1368.
 

D.Lo

Member
Yeah, Xbox 1, PS3, X360, Wii U, PS4 and XBone are not able to go as low.

The Wii was the last home console capable of feeding 240p "unpadded" to a TV.That's normal.
I remember trying to work out why 3rd Strike on Dreamcast looked worse than 2nd Impact, until I found out about the Start+Z trick (2nd Impact is 240p by default, 3rd strike needs to be forced into 240p, otherwise is 480i)

And then 3rd Strike on the Xbox looked so blurry - PAL Xbox never did 480p, only 480i. And even when modded to NTSC it only did upscaled the 480i to 480p, no 240p. Didn;t hurt that the Xbox's component out was also pretty shit (washed out and dark) compared to the Gamecube's amazing component or the Dreamcast's RGB or VGA.

It's a good one!
Mine has poor shielding, or poor isolation, I get audio buzz on white screens. I'm going over it to try and make it better. Maybe I just got a dud unit.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Any of those quality SCART switch boxes that work fine with JP-21 cables?

I've given up trying to find a SELECTY21 for reasonable prices.
 

antibolo

Banned
Any of those quality SCART switch boxes that work fine with JP-21 cables?

I've given up trying to find a SELECTY21 for reasonable prices.

I don't see any reason why the manual version of the Bandridge switch wouldn't work. The LED may be problematic but many people remove it anyway.
 
Mine has poor shielding, or poor isolation, I get audio buzz on white screens. I'm going over it to try and make it better. Maybe I just got a dud unit.

What cables are you using? Planning to order a decent quality cable between my display and the switch and all my console cables are official.
 

D.Lo

Member
What cables are you using? Planning to order a decent quality cable between my display and the switch and all my console cables are official.
Yes it may be the male/male scart cable, and I've ordered another 'high quality' one.

It's definitely none of my console cables, as there is no buzz when they're plugged directly into the Framemeister or PVM.

I might actually make my own mini-din to male scart adapter to bridge the output to the xrgb. Someone needs to make of those commercially, with a 2 foot cable.
 

BONKERS

Member
I remember trying to work out why 3rd Strike on Dreamcast looked worse than 2nd Impact, until I found out about the Start+Z trick (2nd Impact is 240p by default, 3rd strike needs to be forced into 240p, otherwise is 480i)

And then 3rd Strike on the Xbox looked so blurry - PAL Xbox never did 480p, only 480i. And even when modded to NTSC it only did upscaled the 480i to 480p, no 240p. Didn;t hurt that the Xbox's component out was also pretty shit (washed out and dark) compared to the Gamecube's amazing component or the Dreamcast's RGB or VGA.


Mine has poor shielding, or poor isolation, I get audio buzz on white screens. I'm going over it to try and make it better. Maybe I just got a dud unit.

So i'm not the only person who has noticed this?


Speaking of Component, I've been needing to get a Component video switch for ages.

like http://www.amazon.com/dp/B001TK9SEE/?tag=neogaf0e-20

But i'm deathly afraid of any kind of issues that the switch might induce on it's own. Like quality loss and other stuff.
 

Lettuce

Member
Has anyone manage to get a DreamCast working on the xrgb-mini via RGB scart?. I get audio but no picture, the 'Input' light isn't illuminated on the mini?
 

D.Lo

Member
So i'm not the only person who has noticed this?
Always infuriated me back in the day as a GCN fan that Xbox versions of games would get extra review points due to the superior audio output when its entire video output was bad.

Gamecube had exceptional quality video output with those component cables. The Wii was worse!
 

BocoDragon

or, How I Learned to Stop Worrying and Realize This Assgrab is Delicious
Gamecube had exceptional quality video output with those component cables. The Wii was worse!

It is so weird that Nintendo engineered this system with very good video output... then doomed the component cable to its obscure online store.

Yeah, I know it was the early 2000s... but this was also a time when pushing PS2 and Xbox component cables was a thing. Go to Future Shop or a boutique home theater store circa ~2001-2005, and they'd pimp you the Monster Cables.

Blame Nintendo's design of requiring a custom chip in the cable itself, I guess.
 

D.Lo

Member
It is so weird that Nintendo engineered this system with very good video output... then doomed the component cable to its obscure online store.

Yeah, I know it was the early 2000s... but this was also a time when pushing PS2 and Xbox component cables was a thing. Go to Future Shop or a boutique home theater store circa ~2001-2005, and they'd pimp you the Monster Cables.

Blame Nintendo's design of requiring a custom chip in the cable itself, I guess.
They were commercially available in Japan I believe.

It's actually a very good design decision - save the expense of a quality DAC for those who have a display that can use it. But evidently 3rd party accessory companies never decided it was worth it to make their own DAC.

Terrible positioning overall by Nintendo for the poor old GC. It should have been the machine used as showpieces for AV equipment (say with Rogue Squadron III - the game with the highest polygon count of any game on any system that generation) yet most of the public thought it was less powerful than the PS2.
 

SegaShack

Member
What do you do to cut off the power to the LED light on the Manual Bandbridge?
I did this recently. You actually aren't cutting the LED lead but the path that draws signals from consoles and puts them into input 5. There are six screw covers underneath the bandridge that you need a knife to pop off. I almost cut through my entire board before realizing that the LED actually isn't the issue. Just cut the trace and the LED will still be on, but that's OK, as long as it isn't tranfering power to the console in slot 5.

SA5ALyu.jpg
 

SegaShack

Member
Weird, it has the option for those cables at that site now, I'm gonna choose composite sync when I get my cables from them. And speaking of that, when it comes to the Genesis/Megadrive it's been said composite sync is preferable to composite video, I know it gets rid of jailbars and such.

Don't buy from retrogamingcables. They don't shield their cables.

'bout to pull the trigger on a second PVM 20l5 for $160.

The pics he took look good.

aeadkoJ.jpg

rQlygKA.jpg

cLOrlI5.jpg


Do I pull the trigger, GAF?!
My PVM 20L5 I paid $100 for. Definitely worth it for $160 too I'd say. If you have a spare component expansion card I would love to buy one off you.

Seriously, where do I have to move too to find all these PVMs?
Southern CA. I got my PVM just a few days after I decided I wanted one and had a lot of options too. We were once the TV and film capital of the country, before a movie star became governor and raised taxes and permit costs on them, pushing them away.
 

Lettuce

Member
I did this recently. You actually aren't cutting the LED lead but the path that draws signals from consoles and puts them into input 5. There are six screw covers underneath the bandridge that you need a knife to pop off. I almost cut through my entire board before realizing that the LED actually isn't the issue. Just cut the trace and the LED will still be on, but that's OK, as long as it isn't tranfering power to the console in slot 5.

SA5ALyu.jpg

Damn, i made a video of this issue i was having back last week, it scared the crap out of me......

https://www.youtube.com/watch?v=pHAWQSNHGtU&list=UUpi68UhqF_G5amIzFdwqrOA

So cutting that trace in the picture will stop my MD2 from powering up??
 

Gunsmithx

Member
Southern CA. I got my PVM just a few days after I decided I wanted one and had a lot of options too. We were once the TV and film capital of the country, before a movie star became governor and raised taxes and permit costs on them, pushing them away.

I'm now tempted but I doubt the wife would find pvms a good reason to move :/ hmm.. it's only 4 or so days to drive cross country....
 

Jamix012

Member
Would it affect the picture/sound if I use a scart to component converter?(Sorry, I'm new to this).

You mean if they had no shielding or in general. Don't know too much about the physical structure of cables, so can't say anything about shielding, but even the very best Scart to component converter will have some loss of picture quality. In theory a perfect converter could probably get this to be damn near the same, but component'll never quite match the original RGB signal.
 

Khaz

Member
Would it affect the picture/sound if I use a scart to component converter?(Sorry, I'm new to this).

There are interferences in the cable because it's not properly screened. The audio will keep buzzing regardless of what you do with the signal.
 

Timu

Member
You mean if they had no shielding or in general. Don't know too much about the physical structure of cables, so can't say anything about shielding, but even the very best Scart to component converter will have some loss of picture quality. In theory a perfect converter could probably get this to be damn near the same, but component'll never quite match the original RGB signal.
Everyone says that but I never seen a comparison.=O

I've seen people say the same thing for quite some years, but yet to actually see any actual comparisons
Same, but I may put an end to that if I ever plan on buying this within some months.

This is one of the ways to get true RGB on a capture card. I want some big differences.

There are interferences in the cable because it's not properly screened. The audio will keep buzzing regardless of what you do with the signal.
That's not good, guess I'll go with retro_console_accessories again.
 

BONKERS

Member
I see you've tested it.

Some 768p TVs can have a 1:1 scaling mode for 720p resolution where it just leaves the empty space black. This TV doesn't sadly.

However, with a PC you can do this via drivers. So for games that require 720p resolution for AA to work, (IE: House of the Dead Overkill, Sonic 4 EPI) you can achieve perfect scaling.
 

SegaShack

Member
Damn, i made a video of this issue i was having back last week, it scared the crap out of me......

https://www.youtube.com/watch?v=pHAWQSNHGtU&list=UUpi68UhqF_G5amIzFdwqrOA

So cutting that trace in the picture will stop my MD2 from powering up??

Yep, it's that easy, I almost murdered my board though cause I thought it was about removing power to the LED. So I kept seeing the light on and would cut deeper and deeper until I reread the thread on the shmups forum which said it was about ensuring power doesn't get drawn from one console to AV5. I did this and have no power issues what so ever. Just note that you have to dig a knife in hard to pop those screw covers off, but worth it. If you need a picture of where they are let me know.

Is AV5 the only one affected by the power thing?

I only really plan to use AV1 - AV3 myself.

Yes it is, but keep in mind it also means 1-4 are having power taken out of them to an extent.
 

SegaShack

Member
Is there any sort of component/rgb, or just BNC, switch box? Annoying to swap out these cables when I want to switch from component to rgb or vice versa. Unfortunately my PVM 20L5 lacks the expansion module.

Also, would it be stupid to hook up a PS3 or 360 to my PVM since I go to to 1080i?
 

Lettuce

Member
I actually have retrogamingcables bookmarked for future reference, but now certainly having second thoughts.

My Megadrive 2 scart cable from retrogamingcables......


Only had it a week and the first time I removed it from my scart switcher the metal shielding came away from the plastic house and almost ripped all the wires from the scart pins and then the 2nd time the above happened and it just fell apart. Have sent them an annoyed email.

Thing is in the UK ordering from retro cable accessories can get expensive when you add in the shipping costs
 

Lettuce

Member
Anyone know where I can source a quality Dreamcast scart cable from, have tried 2 from eBay and none work with my xrbg-mini
 

Borman

Member
The Xbox component output isn't bad at all, I dont know why people would think that. Perfect? No. But compared to the PS2, or even the PS3, it is great.

26wj59.png
 

bitoriginal

Member
Does anyone know the best way to display an OG PAL Xbox on a HDTV? I've soft modded mine, set to NTSC to display at 720p and I'm using an rgb component cable, but it still doesn't look great for games outputting at 480p. Is there a better way to display 480p games?
 

Jamix012

Member
Does anyone know the best way to display an OG PAL Xbox on a HDTV? I've soft modded mine, set to NTSC to display at 720p and I'm using an rgb component cable, but it still doesn't look great for games outputting at 480p. Is there a better way to display 480p games?

Scart doesn't carry a 480p signal. It's outputting at 480i to your TV, so no wonder it looks awful. Get a component cable.

Edit: You initally said scart and then RGB component so I assumed Scart as component cables aren't RGB. Uh, well the component output on the OG xbox is not amazing so that's about as good as you're going to get it without resorting to either an upscaler or using the 360 BC in the case of some games.
 
Status
Not open for further replies.
Top Bottom