• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NT Analogue announces Super NT

  • Thread starter Deleted member 20415
  • Start date

dark10x

Digital Foundry pixel pusher
I'm not arguing that the C8 specifically isn't fine. Just because you can list one display that scales up well doesn't mean that most of them do. A lot of displays are absolutely terrible at rendering non native resolutions.
I think most recent 4KTVs do a solid job with scaling and latency, though. Just looking through RTINGS data, even most mid to lower-end TVs produce the exact same latency in both 4K and 1080p modes. It's quite rare that 1080p exhibits more latency over 4K.

Either way, if you're serious about perfection then you should own the best displays as well! Even if you can't afford that, the lower tier displays handle 1080p beautifully so it's a non-issue.

Do you have a specific TV in mind then? I feel like you must have encountered a shitty 4K TV with a poor scaler and high latency to make that comment.
 

dirthead

Banned
I think most recent 4KTVs do a solid job with scaling and latency, though. Just looking through RTINGS data, even most mid to lower-end TVs produce the exact same latency in both 4K and 1080p modes. It's quite rare that 1080p exhibits more latency over 4K.

Either way, if you're serious about perfection then you should own the best displays as well! Even if you can't afford that, the lower tier displays handle 1080p beautifully so it's a non-issue.

Do you have a specific TV in mind then? I feel like you must have encountered a shitty 4K TV with a poor scaler and high latency to make that comment.

The Samsung MU6300 for example will apply filtering to the image even if could be scaled up with an integer multiplier. There are a lot of them out there. I agree that they're not all going to be the latest top of the line TVs, but still, it's an issue.

If you really want the best setup for emulation, though, you need a CRT or variable refresh flat panel, though, which immediately excludes any of the LG OLEDs.
 

dark10x

Digital Foundry pixel pusher
The Samsung MU6300 for example will apply filtering to the image even if could be scaled up with an integer multiplier. There are a lot of them out there. I agree that they're not all going to be the latest top of the line TVs, but still, it's an issue.

If you really want the best setup for emulation, though, you need a CRT or variable refresh flat panel, though, which immediately excludes any of the LG OLEDs.
Well, sure, that's why I have a CRT in addition to the LG (well, and a VRR PC monitor just in case).

The Samsung MU6300 seems pretty lousy, from what I'm seeing, so yeah...I'd be annoyed with that one. I hate when aggressive filtering is applied to an upscaled image. The LG's do a fantastic job with this in comparison while Panasonic's OLEDs are even better with the option for full nearest neighbor scaling.

Sadly, there is no perfect display out there especially in the PC space.

I'm using the LG C8 as my primary screen with a Sony PVM 20L4 for retro games. I also have a second gen Pioneer Kuro Elite in another room to scratch that plasma itch. On the PC side, I'm using the LG 38UC99 only because I really wanted a large, higher resolution display and the market is terrible for large PC monitors. PC monitors with the best panels and features all tend to be small 27" displays and lower resolution. What I really want is a 38" 3840x1600 or 40" 3840x2160 monitor with FALD and HDR support + G-Sync but it just doesn't exist.

Which displays are you using at the moment BTW?
 

dirthead

Banned
Well, sure, that's why I have a CRT in addition to the LG (well, and a VRR PC monitor just in case).

The Samsung MU6300 seems pretty lousy, from what I'm seeing, so yeah...I'd be annoyed with that one. I hate when aggressive filtering is applied to an upscaled image. The LG's do a fantastic job with this in comparison while Panasonic's OLEDs are even better with the option for full nearest neighbor scaling.

Sadly, there is no perfect display out there especially in the PC space.

I'm using the LG C8 as my primary screen with a Sony PVM 20L4 for retro games. I also have a second gen Pioneer Kuro Elite in another room to scratch that plasma itch. On the PC side, I'm using the LG 38UC99 only because I really wanted a large, higher resolution display and the market is terrible for large PC monitors. PC monitors with the best panels and features all tend to be small 27" displays and lower resolution. What I really want is a 38" 3840x1600 or 40" 3840x2160 monitor with FALD and HDR support + G-Sync but it just doesn't exist.

Which displays are you using at the moment BTW?

I have an older LG OLED B6, an Asus 27" G-Sync, a Sony 27" Trinitron CRT, an HP ZR30W, and an Acer 28" 4k G-Sync. Next year there are 32.5" variants of the 4k 144hz G-Sync monitors coming out from AU Optronics. Those are the first PC ones that might finally scratch the larger display itch.

Another really interesting option that I wish existed was a 1:1 variable refresh display.

A low input lag variable refresh variant of this monitor would be amazing for emulation because you could run both 3:4 and 4:3 games on it seamlessly.

 

dark10x

Digital Foundry pixel pusher
I have an older LG OLED B6, an Asus 27" G-Sync, a Sony 27" Trinitron CRT, an HP ZR30W, and an Acer 28" 4k G-Sync. Next year there are 32.5" variants of the 4k 144hz G-Sync monitors coming out from AU Optronics. Those are the first PC ones that might finally scratch the larger display itch.

Another really interesting option that I wish existed was a 1:1 variable refresh display.

A low input lag variable refresh variant of this monitor would be amazing for emulation because you could run both 3:4 and 4:3 games on it seamlessly.


Ah yeah, I had a B6 as well. Great overall but it's true that the scaling from 1080p to 4K is quite blurry. The much cleaner/sharper scaling of the C8 combined with the addition of black frame insertion (+brighter HDR game mode and lower latency) pushed me to upgrade. Plus I wanted to move up to 65".

Good call on the 27" Sony. Those are always great TVs.

Yes, I agree on the 1:1 variable refresh display. It would be especially cool to see something like that with built-in circuitry designed to line double lower resolution content - basically something like integrating an OSSC into a display but at a lower level. Something like that.

What I *REALLY* want to see, though, is the return of variable resolution support - basically like a CRT monitor. Would require a completely different type of display technology but it would solve so many issues.
 

Redneckerz

Those long posts don't cover that red neck boy
I don't think so. It looks absolutely gorgeous on a good 4K display. Using an LG C8, the pixels are scaled beautifully with no visible softening. It's an excellent machine.

The fact is, 4K would have been impossible right now due to the requirements of an FPGA chip necessary to reach that. It's simply not reasonable. There's a reason why nobody has yet to release a proper external 4K scaler or line doubler for retro gaming. This isn't just a matter of cost right now.

In a few years, this should be possible but it wasn't possible when the Super Nt was designed. 4K would have been nice but it wouldn't have influenced the experience in a major way unless you're TV is poor with scaling 1080p.

The point is that this is a silent, all-in-one device with a gamepad driven menu. Higan is an amazing emulator but it but it's not a great experience in the living room.

I know you love to shit on developers but I don't think you're giving the engineering efforts of Kevin enough credit here. You can bet your ass he thought about 4K output but discovered that it simply wasn't feasible yet.

Honestly, though, the preferred way to play Super NES is still an original system paired with a PVM.
I just realized that the Vampire V4, like the NT here, uses a Cyclone V FPGA. It too only supports up to 1080p (But enhances the existing Amiga AGA architecture by including a new CPU core and a new GPU, called SAGA). Perhaps there arent enough logic gates to make 4K support functional without massively blowing the bank? Even for the Vampire, a Cyclone V is a substantial cost upgrade compared to the Cyclone III's they used prior.

Don't say I love to shit on developers when you don't know anything about me. Thanks.
Whilst we don't know you personally, your post history gives weight to Dark's statement. You do love to view gaming through a very distorted, onedimensional lens - Where few game developers or console platforms even survive your view/opinion. You can be rattled by that, but its you who gave rise to that negative image of yourself, not everyone else.
 

DunDunDunpachi

Patient MembeR
I have an older LG OLED B6, an Asus 27" G-Sync, a Sony 27" Trinitron CRT, an HP ZR30W, and an Acer 28" 4k G-Sync. Next year there are 32.5" variants of the 4k 144hz G-Sync monitors coming out from AU Optronics. Those are the first PC ones that might finally scratch the larger display itch.

Another really interesting option that I wish existed was a 1:1 variable refresh display.

A low input lag variable refresh variant of this monitor would be amazing for emulation because you could run both 3:4 and 4:3 games on it seamlessly.


Those old Pioneer PDP-V402 displays are probably the closest we'll ever get to what you're describing, sadly.

By the time enough gamers start caring about low-latency (at least to the point of it influencing manufacturers), run-ahead in emulators and the lack of personal experience with CRTs will result in gamers settling for less. Retro enthusiasts/collectors who opt for all the scalers, Retron5s, Polymega, etc etc can't tell the difference now and won't be able to tell the difference then. The ones who care about latency are the ones busy improving their scores and beating more games, not the people filling shelves upon shelves with old games and making Youtube videos about it.

If original hardware is not possible, people should be pumping these games out of a PC and downscaling to a CRT using an Extron (or something equivalent).

The mad-scramble for upscaling is completely backwards, in my opinion.
 

dirthead

Banned
Those old Pioneer PDP-V402 displays are probably the closest we'll ever get to what you're describing, sadly.

By the time enough gamers start caring about low-latency (at least to the point of it influencing manufacturers), run-ahead in emulators and the lack of personal experience with CRTs will result in gamers settling for less. Retro enthusiasts/collectors who opt for all the scalers, Retron5s, Polymega, etc etc can't tell the difference now and won't be able to tell the difference then. The ones who care about latency are the ones busy improving their scores and beating more games, not the people filling shelves upon shelves with old games and making Youtube videos about it.

If original hardware is not possible, people should be pumping these games out of a PC and downscaling to a CRT using an Extron (or something equivalent).

The mad-scramble for upscaling is completely backwards, in my opinion.

I'm really not that pessimistic about displays in the long run. MicroLED displays will have all the advantages of the displays so far and none of the downsides. I don't really romanticize any of the previous display technologies. They all have crippling problems (geometry on CRTs is a joke, CRTs can't scale up in size without getting ludicrously deep, OLEDs have image retention, etc.).

An HDMI 2.1 MicroLED screen with some kind of strobing or black frame insertion will be better than anything a CRT was capable of. We just need to raise people's standards for displays so that the push to get there accelerates. A lot of the problem is that because people accept just any shit, the industry just keeps releasing the same crap every year without moving forward.
 

DunDunDunpachi

Patient MembeR
I'm really not that pessimistic about displays in the long run. MicroLED displays will have all the advantages of the displays so far and none of the downsides. I don't really romanticize any of the previous display technologies. They all have crippling problems (geometry on CRTs is a joke, CRTs can't scale up in size without getting ludicrously deep, OLEDs have image retention, etc.).

An HDMI 2.1 MicroLED screen with some kind of strobing or black frame insertion will be better than anything a CRT was capable of. We just need to raise people's standards for displays so that the push to get there accelerates. A lot of the problem is that because people accept just any shit, the industry just keeps releasing the same crap every year without moving forward.
Well here's the rub: what will push people to support monitors with strobing or black-frame insertion? I just don't see that having much utility except for retro gamers, and like I pointed out I don't see "the masses" of retro gamers actually caring so much about proper display capabilities.

Setting aside the characteristics of each display types, it's the input lag that bothers me the most. I don't see this getting much better. Emulation has made huge bounds here (the aforementioned run-ahead) but I hardly hear a peep about it outside of ultra-nerdy niche gaming groups I'm a part of. People seem to be happy with fake scanlines and "pixel perfect" emulation with little to no regard for the frames of lag they're suffering through.
 

dirthead

Banned
Well here's the rub: what will push people to support monitors with strobing or black-frame insertion? I just don't see that having much utility except for retro gamers, and like I pointed out I don't see "the masses" of retro gamers actually caring so much about proper display capabilities.

Setting aside the characteristics of each display types, it's the input lag that bothers me the most. I don't see this getting much better. Emulation has made huge bounds here (the aforementioned run-ahead) but I hardly hear a peep about it outside of ultra-nerdy niche gaming groups I'm a part of. People seem to be happy with fake scanlines and "pixel perfect" emulation with little to no regard for the frames of lag they're suffering through.

Getting rid of motion blur has implications for every game not just retro games. What a lot of people don't seem to realize is how much games have slowed down over the years. Games don't move as fast as they used to, and they actually feel less intense in a lot of ways. I think shitty. slow, blurry monitors are partially to blame for this. You eliminate motion blur and suddenly there isn't such a huge penalty for moving faster.

I think the phrase "motion resolution" needs to get spread a bit more, too. So we have 4k displays. They look awesome when you take a screenshot. While you're actually playing a game and moving, though, everything's a blurry fucking mess. What's the point of all this resolution when it's instantly lost the second anything moves? The higher detail things get, the more low persistence matters so that you can see those details while moving.
 

DunDunDunpachi

Patient MembeR
Getting rid of motion blur has implications for every game not just retro games. What a lot of people don't seem to realize is how much games have slowed down over the years. Games don't move as fast as they used to, and they actually feel less intense in a lot of ways. I think shitty. slow, blurry monitors are partially to blame for this. You eliminate motion blur and suddenly there isn't such a huge penalty for moving faster.
Right, I think if you compared FPSs from the late 90s (Quake, Unreal Tournament, Starsiege Tribes) and today, modern FPSs are significantly slower. The blur and auto-aiming reduces the need for image clarity, too.

I think the phrase "motion resolution" needs to get spread a bit more, too. So we have 4k displays. They look awesome when you take a screenshot. While you're actually playing a game and moving, though, everything's a blurry fucking mess. What's the point of all this resolution when it's instantly lost the second anything moves? The higher detail things get, the more low persistence matters so that you can see those details while moving.
Doesn't make a hoot of difference to the Madden or Cawadooty player who gets their 4k display on sale during Superbowl season. And PC gamers aren't much better: they blow hundreds extra on <1ms displays even though that is the rating for individual bulb response time/speed, not the input lag from the PC.

I'm not disagreeing. I just think we're a long ways away from anyone caring about motion blur when even the hardware-obsessed PC crowd is barely aware of the underlying issue.
 

dirthead

Banned
Right, I think if you compared FPSs from the late 90s (Quake, Unreal Tournament, Starsiege Tribes) and today, modern FPSs are significantly slower. The blur and auto-aiming reduces the need for image clarity, too.


Doesn't make a hoot of difference to the Madden or Cawadooty player who gets their 4k display on sale during Superbowl season. And PC gamers aren't much better: they blow hundreds extra on <1ms displays even though that is the rating for individual bulb response time/speed, not the input lag from the PC.

I'm not disagreeing. I just think we're a long ways away from anyone caring about motion blur when even the hardware-obsessed PC crowd is barely aware of the underlying issue.

I agree. The problem is that it's easy to slap on some gimmick feature (3D whatever), and actually hard to make a good display that addresses fundamental quality issues when displaying images.
 

DunDunDunpachi

Patient MembeR
I agree. The problem is that it's easy to slap on some gimmick feature (3D whatever), and actually hard to make a good display that addresses fundamental quality issues when displaying images.
Competitive PC gaming should be leading the charge (as I could see the value of lower blur in MOBAs, shooters, RTSs, etc) but that doesn't seem to be happening. Instead, they'll endorse panel companies that may or may not actually make good panels for "pro gaming".

So it's all kinds of messed up. That's why I usually just throw up my hands and recommend a decent CRT.
 

dirthead

Banned
Competitive PC gaming should be leading the charge (as I could see the value of lower blur in MOBAs, shooters, RTSs, etc) but that doesn't seem to be happening. Instead, they'll endorse panel companies that may or may not actually make good panels for "pro gaming".

So it's all kinds of messed up. That's why I usually just throw up my hands and recommend a decent CRT.

I don't have a lot of respect for "pro gaming." We both know it's just people shilling. Put a Madkatz bumper sticker on your forehead and play Street Fighter. Like it isn't completely transparent. What's a little sickening is that I get the feeling that the majority of pro gamers don't even like playing video games and only do so as a revenue stream.
 

DunDunDunpachi

Patient MembeR
I don't have a lot of respect for "pro gaming." We both know it's just people shilling. Put a Madkatz bumper sticker on your forehead and play Street Fighter. Like it isn't completely transparent. What's a little sickening is that I get the feeling that the majority of pro gamers don't even like playing video games and only do so as a revenue stream.
I wouldn't take it that far (that the majority don't even like playing video games), but on the whole I do understand your point.

I know gamers in far-less-popular genres who would qualify as "pro" (as in, they are regularly putting up national-tier and world-tier records) but they like to keep to themselves. Speed-runners were once that way but I think accidental popularity kind of poisoned that scene (we'll see if it can recover).

Pro circuits that allow players to make a name for themselves, grandstand on stage, and play on Twitch all day definitely paves the way for people who are more interested in the lifestyle instead of the game itself. At that point, it is up to the community (especially commentators who also profit from the growth of the scene) to police itself and push for higher-quality play but alas, many of the people watching know less about gaming than the people streaming. "He who makes use of fools has to put up with them", as the saying goes.
 

dolabla

Member
It is! I've been playing Donkey Kong Country on my Super NT, and it looks gorgeous. I'm a bit saddened by the fact that 4.5 will probably be the last firmware update it gets and we'll miss on MS, GB and other cores; but it is what it is.

I think I will start collecting a few of the special chip games, mostly the ones that don't require a lot of reading and I can get cheap from Japan.

Yeah, I was really looking forward to seeing what Kevtris was going to do with the scanlines for 1080p but doesn't look like that's going to happen anymore :(. Still an amazing system.

What settings are you using?
 

dark10x

Digital Foundry pixel pusher
An HDMI 2.1 MicroLED screen with some kind of strobing or black frame insertion will be better than anything a CRT was capable of. We just need to raise people's standards for displays so that the push to get there accelerates. A lot of the problem is that because people accept just any shit, the industry just keeps releasing the same crap every year without moving forward.
That's a rather pessimistic view. Why so negative? We've seen HUGE improvements in display technology in the last few years. OLED, while still relatively new, only became an affordable mainstream display recently and even in the last two years it has improved (hugely improved scaling, black frame insertion, faster input response, brighter picture, 120hz input support and more). I dislike LCD technology but even that has yielded massive improvements. Overall picture quality has improved by leaps and bounds in the last few years. There WAS a period of time when it felt pretty standstill but those days are behind us. The introduction of OLED has basically forced companies to work on contrast and black levels while HDR means a brighter picture. Input latency has never been lower in TVs and lots of new game friendly features are becoming the norm.

Now, MicroLED could be interesting as it seems to solve the remaining flaws that OLED suffers but fundamentally it doesn't solve things like the resolution issue (variable resolution support like a CRT) as it is still fixed pixel. That would be a game changer. Resolutions like 1280x720 and lower can still look absolutely beautiful on a CRT monitor as scaling is not required due to how the image is drawn.

Beyond that, strobing/BFI isn't perfect either though it DOES produce vastly better motion resolution. I've found that BFI produces results closer to a plasma TV but without the yellow trails. Sonic Mania is so much more playable with BFI enabled, I'll say that much. It's a very good feature especially for side scrolling games.

I just realized that the Vampire V4, like the NT here, uses a Cyclone V FPGA. It too only supports up to 1080p (But enhances the existing Amiga AGA architecture by including a new CPU core and a new GPU, called SAGA). Perhaps there arent enough logic gates to make 4K support functional without massively blowing the bank? Even for the Vampire, a Cyclone V is a substantial cost upgrade compared to the Cyclone III's they used prior.
That's right. It's an issue of size. I'm not sure what the availability is on an FPGA core that could support 4K scaling but it would be huge and super expensive.

It's definitely a case where you'd want to choose your TV carefully with this type of usage in mind, though. OSSC, Super Nt and Framemeister all look dramatically better on the C8 versus the B6 OLED I moved from due to the way scaling is handled + the introduction of black frame insertion.
 

dirthead

Banned
That's a rather pessimistic view. Why so negative? We've seen HUGE improvements in display technology in the last few years. OLED, while still relatively new, only became an affordable mainstream display recently and even in the last two years it has improved (hugely improved scaling, black frame insertion, faster input response, brighter picture, 120hz input support and more). I dislike LCD technology but even that has yielded massive improvements. Overall picture quality has improved by leaps and bounds in the last few years. There WAS a period of time when it felt pretty standstill but those days are behind us. The introduction of OLED has basically forced companies to work on contrast and black levels while HDR means a brighter picture. Input latency has never been lower in TVs and lots of new game friendly features are becoming the norm.

Now, MicroLED could be interesting as it seems to solve the remaining flaws that OLED suffers but fundamentally it doesn't solve things like the resolution issue (variable resolution support like a CRT) as it is still fixed pixel. That would be a game changer. Resolutions like 1280x720 and lower can still look absolutely beautiful on a CRT monitor as scaling is not required due to how the image is drawn.

Beyond that, strobing/BFI isn't perfect either though it DOES produce vastly better motion resolution. I've found that BFI produces results closer to a plasma TV but without the yellow trails. Sonic Mania is so much more playable with BFI enabled, I'll say that much. It's a very good feature especially for side scrolling games.


That's right. It's an issue of size. I'm not sure what the availability is on an FPGA core that could support 4K scaling but it would be huge and super expensive.

It's definitely a case where you'd want to choose your TV carefully with this type of usage in mind, though. OSSC, Super Nt and Framemeister all look dramatically better on the C8 versus the B6 OLED I moved from due to the way scaling is handled + the introduction of black frame insertion.

Why do you feel that variable resolution is so important? I'm not seeing that. Even if you just integer scale everything and have overscan, it's not much different than the overscan you'd get on most CRTs anyway. To me, fixed resolution is at the bottom of the list in terms of things to "fix."
 

dark10x

Digital Foundry pixel pusher
Why do you feel that variable resolution is so important? I'm not seeing that. Even if you just integer scale everything and have overscan, it's not much different than the overscan you'd get on most CRTs anyway. To me, fixed resolution is at the bottom of the list in terms of things to "fix."
It would basically eliminate the need to target these specific resolutions. You could render at any arbitrary resolution and achieve great results. It would be a huge win for image quality.

It’s also something that isn’t close to being solved.

Motion? This is nearly solved. We’re getting very close to eliminating blur entirely.

Black levels? Solved.

Variable refresh? Basically solved.

The thing is - variable resolution would eliminate the need for TVs to scale. Integer scaling would be ok (though much more limited since only certain resolutions would benefit) but fee TV manufacturers allow this. Samsung is working on MicroLED right? Their TVs are typically ineffective when it comes to displaying low res content. Lots of upscale blur.

That’s the only reason why there is ever an issue with resolutions these days. People complain about lack of native 4K and previously native 1080p only because of display limitations. 1600x900 would look perfect on a display like this and the difference would be minor. You could do any oddball res even.

So that’s my reasoning. It’s just something I’ve dreamt of seeing return. Unlikely to ever happen but hey!
 

dirthead

Banned
It would basically eliminate the need to target these specific resolutions. You could render at any arbitrary resolution and achieve great results. It would be a huge win for image quality.

It’s also something that isn’t close to being solved.

Motion? This is nearly solved. We’re getting very close to eliminating blur entirely.

Black levels? Solved.

Variable refresh? Basically solved.

The thing is - variable resolution would eliminate the need for TVs to scale. Integer scaling would be ok (though much more limited since only certain resolutions would benefit) but fee TV manufacturers allow this. Samsung is working on MicroLED right? Their TVs are typically ineffective when it comes to displaying low res content. Lots of upscale blur.

That’s the only reason why there is ever an issue with resolutions these days. People complain about lack of native 4K and previously native 1080p only because of display limitations. 1600x900 would look perfect on a display like this and the difference would be minor. You could do any oddball res even.

So that’s my reasoning. It’s just something I’ve dreamt of seeing return. Unlikely to ever happen but hey!

I'm not sure motion is nearly solved because there's still the issue of getting variable refresh to work with strobing/black frame insertion/etc. The anti blur solutions seem to really only work with fixed refresh rates currently.
 

dark10x

Digital Foundry pixel pusher
I'm not sure motion is nearly solved because there's still the issue of getting variable refresh to work with strobing/black frame insertion/etc. The anti blur solutions seem to really only work with fixed refresh rates currently.
Sure, those are two separate problems but it also highlights the inherent limitations of the technology - blur can only be eliminated by 'cheating', basically. Inserting black frames or rapidly flashing the image.

So, you're right, a real solution would involve creating a panel that could natively display things without blur more like a CRT (which, while not perfect, is still the reference - no strobing or BFI implementation is a perfect match there yet but we're getting closer). A different approach to drawing the image would be required.

That said, while VRR is nice, it's honestly not THAT important to me personally at this point. To my eyes, it's not as effective as I would have liked and I prefer a locked frame-rate. Of course, it's light years beyond what you get when running into slowdown without VRR. So it's fantastic tech but only necessary to compensate for unstable performance. That's all just my opinion after spending times with various G-Sync and Freesync displays, though.

Which is why, for the moment, I'd rather use BFI or strobing at a fixed refresh to achieve clean motion over VRR. I'd always prefer VRR if it were an option but not at the expense of motion clarity. If we could eventually combine the two properly, however, that WOULD be amazing and it would be a welcome addition.

...but yeah, motion isn't solved per se but we're getting closer. It's a problem that's being worked on.
 

dirthead

Banned
Sure, those are two separate problems but it also highlights the inherent limitations of the technology - blur can only be eliminated by 'cheating', basically. Inserting black frames or rapidly flashing the image.

So, you're right, a real solution would involve creating a panel that could natively display things without blur more like a CRT (which, while not perfect, is still the reference - no strobing or BFI implementation is a perfect match there yet but we're getting closer). A different approach to drawing the image would be required.

That said, while VRR is nice, it's honestly not THAT important to me personally at this point. To my eyes, it's not as effective as I would have liked and I prefer a locked frame-rate. Of course, it's light years beyond what you get when running into slowdown without VRR. So it's fantastic tech but only necessary to compensate for unstable performance. That's all just my opinion after spending times with various G-Sync and Freesync displays, though.

Which is why, for the moment, I'd rather use BFI or strobing at a fixed refresh to achieve clean motion over VRR. I'd always prefer VRR if it were an option but not at the expense of motion clarity. If we could eventually combine the two properly, however, that WOULD be amazing and it would be a welcome addition.

...but yeah, motion isn't solved per se but we're getting closer. It's a problem that's being worked on.

Well, that's not exactly true. CRTs aren't variable refresh. So it's not like CRT is any better in that regard. Both technologies get rid of the blur the same way (flickering basically). Variable refresh with no tearing + low persistence is a new requirement that no CRT ever handled.
 

dark10x

Digital Foundry pixel pusher
Well, that's not exactly true. CRTs aren't variable refresh. So it's not like CRT is any better in that regard. Both technologies get rid of the blur the same way (flickering basically). Variable refresh with no tearing + low persistence is a new requirement that no CRT ever handled.
That's true.

Still, CRTs are superior in terms of handling motion at fixed refresh rates still.
 

dirthead

Banned
That's true.

Still, CRTs are superior in terms of handling motion at fixed refresh rates still.

I agree in the near term future, but I think that will change soon. Pre HDR flat panels don't get quite as bright as a flickering CRT when strobing, but new HDR flat panels are pretty absurdly bright. That's the only thing they ever really had over motion handling. MicroLED will be an across the board upgrade there. They'll be able to make MicroLEDs ridiculously bright, which should address the dark/washed out problem with flickering/strobing/black frame insertion.
 

Redneckerz

Those long posts don't cover that red neck boy
That's right. It's an issue of size. I'm not sure what the availability is on an FPGA core that could support 4K scaling but it would be huge and super expensive.
I don't think it would be Cyclone anymore but definitely something higher up - Impossible to give a reasonable price tag around. For what it is worth, the NT using an FPGA means it can be upgraded over time if one permits it. The Vampire Team does this with multiple cores.

It would be interesting to see how much you can abuse the FPGA, yourself..
 
The jailbroken firmware is great, but there's still some things it cannot do...

xrxXaE3.jpg



oHh1563.jpg


... and I really like the artwork for this.



yhU5CQH.jpg



I've finished Donkey Kong Country on my Super Nt and am on my way to the last mana seed on Secret of Mana. I might try do Yoshi's Island or Super Metroid after that one. I've been playing the Super Nt almost every day since getting it, and I am still amazed by how beautiful the games look and sound.
 

Chittagong

Gold Member
Sorry for the late reply but im wandering:
Does this mean a random millionare could massproduce say a wiiu, ps2 or ps3 and as long as the brand is different you can profit from it?

The key thing to understand of pre-N64 consoles is that they were collections of simple, mostly off the shelf components not owned by the platform company. The only thing “Nintendo” about a NES, or “Sega” about a Genesis, is the choice of components and the order they were laid in. Given the components are not owned by Nintendo or Sega, they were well documented, they could be easily reverse engineered and reproduced in the FPGA chip.

The latter platforms are different in that they have custom silicon and a custom OSs that would need to be reverse engineered, with little to no documentation, and that would have to be done carefully to avoid copyright infringement.
 

baphomet

Member
Sorry for the late reply but im wandering:
Does this mean a random millionare could massproduce say a wiiu, ps2 or ps3 and as long as the brand is different you can profit from it?

If they're not using anyone's copyrighted code of course they could.
 

lengmandx

Neo Member
Apologize for the double post (Can't edit posts yet). Figured it out. It wasn't the Super NT at all. I had to disable some of the "advanced" features of my Samsung TV in order to stop the dimming.

Hi! Just stumbled upon your post while searching for a solution to the very same problem. I am also running the Super NT connected to a Samsung TV and experience that annoying dimming when the background is scrolling. Would you be so kind to tell me, which settings you changed on your TV?

Many thanks in advance!
 

_Justinian_

Gold Member
I no longer own that TV, but if my memory recalls it correctly, I believe it was a setting that turns off dynamic contrast or energy saving mode. I'm trying to research it now to find out. Let me know if that helps!
 

lengmandx

Neo Member
I no longer own that TV, but if my memory recalls it correctly, I believe it was a setting that turns off dynamic contrast or energy saving mode. I'm trying to research it now to find out. Let me know if that helps!

Many thanks for your response! I already tried to fiddle around with those settings, everything fancy is turned off now, but I am still facing that shimmering problem. Based on your reply I suppose you only adjusted the „regular settings“ and did not enter the (hidden) service menu on your TV?
 
Last edited:

_Justinian_

Gold Member
No. I didn't have to go into any hidden menu. I do recall turning off one of the motion features like auto motion plus or something similar. Have you already tried changing from game mode to PC or another mode?
 
Last edited:

lengmandx

Neo Member
No. I didn't have to go into any hidden menu. I do recall turning off one of the motion features like auto motion plus or something similar. Have you already tried changing from game mode to PC or another mode?

Yes, I have tried already PC mode as well as Games mode and turning all picture enhancing features off manually. Still facing that shimmering issue, I guess it has something to do with wrong aspect ratio, however I have no clue how to fix that...

Anyways, thanks for your input :messenger_winking:
 

_Justinian_

Gold Member
One last thing before I give up entirely. lol Have you tried going into the system settings of the SNES itself and using the "No scaler" setting along with disabling the H and V interpolation options?
 

lengmandx

Neo Member
One last thing before I give up entirely. lol Have you tried going into the system settings of the SNES itself and using the "No scaler" setting along with disabling the H and V interpolation options?

Yes, i basically tried any possible setting combination without any success so far. The shimmering still occurs.
I guess the problem is still on the TV‘s end of the HDMI cable.

Thank you anyhow for your support ;)
 

ksdixon

Member
Sorry for necro bump, but I just got my Super NT from eBay today. I have a black screen when I press power button, no analouge menu pops up.

When turning on the light goes red, then flashes green, blue, yellow and finally settles on white.

I've already swapped the hdmi and power lead with with the ones that work find on my Analouge Mega SG but there was no joy.
 
Last edited:
Top Bottom