• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RETROARCH - The all-in-one emulator dreams are made of, son

kaioshade

Member
Does anyone here have any experience with the OS X port of Retroarch? I am attempting to use my Dualshock 3 or 4 controller, but for some reason the dpad is not recognized. all other buttons and analog sticks work. I'm not sure if this is a limitation of apple_input or not.
 
Shaders depend 100% on your graphics driver, and integrated won't cut it for some of the crazy ones or stacks of like 4-5 shaders.

makes sense. Emulation itself is mostly CPU heavy right?

Makes sense the games run fine (the CPU isn't bad ) but shaders can slow down (the graphics suck)
 

Rich!

Member
makes sense. Emulation itself is mostly CPU heavy right?

Makes sense the games run fine (the CPU isn't bad ) but shaders can slow down (the graphics suck)

Yeah that's it really

Same as how I can run games like F Zero GX and Kirby Dreamland Wii at full speed with no dips at 720p and native res in Dolphin via my over clocked i7, but playing in higher rendering resolutions with AA results in slowdown because of my outdated radeon graphics card.
 

Lettuce

Member
Shaders depend 100% on your graphics driver, and integrated won't cut it for some of the crazy ones or stacks of like 4-5 shaders.


makes sense. Emulation itself is mostly CPU heavy right?

Makes sense the games run fine (the CPU isn't bad ) but shaders can slow down (the graphics suck)


That might make it a bit more clearer for the new Gigabyte Brix system im looking at getting to replace my old one, has a slightly slower CPU clock speed 1.7GHz vs 1.9Ghz but has a better GPU Intel® HD 4400 graphics vs Intel® HD 4000 graphics.

Im not sure if these will cancel each other out or give me better performance with process shaders!!?
 

Icelight

Member
Been having a bit of trouble with per-core configs lately, and I'm wondering if anyone can provide any pointers here.

Put simply: Retroarch is no longer loading per-core configs, and just loads retroarch.cfg every time.

I can save per-core configs just fine, and they are saved correctly into the correct folder (which mirrors the folder defined in retroarch.cfg for configs), and I can load per-core configs manually just fine. But every time I load retroarch either via the exe directly (and then manually load a core), or via command line with the -L arg to launch with a specific core, it only loads retroarch.cfg automatically.

I have tried both a couple of different nightly builds, as well as the latest stable build, and I see the same thing happening everywhere. I have no idea what could be wrong, but any assistance would be greatly appreciated!

-Edit- Never mind, I'm an idiot and didn't notice that there's a config entry for "Per Core Config". I was scratching my head *trying* to figure out what had changed since I upgraded Retroarch and I kept missing that settings option lol. Figures that I solve it the moment I post this :p
 
I love RetroArch, Ive been using it for Game Boy games but sometimes I find issues with emulating, input lag. graphics clipping etc.

because of that I'm trying to (where possible) use real hardware for videos I make.
However I love the shaders RA has and have been trying to emulate them in premiere pro.

Heres what I have so far with a SNES
maxresdefault.jpg
https://www.youtube.com/watch?v=HD10UGZOdBc

What do you guys think? any changes? I still need to sort out a border.
also getting the width right. I know that some people like the wide stretched look from how they played it even tho it should be more 4:3.
The upscaler I'm using gives me a 1080p wide video from the SNES.

I'm also looking to do other consoles, I really like some of the CRT shaders, ones that give that rgb pixel look from a TV.
But I'm not really sure how to do them, hence why I'm showing you guys and asking for help

so, any ideas how can I recreate some of these shaders in premiere?


(heres an example of a Game Boy video, Super Mario Land GB)
 

Rich!

Member
I love RetroArch, Ive been using it for Game Boy games but sometimes I find issues with emulating, input lag. graphics clipping etc.

because of that I'm trying to (where possible) use real hardware for videos I make.
However I love the shaders RA has and have been trying to emulate them in premiere pro.

Heres what I have so far with a SNES

https://www.youtube.com/watch?v=HD10UGZOdBc

What do you guys think? any changes? I still need to sort out a border.
also getting the width right. I know that some people like the wide stretched look from how they played it even tho it should be more 4:3.
The upscaler I'm using gives me a 1080p wide video from the SNES.

I'm also looking to do other consoles, I really like some of the CRT shaders, ones that give that rgb pixel look from a TV.
But I'm not really sure how to do them, hence why I'm showing you guys and asking for help

so, any ideas how can I recreate some of these shaders in premiere?


(heres an example of a Game Boy video, Super Mario Land GB)

I guess if you want to replicate a SNES being output as a stretched image via composite to an HDTV then you've done a good job.

But personally, I uh, yeah. Not my kind of look...sorry.
 

BONKERS

Member
I love RetroArch, Ive been using it for Game Boy games but sometimes I find issues with emulating, input lag. graphics clipping etc.

because of that I'm trying to (where possible) use real hardware for videos I make.
However I love the shaders RA has and have been trying to emulate them in premiere pro.

Heres what I have so far with a SNES

https://www.youtube.com/watch?v=HD10UGZOdBc

What do you guys think? any changes? I still need to sort out a border.
also getting the width right. I know that some people like the wide stretched look from how they played it even tho it should be more 4:3.
The upscaler I'm using gives me a 1080p wide video from the SNES.

I'm also looking to do other consoles, I really like some of the CRT shaders, ones that give that rgb pixel look from a TV.
But I'm not really sure how to do them, hence why I'm showing you guys and asking for help

so, any ideas how can I recreate some of these shaders in premiere?


(heres an example of a Game Boy video, Super Mario Land GB)

Doesn't look so bad to me. Seems some of scanlines look applied on the wrong lines in some parts or maybe it's just me
 
What are the best basic/simple shaders for emulating a CRT type look on a HD TV? I'm not concerned about messing with configs to find the "perfect look"

I'm not a stickler for super accuracy, I just want something that makes the games look authentic but still pretty good (I do like some of the upgrades emulation offers)

Mostly looking for PSX, Super Nintendo, and GBA (this is the most important because I know there is issues with contrast since the original GBA wasn't backlit)
 
If you want to do no work and get a good result: crt-easymode. It's right in the name!

haha I didn't see that one but it looks good to me

Like I said I know there is something weird with the GBA and GB because of their LCDs. I see some of the LCD shaders that game with Retro but everything is so poorly named and I don't know what half the differences I'm seeing are.
 

Lettuce

Member
Be nice if easymode could make his shader work with the Raspberry Pi 2 device as at the moment its a tad bit too demanding...not sure if it could be stream lined at all??
 

EasyMode

Member
I'm also looking to do other consoles, I really like some of the CRT shaders, ones that give that rgb pixel look from a TV.
But I'm not really sure how to do them, hence why I'm showing you guys and asking for help

so, any ideas how can I recreate some of these shaders in premiere?

The effect you mention is just alternating vertical lines tinted green and magenta (255, 0, 255), or red, green, and blue.

I imagine applying filters to video capture would have considerably more input lag than RetroArch, though. Have you tried turning on GPU Hard Sync in RA's video settings? It reduces input lag, but is more demanding.

Be nice if easymode could make his shader work with the Raspberry Pi 2 device as at the moment its a tad bit too demanding...not sure if it could be stream lined at all??

It's already pretty optimized. You can get a performance boost by setting #define ENABLE_LANCZOS 0, but the scaling might not look as nice. I don't have a Pi, can you tell me if any CRT shaders run full speed? crt-hyllian-lq is the fastest one, IIRC.
 
Like I said I know there is something weird with the GBA and GB because of their LCDs. I see some of the LCD shaders that game with Retro but everything is so poorly named and I don't know what half the differences I'm seeing are.

I spent (too much) time getting some shaders all linked up in the main directory so I can quickly switch between them, so I'm gonna do a comparison gallery sometime soon just in case anyone wants to get a good idea how they stack up to each other.
 
That's really odd... I'm on the latest nightly on an i5 2500k@4.2Ghz and I can play higan accuracy with video sync, audio sync and hard gpu sync 0. Some special chip games still have slowdown but nothing really noticeable.

BTW, there is an enormous fact about bsnes. For 98% of the games Snes9x is just as good, and even if you want bsnes, balanced is way to go in most cases.

Anyway, usual suspects when you have bad performance:

- rewind, if you really want rewind increase your granularity, it's quite demanding
- hard gpu sync 0
- uncalibrated refresh rate
- multi pass shaders, most will not tax RA enough to cause noticeable slowdown but shaders like royale can
- audio latency set too low for your system

I switched to the latest nightly from the stable and my problem went away. Thanks for the suggestions!
 

petran79

Banned
regarding audio latency, on Ubuntu there was crackling sound when I set it to 60 ms. Had to update the audio driver from ALSA daily builds PPA to fix the problem.
 

Toad King

Banned
hard GPU sync reduces input lag right???

The main purpose of hard GPU sync is to force the video driver to wait until all function calls are done and complete before continuing. Sometimes even if you wait for v-sync you can still miss frames because the previous frame's data isn't actually fully processed and it gets more data. When you play some games and it appears to run at sub-60 FPS even though it's fullspeed (really obvious in games with smooth scrolling parts like room transitions in Super Metroid or 60hz sprite flicker effects), this is why. GPU sync objects try to fix this. Mostly it'll give the illusion of better input lag because frames that were getting skipped before are now being displayed.

The best way to get lower input lag is disabling v-sync, a faster monitor, or G-Sync/FreeSync. Also controllers with higher than the USB standard 125 Hz poll rate, although I'm not sure if any PC controllers support that without hacked drivers.
 

petran79

Banned
The main purpose of hard GPU sync is to force the video driver to wait until all function calls are done and complete before continuing. Sometimes even if you wait for v-sync you can still miss frames because the previous frame's data isn't actually fully processed and it gets more data. When you play some games and it appears to run at sub-60 FPS even though it's fullspeed (really obvious in games with smooth scrolling parts like room transitions in Super Metroid or 60hz sprite flicker effects), this is why. GPU sync objects try to fix this. Mostly it'll give the illusion of better input lag because frames that were getting skipped before are now being displayed.

The best way to get lower input lag is disabling v-sync, a faster monitor, or G-Sync/FreeSync. Also controllers with higher than the USB standard 125 Hz poll rate, although I'm not sure if any PC controllers support that without hacked drivers.

Doesnt Windows XP have lower input lag than windows 7?
 

EasyMode

Member
For anyone wondering, I threw together some albums showcasing some of the different CRT and LCD shaders from common-shaders. There's thirteen images in each gallery below, so if anyone is looking to pick one without going through all the work to compare them, well... here you go.

That looks like the first version of my shader, which used linear scaling and no dot mask. The latest version looks pretty different.
 

EasyMode

Member
Nothing changes, maybe its normal.

You could compare it to a standalone emulator to see if it looks any different. Are we talking about a consistent judder or like an occasional dropped frame?

Another thing to try is going into video settings and letting the Estimated Monitor FPS do its thing until it reaches 0% / 2048 samples, and then pressing the confirm button so it saves that refresh rate.

Personally, I do notice more judder in retro games, but I think it's due to their low resolution.
 

Lettuce

Member
For anyone wondering, I threw together some albums showcasing some of the different CRT and LCD shaders from common-shaders. There's thirteen images in each gallery below, so if anyone is looking to pick one without going through all the work to compare them, well... here you go.


rFblcOo.png

http://imgur.com/a/pXZFz

KE0GHuz.png

http://imgur.com/a/LuXSo

e5OJ2NF.png

http://imgur.com/a/qzMlp

How come you didnt included CRT_Geom???, along with Easymode these 2 are the only shaders that have correct scaling for scanlines without the need of interga scaling!

General question about these CRT shaders what settings need to be adjusted to help cut down on the sharpness of the scanlines on pure white screens, for example the Konami logo screen. With bright white areas of the screen the scanlines always appear narrower and sharper than other parts of the screen, why is this??
 

Rick74

Neo Member
Using CRT Lotte's Halation... Scanlines are also correct with integer scaling off.

Make sure you're using true full screen, not windowed full screen
 

EasyMode

Member
Using CRT Lotte's Halation... Scanlines are also correct with integer scaling off.

Make sure you're using true full screen, not windowed full screen

Yes, crt-lottes has exellent looking scanlines at non-integer scales.

General question about these CRT shaders what settings need to be adjusted to help cut down on the sharpness of the scanlines on pure white screens, for example the Konami logo screen. With bright white areas of the screen the scanlines always appear narrower and sharper than other parts of the screen, why is this??

There's a few reasons for this. In the case of my shader, it's caused by the brightness boost parameter. If you lower it to 1.0 that issue goes away, but the reduced brightness might not be worth it.

Any kind of scanline or mask emulation will cause a reduction in brightness, so it's common for CRT shaders to push the colors beyond the maximum displayable value. As a result, brightness is recovered, but portions of scanlines in brighter colors will get clipped.

Another reason is that it's sometimes intentional, and the author wanted the line thickness to vary like on a real CRT:


That pic was taken from the Libretro blog. Notice the difference in line thickness depending on the color's brightness? My shader supports this with the beam width min/max parameters, but I set them equally by default to minimize scanline issues at 1080p.
 

Toad King

Banned
Doesnt Windows XP have lower input lag than windows 7?
In windowed/borderless windowed mode is does with windows compositioning enabled, but with that disabled they're basically the same. Exclusive fullscreen is basically the same as well.

@KainXVIII: Try exclusive fullscreen mode. That + Hard GPU Sync should help.
 
That looks like the first version of my shader, which used linear scaling and no dot mask. The latest version looks pretty different.

Oh, it might be. I've futzed around with my shaders directory a lot and might have accidentally reverted to an older version. I'll check tonight.

Yes, crt-lottes has exellent looking scanlines at non-integer scales.

Is there a specific benefit to turning integer scaling off?
 
You can fill the entire screen at resolutions that aren't an integer multiple of the original game resolution.

I meant is there any way that's specifically better for the CRT shaders? I think I turned on integer scaling upfront and never turned it off so it didn't even occur to me that I should try with it off.
 

Radius4

Member
Some scanline shaders look uneven without it. It depends on your screen res, on my QHD monitor everything looks good with non integer, but I think it might be because many systems scale perfectly to this vertical res
 
How come you didnt included CRT_Geom???, along with Easymode these 2 are the only shaders that have correct scaling for scanlines without the need of interga scaling!

That looks like the first version of my shader, which used linear scaling and no dot mask. The latest version looks pretty different.

Alright, I updated my shaders and ran through again on two more games (on two more systems, heh) with a few changes.

The more up-to-date crt-easymode:
http://imgur.com/a/MkIl5

And crt-geom (customized to turn off the curvature which I just personally can't stand):
http://imgur.com/a/Grw72
 

omaroni

Member
Alright, I updated my shaders and ran through again on two more games (on two more systems, heh) with a few changes.

The more up-to-date crt-easymode:

http://imgur.com/a/MkIl5

And crt-geom (customized to turn off the curvature which I just personally can't stand):

http://imgur.com/a/Grw72

Could you please tell me how to turn the curvature off with geom, I tried changing

the setting around but I always end up with broken black screen.
 

EasyMode

Member
Alright, I updated my shaders and ran through again on two more games (on two more systems, heh) with a few changes.

Sweet. Thanks for taking the time to do all that.

Could you please tell me how to turn the curvature off with geom, I tried changing

the setting around but I always end up with broken black screen.

Are you using a recent build? Under shader parameters, and set Curvature Toggle and Corner Size to 0.
 
Sweet. Thanks for taking the time to do all that.

Thanks for taking the time to keep improving the shader!

Are you using a recent build? Under shader parameters, and set Curvature Toggle and Corner Size to 0.

Or change the first number in lines like

#pragma parameter CURVATURE "CRTGeom Curvature Toggle" 1.0 0.0 1.0 1.0

to 0 in the actual shader files.
 

Lettuce

Member
Anyone how here know how to compile the VBA-M core for linux?. Have tried the following commands...

export CFLAGS=”-mcpu=cortex-a7 -mfpu=neon-vfpv4″
git clone https://github.com/libretro/vbam-libretro.git
cd vba-libretro
make -f Makefile.libretro platform=”armv neon hardfloat” -j4

but im getting an error message…

make: Makefile.libretro: No such file or directory
make: *** No rule to make target ‘Makefile.libretro’. Stop

Any ideas?
 
Doesn't look so bad to me. Seems some of scanlines look applied on the wrong lines in some parts or maybe it's just me

Yes you're right, I need to add them via an image over the video, in that test they were an effect on the video.

The effect you mention is just alternating vertical lines tinted green and magenta (255, 0, 255), or red, green, and blue.

I imagine applying filters to video capture would have considerably more input lag than RetroArch, though. Have you tried turning on GPU Hard Sync in RA's video settings? It reduces input lag, but is more demanding.

It's already pretty optimized. You can get a performance boost by setting #define ENABLE_LANCZOS 0, but the scaling might not look as nice. I don't have a Pi, can you tell me if any CRT shaders run full speed? crt-hyllian-lq is the fastest one, IIRC.

Thanks for the reply
I'm not actually playing with the video effects on, I'm playing on a crt tv with the video being split into a SCART to HDMI scaler,
That then gives me a video that looks "normal" and not like you're seeing it on a tv screen, so the stuff I'm doing is post processing in premiere to give it the look that shaders mimic.

No input lag for me as I'm using original hardware on a normal TV. make seance?
http://imgur.com/a/dQOku#0

so what I'm looking for are tips on how to recreate the TV / shader effect on video.
 

Radius4

Member
Anyone how here know how to compile the VBA-M core for linux?. Have tried the following commands...

export CFLAGS=”-mcpu=cortex-a7 -mfpu=neon-vfpv4″
git clone https://github.com/libretro/vbam-libretro.git
cd vba-libretro
make -f Makefile.libretro platform=”armv neon hardfloat” -j4

but im getting an error message…

make: Makefile.libretro: No such file or directory
make: *** No rule to make target ‘Makefile.libretro’. Stop

Any ideas?

Well that means there is no Makefile.libretro in that folder...

cd src/libretro
make platform=”armv neon hardfloat” -j4
 
Top Bottom