Stallion Free
Cock Encumbered
Its about time studios went back in making their own engines, fuck Unreal.
I don't even know how to respond to a statement so far beyond reason like this.
Its about time studios went back in making their own engines, fuck Unreal.
so, lensflare, dof and fog is the new bloom and brown.
No, pretty sure dynamic real-time global illumination is the new 'bloom and brown'.so, lensflare, dof and fog is the new bloom and brown.
No, pretty sure dynamic real-time global illumination is the new 'bloom and brown'.
I, for one, welcome our new, dynamically-illuminated overlords!
really, because all those games in the op look the same, and look like garbage in terms of art direction.
Source 2 from Valve is theoretically in the works as well.
4chan: So, tell us about Ricochet 2... is Valve potentially already working on a new engine? Source 2, could or could not be..?
Gabe: We've been working on Valve's new engine stuff for a while ...
4chan: Is it going to be more than an add-on engine for Source? Like, is it an entirely new engine?
Gabe: Yeah.
Real time too...hnnnnng
Mirror's Edge was UE3, wasn't it?Frostbite 2
Again we've already seen current gen iterations of this engine in games like BF3 and NFS, but we won't truly know the potential until next gen.
-Takes "full advantage of the DirectX 11 API and 64-bit processors"
-Detailed destruction using "Destruction 3.0"
-MLAA; real-time subsurface scattering
-"Quasi-realtime radiosity using Enlighten from Geomerics"
Mirror's Edge was UE3, wasn't it?
Source 2 > the rest
*Wonders what the folks at Unity3D are up to.*
Dear Sony, why haven't you released Phyre Engine to the public?
oh right, that demo.The screenshot in the OP isn't Mirror's Edge itself, but a Mirror's Edge-inspired example (or, if you're particularly optimistic, Mirror's Edge 2) showing real-time radiosity (see here).
Btw, for next gen will there be a measurable improvements pertaining to:
- Anti-Aliasing
[*]Anisotrophic Filtering- Low resolution alpha texturing (for ref look at the wipers: http://www.youtube.com/watch?v=KuViKgCO3PA)
- Quality of real time shadows cast
To be frank, those things are so insignificant relative to the graphics power that they're immaterial. If you were to measure out what takes the most raw amps to run, the visuals are by far the biggest culprit.But, surely with no AI or 'real' game code running, right? I mean, it's impressive, but it's not under the same pressures (may be wrong word) as a full game.
This is the biggest one for me. Aliasing kills my immersion very, very fast. TXAA and SMAA 2x/4x are looking very promising. Good performance (well, good relative to supersampling) and excellent image quality! I love it.
- Anti-Aliasing
That's probably what they've been doing behind the scenes.
Today's games don't look close to CG. We still have a long ways to go.I think we are approaching the end of the graphics grind. Is this the end of the road fellas?
Today's games don't look close to CG. We still have a long ways to go.
Actually, fun fact: Today's games look pretty close to the CG of a decade past. All you have to do is apply lots of super sampling (it's the reason those CG movies look so clean - they sample every pixel on the screen dozens of times). Well, that and some motion blur would help. It's amazing how much motion blur helps with the believability of a video game.Today's games don't look close to CG. We still have a long ways to go.
Actually, fun fact: Today's games look pretty close to the CG of a decade past. All you have to do is apply lots of super sampling (it's the reason those CG movies look so clean - they sample every pixel on the screen dozens of times). Well, that and some motion blur would help. It's amazing how much motion blur helps with the believability of a video game.
Thing is, CG is a moving target. It improves with the technology and research just like video games do. By the very nature of CG (using render farms to render frames that take minutes at a time to render instead of milliseconds), it is literally impossible for our video games to catch up.
There will be a point where we hit returns that are so diminished where you can't even distinguish between a CG clip and real-time gameplay, but that won't happen until we well and truly solve the aliasing problem. And by that I mean actually rendering at 8x our current resolutions.
out of all the mentioned engines i am more interested to see an actual game working on the Luminous Engine.
but as a SEGA fan, i can't wait to see how an advanced Hedgehog Engine would look like... i think it's a promising engine and really really underrated.
and of course i am mostly looking forward to see AM2's engine for the next VF game.
Yes the IQ in the tech demos are beautiful but you won't see that kind of IQ in the actual games, so don't hype yourselfs up. Atleast not on next-gen consoles.
Well, those polygon counts are obviously unreachable due to the distance between real-time and offline rendering. Even so, when it comes to raw detail, we've already matched that and then some.I'm mainly talking details in assets. Luminous seems very close.
Toy Story
Yes the IQ in the tech demos are beautiful but you won't see that kind of IQ in the actual games, so don't hype yourselfs up. Atleast not on next-gen consoles.
actually it was 8xMSSAA + FXAA
FXAA is cheap for next gen. You're going to get that IQ at least.
Actually, fun fact: Today's games look pretty close to the CG of a decade past. All you have to do is apply lots of super sampling (it's the reason those CG movies look so clean - they sample every pixel on the screen dozens of times). Well, that and some motion blur would help. It's amazing how much motion blur helps with the believability of a video game.
Thing is, CG is a moving target. It improves with the technology and research just like video games do. By the very nature of CG (using render farms to render frames that take minutes at a time to render instead of milliseconds), it is literally impossible for our video games to catch up.
There will be a point where we hit returns that are so diminished where you can't even distinguish between a CG clip and real-time gameplay, but that won't happen until we well and truly solve the aliasing problem. And by that I mean actually rendering at 8x our current resolutions.
Strictly speaking, ray-tracing is not necessary as long as you can do a good enough job of faking it. And today's games do a very good job of 'faking it'.AA dont make a game "CGI-like". Ray tracing and REYES renderer will do... in the future.
Strictly speaking, ray-tracing is not necessary as long as you can do a good enough job of faking it. And today's games do a very good job of 'faking it'.
Many games already use an approximation of ambient occlusion (done entirely in screen space). Its not a huge leap from there to add indirect lighting (hence why all the new engines are focusing on it).
What i want to see, more than ten million polygons per model or whatever, is better, more natural animations and facial expressions. I've always thought that's THE biggest problem with computer generated graphics. Characters always have unnatural movements and horrible facial expressions. Now i know that creating a system that enables technology to emulate living beings is one of the largest problems in computer science, but, though impressive as fuck as CG has became, it will always seem to me as they leave that element of CG unattended and just keep on patching over it.
Some of the best animations I've seen lately are actually in the indie game Overgrowth (still in alpha).