• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Agni's Philosophy runs at 60FPS on a GTX 680, uses 1.8GB VRAM. Can next-gen run it?

Spongebob

Banned
To be fair, that FFXIII trailer was a target render of what they believed they could achieve with PS3, then the PS3 hardware didn't quite turn out to be what originally intended and they couldn't quite get there (not that they necessarily would have even if cuts weren't made to PS3).

This is stuff that is running real-time on actual hardware that should be about as powerful as the next-gen consoles.

NO!

High end PC CPU vs 8 jaguar cores (tablet level cpu cores)

GTX680 vs 7770-7850

I'm really hoping my i5 3570k and 7970 Ghz Edition will give me max settings for a couple more years. Spent a lot of cash on this build. Could at least run Agni's Philosophy so that's something.

Anyways, I am confident the other next gen consoles will be able to go at least to Agni's Philosophy level, I don't see why they wouldn't be able to, never underestimate a close system.

You are set for next-gen.
 

Famassu

Member
NO!

High end PC CPU vs 8 jaguar cores (tablet level cpu cores)

GTX680 vs 7770-7850
Yeah, I didn't properly think that sentence through, of course they aren't the same, power-wise. What I basically meant was the optimization part in my next post. Once properly optimized, Orbis and/or Durango should be able to run this, or at least that's what Square Enix believes. Maybe not @ 60fps, but perhaps 30fps.
 
1440x1080p 30fps and with SMAA T2x or S4x. And that's being hopeful based on what Spongebob said. Magical console optimization is just a myth!
 
Those videos are from the new 4gamer article

17ijyp.jpg


2ukkic.jpg


34ckm5.jpg


4lnkoz.jpg


5uxkcq.jpg


61ck7c.jpg

http://www.4gamer.net/games/032/G003263/20130112002/
 

i-Lo

Member
1440x1080p 30fps and with SMAA T2x or S4x. And that's being hopeful based on what Spongebob said. Magical console optimization is just a myth!

And yet we have seen games like Halo 4, TLoU, Uncharted 3, GoWA etc on consoles that are over 7 years old with comparatively meagre hardware resources. Console optimization is not a myth if you can understand its basic principles at which point it stops being , "Magical". Of course, there are obvious limits to it and perhaps the resolution, framerate and AA options you have stated are optimistic as well, but it does not preclude the necessity of understanding why the term, "console optimization" exists and why and how works (even superficially).
 

SapientWolf

Trucker Sexologist
And yet we have seen games like Halo 4, TLoU, Uncharted 3, GoWA etc on consoles that are over 7 years old with comparatively meagre hardware resources. Console optimization is not a myth if you can understand its basic principles at which point it stops being , "Magical". Of course, there are obvious limits to it and perhaps the resolution, framerate and AA options you have stated are optimistic as well, but it does not preclude the necessity of understanding why the term, "console optimization" exists and why and how works (even superficially).
It's not a myth, but IQ is one of the first things to be optimized out. Which is why I roll my eyes whenever I hear someone say that 1080p/60fps will be the standard next gen.
 

nib95

Banned
1440x1080p 30fps and with SMAA T2x or S4x. And that's being hopeful based on what Spongebob said. Magical console optimization is just a myth!

Lol. Yea sure. You try playing Crysis 1 or 2 using a 7800 GT (PS3) or an X1800 XT (360) and get back to me. Hint...even at medium or low settings, at 720p or lower resolutions, you'll be getting awful frame rates.

http://www.youtube.com/watch?v=klgVCk178OI


I've read on forums on people using a 7800 GTX with Crysis not even breaking 20fps on low settings at 1024.

http://forums.anandtech.com/showthread.php?t=148590


Yet on consoles we have this.

http://www.eurogamer.net/articles/digitalfoundry-crysis-face-off
 

SapientWolf

Trucker Sexologist
Lol. Yea sure. You try playing Crysis 1 or 2 using a 7800 GT (PS3) or an X1800 XT (360) and get back to me. Hint...even at medium or low settings, at 720p or lower resolutions, you'll be getting awful frame rates.

http://www.youtube.com/watch?v=klgVCk178OI


I've read on forums on people using a 7800 GTX with Crysis not even breaking 20fps on low settings at 1024.

http://forums.anandtech.com/showthread.php?t=148590


Yet on consoles we have this.

http://www.eurogamer.net/articles/digitalfoundry-crysis-face-off
The x1800xt is actually pretty close to the x1950 pro. It just had a weird paper launch at the end of 05 so no one really bought it.

Here's the x1950 playing Crysis 2.

http://www.youtube.com/watch?v=jHWPGmf_A_0

360 in comparison:

http://www.youtube.com/watch?v=kF67XFXq_ys

Console Crysis is based on the amazingly optimized CryEngine 3 instead of the horribly optimized engine they used for Crysis.
 

Vaporak

Member
Lol. Yea sure. You try playing Crysis 1 or 2 using a 7800 GT (PS3) or an X1800 XT (360) and get back to me. Hint...even at medium or low settings, at 720p or lower resolutions, you'll be getting awful frame rates.

http://www.youtube.com/watch?v=klgVCk178OI


I've read on forums on people using a 7800 GTX with Crysis not even breaking 20fps on low settings at 1024.

http://forums.anandtech.com/showthread.php?t=148590


Yet on consoles we have this.

http://www.eurogamer.net/articles/digitalfoundry-crysis-face-off

Here's Crysis 2 on an X1950 pro which is similar performance to an 1800XT. Not really noticing it being blown away by the xbox build of the game. Console "optimization" is mostly in the form of lowered RAM usage and pairing back graphical features to a "good enough" level, not actual optimization which requires identical outputs. The fact of the matter is that old PC's with tech from that era can handle multi-platform games just fine. The console optimization myth is that you need significantly more powerful PC hardware to be able to get the same results that the consoles get. It's just an internet meme that people have, and even some who should know better fall for it.
 

nib95

Banned
Here's Crysis 2 on an X1950 pro which is similar performance to an 1800XT.

Not exactly, has more than double the pixel pipelines. But who knows, perhaps despite that they're quite similar?

X1800 XT
600MHz
1500MHz GDDR3 on a 256-bit bus
16 pixel pipelines
8 vertex pipelines

X1950 Pro
580mhz
1400Mhz GDDR3 on a 256-bit bus
36 pixel pipelines
8 vertex pipelines
 

Vaporak

Member
Not exactly, has more than double the pixel pipelines. But who knows, perhaps despite that they're quite similar?

X1800 XT
600MHz
1500MHz GDDR3 on a 256-bit bus
16 pixel pipelines
8 vertex pipelines

X1950 Pro
580mhz
1400Mhz GDDR3 on a 256-bit bus
36 pixel pipelines
8 vertex pipelines

Yes, that's basically the only architectural change. In the X1800 there is a 1-1 ratio between texture units and pixel shading hardware units. For the X1900 line that changed to having more than 1 pixel shading units per texture units. Beyond that they are almost identical, except for being manufactured on different processes.
 

Durante

Member
It's not a myth, but IQ is one of the first things to be optimized out. Which is why I roll my eyes whenever I hear someone say that 1080p/60fps will be the standard next gen.
I really dislike that the flat-out reduction of IQ is often called "optimization". That's not optimization. You just made it render faster by making it render worse.
 

Perkel

Banned
Considering SE moved over to using UE4 from what I read. I doubt it :p

And where did you hear that ?

I really dislike that the flat-out reduction of IQ is often called "optimization". That's not optimization. You just made it render faster by making it render worse.

Often yes but it isn't lie. God of War 3 is example of that. Developers did optimize IQ by using MLAA instead of MSAA. If i remember correctly this gave them almost 60% advantage over MSAA and effects were far above 4xMSAA in some places.

Optimization is vague therm but can apply to thing like i described above. Yes reducing framerate or resolution is not optimization.

edit: completely read wrong your post.
 

Vaporak

Member
Often yes but it isn't lie. God of War 3 is example of that. Developers did optimize IQ by using MLAA instead of MSAA. If i remember correctly this gave them almost 60% advantage over MSAA and effects were far above 4xMSAA in some places.

Optimization is vague therm but can apply to thing like i described above. Yes reducing framerate or resolution is not optimization.

edit: completely read wrong your post.

That's not what the word optimization means in a programming context. It's only actually optimized rendering when you compare it to a different renderer with identical output. That is factually not the case with, for example, MLAA vs MSAA. This is an excellent example of the console optimization myth.
 

Perkel

Banned
That's not what the word optimization means in a programming context. It's only actually optimized rendering when you compare it to a different renderer with identical output. That is factually not the case with, for example, MLAA vs MSAA. This is an excellent example of the console optimization myth.

I know it but this word is used to macro and micro situations. You described micro as optimization of one program/code I gave macro example as optimization of AA output where AA techniques used are this part of optimization.
 
That's not what the word optimization means in a programming context. It's only actually optimized rendering when you compare it to a different renderer with identical output. That is factually not the case with, for example, MLAA vs MSAA. This is an excellent example of the console optimization myth.
Rendering is programming. I would consider using a more effecient AA over another optimizing and I'm sure plenty of devs agree.
 

RoboPlato

I'd be in the dick
Keep in mind when talking about Agni on consoles it was running at 60fps with 8x MSAA and FXAA. 30fps and a reduction to 4xMSAA would still look awesome and should be easily do able next gen at 1080p
 

Vaporak

Member
Rendering is programming. I would consider using a more effecient AA over another optimizing and I'm sure plenty of devs agree.

It can't be more efficient if it's doing something different, that's the point. You should just be up front about it and say that you're using a less taxing rendering technique with lower fidelity because you can't do a more taxing one because of hardware limitations.
 

SapientWolf

Trucker Sexologist
It can't be more efficient if it's doing something different, that's the point. You should just be up front about it and say that you're using a less taxing rendering technique with lower fidelity because you can't do a more taxing one because of hardware limitations.
Welcome to real time graphics programming.
 
It can't be more efficient if it's doing something different, that's the point. You should just be up front about it and say that you're using a less taxing rendering technique with lower fidelity because you can't do a more taxing one because of hardware limitations.

I think you're reaching way too far into semantics to try and support your point. AA has a very basic purpose, the type of AA used is just a different way to the same goal.
 
i wonder if the CPU requirements will change significantly when the effects are being done dynamically based on actual gameplay? it makes sense to me that the demo isn't CPU intensive since nothing is being done dynamically and all of the special effects are calculated in-advance, the big issue is rendering it and maintaining the image quality...
 

Perkel

Banned
It can't be more efficient if it's doing something different, that's the point. You should just be up front about it and say that you're using a less taxing rendering technique with lower fidelity because you can't do a more taxing one because of hardware limitations.

Macro and micro. In micro scale your goal is to have better performance or effect in particular technique like MSAA. On macro level your goal is to have better performance or effect of AA. Techniques used to achieve that goal is optimization. End goal is what describes optimization part not change itself.

GoW3 AA was case of optimization. Their goal was to have better and faster AA. And this was part of this so called "magical" optimization. They have same hardware to use but their game looked crystal clear with absolute the best IQ this gen.
 

Krabardaf

Member
Yeah i'm not following you either Vaporak. AA is AA. Some solutions have better visual fidelity, others better performances, but various techniques aren't basically "doing something different".

Also, this :
Welcome to real time graphics programming.

Realtime rendering has always been about tricks and faking things. More so than any digital imaging.

edit :
To answer OP, yes we'll see similar results on next gen. Maybe not 60FPS though. Remember they aren't doing a benchmark, theses demos serve a purpose and feature technology they plan to use.
 

Trickster

Member
And where did you hear that ?

Pretty sure it was via a topic on neogaf.

Anyways just google square enix unreal engine 4 and you will get lots of links to articles like this one right -> here

And someone has already pointed out that licensing the engine doesn't mean they will use it for all their games.
 

Perkel

Banned
Pretty sure it was via a topic on neogaf.

Anyways just google square enix unreal engine 4 and you will get lots of links to articles like this one right -> here

And someone has already pointed out that licensing the engine doesn't mean they will use it for all their games.

They why did you post that ? License is common thing. Especially since they have Eidos now under their wing. It doesn't really mean that much and this is totally idiotic to use other tech (as their main engine) when they have their own tech which by looks of it is amazing.
 

Trickster

Member
They why did you post that ? License is common thing. Especially since they have Eidos now under their wing. It doesn't really mean that much and this is totally idiotic to use other tech (as their main engine) when they have their own tech which by looks of it is amazing.

Because the guy pointing it out was responding to my original post.

My post wasn't really meant to stir up discussion. It was probably a bit incorrect, let's just leave it there.
 

Pooya

Member
I don't know that hair doesn't look realistic exactly, looks 'cool' though.

It doesn't give the impression of individual hair strands moving, maybe the textures they are using for the beard aren't the best.
 
I think you're reaching way too far into semantics to try and support your point. AA has a very basic purpose, the type of AA used is just a different way to the same goal.

Absolutely not. "console optimization", in the true sense of the word, means that you were able to achieve the same image quality with less CPU and GPU resources due to coding for a specific configuration, therefore optimizing your code and getting more out of existing hardware.

Lowering image quality in order to achieve a better framerate is not optimization, it's the same thing PC gamers have been doing for decades by adjusting graphics settings! If I play Deus Ex: Human Revolution and I use FXAA instead of the more taxing (and better-lookin) MSAA, does that mean that the increased framerate is due to "PC optimizations"? Noone can possibly think that.
 

Durante

Member
Yeah i'm not following you either Vaporak. AA is AA. Some solutions have better visual fidelity, others better performances, but various techniques aren't basically "doing something different".
But using a different AA method, that provides a different output, is not optimization. Software optimization is about using different methods to achieve the same output.
 

MaLDo

Member
But using a different AA method, that provides a different output, is not optimization. Software optimization is about using different methods to achieve the same output.


Clever coding then. Sometimes you can achieve 90% of the result with 50% of the power. If you can put that missing 10% far from eyes, it's a good change at all.

Object lods is an "optimization" (let's call it "trick") although you're not seeing the same result as if you did not use lods.
 

MaLDo

Member
It may be a good tradeoff, sure, but it's not optimization.

Yes, message edited with "trick" word. Usually, a lot of developers working with no inhouse tech, are used to optimize final result using those kind of tricks. Two optimizations ways are compatible, the very programmer part with a limited final vision and the "some part on the artist roof" optimi-tricks.
 

Sentenza

Member
I really dislike that the flat-out reduction of IQ is often called "optimization". That's not optimization. You just made it render faster by making it render worse.
Well, you probably know it better than me, but I'd like to point that's how actually most of the "optimizations" work, even outside the cosmetic department.
An "optimized physics engine" is usually an engine that makes more approximations and it's less accurate, but it works faster.
 

kinggroin

Banned
I really dislike that the flat-out reduction of IQ is often called "optimization". That's not optimization. You just made it render faster by making it render worse.


Not to mention completely glossed over as some insubstantial option PC gamers have; talking higher resolutions and IQ enhancing features.

Simply ignoring the dramatic decrease in image quality and only comparing assets....is well, pretty damn disingenuous.

I'd love to see that same Crysis video with a PC port to the new engine. Oh...and have the PC version run at 1080p for starters.
 
Top Bottom