• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Agni's Philosophy runs at 60FPS on a GTX 680, uses 1.8GB VRAM. Can next-gen run it?

Krilekk

Banned
And yet we have seen games like Halo 4, TLoU, Uncharted 3, GoWA etc on consoles that are over 7 years old with comparatively meagre hardware resources. Console optimization is not a myth if you can understand its basic principles at which point it stops being , "Magical". Of course, there are obvious limits to it and perhaps the resolution, framerate and AA options you have stated are optimistic as well, but it does not preclude the necessity of understanding why the term, "console optimization" exists and why and how works (even superficially).

Console optimization this gen was basically to limit FOV to ridiculously low levels, make games as linear as possible and throw away any AI achievements we had.
 

EvB

Member
Console optimization this gen was basically to limit FOV to ridiculously low levels, make games as linear as possible and throw away any AI achievements we had.

not the old fiew of view thing again?!

Do you have giant wide angle lenses attached to all of your windows?
 

Massa

Member
But using a different AA method, that provides a different output, is not optimization. Software optimization is about using different methods to achieve the same output.

That's not always the case. Optimization is about being more efficient for the specific case you're targeting, and often that involves different output.
 
... 32GB of RAM?

I'm not a spec boffin, but there's no way in hell it'll run in its current form on either the Orbis, Durango, or Wii U.

The Wii U is outclassed by that PC by hella lot, the Orbis and Durango are even a couple of generations behind.
 

Durante

Member
Well, you probably know it better than me, but I'd like to point that's how actually most of the "optimizations" work, even outside the cosmetic department.
I wouldn't say that. In general, "software optimizations" are transformations that maintain semantics while improving non-functional parameters (e.g. reducing execution time).

Can we use the word 'compromization' then?
I'd be fine with that :p

This is pretty much non-sense. It provides the same output. AA is AA.
Are you trying to imply that e.g. 4xSGSSAA and FXAA are functionally equivalent?
 
Absolutely not. "console optimization", in the true sense of the word, means that you were able to achieve the same image quality with less CPU and GPU resources due to coding for a specific configuration, therefore optimizing your code and getting more out of existing hardware.

Lowering image quality in order to achieve a better framerate is not optimization, it's the same thing PC gamers have been doing for decades by adjusting graphics settings! If I play Deus Ex: Human Revolution and I use FXAA instead of the more taxing (and better-lookin) MSAA, does that mean that the increased framerate is due to "PC optimizations"? Noone can possibly think that.

Lol it's not lowering IQ. AA at its base is an enhancement. The natural rendering pipeline produces aliasing . So to combat that, developers implement an AA algorithm into the pipe. There's no standard algorithm. There are dozens to choose from. GOW3 happen to optomize one based off MLAA and tweak it to there needs. The end.

There's no such thing as "console optimization", its just optimization. There are various paths you can optimize on but it's all one general concept.

In computer science, program optimization or software optimization is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources.[1] In general, a computer program may be optimized so that it executes more rapidly, or is capable of operating with less memory storage or other resources, or draw less power.
 
Hair has never had so much beauty...

Did we decide rather or not this is possible on consoles.

There would probably be trade offs. I would think so though. Especially in a turnbased RPG, which wouldn't benefit that much from 60FPS for example.

Square will probably spend alot of money and wow us with a couple games.
 

LastNac

Member
There would probably be trade offs. I would think so though. Especially in a turnbased RPG, which wouldn't benefit that much from 60FPS for example.

Square will probably spend alot of money and wow us with a couple games.

Does everything next gen really need to be 60FPS anyway?
 

TronLight

Everybody is Mikkelsexual
Are these new?

http://www.mmoculture.com/2013/01/luminous-studio-new-tech-demos-get-all-hairy/

ibscnGZCnvl9gU.gif

Why SQEX doesn't want to licence this. :(
It'll end like the Crystal Tool, used by 4 title through the generation.
 
Why SQEX doesn't want to licence this. :(
It'll end like the Crystal Tool, used by 4 title through the generation.

Crystal Tools was an unmitigated disaster - that's why it was used so little. I'm sure we'll see both FF/Japanese games use it but also probably some Eidos studios if this turns out well.

The big question as ever with this engine is how much will be lost once AI etc is running over the top; this stuff alone is exciting but has a ton of caveats attached.
 
You cant have a 680 and sell the console at a 400$ price-point and make a profit. The card alone retails for 450$.

sure, if you buy ONE..
if you buy say 500.000 graphic card paying CASH from intel/ati i doubt the price will be even REMOTELY close to retail :)
i'd cut it to 2/3 max, possibly even less :p
 

Durante

Member
Again, AA is an enhancement. Its not part of the rendering pipeline, it is added.
I'm sorry, that's wrong. Real anti-aliasing absolutely is part of the rendering pipeline. The only "AA" methods that are added afterwards are pure post-processing solutions, with some inherent issues.

And even if it were "added", how does this alter my point of inferior AA methods affecting IQ?
 

TronLight

Everybody is Mikkelsexual
Crystal Tools was an unmitigated disaster - that's why it was used so little. I'm sure we'll see both FF/Japanese games use it but also probably some Eidos studios if this turns out well.

The big question as ever with this engine is how much will be lost once AI etc is running over the top; this stuff alone is exciting but has a ton of caveats attached.

Let's hope for the best then. :D
 
I have a pretty average 560Ti and an older i7 870, and I generally play everything at 60fps, 1920x1200 or 1680x1050. Many games I pretty much max out, others require a bit of tuning.
 
I'm sorry, that's wrong. Real anti-aliasing absolutely is part of the rendering pipeline. The only "AA" methods that are added afterwards are pure post-processing solutions, with some inherent issues.

And even if it were "added", how does this alter my point of inferior AA methods affecting IQ?

SIGH.

Please tell me which part of a standard vanilla DX or OpenGL pipe AA resides in. Don't worry, I'll wait.
 

Kagari

Crystal Bearer
They should probably fully move Versus XIII to this engine at this point. They had the lighting technology as of the end of 2011 anyway.
 

Durante

Member
SIGH.

Please tell me which part of a standard vanilla DX or OpenGL pipe AA resides in. Don't worry, I'll wait.
Why would you be talking about the rendering pipeline in terms of APIs? That just obfuscates the issue. In any case, I'm not your 3D programming teacher.

You don't seem interested in genuine discussion, just in going further and further away from the original issue, which is your (plainly incorrect) statement that choosing a less effective AA implementation does not lower IQ. Do you still maintain that this is the case?
 

Eideka

Banned
WAT?

Do current High-End PC games run in 60fps? On PCs that are not Ultra Extreme High-End?

As long as you have the right hardware, pretty much any games can run at 60fps. For obvious reasons you will need some serious horsepower to run the best looking games at 60fps.
 
WAT?

Do current High-End PC games run in 60fps? On PCs that are not Ultra Extreme High-End?

For sure. Its all depends on what you want it to look like. If we wanted our games to run at 200fps we would turn it down to less then 720p with no aa and shitty textures. So we up the ante till we get to an fps that is good enough to play but pretty enough to jack off too.
 

i-Lo

Member
They should probably fully move Versus XIII to this engine at this point. They had the lighting technology as of the end of 2011 anyway.

Provided they have yet to scrap that project. I imagine it would be a daunting task to update the numerous assets to next gen fidelity. It is the gift that keeps on giving.
 
As long as you have the right hardware, pretty much any games can run at 60fps. For obvious reasons you will need some serious horsepower to run the best looking games at 60fps.

Yeah I know, but he was talking about "current High-End PC games" which in my understanding is the top of the notch (like Crysis 3 for example or Assassin's Creed 3). You don't or only barely get 60fps on machines that are equal to what we know about the Next-Gen consoles.

So the next-gen consoles might still have some tricks up their sleeves, but I do not expect many games to run at 60fps.
 

Reiko

Banned
Yeah I know, but he was talking about "current High-End PC games" which in my understanding is the top of the notch (like Crysis 3 for example or Assassin's Creed 3). You don't or only barely get 60fps on machines that are equal to what we know about the Next-Gen consoles.

So the next-gen consoles might still have some tricks up their sleeves, but I do not expect many games to run at 60fps.

If they make the game with 60fps in mind, like they did in the DC/PS2/XBOX/GC era then it's possible.
 

Waaghals

Member
I got 60fps at 1080p in Crysis 2 using maximum details, last year on my 570.
upper midrange hardware is enough for 1080p @ 60fps in most games, even with ultra details.
 

Reiko

Banned
I got 60fps at 1080p in Crysis 2 using maximum details, last year on my 570.
upper midrange hardware is enough for 1080p @ 60fps in most games, even with ultra details.

Crysis 2 despite how good it looks, is last gen.

The graphical ceiling goes up with the next generation games, what was easy to run in 60fps prior won't be so easy without a serious hardware upgrade.
 
Why would you be talking about the rendering pipeline in terms of APIs? That just obfuscates the issue. In any case, I'm not your 3D programming teacher.

You don't seem interested in genuine discussion, just in going further and further away from the original issue, which is your (plainly incorrect) statement that choosing a less effective AA implementation does not lower IQ. Do you still maintain that this is the case?

Because some 98% of computer graphics reside in these APIs, or some form? The rendering pipeline resides mostly in the APIs, Mr 3D Programming Professor. There are distinct fundamental differences between OGL and D3D. That was always the original issue.

As far as the AA thing, lower IQ from what. Lower IQ of the raw graphics output or lower IQ vs a more robust form of AA?
 

Kagari

Crystal Bearer
Provided they have yet to scrap that project. I imagine it would be a daunting task to update the numerous assets to next gen fidelity. It is the gift that keeps on giving.

I have said this before, but it wouldn't be the first time they've scrapped entire parts of the game. In fact, around March last year Nomura admitted they scrapped some real time work in favor of switching them to CGI scenes.
 

Durante

Member
I'll stop the "pipeline" tangent here, since clearly you're not interested in learning, and I'm not interested in having my qualifications "cleverly" questioned by you.

As far as the AA thing, lower IQ from what. Lower IQ of the raw graphics output or lower IQ vs a more robust form of AA?
Of course compared to another AA method.

If I may quote the (utterly wrong, but presented with much conviction, like most of your posts) original statement that sparked the debate:
iamshadowlark said:
This is pretty much non-sense. It provides the same output. AA is AA.
 

i-Lo

Member
I have said this before, but it wouldn't be the first time they've scrapped entire parts of the game. In fact, around March last year Nomura admitted they scrapped some real time work in favor of switching them to CGI scenes.

Ah, I was unaware of this. Thanks. It will be intriguing nonetheless to see if the first Final Fantasy title for next gen will be Versus or not and if so whether it will be based on Luminous Engine or one of the external engines they have licensed.
 
Yeah I know, but he was talking about "current High-End PC games" which in my understanding is the top of the notch (like Crysis 3 for example or Assassin's Creed 3). You don't or only barely get 60fps on machines that are equal to what we know about the Next-Gen consoles.

So the next-gen consoles might still have some tricks up their sleeves, but I do not expect many games to run at 60fps.
I don't expect many games to run 60fps either, but I think your comparison to PCs is irrelevant. If they can do really pretty things at 30fps and responsiveness is slightly lower priority (most action games, most RPGs, most 'cinematic experience' games), they will do 30. If responsiveness is higher priority (sims, most competitive shooters, most sports games), they'll do 60. Simple as that.
 
I'll stop the "pipeline" tangent here, since clearly you're not interested in learning, and I'm not interested in having my qualifications "cleverly" questioned by you.

Of course compared to another AA method.

If I may quote the (utterly wrong, but presented with much conviction, like most of your posts) original statement that sparked the debate:

Maybe I should have said a little more than "its the same output" but I think I laid my point out here:

Lol it's not lowering IQ. AA at its base is an enhancement. The natural rendering pipeline produces aliasing . So to combat that, developers implement an AA algorithm into the pipe. There's no standard algorithm. There are dozens to choose from. GOW3 happen to optimize one based off MLAA and tweak it to there needs. The end.
 
Top Bottom