The Star Wars demo wasn't running on 3 GTX 680s, it's been debunked numerous times. -_-
I've been posting about this a few times. I think people often overestimate the performance impact of the OS on a modern gaming PC -- it's negligible.I have to wonder how much of a benefit the closed system designs are going to be in the next generation.
After all, at this point, with 4.5GHz quad-core CPUs and the absurd power inherent in a GTX 680, the OS occupies a very, very tiny proportion of a high-end PC's total power.
Of course there are more optimizations you can do when you're programming directly for the specific hardware, but it seems to me like the returns from that just don't outweigh the massive differences in power between a console and a high-end PC.
GI is baked it's not real time "Baked GI" in my post with screenshot on last page
60 FPS on a 680 is really impressive.
I really hope they either (a) manage to keep the IQ high when running it on consoles or (b) make a PC version, since it would a crime to waste those assets on a shitty IQ render.
I've been posting about this a few times. I think people often overestimate the performance impact of the OS on a modern gaming PC -- it's negligible.
Not sure how this bodes for next gen considering how long the asset creation at that fidelity would take to develop. Even if they could, I don't think they'd go balls out like they did with that trailer for a 60 hour game
Running at 60fps on one 680 is crazy. I didn't think it was being run on a single GPU config
If that's true then why are PC games getting such bad performance/graphics. Take games like AC3 or FC3 which, considering Agni's Philosophy runs at 60FPS, should be running something like 240FPS but when you take a look at benchmarks they're getting like 40FPS which is worse than what this demo is giving despite looking 10x better. Considering that, do you really believe that console optimisation bringing massive performance benefits is near an end?60 FPS on a 680 is really impressive.
I really hope they either (a) manage to keep the IQ high when running it on consoles or (b) make a PC version, since it would a crime to waste those assets on a shitty IQ render.
I've been posting about this a few times. I think people often overestimate the performance impact of the OS on a modern gaming PC -- it's negligible.
RPG Site: Do you feel that when this tech starts making it into games there'll still be a place for pre-rendered CGI? This is getting pretty close.
Hashimoto: We think so, yeah. I think there'll be a big technology leap to allow this sort of quality to be used in a real-time game - this is essentially a movie - a cutscene, after all, so CG can still accomplish things we can't. I can express my opinion - but it may be better to ask someone who is responsible for real-time CG that question.
The strength of Square Enix is that we have an excellent team that is very highly capable of creating excellent CG - and also we now have a real-time engine. So using the assets that we already have and combining the two, we are in a very good position to create very interesting games.
RPG Site: So you're saying assets created for CG can be bought straight down into the game and the reverse in this new engine?
Hashimoto: Yeah, that's what we're expecting to do. For the backgrounds used in this - the mountains, the houses - we are using exactly the same assets as are used in the Visual Works CG version.
Of course, it's too massive of a data to use in a game as-is, but I think the look and feel will probably remain. If we had time, we could've compressed the data even smaller. We didn't have time to do that, so we just used the same master data - but it can definitely be reduced.
RPG Site: Do you think that disc space is going to be an issue, then, even on Blu-ray?
Hashimoto: Yeah, that could be a challenge. There's a possibility that just one Blu-ray may not be sufficient.
RPG Site: Back to the PS1 days!
Hashimoto: [laughs] Yeah. We have to really consider the mechanism of compressing the data carefully.
So if PS4 is rumored to have a 7970, I assume that X720 is gonna have a 680? Or they use it because it's the main reference of this gen of GPUs?Agni's Philosophy demo - i7, 32gb of ram, GTX680
UE4 elemental demo - i7, 16gb of ram, GTX 680
Star Wars 1313 - i7, 16 gb of ram, 3x GTX680 [but they used modified UE3 and were doing incredible stuff by porting "depth compositing" from offline workstations into realtime enviroment]
Watch Dogs - Unknown. Nothing revolutionary in that engine, just more detailed assets [clearly a game designed to be run on currentgame hardware, with HQ options for PC/Nextgen. Same as BF3].
Not really needed, the Vita has its own CPU/GPU and works well as cross crontroller with PS3, they just need to optimize the Remote Play a bit and enable it for all games.If Sony also rips off Nintendo with a tablet-style controller, maybe the APU can handle the graphics processing for it.
nothing's magic, got one and still getting some fps drops on Hitman
I feel like PC games are surpassing hardware limitations. SLI was kind of a ridiculous idea before and now its very much needed sometimes.
Anyway, if anything I feel that next-gen consoles could reach the equivalence of a 680 at most really, considering that im sure they'll stick to 30fps.
If that's true then why are PC games getting such bad performance/graphics. Take games like AC3 or FC3 which, considering Agni's Philosophy runs at 60FPS, should be running something like 240FPS but when you take a look at benchmarks they're getting like 40FPS which is worse than what this demo is giving despite looking 10x better. Considering that, do you really believe that console optimisation bringing massive performance benefits is near an end?
BF3 and Witcher 2 still don't run at 60FPS on a GTX 680 and they don't look anywhere near as good as Agni's.Because FC3 seems rushed (in terms of optimisation) and AC3 is a badly done console port.
BF3, The Witcher 2 etc look better than those games and run brilliantly on PC hardware.
Hitman is poorly coded for PC, there is nothing in that game that is that impressive ....... it's just a poorly coded console port.
BF3 and Witcher 2 still don't run at 60FPS on a GTX 680 and they don't look anywhere near as good as Agni's.
I'm pretty sure a 670 can't run BF3 60FPS locked with HBAO and 4xMSAA.BF3 runs at 60fps on a 680 no problem ....... what are you talking about? ....... have a 670 on stock clocks running at 1080p, ultra, SSAO vsynced at 60fps.
As does the Witcher 2.
I am was referring to you pointing out a badly done console port (AC3) running badly as not be indicative of the ability of current PC hardware.
A well coded and optimised engine like that behind Agni can run that well and look that good on a card that has over / around 2 teraflops of performance.
BF3 and Witcher 2 still don't run at 60FPS on a GTX 680 and they don't look anywhere near as good as Agni's.
I'm pretty sure a 670 can't run BF3 60FPS locked with HBAO and 4xMSAA.
I thought that the opposite was true.Damn, that's pretty impressively optimized already if true. Not sure if I buy 8XMSAA though... that seems a little nuts for 60fps, even on that hardware.
Looking forward to running Square games with SMAA injectors!
What? A 680 can't run BF3 at 60FPS locked absolutely maxed out, that was my point, you can't just take effects out of the equation and still say it does.Can't run what? ...... BF3?
HBAO and 4xMSAA was not the point, rather you stated a 680 could not run BF3 at 60fps ...... which is obviously wrong.
Do we have an actual number on how much of that it was using? Seems kind of weird that a game needs that much.Not sure 32gb of ram kinda scary
Can probably be fixed by a good streaming engine.
Agni's Philosophy demo - i7, 32gb of ram, GTX680
UE4 elemental demo - i7, 16gb of ram, GTX 680
Star Wars 1313 - i7, 16 gb of ram, 3x GTX680 [but they used modified UE3 and were doing incredible stuff by porting "depth compositing" from offline workstations into realtime enviroment]
Watch Dogs - Unknown. Nothing revolutionary in that engine, just more detailed assets [clearly a game designed to be run on currentgame hardware, with HQ options for PC/Nextgen. Same as BF3].
i'm worried about development cost.
That is one year job for 4 minutes. Sure, engine is pretty much ready, and maybe it took 4 months to produce, but it's still 4 minutes only
Now imagine a 6 hour game
Now imagine a 20 hour game
Now imagine skyrim looking like that
But not that camera, hoo boy...The thing that makes that demo so impressive to me is the directing, the facial animations, the whole post processing effects to make it like a movie, and especially the cloth simulation.
But not that camera, hoo boy...
Yeah, that's the one that got dismissed. I look for a link.
i'm worried about development cost.
That is one year job for 4 minutes. Sure, engine is pretty much ready, and maybe it took 4 months to produce, but it's still 4 minutes only
Now imagine a 6 hour game
Now imagine a 20 hour game
Now imagine skyrim looking like that
You guys think Wii U could run this?
Nope. I don't think 720/PS4 will run it either without major compromises though.
What we'll likely get on 720/PS4 is something that looks kinda like this in 720p from far away, then when you start nitpicking you notice stuff that got axed to make it work for an actual game.
32 GB of ram? My laptop whimpers.
Sonic All Stars Racing Transformed runs on 3DS. I would hardly call it a facsimile of the PS3/360/Wii U version. If Sony and MS are going for 2.5 teraflop systems, it will be a very difficult task to get the Wii U version up to snuff.Didn't Square confirm it's very scalable? Didn't they also want to bring it to Wii and 3DS?
http://sknygy.blogspot.ca/2011/08/square-enixs-luminous-engine-wii-u3ds.html
32GB OF RAM
christ
Sonic All Stars Racing Transformed runs on 3DS. I would hardly call it a facsimile of the PS3/360/Wii U version. If Sony and MS are going for 2.5 teraflop systems, it will be a very difficult task to get the Wii U version up to snuff.
You cant have a 680 and sell the console at a 400$ price-point and make a profit. The card alone retails for 450$.
You cant have a 680 and sell the console at a 400$ price-point and make a profit. The card alone retails for 450$.
Retail costs are irrelevant for console prices. GTX 680 costs Nvidia around 130-150$ to produce, that's together with expensive GDDR5 and cooling.