• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

MS must genuinely be freaking out about the negative PR coming out of their '20% but 50% less' game console considering how much they're trying to get out ahead of this, but the numbers don't lie, and the gap is pretty substantial.

Or people had questions about the tech specs so they addressed those questions.
 
jZASF8x.jpg

As the second part of my post said there are a few.

60fps in MP in KZ:SF must be proven, because for now its 30-40 most of the time. And latest footage of singleplayer had drops 20s in some heavier scenes.

LIST

I am sorry but res more importation than half those things on that list .
It easy to do a whole bunch more of effects if your game rendering 44% less pixels .

Just in case since it was the last post on the page .
 
Want me to get you a soccer ball? Gonna be pretty lonely on the island you just put yourself on.

I'll join him on this island then. I don't see much difference between those Crysis shots either, of course I can at close scrutiny but at couch distance it would be negligible and I'm very sensitive to PQ.
 
A twist. Defending their box means it's weak. Not many hoops need to be jumped through at all. It's actually quite easy to see how the xbox one is the weaker system, but not nearly as weak as people try to make it seem, and the games shown so far make it a whole lot easier to do.



If these are the kinds of differences you guys talking about seeing this gen, you're in for a long ride. Keep those numbers handy, you might need them to remind yourselves of what you think you should be seeing at that rate. Seriously, we know the PS4 is stronger. The problem, I think, is people not agreeing that the Xbox One is significantly weaker in comparison. That's a fair argument to make, but I've really said enough on this matter.

Well, besides just higher resolutions the PS4 will also be able to render 2x as fast as the X1, with a ton more data to process at a time. 32ROP vs. 16 ROP is perhaps the thing in my eyes that is the biggest difference. When you have twice the rendering capability, as well as so much more information being pulled from RAM concurrently at such a faster rate, well .... you get the picture.

So the shown picture, pretty much cut the picture in half, lower the resolution, and take out some of those leaves and such and you might have a better picture of the differences.
 
Welcome to this thread.

I have a question, When are you going to detail your approach to asynchronous GPU compute?

Oh yes been following with interest, as you can imagine.

I don't know re: follow-up. I wasn't on the call (I was at TGS), but my understanding was that it was covered. I think DF mentions they may cover off another article.

I'm asking a serious question just so I know how to parse out some of the comments in my mind:

How many people on the thread have seen the games running personally? For the people saying Sony's games are "obviously" better graphically, is that interpretation from screen shots or seeing actual games at one of the shows (E3, Gamescom, Pax or TGS)?
 

Chobel

Member
I'll join him on this island then. I don't see much difference between those Crysis shots either, of course I can at close scrutiny but at couch distance it would be negligible and I'm very sensitive to PQ.

So we should go 720p and be done with it?
 

badb0y

Member
So they just confirmed Xbox One uses Bonaire yet Digital Foundry has the audacity to refer back to their shitty PC comparison where they used Pitcairn as the GPU for Xbox One.

The GPU in the Xbox One has 14 CUs with 2 CUs disabled and Bonaire has 14 CUs fully enabled in HD 7790.

That's the only new thing that I got out of the article, all the other stuff about balance was PR nonsense.

So Bonaire confirmed? So much for GCN 2.0 tech being incorporated in the GPU.
 
So wait a minute all of a sudden 1080p doesn't matter?

Wtf am I reading?

1080p still matters. But if there was no side by side comparison and you sat me on a couch and you told me the 900p shot upscaled to 1080p was legit 1080p, I wouldn't question it. Actual gameplay might be a different story though.
 

nib95

Banned
Bolded. But I don't expect to get anywhere with you.

60fps in MP in KZ:SF must be proven, because for now its 30-40 most of the time. And latest footage of singleplayer had drops 20s in some heavier scenes.

Actually, it was running closer to 50+fps on average according to Blim himself who took most of the footage. Ryse also suffers drops and tearing but you conveniently omit to mention these.

Lighting? Read about solutions in those games, Ryse is completely dynamic in terms of lighting, it has also much better shadowing system than both titles combined

I completely disagree. Add to that, Ryse's shadowing in places is actually pretty poorly implemented. As others have mentioned, currently it is far too high contrast with some areas that are simply too strong and jet black where they shouldn't be, even from a minor or more distant object.


Character models? Read about that in both games too, because Ryse wins here, not only characters models have more polys, but there are more of them on screen.

Poly count paper specs doesn't change the fact that the character models currently do not look as detailed or well textured as some of SF's. Also, how are there more of them on screen? Everything we've seen so far shows very few characters on screen. We've seen a screenshot that has many characters in it (photoshopped in) as well as some earlier single player footage that also had quite a few characters, but this particular build was not running on a dev kit and the final player count remains to be seen. Right now what is known is that Killzone's maps are considerably larger and have open world dynamics to them. So it's like comparing a linear game with a semi open world one.


Animation? technically animations in Ryse are very detailed, probably the most detailed from all shown games, they just dont have transitions

No. The animations without the blending are very good to poor. Some look realistic, whilst others look janky and lack the realistic weight and look/feel of the others. There's also very poor animation blending with animations sometimes stitched together in a very awkward and hap hazard way.


Background geometry? We havent seen to much, but city in KZ:SF background has less than 500k polys, its not much and there is no other game with grass like Ryse and that grass span through to the horizon and all of those are geometry with physics.

SF has considerably more going on with the background. Fact that you're trying to argue this by again simply going back to paper spec numbers is pretty amusing.

Cloth physics and body physics are best in Ryse

Based on what? The small amount of cloth physics we've seen from SF look on par if not better.

Motion blur and Bokeh are best in Ryse

This I agree on. Though I do think DoF is being heavily overused and over implemented in Ryse. They need to tone it down several notches.

We cant really talk about particles, because all games are different, but all of them are affected by wind and all of them are lit by light sources and shadowed by objects.

Of course you'd dismiss the things that SF clearly has an advantage in, but not afford the latter the same luxury. Classic.

We dont know anything about water in Infamous and KZ:SF, but in Ryse its tessellated, its FFT and it generated real-time caustics.

It can be tessellated or fully dynamic for all I care, it still looks awful. Gloopy, overly large rippling, weird physics consistency etc.

etc, so no Ryse no technically inferior in any way.

Yes it is. It is clearly technically inferior. Shadow Fall is pushing 44% more pixels and DOUBLE the frame rate.

===
 

Vizzeh

Banned
In the article the technical fellows say:
"Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing, but we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did."

12 CU's @ 853mhz is perhaps better than 14 CU's@800mhz because X1 Doesnt have the ROPS or bandwidth to feed the 14CU's - IE to use the car quotes MS like, its like a 6 Litre race car without the fuel pump/fuel injectors to feed it. But yes perhaps the clock increase suits their system more.

That section seems to be a underhanded dig at Sonys 18CU's - when their story is different, they have the ROPs and bandwidth to feed their 18 CU's
 
Albert, why is your engineer talking out of both sides of his mouth?

Not to be rude, but what part of these statements is contradictory?

Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU

We've done things on the GPU side as well

"the biggest" =/= "the only", and "as well" by definition would mean we focused on improving framerate bottelnecks across the board, the exact definition of balance.
 
oh shit we're comparing KZ: SF visuals with Ryse now? I guess some people are delusional...Ryse looks good, best looking X1 game. But KZ: SF is a league ahead. It trumps everything else we've seen.

I think X1 fans shouldn't focus on the graphics, they're gonna lose that battle. Instead, focus on the game itself. And Ryse looks like an awesome game so far.
 
Do you guys think its fair to look at things like the iphone 5s a7 processor, the fact that it only has one gig of ram, and blows away the competition in benchmarks as far as phone performance goes, makes it reasonable to consider that maybe the Xbox one could perform well next to the ps4? Better than expected even?

Just a thought.
 

Skeff

Member
Not to be rude, but what part of these statements is contradictory?

Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU

We've done things on the GPU side as well

"the biggest" =/= "the only", and "as well" by definition would mean we focused on improving framerate bottelnecks across the board, the exact definition of balance.

Nice to see you in here, when you consider the XB1 to be "more balanced than the PS4" Would you be able to clarify something?

The PS4 GPU is noticeably stronger "on Paper" and the thing a lot of people talk about is the 1.8 vs 1.3 TFlop figure, What is often overlooked is the number of ROP's on each of the GPU's, Do you feel that 16 ROP's is balanced in your system?

It appears that by saying you had a greater improvement of framerates from upping the clock speed rather than adding CU's that this implies that the XB1 may be fill rate limited which would be the only real explanation for an up clock giving a better performance than more CU's.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Not to be rude, but what part of these statements is contradictory?

Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU

We've done things on the GPU side as well

"the biggest" =/= "the only", and "as well" by definition would mean we focused on improving framerate bottelnecks across the board, the exact definition of balance.

So if you are having frame rate issues everywhere you are balanced? You do realize the WiiU could be perfectly balanced too? That does equate to high performance.
 

frizby

Member
I don't believe for one second that a 16.7% GPU througput bump is less of a real world perfomance boost than a 6.7% CPU clock bump. Remember, games consoles. What do you do on games consoles when your perfomance isn't where you want it? Reduce resolution. Hundreds of games did this in the curent gen. And GPU performance is the only thing that alleviates this. Again, I don't believe for one second that the benefits of a 16.7% GPU bump would be so negligible that it isn't worth doing, while a 6.7% CPU bump somehow is worth doing.

Re the "shill" thing, people who say BS face headwind, mockery and disrespect. That's the perk of a societal species. Nothing extraordinary.

Re the quote, the most likely truth behind of all this is that they could not, at this stage, alter the silicon itself. The design was nailed down and is ramping up production, and it would set them back months to add/remove/etc functional blocks, let alone all the money already spent on no longer useful silicon, and the additional money to dump into that process.

Another thing they couldn't do was enable the 2 CUs built in for redundancy. They need them to get the yields into a comfortable range. That's their purpose, and if they just openened them all, they'd lose that, making their yields suck, driving the cost per functional chip through the roof. Just like Sony never enabled the last SPE in the PS3s. They needed the redundancy. Microsoft needs the redundancy.

They also couldn't up the GPU clocks, most likely due to the GPU drawing a majority chunk of the overall system power already. Remember, upping the clock on a chip usually also needs increased voltage. It's not unusual to see power requirements scaling quadratically with clock. I.e. a 10% clock bump, can easily mean 20% more power draw.

What they could do, however, was up the clock on the CPU block. Because that's not a hot area of the chip, and a relatively small area. There was headroom there that they couldn't find anywhere else. And so they did it, just to have something to tout in public as a step forward.

Thanks for that...all I was really looking for. :)

It's one thing to say that people are untrustworthy because of what they do for a living, but it's another entirely to subject their actual words to some scrutiny.
 

onQ123

Member
eh. it was more the cost benefit between upclock or utilizing 2 additional cu's. (no idea if that is true or not)

regardless, AMD will be laughing all the way to the bank

I was talking about this quote from Richard Leadbetter.

Assuming level scaling of compute power with the addition of two extra CUs, the maths may not sound right here, but as our recent analysis - not to mention PC benchmarks - reveals, AMD compute units don't scale in a linear fashion. There's a law of diminishing returns.

It's pretty funny that he would try to use that excuse in this situation when it's only 12 CU's.

And of course a higher clock is going to make them 12 CU's perform better but it will also make the 14 CU's perform better.
 

KKRT00

Member
Bolded. But I don't expect to get anywhere with you.

"Actually, it was running closer to 50+fps on average according to Blim himself who took most of the footage. Ryse also suffers drops and tearing but you conveniently omit to mention these."
Blim didnt record MP footage, what are You talking about? And we have direct feed from conference that does not run even close to 50.

----
Character models - E3 demo was playable and was running on dev-kits, its confirmed by multiple sources and even some Gaffers too that was messing around Xbone menu while playing the game ,and it had 50+ characters on screen.
http://www.youtube.com/watch?feature=player_detailpage&v=dTWOU9qOG34#t=114
Count them if You really need to.
And 'photoshopped in' lol, thats next level of fanboyism.

==
"I completely disagree. Add to that, Ryse's shadowing in places is actually pretty poorly implemented. As others have mentioned, currently it is far too high contrast with some areas that are simply too strong and jet black where they shouldn't be, even from a minor or more distant object."
Rendering Shadows do not have anything in common how black they are in cooperation with GI, and judging contrast from off-screen footage is not very smart.
The fact is that shadows are rendered on everything, even on high distance and have smooth LoD transition, where in both Infamous and KZ they are rendered on dynamic objects only for 10 meters and static geometry has very harsh lod transitions.

"No. The animations without the blending are very good to poor. Some look realistic, whilst others look janky and lack the realistic weight and look/feel of the others. There's also very poor animation blending with animations sometimes stitched together in a very awkward and hap hazard way."
And all of that has nothing to do with rendering performance, which was the point here.

"SF has considerably more going on with the background. Fact that you're trying to argue this by again simply going back to paper spec numbers is pretty amusing."
It can have even elephants flying around, but it doesnt change the fact thats mostly art and have nothing to do with performance. Geometry is poor in the distance, thats a fact.

"Based on what? The small amount of cloth physics we've seen from SF look on par if not better."
Its very fast and has the smoothest rig from all they've seen. Plus there is several cloths and armor pieces per character model that move independently.

"Of course you'd dismiss the things that SF clearly has an advantage in, but not afford the latter the same luxury. Classic."
What advantage? Have You seen amount smoke in Ryse E3 demo? Why do You think those particles are worse? Also in comparison to C3, which tech worked on current gen consoles to be precised, particles in Infamous and KZ:SF do not react to explosions from other sources.

"It can be tessellated or fully dynamic for all I care, it still looks awful. Gloopy, overly large rippling, weird physics consistency etc."
Crysis 3 water looks awful, thanks for the info.

"Yes it is. It is clearly technically inferior. Shadow Fall is pushing 44% more pixels and DOUBLE the frame rate."
I have not seen those double framerates yet.
 

badb0y

Member
Do you guys think its fair to look at things like the iphone 5s a7 processor, the fact that it only has one gig of ram, and blows away the competition in benchmarks as far as phone performance goes, makes it reasonable to consider that maybe the Xbox one could perform well next to the ps4? Better than expected even?

Just a thought.

You need to do some research but since you brought it up.... Why is it that the Apple A7 is better than other processors on the mobile market?

Apple's A7 is a custom chip designed in house using an ARM licence similar to how Qualcomm makes it's Snapdragon as opposed to what Samsung does (licences ARM core design). Apple has been hiring highly qualified chip designers for a while and the effect of that is Apple's SoC are efficient and powerful compared other companies in the market. RAM has nothing to do with why the A7 is a beast of SoC.

On the other hand both the PS4 and Xbox One use the 8-core Jaguar CPU design from AMD and use a GPU the same family of GPUs, just from a different product stack. Essentially both the PS4 and Xbox One have the same underlying architecture, the PS4 just has more GPU resources to use.
 

Skeff

Member
"Actually, it was running closer to 50+fps on average according to Blim himself who took most of the footage. Ryse also suffers drops and tearing but you conveniently omit to mention these."
Blim didnt record MP footage, what are You talking about? And we have direct feed from conference that does not run even close to 50.

----
Character models - E3 demo was playable and was running on dev-kits, its confirmed by multiple sources that confirmed, with some Gaffers too that was messing around Xbone menu while playing the game ,and it had 50+ characters on screen.
http://www.youtube.com/watch?feature=player_detailpage&v=dTWOU9qOG34#t=114
Count them if You really need to.
And 'photoshopped in' lol, thats next level of fanboyism.

==
"I completely disagree. Add to that, Ryse's shadowing in places is actually pretty poorly implemented. As others have mentioned, currently it is far too high contrast with some areas that are simply too strong and jet black where they shouldn't be, even from a minor or more distant object."
Rendering Shadows do not have anything how black they in cooperation with GI, and judging contrast from off-sceren footage is not very smart.
The fact is that shadows are rendered on everything, even on high distance and have smooth LoD transition, where in both Infamous and KZ they are rendered on dynamic object only for 10 meters and static geometry has very harsh lod transitions.

"No. The animations without the blending are very good to poor. Some look realistic, whilst others look janky and lack the realistic weight and look/feel of the others. There's also very poor animation blending with animations sometimes stitched together in a very awkward and hap hazard way."
And all of that has nothing to do with rendering performance, which was point here.

"SF has considerably more going on with the background. Fact that you're trying to argue this by again simply going back to paper spec numbers is pretty amusing."
It can have even elephants flying around, but it doesnt change the fact thats mostly an art and have nothing to do with performance. Geometry is poor in the distance, thats a fact.

"Based on what? The small amount of cloth physics we've seen from SF look on par if not better."
Its very fast and has the smoothest rig from all they've seen. Plus there is several cloths and armor pieces per character model that move independently.

"Of course you'd dismiss the things that SF clearly has an advantage in, but not afford the latter the same luxury. Classic."
What advantage? Have You seen amount smoke in Ryse E3 demo? Why do You think those particles are worse? Also in comparison to C3, which tech worked on current gen consoles to be precised, particles in Infamous and KZ:SF do not react to explosions from other sources.

"It can be tessellated or fully dynamic for all I care, it still looks awful. Gloopy, overly large rippling, weird physics consistency etc."
Crysis 3 water looks awful, thanks for the info.

"Yes it is. It is clearly technically inferior. Shadow Fall is pushing 44% more pixels and DOUBLE the frame rate."
I have not seen those double the framerates yet.

http://www.neogaf.com/forum/showthread.php?t=678889

C'mon there was a thread on Very badly Photoshopped Ryse screenshots on xbox.com, They painted over the armor clipping with a solid grey chunk. It was terrible, My 12 year old cousin could have done it better.
 

Vivie

Banned
The Crysis screens in the article show a clear difference.

8lXVoUn.png

0sNNxWR.png

I think these side by side graphics comparisons basically sum up how pointless the argument over visuals at this point is. I mean getting into the minutiae of a pixel or two, slight color and resolution differences etc to pick apart fine details is very boring.
Any medium can only approximate reality, (wake me up when every leaf on that tree is unique and modelled in 3d with individual physics, every brick in the wall too) . Looking at a photo and interactive dynamic video game environment are not close to the same thing. If you play a game and try to scrutinize every detail of how it's put together, well you're not really playing the game at all.
I'd like to see them continue to push boundaries in tech of course. let's see how close we can get, but placing great importance on it is pointless imo.
 

Piggus

Member
Do you guys think its fair to look at things like the iphone 5s a7 processor, the fact that it only has one gig of ram, and blows away the competition in benchmarks as far as phone performance goes, makes it reasonable to consider that maybe the Xbox one could perform well next to the ps4? Better than expected even?

Just a thought.

No.

The A7 beats the shit out of other mobile CPUs because of its architecture and monster GPU, not RAM.

The PS4 and Bone have essentially the same architectures with minor differences. But one of them has significantly more raw GPU power, faster and easier to use memory, and more customizations with regard to GPU compute. That system is the PS4.

The only "advantage" the Xbox has is the tiny CPU clock speed advantage (assuming ps4s CPU is still 1.6 ghz, which we don't know for sure.)
 

nib95

Banned
KKRT00 you're just being disingenuous now. You HAVE seen double the frame rate, because last time we had this discussion I linked you to it. The off screen 60fps UI where your response was that it was indeed running at 60fps.

No idea why you're pretending otherwise now based on a few rare drops off of a 30fps encoded b roll trailer.
 

AlphaDump

Gold Member
I was talking about this quote from Richard Leadbetter.


And of course a higher clock is going to make them 12 CU's perform better but it will also make the 14 CU's perform better.

lol that is true. maybe it is in reference to the finite amount of bandwidth available. diminishing returns in that respect to the CUs if the amount were to scale upward, however it would still be linear... leadbetter's lost his mind
 

Cidd

Member
Well, let's see who is moving to 4k in a few years. Not everybody builds a new house because of a new TV.
And I can understand that you are tired of muddy textures but please buy a PC then if you want games running with 1080p *and* decent AA because 1080p still requires AA.

For your info I already have a gaming PC and thanks but I don't think I need advice from you on my buying habits no offense. I've been 1080p gaming for years so forgive me if I think 900p in this day and age is absolute garbage.
 
The only thing that's "crazy" about this discussion is your opinion that Ryse looks better than KZ:SF or InFamous, not Ryse's graphics themselves. What exactly is so amazing about Ryse's visuals?

Ignoring the massive downgrade in resolution compared to those two titles (900p vs, 1080p), let alone framerate (60 fps for KZ:SF's multiplayer), the lighting, character models, action, background geometry, animation, etc. look noticeably worse than those two titles.

Is this argument about "art style", which is more subjective? Because that's the only thing I can think of right now that pay put Ryse ahead for some. It's the same crazy arguments that we heard this gen with Mario Galaxy being claimed to look better than some of the best PS360 titles.

See, from reading this, I could tell you were just spouting off things at random. Ryse's character models (but certainly the main character) actually seems like they're quite a bit more detailed than anything on display in either of those two games. That goes double for the main character in Ryse.

guerrillagames1bojl3.jpg


And this is beyond stupid, because I hate doing comparisons like this, but I'll do it just for the sake of the silly game some of you are attempting to play. I'll show you why some of the ways you guys choose to compare games is nonsensical, because every game is doing its own thing. It's almost outrageous to compare games based on specific metrics due to that reason, but here goes anyway just to make a point..

-- Highest poly count in Killzone SF is 40k.

-- The main character in Ryse is 150k polys alone. It would take 4 of Killzone's characters combined at their highest possible LOD just to eclipse the poly count in just Ryse's single main character.

-- There's 8 bone influences per vertex in Killzone.

-- Dude... there's 260 deforming joints just in the Ryse main character's face alone.
-- 500 deforming joints total on just the main character (including the 260 facial).
-- Greater than 770 joints in general total on just the main character.
-- There's 230 corrective facial blendshapes alone, on just the main character.

video proof.

http://www.youtube.com/watch?v=Nm4aIHxkMek

In fact, no point searching through the video since it might take too long, here's the picture.

h9jv.jpg


Care to come again about character models?

The main character in Ryse is sporting an incredible level of detail that, as much as I think Infamous and Killzone look utterly fantastic (not going so far as to claim there's nothing special about either game visually like you're bogusly doing with Ryse), neither of those two games are matching the quality of that character model, and that's just speaking the truth. The cutscene model in Ryse IS the gameplay model in every possible way. There's an unbelievable amount of detail in that character model with a number of independently modeled portions of his armor that are made of real geometry that each have their own unique physics simulation.

The full performance capture during actual gameplay in Ryse hasn't been matched by anything I've seen in either of the other two games. it's even carried over to enemies, not just the main character.

You're talking about lighting? Have you even seen how pitch perfect that overcast lighting is in the Ryse SP on that Stonehenge level? I had to double take just to believe it. And you think Ryse doesn't have amazing lighting? You're kidding yourself. I'm not suggesting the other two games don't, but suggesting Ryse doesn't have great lighting is just not factual. The game is even running what looks like real-time caustics. Even the blood splatter realistically reacts to the lighting conditions in the game.

Taking a step away from the Ryse main character, even the enemy models sport an undeniably impressive degree of detail.

You talk about geometry, but geometry is the last thing this game has a problem with. Have you seen the quality of the grass in the game, and how tall and how far out it stretches, and the way it reacts to both main character and enemy movements, even reacting realistically to the different wind conditions? You see how amazing the motion blur in this game? How about that real-time burning boat segment with the amazing physical simulation on display with the roaps flying all over the place? There's a plume of smoke on that initial E3 level that, according to Crytek, requires more computational load than the entire Xbox 360 could produce even if you dedicated 100% of its power to trying to accomplish that same thing.

And it's useless, yes, useless, attempting to compare two very different styles of games based on framerate, but, sure, go ahead. Nevermind that the Killzone SP is 30fps, and the multi is back and forth between 30ish and 60fps (nib says 60fps more often than not, I'll choose to believe that, as I don't think he'd make it up.) I don't see how that makes Ryse look any less amazing, though. Whether you compare to Killzone SP or MP, Ryse looks incredible in comparison both ways. Killzone looks downright mindblowing, too, and I can admit that, but you're not going to get anywhere claiming Ryse has lackluster graphics just to make a point. I think Killzone, Infamous and Ryse all are the best looking games shown so far that have shown so much real gameplay. Of all three games in question, Infamous is likely to be the best game of them all, and so far it definitely looks the most fun. I can say that without hesitation, because I give credit where credit is due. Killzone graphically initially went from not impressive me all that much to absolutely blowing my mind, convincing me to buy it JUST for the graphics alone at launch. I understand Ryse is on a different platform, and so there's a need for full scale warfare, but it looks amazing regardless of the platform, and if you can't admit that much, then nothing else you say matters.
 

maeh2k

Member
This is bona fide BS.

I don't believe for one second that a 16.7% GPU througput bump is less of a real world perfomance boost than a 6.7% CPU clock bump.

.
.
.

They also couldn't up the GPU clocks, most likely due to the GPU drawing a majority chunk of the overall system power already. Remember, upping the clock on a chip usually also needs increased voltage. It's not unusual to see power requirements scaling quadratically with clock. I.e. a 10% clock bump, can easily mean 20% more power draw.

You are wrong. They actually did up both the GPU and the CPU clocks. The 6.7% boost comes from raising the GPU clock from 800 MHz to 853 MHz. Unlike adding two CUs the raised clock also had an impact on other areas such as the ESRAM.
 

dr_rus

Member
"Balance" isn't something that can be right or wrong. It's a gamble that a h/w maker makes when designing a new h/w. MS thinks that they have the right balance but that doesn't mean that it's right or that PS4's is wrong. Who've got the right balance will be seen in a couple of years - if PS4 titles will look significantly better than XBO's then it'll be clear that PS4's h/w balance is better.
 

c0de

Member
For your info I already have a gaming PC and thanks but I don't think I need advice from you on my buying habits no offense. I've been 1080p gaming for years so forgive me if I think 900p in this day and age is absolute garbage.

Why so serious? :) It was meant in a nice way, no offense from my side.
 

Bossofman

Neo Member
I am going to drop my XBO preorder and only go with the PS4, until MS cut's the price $100, Both being the same price with the MS system being a little weaker but having Kinect sounds fair, as it is NOW the XBO is a rip off.
 

GameSeeker

Member
See, from reading this, I could tell you were just spouting off things at random. Ryse's character models (but certainly the main character) actually seems like they're quite a bit more detailed than anything on display in either of those two games. That goes double for the main character in Ryse.

-- Highest poly count in Killzone SF is 40k.

-- The main character in Ryse is 150k polys alone. It would take 4 of Killzone's characters combined at their highest possible LOD just to eclipse the poly count in just Ryse's single main character.

No, you are spouting facts at random and making incorrect conclusions.

You compare 40K POLYGONS in KZ:SF to 150K TRIANGLES in Ryse. If you don't understand the difference between polygons and triangles, please don't be making comparisons.
 

Y2Kev

TLG Fan Caretaker Est. 2009
I run my games in 480p but I sit on a couch 500 feet away from my TV

Sometimes it is hard to see but I was able to beat Prince of Persia 2008 without actually being able to see the screen
 

coolasj19

Why are you reading my tag instead of the title of my post?
I had something more eloquent typed up but it's gone now.

What does the RAM Capabilities and Allocations mean for the part of the OS that the user operates with? Features, speed, stuff like that.
 
Lmao at some of the crazy bedroom wall calculations going on.




Hmm... so basically still just the PC footage then? The reason it looked no different from the "PS4 version" was probably because they were both the exact same piece of PC code.

Game isn't going to be running on actual hardware for either of these machines yet.
 

Bundy

Banned
Well, let's see who is moving to 4k in a few years. Not everybody builds a new house because of a new TV.
And I can understand that you are tired of muddy textures but please buy a PC then if you want games running with 1080p *and* decent AA because 1080p still requires AA.
Come on, c0de ;)
You can do better thant this.
 

onQ123

Member
Some questions I would have liked answered by Microsoft

Does their Graphics chips support DX11.2 ?

According to this it doesnt:

images.eurogamer.net/2013/articles//a/1/6/1/2/9/1/9/77.png/EG11/resize/1200x-1

DX11.2 Supports alot of fancy stuff that will aide in next GEN, things like Scalable frame buffers, tiled resources - Tiled resources will be particularly useful, it allows textures in advance of 10GB to be STREAMED to a file as small as 16mb stored on RAM, it allows a saving in memory because distant textures are seen in small quality, as you get closer those textures INCREASE in quality

all radeon 7000series cards with DX11.1 Can be updated in drivers to DX11.2 - BUT not all the features are present like tiled resources TIER 2 (TIER 1 Is availabe with software) but Teir 2 requires a hardware modification

PS4 Supports DX11.2+ (From the GDC SONY developer briefing http://www.gdcvault.com/play/1019252/PlayStation-Shading-Language-for)

Modern GPU
- DirectX 11.2+/OpenGL 4.4 feature set
- With custom SCE features
- Asynchronous compute architecture
- 800MHz clock, 1.843 TFLOPS
- Greatly expanded shader pipeline compared to PS3™

So can we safely Assume X1 Supports DX11.1 With a SOFTWARE upgrade to DX11.2

Where sony explicitly states 11.2+ They will support the hardware variant allowing for Teir 2 tiled resources and a few other goodies explained here:

http://msdn.microsoft.com/en-us/library/windows/apps/bg182880.aspx (good read)

I also have been trying to get to the bottom of this.

Xbox One: DirectX 11.1+ AMD GPU , PS4: DirectX 11.2+ AMD GPU what's the difference?

But I guess we will find out what's in DirectX 11.2 compatible hardware on the 25th.
 

c0de

Member
No, you are spouting facts at random and making incorrect conclusions.

You compare 40K POLYGONS in KZ:SF to 150K TRIANGLES in Ryse. If you don't understand the difference between polygons and triangles, please don't be making comparisons.

So please explain to us the difference between polygons and triangles.
 
This polygon per character talk to prove which is "best" is the dumbest thing I've read in a while.

You guys do realize that either of the dev teams can have whatever polygons they want for their characters right? It's not exactly something complicated to do.

What matters is what the engine is rendering as whole, and not just individual characters.
 
Top Bottom