• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ars Technica: Penello's XB1 Numbers, Ars "Sniff Test" Analysis

I actually repeated that I had heard this to a high-level Sony guy, and he basically confirmed it. But also wanted to make sure that I didn't actually hear it from someone at Sony because they didn't want it getting back to them.

This Sony guy in particular I don't think would lie or exaggerate, but obviously they would have something to gain if they did. So I don't know.

Wow that adds more fuel to the fire then

Honestly I think the PS4 dev time and FPS might be close to reality then

Clearly it's not a console that's difficult to develop on

Good news

XB1 sounds less promising development-wise but I'm sure they'll get a solid FPS for all those Xbox fans
 
While I dont doubt Ravidath, its prudent to remember that the PS4 was built around having a very short "time to triangle" that said, im sure you can drag decent performance out the X1, it just requires more work.
 

Kagari

Crystal Bearer
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

FWIW a Japanese developer was having similar issues apparently. They will remain nameless for obvious reasons.
 

Skeff

Member
If you believe that's true, you must think Turn 10 and Crytek are goddamn programming savants.

Not really it's SOP for low fps when the engine is ported over, I would assume the 15fps is before any optimizations. I've seen engines ported over running at 1FPS and get to a locked 30fps by release.
 

twobear

sputum-flecked apoplexy
Not really it's SOP for low fps when the engine is ported over, I would assume the 15fps is before any optimizations. I've seen engines ported over running at 1FPS and get to a locked 30fps by release.
So the PS4 version of Ghosts will run at like 500fps?
 

Proelite

Member
If you believe that's true, you must think Turn 10 and Crytek are goddamn programming savants.

Or that Infinity Ward are terrible at it.

However, I am suprised that Infinity Ward would have had the Xbox One dev kits for 4 months already by E3.

As far as I know, third parties were still on PC alpha kits that has no esram in Feb / March.

Just my two cents.

If they were having Esram issues, it would be from march and on wards. Meaning that studios were still having perf issues as early as this July.

FYI, the fact that we STILL haven't see Ghost footage from Xbox One means that the Xbox One version is still un-presentable.
 

Coiote

Member

I can't believe this is entirely true. I hope the game doesnt get Bayonetta'ed on X1, as I will feel sorry for their fans. No one deserves this.

Isn't dangerous to post this information though? Apparently you are a indie dev and gettting blacklisted won't help you. Be careful. I love Skullgirls and I don't want to see you guys getting into trouble. :)
 
If you believe that's true, you must think Turn 10 and Crytek are goddamn programming savants.

Well lets be fair, both probably had much more time with the hardware than the typical multiplatform Dev. Not only that but Ryse started off as a 360 game, so we have no idea how close that was to completion. For all we know it could mostly a port job.
 

Caronte

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Holy shit.
 

szaromir

Banned
Yeah I mean drastic in both directions. The HIGH fps and the LOW
I think Radeon 7850 would comfortably run a Call of Duty game at 90fps. It's not like Ghosts look miles apart from the previous games. On Xbone there was probably some bottleneck, it wouldn't be unusual.
 
Surface and Windows phone 8 that even extensive marketing cant save a undesirable product.

Luckily the Xbox one is still one, even in its original form.

More desirable than the PS4? time will tell.
Yes, the people who keep talking about all of Microsoft's money forget that they have many other products to prop up in the marketplace as well. They can't and won't blow all of their reserves on one product.
 

vpance

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Goddamn.
 
Or that Infinity Ward are terrible at it.

However, I am suprised that Infinity Ward would have had the Xbox One dev kits for 4 months already by E3.

As far as I know, third parties were still on PC alpha kits that has no esram in Feb / March.

Just my two cents.

If they were having Esram issues, it would be from march and on wards.

Only 2 cents I need. Thanks for chiming in.
 
we've known since over 3 months ago PS4 is the more powerful system, I don't know why it keeps coming up lol.


Keep in mind, that is not to say MS will have crap games. What they have is still a leap over current gen.
 

Nafai1123

Banned
This one is categorically true. And Albert's mention that neither did Sony is also categorically true. Albert even went so far as to proclaim that high-end PC gamers would agree on this. And looking at the specs of both, how can you deny that claim?

I don't deny the claim, I question the message. Stick to your guns and show the value instead of spreading FUD. There's literally nothing to gain from downplaying the competition when both systems will be released in a few months and the differences will be proven one way or another.
 

RedAssedApe

Banned
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

interesting...this is the first time i've seen this posted. has there been discussion about this before? or was it just discounted as rumor due to being second hand?
 

EagleEyes

Member
Man these consoles can't release fast enough so we can be done with these silly he said, she said rumors. At this point somebody ought to come up with a rumor claiming that multiplats are running better on Wii U than the Xbox One.
 
I can't believe this is entirely true. I hope the game doesnt get Bayonetta'ed on X1, as I will feel sorry for their fans. No one deserves this.

Isn't dangerous to post this information though? Apparently you are a indie dev and gettting blacklisted won't help you. Be careful. I love Skullgirls and I don't want to see you guys getting into trouble. :)

Won't matter if PS4 dominates next gen... :p

Really though, this was unoptimised for both consoles I assume? There'd obviously be a difference but I feel it'd just take a bit more work on XBOne to get it going properly because of the ESRAM+DDR3 split.
 

Cuth

Member
I would pay WAY more attention to ERP's words than to Ars' ones, given that he seems to have direct experience with the PS4 and that he worked as a game dev even in the Genesis/SNES era (and I don't know, maybe even earlier).

I'd also guess that, when he talks about being "limited by bandwidth" in certain situations, he's referring on how fast it's possibile to bring data to the GPU and he doesn't forget that it depends on the combination of bandwidth and latency, something that a few "expert" people here seems to forget.
 

Ravidrath

Member
Isn't dangerous to post this information though? Apparently you are a indie dev and gettting blacklisted won't help you.

But Indie devs are already blacklisted on Xbone, effectively...!

For serious, though - it's not my story, just an anecdote I heard at E3.

I cannot verify its authenticity, but felt that it could add to the discussion because I just think all the talk about GB/s are pointless if your APIs don't actually make that relevant.
 
Man these consoles can't release fast enough so we can be done with these silly he said, she said rumors. At this point somebody ought to come up with a rumor claiming that multiplats are running better on Wii U than the Xbox One.

In this case the rumors are all saying the same thing with no one saying otherwise.
 
Or that Infinity Ward are terrible at it.

However, I am suprised that Infinity Ward would have had the Xbox One dev kits for 4 months already by E3.

As far as I know, third parties were still on PC alpha kits that has no esram in Feb / March.

Just my two cents.

If they were having Esram issues, it would be from march and on wards. Meaning that studios were still having perf issues as early as this July.

Welp. That didn't take long to start saying that Infinity Ward are terrible developers because the PS4 runs better.
 

Derrick01

Banned
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

It probably wasn't THAT bad and I'm sure things have improved a bit on XB1 since then but honestly I'm not surprised if there is a fairly decent difference there. Cerny did quite a few talks where he emphasized how important it was to have unified memory and to make things as simple as possible for devs. He didn't want them doing any extra tricks to get their games running.
 

twobear

sputum-flecked apoplexy
All I can say is that if that story is true, MS have fucked up in ways that we will not even begin to fathom for years to come.

The infinitely powerful supercomputers of the future will puzzle until the heat death of the universe over how MS managed to so thoroughly fuck up.
 

Schrade

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Holy steaming fucksauce!

If that is true... oh boy.

Please please let it be true. I would love for developers to not be limited by the actual API/SDK and coding part of things. If they can write stuff and have it run at an acceptable speed on the first go 'round... fantastic.
 

gruenel

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

What the hell. Can it really be that bad? I thought devs were already used to the eDRAM on the X360. Is the eSRAM on X1 really that different from a dev perspective?
 
It shouldn't be their main concern. They can't win over you, me or the enthusiast crowd, they've already lost us, and all they do by arguing specs is make it more apparent that their machine is considerably weaker.
Their biggest mistake since the reveal has been not simply to go all in with "specs don't matter". By ignoring the tech gap entirely and focusing on the games and experience, they could have avoided most of the gaffes.

Of course, I still think they would have been better off going all digital in the first place and sticking with their initial vision.
 

Skeff

Member
Hopefully not when it comes to Japan I guess.

I would imagine Japanese devs are releasing japan orientated games that sell big in japan, and there's going to be a large % of Japanese console gamers on PS4. I can't imagine much need for Parity in this case.
 

jayu26

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Interesting...

FWIW a Japanese developer was having similar issues apparently. They will remain nameless for obvious reasons.

Beautiful, we have confirmation.

I'm half joking. The numbers are clearly exaggerated.
 

TheD

The Detective
I would pay WAY more attention to ERP's words than to Ars' ones, given that he seems to have direct experience with the PS4 and that he worked as a game dev even in the Genesis/SNES era (and I don't know, maybe even earlier).

I'd also guess that, when he talks about being "limited by bandwidth" in certain situations, he's referring on how fast it's possibile to bring data to the GPU and he doesn't forget that it depends on the combination of bandwidth and latency, something that a few "expert" people here seems to forget.

ERP not only misunderstood what was said about the CUs, but even then he points out how much more fillrate and bandwidth will help (something the PS4 has a lot more of than the xb1).

GPUs are also heavily latency tolerant! bandwidth is far, far more important to them!
Attacking people for not talking latency in regards to the GPU proves you are way out of your depth!
 
Albert has put himself in such a tricky predicament..... What can he really say now?


He can't really admit he was wrong and say the PS4 actually IS more powerful. He and Major Nelson have denied that far too long.


He can't say the Microsoft "Technical Fellow" was wrong because he played him up as one of the smartest men around who has one of the rarest and prestigious jobs at Microsoft...


All he can really do is support his and "Technical Fellows" original claims and specs and basically say some of the users on GAF don't know what they are talking about and say the ARS article is completely false..


Two nights ago when people told him straight up his math was completely wrong and told him you cannot add the numbers that way, he told us all "Yes you can.". Now ARS is also saying you cannot add the numbers that way, he will HAVE to say "Yes you can.". Otherwise he is admitting that Technical Fellow, one of the smartest men at Microsoft, is wrong.


Don't expect him to come clean or admit he or anyone else was wrong. He simply can't at this point, he's in too deep...
 
Top Bottom