• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PopCap dodges whether PvZ: Garden Warfare is 1080p on Xbox One (update: it's 900p)

RoboPlato

I'd be in the dick
so 900p

TBeipI8.png


wonder what it looks like on 360.

This quote is one of the most embarrassing things MS has said and the fact that I still see people acting like it's valid pisses me off to no end.

The ESRAM funnel needs to be significantly more narrow than the GDDR5 pipe for this to be accurate.
 

Biker19

Banned
I swear their tools are not done and if they were done it takes months for developers to integrate new SDK drops into their games with confidence. You won't see the fruits of whatever rendering work MS has been doing in released games for the Xbox One until September at the earliest. Time pressure and engine restructuring for any game in development is the issue.

Or maybe because that the hardware inside Xbox One is too weak to output Native 1080p properly, let alone Native 1080p w/60 FPS.
 

stryke

Member
Wow, this whole 1080p or not 1080p stuff is getting out of hand. It's like resolution and an extra 30fps matter more than game play. I own all three of the new consoles and am tired of seeing this place with so many threads about resolution and frame rates.

Man, it must really hurt you that you must click on a thread you don't like, read and respond to it.
 
I see this all the time, but is this an accurate metaphor or not?

Honestly from what I've heard, the ESRAM is a bottleneck in itself (it just isn't enough to do much with and complicates matters) and the DDR3 is a definite bottleneck to reaching parity with the PS4 in cross-platform games.

AKA Microsoft fucked up bad when they could've just thrown GDDR5 in the thing.
 

TyrantII

Member
I see this all the time, but is this an accurate metaphor or not?

Yes. The system main unified memory is DDR3. It's absolutely a huge bottneck, and the reason why eSRAM is in there to sort of help the system out. Problem is there's only 32MB of it.

Not even the 360 used DDR3. They used GDDR3 with a higher bus speed. You need fast memory bus to pipe in huge next gen assets.

DDR3 < GDDR3 < PCIe < GDDR5 in terms of speed and bandwidth.
 

jelly

Member
I wonder who is to blame for this at Microsoft because I can't imagine any engineer worth their salt agreeing to this mess so it has to be the suits chopping them off at the knees.

It's frankly embarrassing performance for your money, even if Microsoft knock $100 off.
 

Lord Panda

The Sea is Always Right
I wonder who is to blame for this at Microsoft because I can't imagine any engineer worth their salt agreeing to this mess so it has to be the suits chopping them off at the knees.

It's frankly embarrassing performance for your money, even if Microsoft knock $100 off.

Mattrick and Ballmer? *shrugs*
 

PhatSaqs

Banned
Yes. The system main unified memory is DDR3. It's absolutely a huge bottneck, and the reason why eSRAM is in there to sort of help the system out. Problem is there's only 32MB of it.

Not even the 360 used DDR3. They used GDDR3 with a higher bus speed. You need fast memory bus to pipe in huge next gen assets.

DDR3 < GDDR3 < PCIe < GDDR5 in terms of speed and bandwidth.
Why did MS choose to downgrade from GDDR to DDR?
 
Yes. The system main unified memory is DDR3. It's absolutely a huge bottneck, and the reason why eSRAM is in there to sort of help the system out. Problem is there's only 32MB of it.

Not even the 360 used DDR3. They used GDDR3 with a higher bus speed. You need fast memory bus to pipe in huge next gen assets.

DDR3 < GDDR3 < PCIe < GDDR5 in terms of speed and bandwidth.

So, this is gonna be a thing for the next few years, then. At least until the devs figure out how to better work with it.
 

Philly40

Member
so 900p

TBeipI8.png


wonder what it looks like on 360.


Just seeing this image made me check out what flavour of idiocy @PNF4LYFE was currently spewing out, but it seems he's been banned from twitter.

It really takes a special amount of effort to get yourself banned on twitter
 

alr1ght

bish gets all the credit :)
Just seeing this image made me check out what flavour of idiocy @PNF4LYFE was currently spewing out, but it seems he's been banned from twitter.

It really takes a special amount of effort to get yourself banned on twitter

Strangely, he called him out on the comment, but seems to have deleted the post though. I guess a ban deletes all of your posts.
3jZ0gor.png
 

PJV3

Member
Just seeing this image made me check out what flavour of idiocy @PNF4LYFE was currently spewing out, but it seems he's been banned from twitter.

It really takes a special amount of effort to get yourself banned on twitter

He was pissing people off at Ready at Dawn last time I heard.
 

TyrantII

Member
Honestly from what I've heard, the ESRAM is a bottleneck in itself (it just isn't enough to do much with and complicates matters) and the DDR3 is a definite bottleneck to reaching parity with the PS4 in cross-platform games.

AKA Microsoft fucked up bad when they could've just thrown GDDR5 in the thing.

They couldn't because they needed 8GB minimum for Apps/Snap/3OS/Kinect. DDR3 was the only cost effective option when designing it.

Just like they can't have their cake and eat it too. They chose this. Period. Stop the parity BS, because you didn't design and plan it for that.

Stop lying about the power and put up or shut up about that "vision" that was the reason in the first place. Visions are nice, but lets see implementations that make gamers go, "sure, I get it, its worth it". Execute!
 

d9b

Banned
900p? Oh come on... They can't even run this game at 1080p 60fps. Please tell me this is a joke....
 
Yes. The system main unified memory is DDR3. It's absolutely a huge bottneck, and the reason why eSRAM is in there to sort of help the system out. Problem is there's only 32MB of it.

Not even the 360 used DDR3. They used GDDR3 with a higher bus speed. You need fast memory bus to pipe in huge next gen assets.

DDR3 < GDDR3 < PCIe < GDDR5 in terms of speed and bandwidth.

Except that the esram isn't the culprit of sub 1080 games. It isn't a bottleneck. The rops are. I invite you to the b3d forums for the proof if you want it.
 
This is becoming embarrassing, Microsoft needs to admit it made the wrong descions or developers need to step the fuck up.

Yup, it was incredibly refreshing to see Kojima be completely straight and put the differences out there. More devs should do exactly that.

at 60fps, I can see it, 1080 will be 30 fps games.

Some 1080p games, though I think 720/60 or 900/30 will be far more common. Thief is 30FPS game running at 900 and I believe TR also dips to 900p in certain cut scenes.
 
I swear their tools are not done and if they were done it takes months for developers to integrate new SDK drops into their games with confidence. You won't see the fruits of whatever rendering work MS has been doing in released games for the Xbox One until September at the earliest. Time pressure and engine restructuring for any game in development is the issue.

Dude it's not the tools

The hardware is terrible, simple as that.
 

TyrantII

Member
Why did MS choose to downgrade from GDDR to DDR?

My guess is with ATI and Nvidia moving exclusively to GDDR5 on most cards, demand and production is winding down. That means costs will be rising on it, and eventually it won't be commercial available.

DDR3 will eventually be phased out for DDR4, but it's a long ways off. DDR3 is also very cheap atm in its life cycle and widely available.

For reference PS4 was planned with as little as 2GB of GDDR5, settled on 4GB, and got 8GB through the planets aligning just right (2GB chip fabs becoming commercial viable in early 2013). Since they planned on that type since 2010 (most likely), no hardware changes were needed.

Strangely enough its also the first time a Sony console hasn't had a convoluted and slower memory architecture. Dev yelled from the roof, and Cerny complied apparently. Now those dividends with a little luck are paying off.
 
Top Bottom