• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PopCap dodges whether PvZ: Garden Warfare is 1080p on Xbox One (update: it's 900p)

Random photo:
Plants-vs-Zombies-Garden-Warfare-Captura2-1024x576.jpg

It looks nice and rather detailed...but nobody is going to mistake it for Battlefield 4 visually. Unfortunate it couldn't hit 1080p, but 60 is more important. If you MUST choose one for a FPS game, 60 is the right choice.
 
Yes. The system main unified memory is DDR3. It's absolutely a huge bottneck, and the reason why eSRAM is in there to sort of help the system out. Problem is there's only 32MB of it.

Not even the 360 used DDR3. They used GDDR3 with a higher bus speed. You need fast memory bus to pipe in huge next gen assets.

DDR3 < GDDR3 < PCIe < GDDR5 in terms of speed and bandwidth.

Xbone's DDR3 has 3 times the speed of 360's GDDR3 bus.

Also, the picture is not correct, not on an ideal usage anyway, the point of esram is to isolate the biggest bandwidth consumers, not bypass it to the main ram.
 

TyrantII

Member
Dude it's not the tools

The hardware is terrible, simple as that.

Nah.

The marketing is terrible. The execution of their vision that they designed the hardware around terrible. The lack of software and experiences to back up those choices is terrible. Their misreading of early adopters and gamer wants is terrible.

The hardware itself is fine for what it was supposed to do.
 

enzo_gt

tagged by Blackace
I'd take 900p or 720p on the PS4 version if it had splitscreen. Forget about the resolution, EA, make the changes where they matter!
And the PC version too
 

zod18

Banned
They lost me on this game with announcing the Xbox exclusive local multiplayer ...yeah f*** you too Popcap.
 

rdrr gnr

Member
I would bet in 2 years time they are all 1080p, they will just have worse fps than the ps4 and crappier AA solutions.
Unlikely, IMO. There seem to be issues at the hardware level and these issues may be insurmountable (third-parties may not care). Especially because console titles tend to emphasize on-screen goodies over IQ or performance. Inferior framerates are already part of the equation due to the processing power disparity, I don't think they have that much flexibility in the first place. And what happens late-gen when everyone starts chopping away at the 1080p standard like the 720p standard this gen? It's not looking good. I think it's almost guaranteed this disparity is going to exist and possibly get worse for the entirety of this gen.
 

TyrantII

Member
Xbone's DDR3 has 3 times the speed of 360's GDDR3 bus.

Also, the picture is not correct, not on an ideal usage anyway, the point of esram is to isolate the biggest bandwidth consumers, not bypass it to the main ram.

Yup, you're right. I didn't notice the absurd clock speed of the 360's RAM. Funny enough when looking it up I came across a blog posting my Major Nelson claiming real term bandwidth is everything [360 vs PS3].


The picture is a simplification, but it's right. Most people are just using it as a frame buffer atm (how it was mostly used in the 360. Eventually it'll be used better to store frequently requested data, but 32GB is slim pickings in a 1080P60 HD world with advanced shaders and terrible large assets.

Some smart people will do some amazing things for it, but outside of first party titles I actually expect the gulf to widen. 3rd party devs; ain't nobody got time for that s**t,

(unless they build something into their own engines and dev tools to leverage across multiple projects)
 

d9b

Banned
Yup, it was incredibly refreshing to see Kojima be completely straight and put the differences out there. More devs should do exactly that.



.
Kojima is a superstar of video games, if any other developer would do that Microsoft would have their head on a plate.
 
And people still say the ESram isn't causing any difficulties. I would think the XB1's GPU would be capable of outputting this game at 1080p60fps personally although it's been a while since I've looked at equivalent PC cards performance
 
Kojima is a superstar of video games, if any other developer would do that Microsoft would have their head on a plate.

I don't see why'd that be the case and if MS tried anything, word would come out and there'd just be even bigger backlash. At least, in my opinion.
 

TheCloser

Banned
I would bet in 2 years time they are all 1080p, they will just have worse fps than the ps4 and crappier AA solutions.

Nope, not possible. If that were the case, they would need to cut down significantly on the effects, the texture resolution, the complexity of the geometry, etc... Its really not worth it.
 

TyrantII

Member
Except that the esram isn't the culprit of sub 1080 games. It isn't a bottleneck. The rops are. I invite you to the b3d forums for the proof if you want it.

From what I've read sticking a better GPU in it would do nothing, since it be bus starved and waiting around with nothing to do. ROPS matter, true, but they'd also need to increase the bus bandwidth, so you end up with a system that looks less like it does now, and much less cost effective.

If they went with a better GPU, then they have to used GDDR5. If they do that they can't budget for 8GB since it costs too much. If thats the case it might not be available and would throw a fork into their "living room/app/media vision" which was the overriding decision for the hardware choices in the first place.
 
Top Bottom