• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Biker19

Banned
It's the wrong tool for the job. It's was interesting when it first came out but that time has passed. It's not useful for gaming and if they wanted it for voice recognition and voice control, there are better options. The Kinect never needed to improve. All you need Kinect for is to manage the living room experience. You need a camera for Skype and a mic with good voice recognition software. You can give verbal commands to the Xbox and there is no need for gestures. A complete waste of money.

I agree. Nobody's gonna care about Kinect 2, especially when they've already "been there, & done that" with Xbox 360.

MS seem to be the ones talking about specs and console performance. The Eurogamer article was such a strange move for MS comparing their console to the PS4 when they clearly have the weaker hardware.

They shouldn't have made the console a mostly "all entertainment/media hub" in the first place over making a console with gaming in mind first.

When you put most of your efforts towards TV, Kinect, etc. & less towards gaming, then greater specs are gonna be put on the backburner for it. You can't have both; it has to be one or the other.
 

Skeff

Member
It's the wrong tool for the job. It's was interesting when it first came out but that time has passed. It's not useful for gaming and if they wanted it for voice recognition and voice control, there are better options. The Kinect never needed to improve. All you need Kinect for is to manage the living room experience. You need a camera for Skype and a mic with good voice recognition software. You can give verbal commands to the Xbox and there is no need for gestures. A complete waste of money.

I think your right, I think if there was a 4 microphone array built into the XB1 itself and a simple IR blaster they could achieve the vast majority of the XB1 useful features at a fraction of the cost.
 

CLEEK

Member
Penello referenced the early rumors/reports that the PS4 was "balanced" for 14 CUs. Does that mean the PS4 can only efficiently use 14 CUs and is better off using the additional 4CUs for GPGPU stuff?

In short: No. The 14+4 was just an early rumour, one that Cerny said wasn't true. More to the point, as far as I'm aware, this is not how GPGPU works. The more CUs the better, for both rendering and GPGPU. There is no 'balance' or optimal number of CUs.

I answered this in detail in another thread.

http://www.neogaf.com/forum/showpost.php?p=83392041&postcount=1586
 
Not into stocks. but if im not mistaken past results don't influence stock prices only current and future result influence the prices.

this is exactly right. Stock prices are a reflection of future revenues. what happened last year or two years back is irrelevant.

when someone wants to "fire ballmer and sell the xbox division" that's because the xbox division as it stands is profitable, and microsoft is better off taking the cash for it now, than modest returns over the next several years. This is...debatable. MS doesn't really NEED the xbox division, but it isn't gaining OR losing them significant amounts of money each year. IIRC they lost more on bing last year or so than they've lost on the xbox, ever.

Ballmer himself is just viewed as a lead anchor on microsoft as a whole, consistently missing the boat on every significant trend. Getting rid of him creates the assumption that someone more suitable will lead microsoft to growth, spiking the stock price- which is exactly what happened when ballmer retired.

That being said shareholders are very much focused on long term gain and if there is a history of losses that will matter in both companies cases

not at all. The typical investor holds on to a given stock for about 11 months. VERY few investors are concentrating on long term gains, which is kind of the problem with wall street right now.
 

CLEEK

Member
They have officially written off Xbox 1 era as loss. Adding that to current Xbone/Xbox doesn't make any sense. Currently it is important for them to make money and so far Xbox360 and probably Xbone will keep making money.

Shareholders don't give a shit about historic losses. And MS have far deeper pockets than most, so the fact that the original Xbox made them haemorrhage money is something they could deal with. It's not like it burned up all their cash reserves.

All the matters is the current results, how things are tracking for this financial year, and how they look in the short term future. The fact the Xbox has been profitable for the past few years means that no-one, shareholders or execs, want to see losses again.
 

nib95

Banned
I know Bkillian is an ex-Microsoft guy who worked on the Xbox One's design, and is pretty invested in it (99% of his posts are about the XO and it's design), but just something to add some sauce to the thread...

Bkillian said:
Adding cores never scales linearly. For their statement to be true on a GFLOPS basis (which it probably isn't), a scaling efficiency of 95.6% per core added would provide 12CUs at 853MHz with the same horsepower as 14CUs at 800. At that (bogus) scaling factor, 18 CUs at 800MHz would only be 7.4% faster than 12CUs at 853MHz.

The real scaling factor is guaranteed to be higher than that, but it will still be <1, and the scaling factor probably goes down with increasing cores, since the bottlenecks become even more important. You're going to have a lot more contention for a 256 bit memory bus at 12 cores than at 2. What this means is that in real terms, PS4 does not have, and never had, a 50% GPU advantage. What the actual advantage is, we may never know.

http://beyond3d.com/showpost.php?p=1787463&postcount=6534

Thoughts?
 

JaggedSac

Member
Report: Microsoft's Xbox division has lost nearly $3 billion in 10 years.

asfd3221125511352.jpg

There is no Xbox division. There is an "Entertainment and Devices Division" that includes Xbox, Windows Phone, Surface, etc. Now consider the fact that the division is making a profit with those other things in there.
 

CLEEK

Member
Thoughts?


Part of what he's saying is true. You don't get a 1:1 increase in throughput by adding more CUs.

But there is an undeniable fact that the PS4 has 50% more theoretical performance over the Xbone. That's what theoretical performance is, just a sum of the CUs and clock to get the FLOPS measurement.

There are weasel words in his justification as to why he feels the PS4 doesn't have 50% advantage. "probably goes down" is not the same as "does go down" or "here is the maths to show how it goes down".

As far a Bkillian goes, when he writes something about the Xbone itself, I trust what he says completely. When he starts making comparisons to the PS4, I usually wince as he moves from having an objective knowledge about something, so just having a subjective opinion, influenced no doubt by the fact he worked on the Xbone.
 

nib95

Banned
I think Artists posts and findings belong here too.

It is linear, atleast where it's relevant.

untitledycq6e.png


untitledm0f61.png


This scaling beyond 14CU non-sense is getting laughable.


So trying to get the discussion back on topic, so I went around mapping out the GCN family GPU and their resources, trying to see if there was any balanced (or unbalanced) GPUs;

(Edited with XO/PS4 figures, Esram based and DDR3 only)


untitled5rfqo.png


untitledxtogz.png


From the above you can see that Prim rate doesnt really effect overall performance as much, see how the 7790 has more triangle output than the 7850, same goes with 7870 over 7950/7970.

Second, an excess of pixel fill (ROP) isnt of tangible benefit either - 7870 having 25% more pixell fill over the 7950 yet that doesnt translate well in games because it's lacking in other areas - compute/texel fill.

These are the two areas that Microsoft decided to strengthen by going with the upclock and the two areas they gave away was texel and compute ..

How again is this a more balanced design? Oh yeah, those numbers that they ran on current titles, ones which will never be public domain. ¬_¬

*Performance source


Yeah.

Although I think the new graph you've made does a better job and is more useful, since it shows a lot of other metrics too and factors in clock rate for compute performance.


Just looking at the 7850 and 7750 for instance on your new graph, where all the other metrics are also increased proportionally, game performance essentially scales perfectly (with compute). Cf. comparing the 7750 and the 7790, where the increase in compute performance of (eyeballing) around 120% only nets a 60% performance benefit, presumably because the other metrics like pixel fill rate and memory bandwidth haven't kept up. Looking at the 7850 vs the 7970 seems to be a similar case, etc.

Very interesting.


Yup.

In terms of scaling linearly, the weightage for different metrics translating into performance across games looks like;
1. Compute
2. Texell Fill
3. Pixell Fill/Mem Bandwidth
4. Prim Rate
 
I know Bkillian is an ex-Microsoft guy who worked on the Xbox One's design, and is pretty invested in it (99% of his posts are about the XO and it's design), but just something to add some sauce to the thread...



http://beyond3d.com/showpost.php?p=1787463&postcount=6534

Thoughts?
This may be true for launch games to a certain degree, but as developers starts to know the bandwidth limitations, they will start to use the CU caches more efficiently and the CUs will run at close to peak performance. Developers hate to leave hardware idling.
 
I know Bkillian is an ex-Microsoft guy who worked on the Xbox One's design, and is pretty invested in it (99% of his posts are about the XO and it's design), but just something to add some sauce to the thread...



http://beyond3d.com/showpost.php?p=1787463&postcount=6534

Thoughts?

I might have bad reading comprehension but it looks like he's saying that yes, CUs don't always scale completely linearly because you have tons of aspects to take into account but the math Microsoft used to reach their conclusion makes no sense. Yes its obvious that the real life scaling factor is not exactly 100% but what he's saying is that if you extrapolate performance from Microsoft's claim that 12CU@853mhz is similar or better than 14CU@800mhz, 18CU@800mhz is only 7.4% better. Which, he states, makes absolutely no sense as the real life scaling factor will be far better than that.

The only thing that he really defends the Xbox One with is that the GPU is definitely not a clean 50% more powerful which I'm inclined to believe. Really, he's not saying anything new here.
 

artist

Banned
I know Bkillian is an ex-Microsoft guy who worked on the Xbox One's design, and is pretty invested in it (99% of his posts are about the XO and it's design), but just something to add some sauce to the thread...



http://beyond3d.com/showpost.php?p=1787463&postcount=6534

Thoughts?
What he's saying is true but is only relevant when the CU count is beyond a certain threshold. See;

untitledycq6e.png


Performance scales pretty nicely until 20CUs (with a certain amount of tex/pix/mem bw resources). Beyond that for example, adding 40% ALUs results in ~12% increase because of bottlenecks.
 
What he's saying is true but is only relevant when the CU count is beyond a certain threshold. See;

untitledycq6e.png


Performance scales pretty nicely until 20CUs (with a certain amount of tex/pix/mem bw resources). Beyond that for example, adding 40% ALUs results in ~12% increase because of bottlenecks.

Not to be nitpicky but wouldn't it make more sense to label the y-axis "CU Count" and have the scale go by tens?

I.E. 10 20 30 etc.

Of course this is coming from someone too lazy to make a graph in the first place
 

onQ123

Member
I might have bad reading comprehension but it looks like he's saying that yes, CUs don't always scale completely linearly because you have tons of aspects to take into account but the math Microsoft used to reach their conclusion makes no sense. Yes its obvious that the real life scaling factor is not exactly 100% but what he's saying is that if you extrapolate performance from Microsoft's claim that 12CU@853mhz is similar or better than 14CU@800mhz, 18CU@800mhz is only 7.4% better. Which, he states, makes absolutely no sense as the real life scaling factor will be far better than that.

The only thing that he really defends the Xbox One with is that the GPU is definitely not a clean 50% more powerful which I'm inclined to believe. Really, he's not saying anything new here.

Pretty much!


but the fact is the Devs seen a 50% advantage ( maybe even after the Xbox One upclock) .

My thinking is that if this was after the upclock there must be something else going on.

maybe it's because of the 10% GPU reserved for the Xbox OS, or it could be that the PS4 chip is just designed better, or it could be that PS4 is more powerful than we think.
 

Biker19

Banned
Probably not. And investors of MS want high margins.

Yeah, if the Xbox One doesn't turn a significant amount of profit for their investors/shareholders (& they mean BIG, big profits, not small, tiny profits that they made on Xbox 360 from 2008 to present), they'll force Microsoft to close down the Xbox division. They're already getting pretty fed up as it is with the Xbox brand racking up a lot of losses from both the Original Xbox & the Xbox 360.
 

mjswooosh

Banned
eh meant it like an advantage in the sense xbox has that extra functionality .... how it plays out mass market remains to be seen .... guess same as how ps4s better graphics will play out mass market too .... just trying to keep my posts even keeled ... cause while i clearly feel ps4 is stronger i do understand part of the reason xbox is weaker is their focus on kinect and tv integration .... different approaches ... i prefer ps4 but some may prefer xbox1.

Fair enough. However, the way I've viewed this topic on bullet points is that features are only an advantage if said functionality is enjoyed/enbraced by a majority of the target market, thereby validating said features. ;)

I suppose you could still argue Betamax "over-delivered value" vs. VHS (to borrow a Mattrick-ism)...but the market said, "uh, more expensive? No thanks". Ultimately if the games on Kinect 2.0 are a marked improvement then it may gain some traction enough to continue justifying its pack-in status. But my money is on MS being forced to release a SKU without Kinect to keep long term sales going after launch hype dies down. A feature is only an advantage if it offers something truly compelling that the competition doesnt offer. It remains to be seen if anyone beyond early adopters and casuals who have moved on to tablet gaming really cares or if most people will view the Kinect's extra $100 over a PS4 as a non-value added expense.
 

mrklaw

MrArseFace
I know Bkillian is an ex-Microsoft guy who worked on the Xbox One's design, and is pretty invested in it (99% of his posts are about the XO and it's design), but just something to add some sauce to the thread...



http://beyond3d.com/showpost.php?p=1787463&postcount=6534

Thoughts?

He's right in that 50% more CUs doesn't mean 50% more power. Sony's own comments about 'not 100% round' architecture and encouraging use of GPGPU also backs that up - diminishing returns etc.

However, if you take all other factors into account - memory bandwidth, number of CUs, number of TMUs, number of ROPs, I think it might not be unreasonable for some devs to find the overall hardware being '50% faster' by whatever measure they prefer.
 

RoboPlato

I'd be in the dick
He's right in that 50% more CUs doesn't mean 50% more power. Sony's own comments about 'not 100% round' architecture and encouraging use of GPGPU also backs that up - diminishing returns etc.

However, if you take all other factors into account - memory bandwidth, number of CUs, number of TMUs, number of ROPs, I think it might not be unreasonable for some devs to find the overall hardware being '50% faster' by whatever measure they prefer.

Exactly what I was going to say. It could also be that the PS4's more mature API is making it easier to get more performance as well.
 

antic604

Banned
Man, MS's really trying very hard! So - in their understanding - PS4 is not balanced, because it has some additional GPU grunt, to be used in the future for GPGPU? I believe it was also already discussed that increasing Xbox' CU count could indeed return disproportionally small performance gains, because they're limited by ROPs, ACEs and maybe b/w as well?

But in the end, I really can't fault MS engineers for being proud of their design. Given the requirements they got for the system (large pool of RAM, seamless multitasking on 3 OSes, integrated Kinect support, etc) they really seem to have created an architecture which does its best to tick all the boxes within given $$ and die-size budgets, with minimum number of trade-offs and optimizations helping to reduce the negative impact of those trade-offs. Therefore in a sense I can see why some might say that PS4 uses - despite number of clever tweaks - a 'brute force' while Xbox is more 'balanced'. Problem is, those consoles will be on the market for at least 5 years and building a machine based on 'profiling' or current-gen games can prove insufficient. E.g. MS seems to hint at different approach to GPGPU than Sony took, in particular they said that it's more important to have low latency access to memory rather than additional ALU-s - this is definitely true for some algorithms that work on small amount of data, but most likely not all of them? I think that game development over the last 8 years was really held back by X360/PS3 memory constrains and outdated GPU architectures and now devs will finally be able to 'spread their wings' in terms of new ideas and algorithms, even if compared to high-end PCs limitations are still there. If that's the case, it can happen that Xbox won't be able to do some of those things unless graphics quality takes a further hit be reserving some of the CUs for the GPGPU (provided that limited # of ACEs is enough to manage the load). Alternatively multiplat games will again be limited to only certain types of GPGPU that can be handled by both consoles, while only 1st parties will push new stuff.

Having said that, I'm looking forward to the next DF article: "Microsoft's approach to asynchronous GPU compute is somewhat different to Sony's - something we'll track back on at a later date."
 
He's right in that 50% more CUs doesn't mean 50% more power. Sony's own comments about 'not 100% round' architecture and encouraging use of GPGPU also backs that up - diminishing returns etc.

However, if you take all other factors into account - memory bandwidth, number of CUs, number of TMUs, number of ROPs, I think it might not be unreasonable for some devs to find the overall hardware being '50% faster' by whatever measure they prefer.


How about hUMA? Is that still a factor?
 

gofreak

GAF's Bob Woodward
He's right in that 50% more CUs doesn't mean 50% more power. Sony's own comments about 'not 100% round' architecture and encouraging use of GPGPU also backs that up - diminishing returns etc.

I think there's a slight blur of two different things in his comment too. The benefit of additional CUs in the XB1 GPU and the benefit of additional CUs in the PS4 GPU.

On XB1 to see more CU reflected in software performance you would need a higher skew on demand for ALU in your software than you would need in PS4.

PS4 has more of simply everything along the pipeline, bar prim setup I guess.

His comments are also generalising where you really need software to provide a context. The value of a given resource, or more of a given resource, depends entirely on a piece of software's demand on that resource vs others, and that naturally varies. Saying 'PS4's GPU was/is never 50% better than XB1' is meaningless. In some software it won't be, in some it may well be, in some it might be more than that even (theoretically, if a piece of software had demanding bw or output characteristics).

And finally, it ignores the potential for console targeted software to shape itself against hardware for better utilisation. And for all the talk of 'efficiency' and 'balance' being bandied about, I see potential hurdles to optimum utilisation on XB1 in its 'customisations' while Sony's are pretty logical extensions based on experiences and challenges in the PC space. This is not at all a case of 'elegance' vs 'brute force'.
 

Riky

$MSFT
Just got the latest edition of Edge, they seem to be coming around to X1.

Here, right now, Microsoft has an edge. The £80 price disparity between the two consoles will be a decision maker for many, but for others, Microsoft's exclusives are more convincing than anything Sony has shown on PS4 to date. Killzone is no Halo; Driveclub doesn't look like measuring up to Forza; and Dead Rising 3 now looks like a smarter bet for open world mayhem than Second Son.

Even a second-tier title like Ryse makes a stronger case for its host hadware's graphical capabilities, at least, than anything set for PS4's launch day, while no next generation multiplayer game can match Respawn's work on Titanfall - an exclusive secured with Microsoft's financial backing, and a game utterly reliant on the company's cloud to synchronise its AI and provide dedicated servers as and when needed. Microsoft's own studios and its willingness to open its chequebook has ensured that Xbox One has 12 months' worth of exclusives that stand against the best ever seen in a launch year

There is also a couple of lines about memory difference but one quote is from Ken Lobb so I expect he would say that since he works there. The other is from Anton Yudintsev of Gaijin Entertainment,

PS4's DDR5 is basically 50 per cent more powerful than DDR3. But the memory write is bigger on Xbox One. So it depends on what you're doing. PS4 is more powerful, but you can't just write to memory, you need to read sometimes

Presume this is something to do with the Esram double bandwith claim?
 

antic604

Banned
And finally, it ignores the potential for console targeted software to shape itself against hardware for better utilisation. And for all the talk of 'efficiency' and 'balance' being bandied about, I see potential hurdles to optimum utilisation on XB1 in its 'customisations' while Sony's are pretty logical extensions based on experiences and challenges in the PC space. This is not at all a case of 'elegance' vs 'brute force'.

I still believe it can be treated as such - PS4's 'brute force' (I know it sounds ridiculous to PC GAF...) will let you do everything with good/enough performance, while Xbox' approach is more elegant/balanced for very specific tasks (e.g. latency-dependent GPGPU, certain types of h/w-aided sound processing, tiled rendering) because it can use less raw power to produce the same result - that undoubtedly can be labelled 'elegant'.

Obviously, the question is how easy are those Xbox' customizations to use for the devs, how 'fixed' they are in their function and how right was MS about future game development paradigms. If answers are negative to all of those, the 'elegance' of Xbox might well turn against it, in particular for 3rd party titles.

I'd really love Sony to come out and reveal all their customizations, because apart from increased ACEs, Garlic/Onion/+ busses, 'volatile' flag and 2nd SoC for background processing we don't know a lot...
 

gofreak

GAF's Bob Woodward
I still believe it can be treated as such - PS4's 'brute force' (I know it sounds ridiculous to PC GAF...) will let you do everything with good/enough performance, while Xbox' approach is more elegant/balanced for very specific tasks (e.g. latency-dependent GPGPU, certain types of h/w-aided sound processing, tiled rendering) because it can use less raw power to produce the same result - that undoubtedly can be labelled 'elegant'.

With optimisation, either can punch above its weight. Can 'use less raw power to produce the same result', to put it in simplistic terms. Both have hardware features different from what is typical in the PC space that provide opportunities to do more in certain cases. And I'd actually argue that PS4's vectors for optimisation are more widely applicable than XB1's.

But all of that is not 'elegance' IMO.

Out of the box, pre significant optimisation, I would wager PS4 will do better, perhaps significantly better, with more shapes of software than XB1. That's flexibility. That's 'elegance'. That's 'balance'.

Microsoft's conversation about balance here is focussed on one thing. I'd bet there are more than a few multiplat devs crunching for launch on both consoles smiling wryly at the idea that Microsoft's solution is more elegant or balanced than PS4.
 

B.O.O.M

Member
"Killzone is no Halo; Driveclub doesn't look like measuring up to Forza; and Dead Rising 3 now looks like a smarter bet for open world mayhem than Second Son.

Even a second-tier title like Ryse makes a stronger case for its host hadware's graphical capabilities, at least, than anything set for PS4's launch day"

LOL wtf am I reading
 
Why was the guy talking about elegance banned?

Something of an aside regarding Ryse, but it would look much less uncanny valley if they fixed their art direction with regard to where the characters are looking. In pretty much every screen I've seen of it the characters are all staring vacantly at nothing.
 
I still believe it can be treated as such - PS4's 'brute force' (I know it sounds ridiculous to PC GAF...) will let you do everything with good/enough performance, while Xbox' approach is more elegant/balanced for very specific tasks (e.g. latency-dependent GPGPU, certain types of h/w-aided sound processing, tiled rendering) because it can use less raw power to produce the same result - that undoubtedly can be labelled 'elegant'.

Obviously, the question is how easy are those Xbox' customizations to use for the devs, how 'fixed' they are in their function and how right was MS about future game development paradigms. If answers are negative to all of those, the 'elegance' of Xbox might well turn against it, in particular for 3rd party titles.

I'd really love Sony to come out and reveal all their customizations, because apart from increased ACEs, Garlic/Onion/+ busses, 'volatile' flag and 2nd SoC for background processing we don't know a lot...
Not quite sure I follow the logic here. The Xbox is balanced so will be good at 'very specific tasks'? The notion of balance surely is that it is capable of doing all things in equal measures. Being very good at a few things and average in others isn't balanced.
 

Raydeen

Member
I suppose it's like a Nissan GT-R vs a Bugatti Veyron.

The Bugatti would smoke anything on a straight road, but the GT-R may be more nippy and useful in certain scenarios (racing around a tight corner city).
 

nib95

Banned
I suppose it's like a Nissan GT-R vs a Bugatti Veyron.

The Bugatti would smoke anything on a straight road, but the GT-R may be more nippy and useful in certain scenarios (racing around a tight corner city).

But that's not a fair comparison. A better one is that the Xbox One is a Ford Mustang GT whilst the PS4 is the Nissan GT-R Black Edition. In other words, the PS4 has more performance and is better balanced (more customisations, creative bandwidth management, faster bandwidth, more forward thinking GPGPU features, secondary custom chip etc), and has less weight (Kinect, snap etc) to worry about.
 
Page 12, it's and Edge article "Power Of The Crowd"

The front cover has a quote

"XBOX ONE: FROM CRISIS TO CONTENDER"

I'm a fan of Edge because they tend to base reviews on the games own merits and give scores on the individual game: Dead Rising versus inFamous sounds like a very bad comparison from a "fanboy", it sounds even worse for Edge.

Maybe new generation fumes getting to them?
 

Mastperf

Member
I'm a fan of Edge because they tend to base reviews on the games own merits and give scores on the individual game: Dead Rising versus inFamous sounds like a very bad comparison from a "fanboy", it sounds even worse for Edge.

Maybe new generation fumes getting to them?
Probably just making sure they stay the topic of conversation on forums.
 

TechnicPuppet

Nothing! I said nothing!
I'm a fan of Edge because they tend to base reviews on the games own merits and give scores on the individual game: Dead Rising versus inFamous sounds like a very bad comparison from a "fanboy", it sounds even worse for Edge.

Maybe new generation fumes getting to them?

Maybe they have played both of them and prefer Dead Rising.
 
I'm a fan of Edge because they tend to base reviews on the games own merits and give scores on the individual game: Dead Rising versus inFamous sounds like a very bad comparison from a "fanboy", it sounds even worse for Edge.

Maybe new generation fumes getting to them?
Yeah even I agree with that. Even if they played both that sounds all around kind of suspect. Anybody else get edge to see if this is true?
 

iceatcs

Junior Member
I'm a fan of Edge because they tend to base reviews on the games own merits and give scores on the individual game: Dead Rising versus inFamous sounds like a very bad comparison from a "fanboy", it sounds even worse for Edge.

Maybe new generation fumes getting to them?
I think they need to balance and has been told off in past few months. They went crazy MS hating for bit while.
Also they met MS team, something it shouldn't take it as personally but professionally.
 
Maybe they have played both of them and prefer Dead Rising.

It barely makes any sense. The two aren't even remotely comparable. Besides that they're both third person open world games, there's nothing similar about the two.

I mean I guess the EDGE writer can come to that conclusion but its a strange conclusion to make.
 

Mastperf

Member
It barely makes any sense. The two aren't even remotely comparable. Besides that they're both third person open world games, there's nothing similar about the two.

I mean I guess the EDGE writer can come to that conclusion but its a strange conclusion to make.
Yes, both open-world games that try to achieve very different things.
 

foxbeldin

Member
It barely makes any sense. The two aren't even remotely comparable. Besides that they're both third person open world games, there's nothing similar about the two.

I mean I guess the EDGE writer can come to that conclusion but its a strange conclusion to make.

I kinda agree with that. Plus the Killzone/Halo comparison is weird, since we didn't see Halo ingame.
And Driveclub and Forza, while being racing games, they're quite different ones and hard to compare, in gameplay and in artistic direction. Plus Evolution seems to be cooking more and more crows every time they show new footage.
 
Top Bottom