• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft respond to the EDGE article

Status
Not open for further replies.

ryano9929

Banned
I've owned both consoles since launch and have always found xbox live to be the better of the two. I like the free games that sony gives out as a plus subscriber but xbox live is the better of the two.


Technology is advancing much quicker these days. I can't see this generation lasting as long as this one as the sales have dropped significantly.
 

Vizzeh

Banned
No games so far suggest that the PS4 is more powerful, especially not 50% more powerful.
Really? Hasn't everyone went over this, from brute force coding to launch window time scale and building the game,es for PC on both platforms and porting them, given full x1/ps4 hardware was unknown, even the mods here have tried to highlight that.. We will see the difference, like in every other console launch, happen after year 1 and improve after that.
 

TechnicPuppet

Nothing! I said nothing!
Folks get way too hung up on percentage differences, like whether it's 20, 30, 40, or 50%, and end up having this notion that at some arbitrary percentage point, the power difference somehow becomes "significant." Really, folks are blowing the numbers out of proportion and getting too hung up on specific numbers.

To illustrate this point, let's compare some PC GPUs that actually have a significant gap in performance, where there would be a huge leap in resolution, IQ, and fidelity between them. For this example, take the fairly pitiful HD7670, with a meager 768 GFLOPS, and compare it to the comparatively monstrous HD7990, still today a beast of a card with a whopping 3891x2 GFLOPS under its belt.

This represents a truly massive power differential. This is where you'd expect to drop resolution, IQ, fidelity, everything just to get a decent framerate on the weaker hardware. And it makes sense - in this case, there is a 1013% difference in GFLOP capability between the cards. That's a level of difference where you can expect to see something that arguably looks like a generational leap, something that looks like Xbox vs. PS2, maybe more.

Now look back at the Xbone vs. PS4. 50%, what does that amount to again? What does 20, 30, 40% actually result in? Probably not as much as folks have in their minds. Yeah, it's a difference; yeah, first party titles will show the difference when they fully exploit each system's power. But multiplats? If there's a difference, don't get yourself worked up into believing it's going to be this huge gulf. Some parts could get trimmed down going from PS4 to Xbone, sure, but getting all worked up over how much will actually come of a ~50% or less difference starts to get silly.

Just a general observation I'm throwing out there, take it for what y'all will.

I think the real debate is notable difference to the masses vs unnoticeable difference to the masses.

I think it will be the latter.
 
Fable-Legends-3.jpg


I am deeply impressed with Fable Legends graphics, so i tend to believe they do have some aces up their sleeve, no matter the raw hardware spec numbers

TBH i find Legends graphics far more detailed and spectacular than the new Deep Down pics (that is, if those graphics do get to be that way and were not promotional fakes like the initial Deep Down video)

Fable Legends trailer was not real time, it was CGI. Did you actually think it was gonna look like that?

Deep Down looks very impressive during gameplay:
dd3.gif
 

vcc

Member
Most of you people are shortsighted and blinded. No console in history has had the kind of advantage that we are talking about here for Sony.

That's not exactly true, the Saturn and PS1 went head to head around the same time (JP 1994). The PS1 had a significant edge in 3D performance while the Saturn was optimized more for 2D with 3D as a late addition. The PS1 came out cheaper with more power in the US.

In the DC vs PS2 match up, the PS2 had an edge but the Sony signature 'stupidly hard to utilize' started here and PS2 games only show better image quality and performance AFTER the DC was more or less dead. The PS2 didn't hold up against the newer game cube or xbox but it had a real edge on the DC.

The PS3 vs 360 era was the peak of the Sony 'stupidly hard to utilize'. The PS3 does have enough in it to compare favorably to the 360 when they do optimize for it's odd set up. the GPU's on those machines had only a narrow 5% edge to the 360 and the silly inflated floating point performance of the cell was rarely a significant factor because of how much work you had to put in to get a tiny fraction of that out.

things are never this simple, and to assume that they are is the height of stupidity.

You can only work with what you know. Perhaps it can be as straight forward as MS was focused on a different fight and the foe they thought they had vanquished still had some fight left.
 

Finalizer

Member
I think the real debate is notable difference to the masses vs unnoticeable difference to the masses.

I think it will be the latter.

Hell, the post right before mine is quoting someone as saying "especially not 50% more powerful." That's precisely what I'm talking about in my post as arguing over essentially nothing.

Whether potential differences will be noticeable is an entirely different conversation, and one I'm not criticizing anyone for having.
 
I've owned both consoles since launch and have always found xbox live to be the better of the two. I like the free games that sony gives out as a plus subscriber but xbox live is the better of the two.


Technology is advancing much quicker these days. I can't see this generation lasting as long as this one as the sales have dropped significantly.

to be honest the jumps in cpu and gpu tech have really slowed to be much more incremental in recent years. If anything I think it could be another long gen.
 

Vizzeh

Banned
Folks get way too hung up on percentage differences, like whether it's 20, 30, 40, or 50%, and end up having this notion that at some arbitrary percentage point, the power difference somehow becomes "significant." Really, folks are blowing the numbers out of proportion and getting too hung up on specific numbers.

To illustrate this point, let's compare some PC GPUs that actually have a significant gap in performance, where there would be a huge leap in resolution, IQ, and fidelity between them. For this example, take the fairly pitiful HD7670, with a meager 768 GFLOPS, and compare it to the comparatively monstrous HD7990, still today a beast of a card with a whopping 3891x2 GFLOPS under its belt.

This represents a truly massive power differential. This is where you'd expect to drop resolution, IQ, fidelity, everything just to get a decent framerate on the weaker hardware. And it makes sense - in this case, there is a 1013% difference in GFLOP capability between the cards. That's a level of difference where you can expect to see something that arguably looks like a generational leap, something that looks like Xbox vs. PS2, maybe more.

Now look back at the Xbone vs. PS4. 50%, what does that amount to again? What does 20, 30, 40% actually result in? Probably not as much as folks have in their minds. Yeah, it's a difference; yeah, first party titles will show the difference when they fully exploit each system's power. But multiplats? If there's a difference, don't get yourself worked up into believing it's going to be this huge gulf. Some parts could get trimmed down going from PS4 to Xbone, sure, but getting all worked up over how much will actually come of a ~50% or less difference starts to get silly.

Just a general observation I'm throwing out there, take it for what y'all will.

I disagree. I can't wrap by head around why 50% would be unnoticeable. The hardware design will also play the part as a negative with x1 doing a ps3 and complicating their hardware with esram etc, that alone I believe is a big factor. There seems to more articles and comments filtering out by the day to prove that. Multiplats, we seen 'some' difference this gen based on ps3's poor design, I expect similar this gen vice versa, in some cases on a higher scale given the increase in difference of hardware.
 

Nicktendo86

Member
Honestly, this is getting ridiculous. I simply cannot believe a major corporation that did so much right with the 360 could manage to fuck up so wildly with just about everything related to the XBone. It's absolutely mind-bogglingly dumb. These shills just make it look somehow even worse.
Oh it started with the 360 though when they fucked up the dashboard, released kinect and turned rare into a kinect only shit show.
 

Finalizer

Member
I disagree. I can't wrap by head around why 50% would be unnoticeable. The hardware design will also play the part as a negative with x1 doing a ps3 and complicating their hardware with esram etc, that alone I believe is a big factor. There seems to more articles and comments filtering out by the day to prove that. Multiplats, we seen 'some' difference this gen based on ps3's poor design, I expect similar this gen vice versa, in some cases on a higher scale given the increase in difference of hardware.

My question to you, then, is what will actually manifest from this 50% performance difference?

If we say 900p vs 1080p across the board as an example, then we're now discussing something tangible. I would certainly notice the difference, and I'd assume most folks on this site would as well. But is your average cowadoody player going to see it? Is he going to give two shits? That discussion is a lot more nuanced, and more interesting, than bickering over some ambiguous percentage point that really doesn't mean anything.

EDIT: "Inherently subjective" was really not the right word for something like resolution, lol. Should think before I post...
 
Microsoft tried to sell their online idea profoundly just so that they can hide the shortcomings of their hardware by stating the power of the cloud.

They thought that having a great successful last-gen with 360, they might be able to convince everyone and if they would've gotten away with it, a normal consumer may not have known much difference and would've thought that the cloud was giving it more power.

But MS's online backfired completely and they started losing fans AND pre-orders (if I'm not mistaken) and they had to fall back on what was working and wasn't broken and after that we never heard about the cloud not even a single mention now about the cloud anywhere.

Which just leads to believe MS was ready, willing and able to feed consumers BS.

Their PRs are making things worse, rumours of turmoil inside MS and everytime MS is in the spotlight to be compared, an article from journalists pops up defending MS. They themselves said that they didn't purposefully target High-end graphics.

And at the same time on the other end, Sony's work is all being done for them, not even lifting a finger to do anything. They're back to their own roots. I loved Ken Kutaragi and his vision of Cell Processor, it was really sad to see devs getting a hard time with it, else during its time it was the single most powerful chip. Cerny brought Sony back to its feet and he should be proud.

Just my summary that I wanted to get out.
 

artist

Banned
My question to you, then, is what will actually manifest from this 50% performance difference?

If we say 900p vs 1080p across the board as an example, then we're now discussing something inherently subjective. I would certainly notice the difference, and I'd assume most folks on this site would as well. But is your average cowadoody player going to see it? Is he going to give two shits? That discussion is a lot more nuanced, and more interesting, than bickering over some ambiguous percentage point that really doesn't mean anything.
This is a tangential topic from the one at hand. I bet most of those cowadoody will not be able to differentiate between current gen and next-gen (not cross-gen) titles as well.

The point is - is there a difference and by how much.
 

Finalizer

Member
The point is - is there a difference and by how much.

That's what I mean, that kind of question is more suited for the discussion between the consoles, and my only point is to say that applying these, in my eyes, arbitrary and nearly meaningless percentage numbers - and on top of that, creating some further arbitrary threshold of significance at some percentage point - to me comes across as silly. Hence my example of some actual graphical difference.

I dunno, I'm probably just splitting hairs here.
 

Vizzeh

Banned
That's what I mean, that kind of question is more suited for the discussion between the consoles, and my only point is to say that applying these, in my eyes, arbitrary and nearly meaningless percentage numbers - and on top of that, creating some further arbitrary threshold of significance at some percentage point - to me comes across as silly. Hence my example of some actual graphical difference.

I dunno, I'm probably just splitting hairs here.

Tbh I agree with that 50% 'sounds' a lot worse than what we will likely see on screen, it maybe is harsh on x1 even though the hardware specs indicate it. Its sort of a roundabout topic I'm not sure will be answered properly for a couple years, the average Joe casual as you said, prob won't notice the graphical difference/smoothness on multiplats unless their running beside each other...let's be fair most of their games are prob seen on TV ads, so it will probably be the wow factor of any 1st party that sticks out.
 

maverick40

Junior Member
Look all the talk about the PS4 being more powerfull is 100 percent true at this stage and if is obvious from this PR response. The problem is Devs like SSM and naughty dog are going so smoke anything Microsoft can throw at them
 
My question to you, then, is what will actually manifest from this 50% performance difference?

If we say 900p vs 1080p across the board as an example, then we're now discussing something inherently subjective.

The objective differences – assuming we see them – will be seen in things like poly count, texture detail, visual effects like real-time reflections.
 
I'm not saying this is what was intended by the remark, but ... For me reading a figure of 50% more powerful, I would assume that this means 50% of a generational leap, where 100% would be another gen entirely.
 

Socky

Member
I'm not saying this is what was intended by the remark, but ... For me reading a figure of 50% more powerful, I would assume that this means 50% of a generational leap, where 100% would be another gen entirely.

If PS4 is 10x the PS3, then a generational leap would be 1000% more powerful (or thereabouts), not 100%. That should put 50% more into perspective, although it's still a noticeable amount to my mind. The comparison quoted directly from a developer in the article suggests a possible level to consider based on direct experience with the respective hardware.
 

Mandoric

Banned
I'm not saying this is what was intended by the remark, but ... For me reading a figure of 50% more powerful, I would assume that this means 50% of a generational leap, where 100% would be another gen entirely.

It's not. Console generational gaps are typically in the 800% range.
A 50% gap is (extremely roughly - don't draw the conclusion that we're talking DC-PS2, or Genesis-SNES, or 360-PS3 here!) around a year's worth of progress at the same price point, cocktail-napkin math.
 

twobear

sputum-flecked apoplexy
Fable Legends trailer was not real time, it was CGI. Did you actually think it was gonna look like that?

Deep Down looks very impressive during gameplay:
dd3.gif
I think the Fable Legends screenshots were in game, you can clearly see polygons and lower res texture work. The trailer was CGI though. As I said at the time, the screenshots look roughly like a PS3 game (such as TLOU) ported up to 1080p to me, not really all that impressive.

This is the other point that hasn't been made, which is that MS developers are either less talented or don't care as much about pushing the hardware, with the major exception of Corrine Yu's team at 343i. Halo 4 is in a league of its own amongst 360 exclusives. This will also skew the visuals, so that on top of having a more powerful console, SWWS teams are also spending more time and effort on visuals.
 

Thrakier

Member
Ten years ago, you could argue that a console’s power was summed up in terms of a few of its specs, but Xbox One is designed as a powerful machine to deliver the best blockbuster games today and for the next decade.

What are they even trying to say with that sentence? That tech isn't important anymore? Why are they even talking about tech then?

Xbox One architecture is much more complex than what any single figure can convey. It was designed with balanced performance in mind, and we think the games we continue to show running on near-final hardware demonstrate that performance. In the end, we’ll let the consoles and their games speak for themselves.”

With "balanced performance" - what is that supposed to mean? What are they balancing? Their investment cost and forcing kinect on players in regards to overall peformance?

The consoles and games speak for themselves, yes. XB1 will be far behind in 1 or 2 years, because it's not even on par when it releases. To think that they really want to establish a 1,4tfop machine for a ten year decade makes me shudder. It's insane. Every smartphone will be 10times faster and with an attached holodeck than that big black box by then.
 

Caronte

Member
My question to you, then, is what will actually manifest from this 50% performance difference?

If we say 900p vs 1080p across the board as an example, then we're now discussing something tangible. I would certainly notice the difference, and I'd assume most folks on this site would as well. But is your average cowadoody player going to see it? Is he going to give two shits? That discussion is a lot more nuanced, and more interesting, than bickering over some ambiguous percentage point that really doesn't mean anything.

EDIT: "Inherently subjective" was really not the right word for something like resolution, lol. Should think before I post...

Why do you assume the only difference will be in resolution and not framerate? It happened this generation, it could happen again.
 

DeviantBoi

Member
But, Microsoft, if games are long to look the same on both consoles, why would I ever buy the one that's most expensive?

For Kinect? No, thanks.
 
It doesn't matter what guff MS come out with regarding performance difference because the message that the PS4 is more powerful is out there in the public conciousness, the toothpaste has been squeezed out of the tube, and MS can't put it back.
 
This is the other point that hasn't been made, which is that MS developers are either less talented or don't care as much about pushing the hardware, with the major exception of Corrine Yu's team at 343i. Halo 4 is in a league of its own amongst 360 exclusives. This will also skew the visuals, so that on top of having a more powerful console, SWWS teams are also spending more time and effort on visuals.

You also have to remember that MS exclusives like Halo or Gears feature local split screen and online campaign co-op, while Sony exclusives like God of War, Uncharted, TLoU, Infamous, etc are single player only. I wouldn't say MS didn't care about visuals, they just focused on other things.
 

Finalizer

Member
Why do you assume the only difference will be in resolution and not framerate? It happened this generation, it could happen again.

That was just an example. My point was, getting down to what the hardware differences could represent is a better discussion than 30% vs 40% vs 50% because those percentages don't really mean anything.
 

dsigns1

Member
That's off topic, but the world doesn't work like that. I personally feel the PS4 is cheap rather than the XB1 is expensive.

Why does an iPad mini cost more than the much better specced Nexus 7?

You just made my point for me. Apple sells their products at a premium. People have looked at their quality, ecosystem and apps and have been satisfied thus people are willing to pay more. Microsoft hasnt shown anything with their system or with kinect that tells me I should pay the extra $100 for their product.
 
Dat balance. Coincidence that MS is using the same terminology as some of the more vocal fanboys / shills here on GAF? Even Leadbetter was talking about balance in one of his recent ramblings.

Sony has the easier to develop platform, better dev tools and more raw power. Balanced enough I would think.
 
Dat balance. Coincidence that MS is using the same terminology as some of the more vocal fanboys / shills here on GAF? Even Leadbetter was talking about balance in one of his recent ramblings.

Sony has the easier to develop platform, better dev tools and more raw power. Balanced enough I would think.
Exactly. My washing machine is balanced. Doesn't mean I want to play games on it.
 
I wanna know what isn't balanced about ps4? Does microsoft just mean their gpu/memory bandwidth is in line with the low clocked cpu?
 

Ranger X

Member
Well, this is pretty interesting because if Microsoft feel the need to "damage control" an Edge article, its because they did touch a part of the truth. I mean, if the Xbone wasn't disadvantaged, the article would not exist and Microsoft wouldn't feel the need to damage control.

So yeah, thanks Microsoft for confirming your weaker machine. We'll indeed let the games speak for itself.
 

Caronte

Member
That was just an example. My point was, getting down to what the hardware differences could represent is a better discussion than 30% vs 40% vs 50% because those percentages don't really mean anything.

You said that the average player would not notice a lower resolution, so the difference in specs wouldn't matter. My point was that the difference sometimes might affect gameplay (framerate) and of course they would notice and care about that.
 

Jedi2016

Member
Well, this is pretty interesting because if Microsoft feel the need to "damage control" an Edge article, its because they did touch a part of the truth. I mean, if the Xbone wasn't disadvantaged, the article would not exist and Microsoft wouldn't feel the need to damage control.

So yeah, thanks Microsoft for confirming your weaker machine. We'll indeed let the games speak for itself.
I was quick to notice that they're not disputing the numbers, they're just trying to downplay the difference, saying "it won't matter". Which means that the numbers are probably accurate. What remains to be seen is whether or not the visual differences will end up being that noticeable.
 

Snubbers

Member
Folks get way too hung up on percentage differences, like whether it's 20, 30, 40, or 50%, and end up having this notion that at some arbitrary percentage point, the power difference somehow becomes "significant." Really, folks are blowing the numbers out of proportion and getting too hung up on specific numbers.

To illustrate this point, let's compare some PC GPUs that actually have a significant gap in performance, where there would be a huge leap in resolution, IQ, and fidelity between them. For this example, take the fairly pitiful HD7670, with a meager 768 GFLOPS, and compare it to the comparatively monstrous HD7990, still today a beast of a card with a whopping 3891x2 GFLOPS under its belt.

This represents a truly massive power differential. This is where you'd expect to drop resolution, IQ, fidelity, everything just to get a decent framerate on the weaker hardware. And it makes sense - in this case, there is a 1013% difference in GFLOP capability between the cards. That's a level of difference where you can expect to see something that arguably looks like a generational leap, something that looks like Xbox vs. PS2, maybe more.

Now look back at the Xbone vs. PS4. 50%, what does that amount to again? What does 20, 30, 40% actually result in? Probably not as much as folks have in their minds. Yeah, it's a difference; yeah, first party titles will show the difference when they fully exploit each system's power. But multiplats? If there's a difference, don't get yourself worked up into believing it's going to be this huge gulf. Some parts could get trimmed down going from PS4 to Xbone, sure, but getting all worked up over how much will actually come of a ~50% or less difference starts to get silly.

Just a general observation I'm throwing out there, take it for what y'all will.

Edge went all out to say that the PS4 is 100% more powerful then the XB1, that is, judging their 'sources' headline grabbing stats..
PS4 = 1920 * 1080 @ 30 = 62208000 pixels/sec
XB1 = 1600 * 900 @ 20 = 28800000 pixels/sec

That's definitely in the 'moderately' noticeable..

IMO, Edge's article is just click bait, and people are lapping it up despite the context of any numbers at those levels being as useful as a chocolate fireguard, but looking at threads about it, they clearly know theirs a rather substantial bandwagon that'll respond to such silly mis-representation..

I could be totally wrong, I don't really care too much which is more powerful, I expended 0 effort in lambasting Sony for their underperforming multi-plats, and I'll do the same for MS..
 

Finalizer

Member
You said that the average player would not notice a lower resolution, so the difference in specs wouldn't matter. My point was that the difference sometimes might affect gameplay (framerate) and of course they would notice and care about that.

Nonono, you're misunderstanding me. That whole thing there was merely an example of discussion; the whole point was to say "there's more discussion there when you start talking about feasible graphical differences that you can realistically expect from the hardware difference than quibbling over percentage numbers. (e.g. "Well there might be a 50% difference in hardware but it'll only amount to 30% in games." [what the hell does that even mean?])
 
I've said it before, but I'll say it again. Some people are going to be extremely disappointed when the Xbox ends up not being as weak as they want it to be.

Based on the games shown so far, the Xbox One has very little to worry about. Forza 5 looks incredible, Ryse looks incredible, Killer Instinct looks incredible. Dead Rising 3 is pretty much a carry over from a last gen game, and even that looks pretty impressive and is doing a lot of nice things technically. Quantum Break clearly looks like it's going to be a stunner also. Well, you get the idea. :)
 

The Flash

Banned
I've said it before, but I'll say it again. Some people are going to be extremely disappointed when the Xbox ends up not being as weak as they want it to be.

Based on the games shown so far, the Xbox One has very little to worry about. Forza 5 looks incredible, Ryse looks incredible, Killer Instinct looks incredible. Dead Rising 3 is pretty much a carry over from a last gen game, and even that looks pretty impressive and is doing a lot of nice things technically. Quantum Break clearly looks like it's going to be a stunner also. Well, you get the idea. :)

I agree. The games will look fine. I think at this point people are just arguing over HOW good the games will look.
 
Status
Not open for further replies.
Top Bottom