• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Microsoft: "We purposefully did not target the highest end graphics"

jaypah

Member
#TeamCG wins.

That said, isn't it the same thing most game do today anyways for non-interactive cutscenes?

Sure, but as a consumer expected to drop hundreds of bucks for this stuff you expect to see realtime graphics. It's not like I'm paying with "in engine" money, that's my realtime money that I worked semi-hard for.

I know, E3 will have realtime stuff but still.
 

gaming_noob

Member
Comparing Forza 5 to GT6 is not even apples to apples as you have a car game from a next gen system versus a current gen system. A better comparison would be Drive Club vs. Forza 5 to which Drive Club looks better.

Way too early to say which of Forza 5 or Drive Club looks better.
 

Brashnir

Member
but offloading things like parts of environments or peripheral players, npcs and whatnot, that are not instant game-result critical can leverage power can it not, in your estimation?

If that means even a 10-15% increase over what is possible with just the hardware alone would that not make the system more powerful in general when utilizing that compute power?

I'm not sure you'd get a whole lot from that stuff, but yeah, it might help some.

It will also cause your game to run like complete shit if your internet connection isn't there or you hit a crappy backbone somewhere on the internet.
 

8GB GDDR5

Neo Member
This post is just... lol.

Man you know so much, looks like Sony should have hired you instead of myriad of engineers with multiple disciplines to develop for PS4.

Clutch at dem straws.

It's a fact. Ram isn't going to help you push more polygons, that's still dependent on the GPU.
 

USC-fan

Banned
but offloading things like parts of environments or peripheral players, npcs and whatnot, that are not instant game-result critical can leverage power can it not, in your estimation?

If that means even a 10-15% increase over what is possible with just the hardware alone would that not make the system more powerful in general when utilizing that compute power?

Debate is just silly. Using Gaikai you can off load the whole game to the cloud.
 

8GB GDDR5

Neo Member
The specs are all already confirmed.

18CU vs 12CU at the same clock is a 50% advantage.

That gap is a bit larger than 7770 to 7850.

I believe the 7770 actually runs at 1000mhz, so I wouldn't be surprised if it's just a 7770 underclocked. If it is the PS4 will be much more powerful. A 7770 struggles to max games at 1080p, while a 7850 will max almost every game at 1080 fairly easily. This is on PC of course, consoles usually have less overhead.
 

i-Lo

Member
It's a fact. Ram isn't going to help you push more polygons, that's still dependent on the GPU.

You should either:

1. Be more informed and think beyond just polygons (GPUs in both PS4 and XB1 are rumoured to have the same poly pushing limit)

or

2. Refrain from talking if you can't do number 1

or

3. Keep typing and we all will be entertained
 
Onlive also suffers from significant lag, and doesn't have to sync with a local machine doing most of the heavy lifting. Find me a single Onlive server that takes a control input, renders the next frame based on that, and returns the frame within 16.7ms. Hell, find one that does it in 33ms. I'll be waiting.

what are you talking about?

given most hdtv's that people play consoles on are loaded with picture processing lag, onlive on a low latency tn panel pc monitor (eg, common ones) could easily have less real lag than most people's console setups.

you're also saying gakai is useless, btw.

given the right setup, lag in a cloud gaming solution is manageable and playable. you have to keep in mind that when you press a button in halo or killzone, it's 100-133ms before the response, not 16.7, (see digital foundry's testing for details) so hiding more latency in that framework is a different ballgame. and i think nvidia has done a bunch of pr slides how they can get it really low.
 

8GB GDDR5

Neo Member
You should either:

1. Be more informed and think beyond just polygons (GPUs in both PS4 and XB1 are rumoured to have the same poly pushing limit)

or

2. Refrain from talking if you can't do number 1

or

3. Keep typing and we all will be entertained

Please inform me then. How will 8GB GDDR5 help?
 

Klocker

Member
I'm not sure you'd get a whole lot from that stuff, but yeah, it might help some.

It will also cause your game to run like complete shit if your internet connection isn't there or you hit a crappy backbone somewhere on the internet.

I understand... bkilian who was at Ms during the planning (and quit) hinted but never confirmed this cloud computing strategy, is talking today a bit now and firmly appears to believe that this is a factor for games and believes it is the main reason they were ok with the GPU they chose.

Also... It appears the ESRAM is possibly 6t as well and they mentioned 5 billion transistors and 200GB/s+ bandwidth and doing the math that was the only way they have been able to get it to add up with transistor size and that the ESRAM is possible faster than originally leaked.


EDIT: oh and the cloud computing doesn't have to make a difference this year or even next but three to four years form now... quite probably will increase the "power" of the box

one of bkilian's comments
 

spannicus

Member
Forza really didnt look anywhere near mind blowing at all, yeah no gameplay was shown but it looked very current gen to me. The best thing that I saw at the event was the fish and coral scene from CoD Ghosts and it looks like the Wii U can pull that off.
 

Brashnir

Member
what are you talking about?

given most hdtv's that people play consoles on are loaded with picture processing lag, onlive on a low latency tn panel pc monitor (eg, common ones) could easily have less real lag than most people's console setups.

you're also saying gakai is useless, btw.

given the right setup, lag in a cloud gaming solution is manageable and playable. you have to keep in mind that when you press a button in halo or killzone, it's 100-133ms before the response, not 16.7, (see digital foundry's testing for details) so hiding more latency in that framework is a different ballgame. and i think nvidia has done a bunch of pr slides how they can get it really low.

I haven't played Gaikai, but Onlive is pretty useless, yeah. Looks like crap and it lags.
 
The specs are all already confirmed.

18CU vs 12CU at the same clock is a 50% advantage.

That gap is a bit larger than 7770 to 7850.



What the hell? That's insane. Here's hoping multiplatform games will be easily scaleable and they won't enforce visual parity.

But maybe MS was already expecting their games to look different, hence the quote?
 

Brashnir

Member
I understand... bkilian who was at Ms during the planning (and quit) hinted but never confirmed this cloud computing strategy, is talking today a bit now and firmly appears to believe that this is a factor for games and believes it is the main reason they were ok with the GPU they chose.

Also... It appears the ESRAM is possibly 6t as well and they mentioned 5 billion transistors and 200GB/s+ bandwidth and doing the math that was the only way they have been able to get it to add up with transistor size and that the ESRAM is possible faster than originally leaked.

That's possible. I'm pretty wary of anything that introduces internet lag into multiplayer games, though.

For what it's worth, I still don't see the power disparity between X1 and PS4 to be a big deal. It's less than we've dealt with in other console generations. (and I'm not talking about Wii.) I have much bigger problems with Microsoft's business strategy with this machine than I do the hardware. The hardware is perfectly adequate to compete with PS4.
 

nib95

Banned
Please inform me then. How will 8GB GDDR5 help?

It will help on more extreme graphics features and processes, even with more advanced forms of AA implementation (which require more bandwidth and more ram) and actually allow you to access more of the total ram per frame.
 

Godslay

Banned
It will help on more extreme graphics features and processes, even with more advanced forms of AA implementation (which require more bandwidth and more ram) and actually allow you to access more of the total ram per frame.

But will 3rd parties even care?
 

8GB GDDR5

Neo Member
It will help on more extreme graphics features and processes, even with more advanced forms of AA implementation (which require more bandwidth and more ram) and actually allow you to access more of the total ram per frame.

So just better AA? Hardly a killer feature. Anything over 4gb is overkill for gaming on PC, I really doubt the PS4 will fundamently change how devs make games. In fact it seems like PCs and consoles are getting closer and closer.
 

Seanspeed

Banned
Comparing Forza 5 to GT6 is not even apples to apples as you have a car game from a next gen system versus a current gen system. A better comparison would be Drive Club vs. Forza 5 to which Drive Club looks better.
Yup, Drive Club is the one you want to compare it to. I disagree that Drive Club looks better at all, but whats worth taking away is that they are both distinctively next-gen. People here claiming that its just Xbox360 1.5 are kidding themselves. This is still a powerful machine.
 

Scoops

Banned
All of the big three are trying to get as many people possible to buy their console. That's natural.

They all have different strategies though:

Sony: Have the best graphics and most core gamer centric experience. Downside is price and there are only so many "core gamers" out there.

Microsoft: Try to make the console more then just a gaming machine and attract people not necessarily interested in the gaming aspect but the media aspect of the console. Downside is few people want to pay the price of the console for the media side and well you alienate gamers ie. what Gaf has been today.

Nintendo: Try to make more gamers ie. Wii Sports/Play/Fit bringing new people into the videogame marketplace such as seniors, soccer moms and little kids. Downside is these new "gamers" tend to not buy too many games and aren't loyal customers.
 

nib95

Banned
So just better AA? Hardly a killer feature. Anything over 4gb is overkill for gaming on PC, I really doubt the PS4 will fundamently change how devs make games. In fact it seems like PCs and consoles are getting closer and closer.

Of course not just better AA. Every new next gen graphical feature, either previously used or yet to be used, is going to be more bandwidth and memory intensive. And anything over 4GB is only overkill because PC's were largely just getting previous gen ports, which were hardly pushing the boat. You wait till we get in to the swing of this new generation and then start telling me about how much ram PC GPU's are hogging.
 

Lord Error

Insane For Sony
It will most likely be small things like AA, and maybe the PS4 version running at full 1080p while the X1 version is some messed up resolution like 1280x1080. Sort of like we saw with PS2 and Xbox, only the difference is smaller.
Stuff like that, yes, but even more likely better framerate (more stable, less frame drops, not necessarily one running at 60FPS and the other at 30). I think the difference like that is practically inevitable unless games run at locked framerate to begin with, which is rarely the case.
 

nib95

Banned
But will 3rd parties even care?

Define care? I think they'll use the added bandwidth and computational advantage of the PS4 to at least bump a few things up. AA, AF, SSAO, shadow quality, post processing, depth of field etc. Or perhaps even 720p vs 1080p, slightly better frame rates etc. There's no reason why they can't.
 
Top Bottom