• Register
  • TOS
  • Privacy
  • @NeoGAF

phosphor112
Banned
(08-02-2013, 09:42 PM)
phosphor112's Avatar
Thuway and BruceLee sources also said 200mhz overclock.

They're getting mixed stuff.

CLEARLY neither of those are right, but both have had a decent track record. I wouldn't try to discredit them if I were you.
cammy2
Banned
(08-02-2013, 09:42 PM)
Cboat confirmed that the downclocking was true?

http://www.neogaf.com/forum/showpost...postcount=1296
Majanew
Banned
(08-02-2013, 09:42 PM)
Majanew's Avatar

Originally Posted by mckmas8808

Well can you link us lol?

http://www.neogaf.com/forum/showthre...ight=wii+u+gpu

The last few pages have them talking about it possibly being 176 GFLOPS
i-Lo
Member
(08-02-2013, 09:44 PM)
i-Lo's Avatar

Originally Posted by USC-fan

Nope. 176 gflops.

Wut?

I was assured by Nintendo fans that it was at least 600GFLOPs a few months back. Wut happond?
Andronicus
Member
(08-02-2013, 09:44 PM)
Andronicus's Avatar
So someone explain, microsoft announced this because something negative is coming next week???? Df are yall talking about
Vashetti
Member
(08-02-2013, 09:44 PM)
Vashetti's Avatar

Originally Posted by Majanew

Nope. 352 GFLOPS.

"next gen"
Minions
Member
(08-02-2013, 09:45 PM)
Minions's Avatar

Originally Posted by cammy2

Cboat confirmed that the downclocking was true?

http://www.neogaf.com/forum/showpost...postcount=1296

To my knowledge he only confirmed Yield issues, you would have to ask the person who posted that directly if you want an answer.
phosphor112
Banned
(08-02-2013, 09:47 PM)
phosphor112's Avatar

Originally Posted by cammy2

Cboat confirmed that the downclocking was true?

http://www.neogaf.com/forum/showpost...postcount=1296

No, he confirmed eSRAM issues, NOT downclock and he NEVER has.

Get with it.
statham
be hot
be naughty
be Xbox
(08-02-2013, 09:47 PM)
statham's Avatar

Originally Posted by cammy2

Cboat confirmed that the downclocking was true?

http://www.neogaf.com/forum/showpost...postcount=1296

off site? PMs I guess.
cammy2
Banned
(08-02-2013, 09:49 PM)

Originally Posted by phosphor112

No, he confirmed eSRAM issues, NOT downclock and he NEVER has.

Get with it.

Not according to what i linked. He confirmed it off site apparently
BleachAndPepsi
Banned
(08-02-2013, 09:55 PM)

Originally Posted by Andronicus

So someone explain, microsoft announced this because something negative is coming next week???? Df are yall talking about

I haven't read all 29 pages (got thru maybe 6 and skipped to this last one). Who said bad news is coming?
EuropeOG
Member
(08-02-2013, 09:55 PM)
EuropeOG's Avatar

Originally Posted by farisr

I don't know what gameplay vid you saw, but that's how they finished the e3 gameplay vid they put out, by doing that attack.

inFAMOUS Second Son Gameplay (timed just before he does the move)

Man, fuck the graphics, it's all about them smooth animations and transitions.

Originally Posted by USC-fan

Its from the latta thread. It from our best people.

That is worse than current gen. Seems like bullshit in that case.
Albert Penello
MS Director of Product Planning
"Now More Direct than ever!"
(08-02-2013, 10:04 PM)
Albert Penello's Avatar
Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.
SpinningBirdKick
Member
(08-02-2013, 10:05 PM)
SpinningBirdKick's Avatar
I can see that this bump in specifications is going to be very beneficial for Microsoft's first party titles.

Polished turds, lipstick on pigs etc etc
Team Vernia
(08-02-2013, 10:05 PM)
Team Vernia's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Whisper quiet. So nice. Can't wait.
Dictator93
Member
(08-02-2013, 10:06 PM)
Dictator93's Avatar

Originally Posted by crazy buttocks on a train

yes ud lov that, wuld..nt uyou kekeke salti handegg cal me out f44 what? hold your L &

stay Mad like mikelsen


durn rite. 6day za go. ~~~200 pipdream fo4r mad men, l


sory no

HE HATH SPOKEN
SenjutsuSage
(08-02-2013, 10:06 PM)
SenjutsuSage's Avatar

Originally Posted by AgentP

Don't be naive, MS didn't disclose a history, just two numbers. As for esram vs GPU people just blur the line, the rumor was about esram yields and the down clock talk was in the context of those yields. There was plenty of talk of >1000 MHz GPU clock from wannabe insiders at B3d, so I guess this puts a death nail in that secret sauce talk.

Fact is we do have a history, whether you choose to acknowledge it or not. It went from 800MHZ to 853MHZ. Just because it comes from Microsoft doesn't somehow make it no good. They confirmed initial clock speed was 800MHZ, and then they said it was overclocked to 853MHZ. A very simple and small history, but a history nonetheless.

The biggest talk in that thread was regarding the possible downclock of the GPU as a direct result of ESRAM yield issues. A GPU downclock was the "main event" as they say. No point trying to rewrite history at this point, because it's way too well documented. None of the 1000MHZ GPU rumors got anywhere the kind of attention that a potential downclock of the GPU did. I agree with one thing, however, a death nail certainly went into something today, but it was a nail in the downclock rumors people thought were true, the same ones they were saying were somehow further confirmed by a eurogamer article that did more to discredit any suggestion of a downclock than it did to somehow lend credence to them.
ZiggyRoXx
Banned
(08-02-2013, 10:06 PM)
Sigh...If only MS hadn't wasted so much of the BOM budget on Kinnect and instead used it to put a decent GPU in there to begin with.
statham
be hot
be naughty
be Xbox
(08-02-2013, 10:07 PM)
statham's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Thanks for posting.
Taiser
Member
(08-02-2013, 10:08 PM)
Taiser's Avatar
but it´s still just a 1.2 Tflop machine given that 10% of the GPU are reserved for the OS.
udiie
Member
(08-02-2013, 10:09 PM)
udiie's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Good to see you stopping by regardless of the criticism.
CambriaRising
Member
(08-02-2013, 10:10 PM)
CambriaRising's Avatar

Originally Posted by Albert Penello

Hey guys

*Very good post*

Thank you for coming here and posting this. Very insightful information.
USC-fan
aka Kbsmoker
(08-02-2013, 10:10 PM)

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

thanks for the post.

Are the specs locked down now? Can you give us the final specs? Or are this the only spec that did change that been announced? Is the ram still 8GB?

thanks again.
Last edited by USC-fan; 08-02-2013 at 10:15 PM.
Munish23
Member
(08-02-2013, 10:10 PM)
Munish23's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Awesome, thanks.
cchum
Member
(08-02-2013, 10:11 PM)
cchum's Avatar

Originally Posted by Dictator93

HE HATH SPOKEN

hhahahaha...love it. So MS was planning on 1ghz, and didn't hit?<---I guess maybe not clockspeed, but gigaflops? Six days for what?

Edit: Wait, what is going on with the man? I'm so confused. Is he talking Gigaflops then?
Last edited by cchum; 08-02-2013 at 10:29 PM.
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(08-02-2013, 10:15 PM)
cyberheater's Avatar

Originally Posted by cchum

hhahahaha...love it. So MS was planning on 1ghz, and didn't hit? Six days for what?

How could you possibly come to that conclusion?
cchum
Member
(08-02-2013, 10:16 PM)
cchum's Avatar

Originally Posted by cyberheater

How could you possibly come to that conclusion?

Did I misread the buttocks?
phosphor112
Banned
(08-02-2013, 10:16 PM)
phosphor112's Avatar

Originally Posted by cammy2

Not according to what i linked. He confirmed it off site apparently

Missed that part I guess.
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(08-02-2013, 10:16 PM)
cyberheater's Avatar

Originally Posted by Taiser

but it´s still just a 1.2 Tflop machine given that 10% of the GPU are reserved for the OS.

I'm not sure this is even true anymore.
i-Lo
Member
(08-02-2013, 10:16 PM)
i-Lo's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Thanks.

Pertaining to 6%, something is better than nothing and in time, it raises the baseline regardless.

What bothers me is that decibel figures take precedence over performance. As an owner of 360, the noise has never bothered me (post game install) and so the only reason I can think of for this sort of conflict is because it's media box. Majority of gamers who own 360 or PS3 or both use them for more than gaming but most likely none of them would trade performance for the sake of shaving some decibels off the top.

Anyway, both the bed and bet has made and hedged. It's now all up to the next half a decade of software strategy.

EDIT: People use gaming devices for more than gaming. Mistyped earlier.
Last edited by i-Lo; 08-02-2013 at 10:19 PM.
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(08-02-2013, 10:17 PM)
cyberheater's Avatar

Originally Posted by cchum

Did I misread the buttocks?

I think some folks on here got some PM's that hinted a 200Mhz upgrade. The info was false. It didn't come from MS.
phosphor112
Banned
(08-02-2013, 10:18 PM)
phosphor112's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Interesting.
henhowc
(08-02-2013, 10:19 PM)
henhowc's Avatar
Has MS even ever official announced specs outside of the super general stuff they have listed on their website?
FranXico
Member
(08-02-2013, 10:20 PM)
FranXico's Avatar
DF has reported the upclock without too much hype, surprisingly.

EDIT: Now I saw how he links to the old hype for the ESRAM "discovery". Well, at least there isn't any exaggeration this time.
miDnIghtEr20C
Banned
(08-02-2013, 10:21 PM)

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Thanks for taking the time to post.
CouldBeWorse
Member
(08-02-2013, 10:21 PM)
CouldBeWorse's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

It's awesome to get these details directly from the horse's mouth, just like it's great to hear similar details from Cerny. Now if you and your team could just do a little better about getting these clarifying details out ahead of the message you could avoid somewhat toxic 30-page threads :)

Please communicate upwards what Sony already seems to mostly understand - getting details out in a direct manner helps keep the conversation focused on truths and helps control the message. You need to be here posting clarifications like this and you need your community team.

I think it's a lesson that's been learned in a VERY hard way, but the progress is encouraging and makes me feel better about picking up an Xbone, which is still frankly the one that is give up if I just had to pick one system (thankfully, I don't).
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(08-02-2013, 10:21 PM)
cyberheater's Avatar

Originally Posted by henhowc

Has MS even ever official announced specs outside of the super general stuff they have listed on their website?

Nope.
henhowc
(08-02-2013, 10:22 PM)
henhowc's Avatar

Originally Posted by FranXico

DF has reported the upclock without too much hype, surprisingly. That is the correct way to do these things, RL.

Well at least better than the other reporting they have been doing lately. :P
~~Hasan~~
Junior Member
(08-02-2013, 10:24 PM)
~~Hasan~~'s Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

not to sound rude or anything. but.. who are you ?

i just never seen your posts before and it seems some people here know you.
henhowc
(08-02-2013, 10:24 PM)
henhowc's Avatar

Originally Posted by cyberheater

Nope.

That's what I thought...guess that's part of the problem. People start thinking everything is a reaction. When something like this would take tons of testing to make sure it runs stable. Otherwise you have another RROD on your hands and then the shit will really hit the fan.

Originally Posted by ~~Hasan~~

not to sound rude or anything. but.. who are you ?

i just never seen your posts before and it seems some people here know you.

Seriously? lol Just Google Albert Penello...
Xbudz
Member
(08-02-2013, 10:24 PM)
Xbudz's Avatar

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Risk Breaker
Member
(08-02-2013, 10:26 PM)
Risk Breaker's Avatar

The Xbox One will have a custom AMD GPU that is supposedly comparable to the Radeon HD 7790 (according to Digital Foundry)

Hmmm.
SwiftDeath
Member
(08-02-2013, 10:26 PM)
SwiftDeath's Avatar
Now if we could only get the rest of the specs for both consoles

I think we are still missing something from Sony's side no?

Although significantly less obviously
ElektroDragon
Banned
(08-02-2013, 10:27 PM)

Originally Posted by jamesgriggs

My first computer ran at 25 MHz, and I could run Space Quest IV from it's single speed CD-ROM drive, but it was pretty choppy at times.

That's pretty funny seeing as how the same game on my diskette based Amiga 500 was super smooth at 7 MHz.
nick_622
Banned
(08-02-2013, 10:27 PM)

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Albert ... just wanted to give you a HUGE Thumbs Up for posting regularly and keeping us up to date. Hope there is more of this in the future from MS as we move forward!!
Albert Penello
MS Director of Product Planning
"Now More Direct than ever!"
(08-02-2013, 10:28 PM)
Albert Penello's Avatar

Originally Posted by i-Lo


What bothers me is that decibel figures take precedence over performance.

That's an interesting point, let me say two things.

First - I wouldn't jump directly to that conclusion. I said these things had to be in balance. There are other ways to get good acoustic performance.

I'm not saying you don't make tradeoffs against those things, but it wouldn't be correct in assuming we made noise a priority.

Second (and you corrected yourself) people use the box for a lot of media functions. I think it would take a beating if we were even close to the noise level of the 2005 360.

As I remind people sometimes. We have a console that is roughly 8x - 10x the performance of last gen, depending on how you define it. It's in a case that is only ~ 10% larger than the launch 360. And yet it's quieter than the 3rd major revision we did 7 years in.
Last edited by Albert Penello; 08-02-2013 at 10:30 PM. Reason: forgot to finish a sentence.
LeeFowler.CU
Banned
(08-02-2013, 10:29 PM)

Originally Posted by Taiser

but it´s still just a 1.2 Tflop machine given that 10% of the GPU are reserved for the OS.

Sony also reserves GPU and CPU for OS functions. Don't delude yourself as in the RAM situation.
GrizzledGrump
Banned
(08-02-2013, 10:30 PM)

Originally Posted by Albert Penello

Hey guys

Lots of interesting comments on the GPU upgrade.

Let me put this out there. I know there is doubt about the “truth” of what we say, so some of you will believe me, and others won’t. But here goes…

We set aggressive targets for reliability, performance, yields, and noise. Those things always have to be balanced. We want this box to have rock-solid reliability. We want it to be DEAD quiet (and let me tell you, X1 is quieter than the new Xbox 360 we just released). And we wanted killer game performance. But those targets are in conflict with each other.

What we’ve found through the development process is we were able to actually exceed our goals on the thermals and acoustics. This gave us headroom to increase the clock speed without any hit to noise, reliability, or heat, so we took the opportunity to bump the GPU. I get it’s only 6% or so, but that could translate to a few FPS in the real world.

I know there are many conspiracy theories out there about how and why we make decisions. I can tell you – this was something we were hoping to be able to do for a while. So we were prepared for this. Nobody should worry this puts us at any risk or people are scrambling at this decision.

Those of us with an ounce of common sense assumed everything you just said. But it's nice to have confirmation of the confirmation. :)
cyberheater
PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 XBOX PS4 PS4
(08-02-2013, 10:30 PM)
cyberheater's Avatar

Originally Posted by LeeFowler.CU

Sony also reserves GPU and CPU for OS functions. Don't delude yourself as in the RAM situation.

Have you got a link which show that PS4 also reserves a percentage of GPU power while a game is running?
phosphor112
Banned
(08-02-2013, 10:31 PM)
phosphor112's Avatar

Originally Posted by ~~Hasan~~

not to sound rude or anything. but.. who are you ?

i just never seen your posts before and it seems some people here know you.

Check his tag ;]

Thread Tools