• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Now that the bath is cold, what could Mark Cerny have done better designing the PS4?

When demonstrating a problem requires you to interact with a product in a way that normal use never would uncover, that isn't a problem unless you do something that nobody would actually do during normal use, yes you're a goofball and it's not shit design.

Business grade laptops can lock up if you jam your hand into the docking station port while they're powered on but who actually does that?

I can see how doing something stupid is analogous to expecting 4 corners of a PS4 to touch the surface you put it on.
 
The Intel CPU was a standard mobile version of the Pentium 3. it was completely off the shelf.



I suspect that whatever their plans were for the background ARM processor have not worked out as expected. Whether that is just a software challenge far larger than anticipated or there is actually a hardware flaw is impossible to tell.

Gonna quote myself:

I think Sony as a whole got caught out on when a reliable version of Suspend/Resume would be implemented in FreeBSD, and therefore how easy it would be to port into Orbis.

https://wiki.freebsd.org/SuspendResume
 
Yes, though NV2A as a whole wasn't an off-the-shelf GPU. It was sort of sitting between a GeForce 3 and a GeForce 4. And even if we look at presence of individual modules, that vertex shader configuration (among other things) wasn't "off the shelf" at the time of launch.

That's true.
 
Hmm i could list a few

# Better design of airvents
# Stop using that glossy plastic shit for the case that attracts dust
# Silent fan instead of the 747 engine they installed
# More than 2 usb ports
# an actual button instead of a touch button for the power and eject
# a redesign of the ds4 parts so the thumbstick does not fall apart and the triggers dont stick after a short amount of time
# Touchpad has not really added anything ...get rid of it
# either give the option of allowing us to use batteries or give a larger capacity battery
# No massive led light on the control...really annoying in the dark

on the OS side i only have one wish.....tell the fucking coders to change the party notification pop ups to when a friend comes online....i dont want to know every time one of my Friends joins a party....its a pretty basic feature and it boggles me as to why its nearly a year and we dont have it
 
Fitting that much tech into a $400 box that mostly works most of the time is a monumental achievement. Sony as a company still needs to do better though. They have taken the lead in this generation and are already behaving like they behaved when the PS2 was such a behemoth. It's going to bite them in the ass.
 
Did Cerny decide on the material for the analog sticks? Because whoever did that made a huge mistake on the most important part of the console, the controller.
 
Hmm i could list a few

# Better design of airvents
# Stop using that glossy plastic shit for the case that attracts dust
# Silent fan instead of the 747 engine they installed
# More than 2 usb ports
# an actual button instead of a touch button for the power and eject
# a redesign of the ds4 parts so the thumbstick does not fall apart and the triggers dont stick after a short amount of time
# Touchpad has not really added anything ...get rid of it
# either give the option of allowing us to use batteries or give a larger capacity battery
# No massive led light on the control...really annoying in the dark

on the OS side i only have one wish.....tell the fucking coders to change the party notification pop ups to when a friend comes online....i dont want to know every time one of my Friends joins a party....its a pretty basic feature and it boggles me as to why its nearly a year and we dont have it

Its almost like a OP doesn't exist...

OT - For the budget if possible better CPU, or higher clocked cores since Jaguar is said to go upwards of 2.0Ghz. Also I guess better GPU.
 
not sure if he is responsible for the controllers, but someone needs to be fired for the awful battery life, coming off the 12+ hours of the DS3 to THIS.

Also the lightbar.

Only complaint of the hardware
 
What would people do with extra usb ports? I'm perfectly happy with just one. I just use for charging the controller anyway :P
 
Fitting that much tech into a $400 box that mostly works most of the time is a monumental achievement. Sony as a company still needs to do better though. They have taken the lead in this generation and are already behaving like they behaved when the PS2 was such a behemoth. It's going to bite them in the ass.

VLHns2S.jpg


?

Is this a new iteration of "arrogant Sony" or am I missing something?
 
I've seen this mentioned only 3 times in this thread so far, and while many address things that are truly really bothersome - like loud fans, glossy plastic, touchpad, etc - this is something that Cerny might actually have had some influence over.

I'd love to have seen a fast SATA III (6 gigabit per second) interface instead of SATA II (3 gbps). We were already on launch day seeing bottleneck tendencies when using SSD drives and this does not bode well for the future.

There are some conflicting reports that there might actually be SATA III in the machine (because we don't know for certain as of now), but we aren't seeing that reflected in the testing. We should see much lower load times, but we don't. The little difference there is, can easily be explained by the SSD drives "naturally" higher seek speeds.

Imagine the standard PS4 with built in SSD and three times faster load times. The difference it could make for games that quickly need to load or continously swap new content. Instead we're seeing last gen load screens and more or less what we're used to, except with higher clarity. I'd love to see more than that. They chose disk space over speed.

Lets hope for a faster interface in the Playstation 4 slim, although games will never be designed with this in mind thanks to this original decision.




Quoting this guy over on GameFAQs:

"who had the bright idea to go with sata 2?

that died out 8 years ago? it has crap bandwidth. why would they make such a powerful system with lighting fast video card memory as its main memory amongst other things and use freaking sata 2? everything uses sata 3 these days. heck, i bet your cable box does as well. i would have loved to be in that meeting.

this will limit performance with ssds and just limit streaming performance, period. does anyone know why they would do this? really the only thing i can think of that maybe makes a little sense is sata 3 with its better transfer speeds causes more heat?

i have no idea why this is not talked about more. everyone assumed (as they should) ps4 would be sata 3. i only found out today doing a google search on detailed ps4 specs."

Source
 
From what I have seen I thought the ps4 was designed so the gpu/cpu does not bottleneck each other to much. Am I missing something to say that if you use PC hardware as an example you can make sure that the cpu/gpu never bottleneck each other.

Im guessing certain games can then be to gpu or cpu intensive which means you would need to upgrade one of you components. This can go on forever really I think its down to game dev's to make sure there games don't do this in the end. You have fixed hardware you know the spec's so why would you design a game that would bottleneck a cpu or gpu.

I understand the first wave of games devs made may not of had locked down specs info so could not design to it but surely now there is no excuse. You can say yes the gpu or cpu will affect creative design but hasn't every gen had this problem if the dev is to ambitious, so what kind of hardware will make sure ambitious dev's will never have a problem with any component. Then think of the price for said hardware.

sorry to rant a bit I feel it was OT. I would of happily paid more for better hardware but there does need to be a sweet spot for price for every one. From what we have seen from sales so far it seems it is at that.
For me I have yet to see a problem from a hardware point of view from what cerny done from the first party games I have played.

All I think is the controller is poorly made but he didn't have anything to do with this. The wifi for me has been ok the speeds are up and down a bit but I have a fast connection and most of the time the download times are decent.
Edit: forgot to say fuck glossy plastic to, whoever thought it's a good idea to have it on a console should be made to have there entire house floors made of it for the rest of there lives.
 
Heheh, Sony wasn't all that careful about establishing terms to keep their partners from selling Cell-related designs to their competitors.

Ah, I just saw your response.

I'm honestly not sure how or why Microsoft suddenly jumped on the same Jaguar architecture for the X1. They (MS) started their R&D years after (effectively at the start of 2011) Sony started R&D on the PS4 (effectively in 2008).

PS4 R&D:
http://www.theguardian.com/technology/gamesblog/2013/jul/15/ps4-develop-2013-playstation-sony

http://www.gamasutra.com/view/news/189368/Inside_the_PlayStation_4_with_Mark_Cerny.php

X1 R&D:
http://www.gamespot.com/articles/xbox-one-development-began-in-late-2010/1100-6409070/

If you are doing an iterative development, then late 2010 would be understandable.

If you are doing a new development from scratch, then you need to get the project and budget approved and then get your team together after the budget allocation and that alone eats up at least 3 months at any large corporation.

So if you take MS's late start (regardless of whether they start in late 2010 or effectively 2011), why did they suddenly decide to jump on the same Jaguar SOC? The coincidence just feels rather unnatural and really unnerving when you consider the corporate espionage shenanigan that MS pulled on the PS3.
 
So if you take MS's late start (regardless of whether they start in late 2010 or effectively 2011), why did they suddenly decide to jump on the same Jaguar SOC? The coincidence just feels rather unnatural and really unnerving when you consider the corporate espionage shenanigan that MS pulled on the PS3.
Maybe. It was a situation where APUs were attractive for some fairly general reasons, and AMD was the only design house that could provide an APU suitable for a reasonably capable home console in 2014. So right from the start you have reason for a lot of the same people to work on both APUs. Then, Sony and Microsoft would have both been power-conscious and (following the lessons of the previous gen) interested in fairly "general-purpose" CPUs, so the most suitable AMD CPU building block would have been 4-core Jaguar clusters. As for the GPU, GCN is everything, so go figure.

I suppose it's sort of weird when you observe that the APU dies are almost the exact same size, but it all sort of makes sense. There's not necessarily any truly weird business happening, especially as neither Sony nor Microsoft have a stake in the base designs for the AMD modules.
 
Maybe. It was a situation where APUs were attractive for some fairly general reasons, and AMD was the only design house that could provide an APU suitable for a reasonably capable home console in 2014. So right from the start you have reason for a lot of the same people to work on both APUs. Then, Sony and Microsoft would have both been power-conscious and (following the lessons of the previous gen) interested in fairly "general-purpose" CPUs, so the most suitable AMD CPU building block would have been 4-core Jaguar clusters. As for the GPU, GCN is everything, so go figure.

I suppose it's sort of weird when you observe that the APU dies are almost the exact same size, but it all sort of makes sense. There's not necessarily any truly weird business happening, especially as neither Sony nor Microsoft have a stake in the base designs for the AMD modules.

The thing is that AMD isn't the only game in town that has an APU design. Intel and NVidia have one as well, known as Intel HD and Project Denver respectively. The NVidia model not being anywhere near mature is understandable but that still leaves Intel HD as being a viable alternative.
https://en.wikipedia.org/wiki/Intel_HD_and_Iris_Graphics
 
For the price it launched at? Nothing really other than perhaps a couple of USB ports on the back. These can be used for external storage like the Xbox One in future or any potential peripherals, with out dangling from the front

Otherwise, perhaps if Cell was cheap enough, use that instead of the secondary processor and ram they use for recording gameplay/background tasks
 
8th generation console, two poorly accessible USB ports on the front, none in the back. And there's clearly room for some USB ports in the back.

But here are my two personal pet peeves with the PS4:
  1. No proper TV style remote for media control. I mean, the PS2 had a great remote. The PS3 had two excellent remotes. The PS4? Nothing.
  2. Vertical stand is ugly as sin. It is clearly an afterthought and a year on the market has yet to produce an aesthetically pleasing vertical stand.
 
The thing is that AMD isn't the only game in town that has an APU design. Intel and NVidia have one as well, known as Intel HD and Project Denver respectively. The NVidia model not being anywhere near mature is understandable but that still leaves Intel HD as being a viable alternative.
Sort of. In 2013 their iGPU showing was looking very strong for what it was, but it still hadn't been scaled to the level of the GPUs that Microsoft and Sony desired (Iris Pro 5200 is quite a ways behind a PS4), and Intel probably isn't as aggressively price-competitive in the console space as AMD.
 
Nothing on that which you quoted remotely supports your conclusion. It was the opposite. It was all about GDDR5 being easier to develop for and nothing more.

My apologies. I read your EDRAM at 1 GB/s statement and my brain violently rejected everything else in your post.

Even then, your attribution isn't entirely correct (you were 3 orders of magnitude off).

Aside from that, I believe that we are actually on the same page.

Sort of. In 2013 their iGPU showing was looking very strong for what it was, but it still hadn't been scaled to the level of the GPUs that Microsoft and Sony desired (Iris Pro 5200 is quite a ways behind a PS4), and Intel probably isn't as aggressively price-competitive in the console space as AMD.

True. Thing is, even without an APU solution, MS could have continued with a relatively standard configuration of separate CPU/GPUs. I don't understand what made them decide to invest in an APU solution instead of something more traditional. Did the whole RROD fiasco scare them witless from ever pursuing a sensibly designed traditional CPU/GPU configuration?
 
The reason why they ditched eDRAM was because you would have split memory, smaller bit bus, lower bandwidth, and then it wouldn't be any different from a PC.

Show me a PC with 1TB of bandwidth for the GPU?

The bandwidth would have been a potentially significant performance advantage.

I see the PS4 as a low risk design (complete opposite to PS3) and they may take more risk with PS5.
 
that still leaves Intel HD as being a viable alternative.
https://en.wikipedia.org/wiki/Intel_HD_and_Iris_Graphics

Nope. If you knew anything about the margins Intel commands, then you wouldn't consider them. Also intel does not do custom design while AMD does.

True. Thing is, even without an APU solution, MS could have continued with a relatively standard configuration of separate CPU/GPUs. I don't understand what made them decide to invest in an APU solution instead of something more traditional.

Cheaper integration costs, cheaper component costs, easier process node transitions, easier console design.
 
Bigger case/fans. I thought my PS4 was broke, got a new one and it's still loud as hell. I got it in an open, stable space, tried it on it's side and flat, etc all that.
 
Step 1: Don't have a fucking giant light that the player can't even see on the back of your controller.

It would be fine if you could turn the bloody thing off and not just dim it. I don't understand why they don't allow that option?

Also can you turn the controller off during video playback? I've tried and can't seem to find a way to do it as the controller is tied to your account isn't it?

Anyway this is a bit off topic, sorry.
 
My apologies. I read your EDRAM at 1 GB/s statement and my brain violently rejected everything else in your post.

Even then, your attribution isn't entirely correct (you were 3 orders of magnitude off).

Aside from that, I believe that we are actually on the same page.



True. Thing is, even without an APU solution, MS could have continued with a relatively standard configuration of separate CPU/GPUs. I don't understand what made them decide to invest in an APU solution instead of something more traditional. Did the whole RROD fiasco scare them witless from ever pursuing a sensibly designed traditional CPU/GPU configuration?

Haha my fault. It's supposed to be 1TB and for some reason I got stuck typing GB's. My apologies for the confusion.
 
Show me a PC with 1TB of bandwidth for the GPU?

The bandwidth would have been a potentially significant performance advantage.

Absolutely. A PS4 with EDRAM could easily run 1080p games with tons of dynamic light sources using deferred methods at 60 fps with no slowdown. It's a missed opportunity.
 
Not having a simple IR port for remotes is inexcusable in the age of Harmony remotes. Especially when they are pushing the "media box" aspect of the console.

Having to use the DS4 to watch a Blu-Ray makes me feel like a caveman
 
Gpu is great, ram is great, cpu is awful. It would have made ps4 much more expensive but a stronger cpu would have been much appreciated.
 
It's a shame to think that the shitty barren dashboard is taking up 3Gb of RAM.

i don't think it does, iirc sony is sitting on 1g of ram for future-proofing. dunno if they're doing it even with the cpu and i don't know if they can upclock it
 
4 USB is the only reasonable issue I have with it. It could have had a better CPU but probably not at that price point. Is my PC's CPU better? Yes. Did it cost 3/4ths of what the PS4 costs? Yes. I'm sure Sony walked a reasonable balance on CPU/GPU. And considering the number of devs ignoring the much better GPU I'm not sure a better CPU would have yielded much difference

Why are you people even using the touch buttons, use your controller to turn it off and on.
 
4x Quad Core Jaguar (instead of 2) at 2Ghz
24 Compute Units (instead of 18)
6 GB of GDDR5 free for games. Or more. Instead of 5 or whatever that they're using now.
 
Top Bottom