• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.
Not to mention XDR2 doesn't actually exist. =P

I personally would love to see XDR2 in the PS4. It would be a logical step because PS2/PS3 had Rambus memory (RDRAM, XDR) and it is fast and efficient. Furthermore it can be stacked if I am not mistaken. Also Samsung is bailing out from GDDR5 - rumours had new ATI GPUs with XDR2 and the possibility of a PS4 might be enough production volume for Samsung, Hynix to licence the product instead of producing GDDR5 modules. The only problems I see is that Rambus is not really liked in the industry and the price - but with Sony going "all in" I think XDR2 would be the right way.
 

RoboPlato

I'd be in the dick
Remind me again why 1080P still won't be a standard and why we won't see more 60FPS games?

I bet 1080p will be more common than 720p was this generation, as will 60 FPS. Neither will be a standard though since devs will often want to push more effects.
 
720p@30fps. That will be the res. for majority of next gen titles. That's a bet.

if the rumors are correct about the PS4 using a 7970 level GPU, then 60 FPS @ 1080p would easily be done for most games. not many devs push the hardware. only a small few really aim to push the gfx envelope and power of todays GPUs are insane.
 
if the rumors are correct about the PS4 using a 7970 level GPU, then 60 FPS @ 1080p would easily be done for most games. not many devs push the hardware. only a small few really aim to push the gfx envelope and power of todays GPUs are insane.

Because there are not enough people who have an extremely powerful gpu. When new consoles arrive, millions will have very powerful hardware, and it suddenly makes sense to develop for them.
 

onQ123

Member
Was, but a back-end dev and certainly never went out of my way to involve myself in anything with Silverlight ;-)



Good points and fair enough, I see there is use for handling 4k size resolutions by the hardware even if the screen is not outputting at that res.

I was just pointing out that a 4K screen itself might not actually benefit many people if you think of the average size living room and how big the screen would have to be before the human eye could start to notice the difference. I guess that also depends on if you accept that graph too.

For gaming itself I would just be happy if the PS4 could output at full 1080p 60fps for all games.

I think that graph is what has caused a lot of confusion.

People actually believe that you can't see any benefits of 4K unless the TV is over 80" this is false.

just to get a mental idea of what a 4K TV will be like, just look at a 24" or 27" 1080P monitor & picture 4 of them put together to make a 48" or 52" 4K TV.


if you honestly think that you couldn't see any difference between stretching the 1080P image on your 27" screen to make it 52" vs having 3 more 27" 1080P images added to the 27" 1080P image to fill the screen there is something wrong with you.
 

Zabka

Member
Next gen is going to be all over the place with different resolutions and forms of anti-aliasing, just like this gen.

Hopefully 720p is the bottom.
 

mclaren777

Member
I will be really surprised if 1080p isn't the minimum resolution of next-gen games, either because no dev attempts something lower or because of console certification.
 

i-Lo

Member
People really think 1080p won't be the norm next gen?

The reason 720p (and even then not every game this gen was that resolution) and 30fps are the base standards because:

  • It is a HD console and 720p is the stated minimum.
  • 30fps is the only way to ensure adequate response and smooth gaming experience (although we've all played games that have dipped below that). For ref: Movies are shot at 24fps.

Unlike PC, consoles are closed boxes. To gain the maximum fidelity, there has to be a compromise. 720p and 30fps are the lowest of standard and therefore can be leveraged for maximum graphical quality with bells and whistles above any other resolution. Also, unlike computer monitors there are only two HD options right now for hdtvs, 720p and 1080p and that's quite a significant jump in resolution.

I am certain we'll see more titles with 1080p@30 and @60fps but they won't account for the majority of titles that'll be released in their lifetime.
 

SapientWolf

Trucker Sexologist
The reason 720p (and even then not every game this gen was that resolution) and 30fps are the base standards because:

  • It is a HD console and 720p is the stated minimum.
  • 30fps is the only way to ensure adequate response and smooth gaming experience (although we've all played games that have dipped below that). For ref: Movies are shot at 24fps.

Unlike PC, consoles are closed boxes. To gain the maximum fidelity, there has to be a compromise. 720p and 30fps are the lowest of standard and therefore can be leveraged for maximum graphical quality with bells and whistles above any other resolution. Also, unlike computer monitors there are only two HD options right now for hdtvs, 720p and 1080p and that's quite a significant jump in resolution.

I am certain we'll see more titles with 1080p@30 and @60fps but they won't account for the majority of titles that'll be released in their lifetime.
30fps was used because it divides evenly into 60fps, but I wonder if we'll start seeing 40fps now that most new HDTVs are 120hz. It would be nice, because drops below 30 quickly slide into the unplayable range. 40fps allows a bigger buffer.
 

i-Lo

Member
30fps was used because it divides evenly into 60fps, but I wonder if we'll start seeing 40fps now that most new HDTVs are 120hz. It would be nice, because drops below 30 quickly slide into the unplayable range. 40fps allows a bigger buffer.

Once again, if 40fps was mandated then perhaps. But I am certain that's not going to happen. 720p will stop being a part of HD games when the lowest resolution for HDTVs is either 1080p (I say next to next generation of consoles) or if console manufacturers mandate 1080p resolution now (which in all likelihood, they would not).
 

RaijinFY

Member
The only thing sure in terms of resolution output is there wont be anymore sub-HD games. I suspect the big boys will go 1080p. Cant wait to see what could do Kaz Yamauchi & his team with a modern GPU and much more RAM. Kind of scary to think they manage to pull out this game with so many limitations (1080@60fps, 16 very detailed cars.... with 256MB of Vram???)
 

i-Lo

Member
The only thing sure in terms of resolution output is there wont be anymore sub-HD games. I suspect the big boys will go 1080p. Cant wait to see what could do Kaz Yamauchi & his team with a modern GPU and much more RAM. Kind of scary to think they manage to pull out this game with so many limitations (1080@60fps, 16 very detailed cars.... with 256MB of Vram???)

I can already tell you how GT6 prologue will look. The gamplay graphics should be very close to what GT5 intro looks like.
 

Raistlin

Post Count: 9999
Very likely. The GPU only needs to be 4 times as fast as the PS3's. Should be achievable even if the launch price will be in Wii U range.
I'm not sure it's quite as simple as the GPU being 4x as fast. There's memory bandwidth, amongst other issues consider. I don't think there's an automatic linear relationship between resolution and FLOP's? Though I hope I'm wrong on that.




30fps was used because it divides evenly into 60fps, but I wonder if we'll start seeing 40fps now that most new HDTVs are 120hz. It would be nice, because drops below 30 quickly slide into the unplayable range. 40fps allows a bigger buffer.
Even with 120Hz TV's, the conventional design is based on 60Hz (and 24Hz) refresh syncs. They don't actually input at higher framerates (and there is no provision in current HDMI Tx/Rx and display processors to support it). Therefore you run up against the same issue we always have. 60/30/15.
 
The reason 720p (and even then not every game this gen was that resolution) and 30fps are the base standards because:

  • It is a HD console and 720p is the stated minimum.
  • 30fps is the only way to ensure adequate response and smooth gaming experience (although we've all played games that have dipped below that). For ref: Movies are shot at 24fps.

Unlike PC, consoles are closed boxes. To gain the maximum fidelity, there has to be a compromise. 720p and 30fps are the lowest of standard and therefore can be leveraged for maximum graphical quality with bells and whistles above any other resolution. Also, unlike computer monitors there are only two HD options right now for hdtvs, 720p and 1080p and that's quite a significant jump in resolution.

I am certain we'll see more titles with 1080p@30 and @60fps but they won't account for the majority of titles that'll be released in their lifetime.

Are you talking about Wii U or all 3? If Durango and Orbis will be 720p/30 I don't want to know the resolution and framerate of next gen Wii U games..

And a closed system can reach better things because dev know exactly what he have. Can you run Battlefield 3 on a 2005 pc? I think you're looking things in wrong way, if Xbox 360 and PS3 can't run actual games without compromises it is not because "closed system", it is because actual consoles have 2005 hardware.
 

KageMaru

Member
People really think 1080p won't be the norm next gen?

When most people can't tell the difference between 1080p and 720p, why restrict yourself to the former resolution? We'll see more 1080p games next gen, especially in the beginning IMO. However as the generation goes on, and developers try to squeeze more out of the boxes, resolution will be one of the first things sacrificed. I still think the majority of games, and even the best looking games next gen will be 720p.

Was 720p "the norm" this gen? Next gen the same will happen with 1080p.

Yes. While there are quite a few sub-HD games, the vast majority of games still render at 720p.

Edit:

I'm not sure it's quite as simple as the GPU being 4x as fast. There's memory bandwidth, amongst other issues consider. I don't think there's an automatic linear relationship between resolution and FLOP's? Though I hope I'm wrong on that.

No, you're not wrong. The Wii-U is a perfect example of this. It has a faster GPU than the PS360, but if it's true that it only contains 8 ROPs, I don't expect the average resolution of games to increase at all for that system.

Are you talking about Wii U or all 3? If Durango and Orbis will be 720p/30 I don't want to know the resolution and framerate of next gen Wii U games..

Aren't there already sub-HD games confirmed for the Wii-U?
 

i-Lo

Member
Are you talking about Wii U or all 3? If Durango and Orbis will be 720p/30 I don't want to know the resolution and framerate of next gen Wii U games..

And a closed system can reach better things because dev know exactly what he have. Can you run Battlefield 3 on a 2005 pc? I think you're looking things in wrong way, if Xbox 360 and PS3 can't run actual games without compromises it is not because "closed system", it is because actual consoles have 2005 hardware.

I am talking about PS4/XB3. Unless MS and Sony mandate 1080p or 60fps or a combination of both, the majority of games will still be 720p.

Just ask yourself this simple question: If you lower the resolution, can you do more graphical fidelity (you know, AA, AF, particle effects, so on and so forth)? If the answer is yes (which is only logical answer) then I have proven my point.

Like I said, I even made a bet.
 
If Durango and Orbis will be 720p/30 I don't want to know the resolution and framerate of next gen Wii U games..

Good point. MS and Sony demanding 1080P as the benchmark would be exactly what Nintendo wants as it's more likely that the WiiU could get a lot of 720P ports without sacrificing much else in the game.

If D and O games are starting at 720P, then I guess the games themselves would have to be gimped in some way in order to make the transition.
 

-COOLIO-

The Everyman
Good point. MS and Sony demanding 1080P as the benchmark would be exactly what Nintendo wants as it's more likely that the WiiU could get a lot of 720P ports without sacrificing much else in the game.

If D and O games are starting at 720P, then I guess the games themselves would have to be gimped in some way in order to make the transition.

handnt thought of that but it's true
 

Raistlin

Post Count: 9999
I am talking about PS4/XB3. Unless MS and Sony mandate 1080p or 60fps or a combination of both, the majority of games will still be 720p.

Just ask yourself this simple question: If you lower the resolution, can you do more graphical fidelity (you know, AA, AF, particle effects, so on and so forth)? If the answer is yes (which is only logical answer) then I have proven my point.

Like I said, I even made a bet.
My hope is that more devs will bite on what some of the top games have offered this generation - selectable resolution. ie. 720p with full effects, 1080p with lower levels of AA/AF.


That's a nice compromise, and allows the user to eek out the best image quality for their particular setup. Your TV's resolution, size, and your viewing distance dictate which will provide the best results. And with some forethought, it's obviously not difficult to implement within an engine.
 

i-Lo

Member
People who are so hung up about MS and Sony mandating 1080p should remember a few two things:

Epic stated that running Samaritan at 1080p at 30fps requires roughly 2.5TFlops. PS4 current rumour doesn't suggest a GPU that capable. Add to the woe that nVidia flops aren't the same as AMD FLOPS (nVidia's one showing better performance for lower overall theoretical flops than AMD cards) and you can see we are far from it.

The latest showing of Star Wars 1313 as presented here on PC gamer should reveal it's holding just about steady at 30fps with not 1 nor 2 but THREE GTX680. No way a next gen console will have such raw horsepower.

Add in the fact that people are already bitching about how PS4 might be DOA if it's not priced the same as WiiU *once again shakes head* and it's plain as day to see why it's realistic to expect next-gen effects with 1080p resolution for majority of titles.

My hope is that more devs will bite on what some of the top games have offered this generation - selectable resolution. ie. 720p with full effects, 1080p with lower levels of AA/AF.


That's a nice compromise, and allows the user to eek out the best image quality for their particular setup. Your TV's resolution, size, and your viewing distance dictate which will provide the best results.

Not going to happen due to fidelity consistency related issues. Devs are looking to eek the max they can on their own terms on a closed system. That's why the only options console owners ever get is gamma and brightness with regards to visual changes.
 
I am talking about PS4/XB3. Unless MS and Sony mandate 1080p or 60fps or a combination of both, the majority of games will still be 720p.

Just ask yourself this simple question: If you lower the resolution, can you do more graphical fidelity (you know, AA, AF, particle effects, so on and so forth)? If the answer is yes (which is only logical answer) then I have proven my point.

Like I said, I even made a bet.

Have you seen any game@480p on Xbox 360 or PS3?
 

Raistlin

Post Count: 9999
Not going to happen due to fidelity consistency related issues. Devs are looking to eek the max they can on their own terms on a closed system. That's why the only options console owners ever get is gamma and brightness with regards to visual changes.
Except some of the top games already do offer what I'm talking about.


I'm not stating I expect it will be the norm (though that would be great) ... I'm simply saying I hope more offer it. And given how close they will be to PC titles in the early goings, coupled with the then current install base of 1080p displays versus earlier this gen, I wouldn't be surprised if we do see it more.


What was your bet? I might want in :D
 

i-Lo

Member
Except some of the top games already do offer what I'm talking about.


I'm not stating I expect it will be the norm (though that would be great) ... I'm simply saying I hope more offer it. And given how close they will be to PC titles in the early goings, I wouldn't be surprised if we do see it more.


What was your bet? I might want in :D

Yes, actually they would give it to you if the only differences between the two resolution builds was/were based solely on AA and/or AF. That said, it would mean that the game would have to be designed with 1080p in mind from get go and adding in features later to justify 720p as an option.
 

Raistlin

Post Count: 9999
Yes, actually they would give it to you if the only differences between the two resolution builds was/were based solely on AA and/or AF. That said, it would mean that the game would have to be designed with 1080p in mind from get go and adding in features later to justify 720p.
Obviously it will depend on what sorts of bottle necks may exist in the architectures ... but I do expect we'll see it at least as much as this gen. Most likely more.
 

SapientWolf

Trucker Sexologist
Have you seen any game@480p on Xbox 360 or PS3?
Is the benefit from going from 480p to 720p equal to the benefits from going 720p to 1080p? It will have a heavy performance cost so the payoff has to be worth it.

IMHO, it's not worth it in the living room. 1080p HDTVs can scale 720p content just fine. Might as well use the extra grunt to tighten up the graphics on level 3. On the PC there's no way. I would rather upgrade at great cost than upscale on a PC monitor.
 
SMH....

I have already mentioned that both PS3 and XB3 core selling point was they are both "HD" consoles. And the lowest resolution considered "HD" is 720p

Seriously, this is going nowhere.

We don't know the selling point of next gen consoles. What if they promote the 1080p as a selling point against Wii U?
 

thuway

Member
Sony or Microsoft would be foolish to demand anything out of developer's. Do as you please should be the mantra, the only limitation would honestly be the amount of RAM available to them since a portion would be given to the OS.
 
The latest showing of Star Wars 1313 as presented here on PC gamer should reveal it's holding just about steady at 30fps with not 1 nor 2 but THREE GTX680. No way a next gen console will have such raw horsepower.

I don't think that article is correct. Maybe they had a Samaritan "three 580s" mental slip.

Sony or Microsoft would be foolish to demand anything out of developer's. Do as you please should be the mantra, the only limitation would honestly be the amount of RAM available to them since a portion would be given to the OS.

I agree. Let devs decide what's best for their games.
 

Raistlin

Post Count: 9999
Have you seen any game@480p on Xbox 360 or PS3?
It's a balancing act depending on the architectures in question.


For PS360, except for some incredibly niche graphics styles/gameplay use-cases, the benefits do not outweigh the costs for dropping down to 480p. The savings in processing, bandwidth, and memory simply wouldn't allow for effects, etc sufficient to provide a better overall visual experience. The results are simply going to be better with the higher resolution.
 
People really underestimate the next consoles. A $150 graphics card from a year ago can run any game in 1080p. This is for the pc where developpers can't even optimize like on consoles.
 
No, you're not wrong. The Wii-U is a perfect example of this. It has a faster GPU than the PS360, but if it's true that it only contains 8 ROPs, I don't expect the average resolution of games to increase at all for that system.

You can observe on PCs that the performance decrease with higher resolutions is linear (as long as you don't run out of memory) and rarely limited by ROPs count.
For examyple: From GTX 580 to GTX 680, ROPs decreased (48 -> 32) while the clocks did not rise by the same amount. If you look at benchmarks with high resolutions and/or SSAA, you will see that the performance advantage of the GTX 680 over the GTX 580 will stay the same.

(Btw., RSX has 8 ROPs, clocked at 550 MHz. A GTX 460, certainly not a high end card by todays standards, has 32 ROPs and is clocked higher.)
 
Has there been a teardown on the new PS3 yet?
Not that I have seen and no power readings either. I was just informed that my PS3 4K will arrive this Friday, I'll have a power reading posted within an hour of it's arrival.

From ekim http://www.geekwire.com/2012/microso...ance-hardware/

The company notified employees this week that it will be implementing new physical security measures — limiting employee access at four key Xbox and Interactive Entertainment Business buildings to ensure confidentiality of upcoming products.

Under the new policy, only employees and vendors in Microsoft’s Interactive Entertainment Business or assigned to the buildings will have open access. Other employees and vendors who need to enter the buildings for business reasons will need to go through an online registration process or register at the buildings as visitors, escorted by another employee with access to the buildings. The changes don’t impact the Commons area in the middle of the Xbox campus.
My guess and the article authors guess is increased security in advance of Xbox 720 Beta machines which will have the final chipset. Security via the ARM A5 Trustzone processor, Dashboard and third party applications will be worked on when they have the final chipset.

Since we have "Leaks" that the core Silicon is identical with the PS4, Sony should be doing the same. A Few ?hundred? TEST & Beta Developer wafers will have been produced in advance (NOW) of the first mass production run (10,000?) to happen starting Dec 2012 if it's to be sold Christmas season 2013. Even if DDR4 is in sample quantity now that should not impact limited Beta Platform production.
 

Teletraan1

Banned
Remind me again why 1080P still won't be a standard and why we won't see more 60FPS games?

To me these are more important than a bunch of effects. I turn the effects down on my PC to keep 1080p and 60FPS. When that doesn't work anymore I upgrade lol.

These rumored video chip sets have to be able to do 1080p and 60fps....If they cant hit that benchmark they can shove them up their ass and I will go PC only. Usually the games that struggle to hit these benchmarks on any half decent PC are unoptimized. That shouldn't be the case with a console.
 
Status
Not open for further replies.
Top Bottom