• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Pro's 4K promise isn't misleading, even though majority of games will be upscaled

onQ123

Member
That transforms in a frame having half information of a full-frame because "it missing lines are duplicated or interpolated to recreate the information".

The render is 1920x540
The output (information passed vi cable) is 1920x540
The frame has informantion of 1920x540

Nothing in 1080i is native 1920x1080.


That's funny because a 1080i frame is made up of how many pixels?

  1. 1920 x 540
  2. 1920 x 540 x 2
  3. 1920 x 1080
  4. 1920 x 1080 x 1/2
 

BennyBlanco

aka IMurRIVAL69
I keep hearing people say this...

But 60fps is a power problem. Designers "choose" 30fps because they're limited in their choices not because it looks better.

Yes, developers could make lower resolution games with worse textures and target 60fps, but then they've sacrificed a different visual quality.

The only "choice" is; do I give consumers muddy visuals or choppy framerates?

I think Insomniac put it well:

Insomniac Games said:
However, during development, there are hard choices to be made between higher quality graphics and framerate. And we want to make the right choices that reflect our commitment to providing you with the best looking games out there. To that end, our community team did some research into the question of framerate. The results perhaps confirmed what I’ve known for a long time, but found it difficult to accept without evidence. They found that:

A higher framerate does not significantly affect sales of a game.
A higher framerate does not significantly affect the reviews of a game.

And in particular they found that there was a clear correlation between graphics scores in reviews (where they are provided) and the final scores. And they found no such correlation between framerate and the graphics scores nor the final scores. As an interesting side-note, our team also found no direct correlation between gameplay scores and final scores, however it does appear that gameplay scores are also influenced by graphics scores. i.e. Better looking games appear to be more “fun” to reviewers, in general.

http://www.insomniacgames.com/how-much-does-framerate-matter/

If you really care about 60 fps (I do). PC/ Nintendo is the way to go.
 

Synth

Member
That's funny because a 1080i frame is made up of how many pixels?

  1. 1920 x 540
  2. 1920 x 540 x 2
  3. 1920 x 1080
  4. 1920 x 1080 x 1/2

Ummm.... this would assume that (at least in the case of graphics rendering) all 1080 vertical lines are rendered in each frame, despite only half of those hitting the display for each field. This isn't necessarily the case, and if a game knows it's going out as 1080i, it could very well only have a skewed 1920x540 in each frame. The same goes for 480i vs 480p.

If you really care about 60 fps (I do). PC/ Nintendo is the way to go.

Nintendo get waaaaaaay too much 60fps credit imo. The other consoles have more 60fps games than the Wii U does, even if they also have far more games that aren't.
 

onQ123

Member
Ummm.... this would assume that (at least in the case of graphics rendering) all 1080 vertical lines are rendered in each frame, despite only half of those hitting the display for each field. This isn't necessarily the case, and if a game knows it's going out as 1080i, it could very well only have a skewed 1920x540 in each frame. The same goes for 480i vs 480p.

But that's the thing 1920x540 does not hit the display alone because LCDs & other new TV tech can only display progressive scan frames so the 2 1920x540 fields have to be converted into progressive scan frames
 

Synth

Member
But that's the thing 1920x540 does not hit the display alone because LCDs & other new TV tech can only display progressive scan frames so the 2 1920x540 fields have to be converted into progressive scan frames

Which is a process not at all dissimilar to scaling 900p to 1080p. It's still blatantly not a true 1920x1080 output, even if the resulting display has that many unique pixels. You were just talking about frame buffers, and now you're happy to accept an LCD TV's interlace conversion process as 1920x1080?
 

BennyBlanco

aka IMurRIVAL69
Nintendo get waaaaaaay too much 60fps credit imo. The other consoles have more 60fps games than the Wii U does, even if they also have far more games that aren't.

The most popular Wii U games are 60 fps. Mario Kart, Sm4sh, etc. Nintendo is getting credit because they are developing good games at 60 fps themselves.
 

Synth

Member
The most popular Wii U games are 60 fps. Mario Kart, Sm4sh, etc. Nintendo is getting credit because they are developing good games at 60 fps themselves.

That's all fine and well, but so are MS (Killer Instinct, Halo 5, Forza Motorsport, Minecraft. Ori, etc) and Sony (The Last of Us, hopefully Gran Turismo 5... or hell, anything PSVR, Tearaway Unfolded, Gravity Rush, etc). The other have a lower ratio sure... but that has a lot to do with them releasing more games that fit the molds where Nintendo will drop 60fps too (Xenoblade, Zelda etc).

Point is, at the end of the day you still get a higher number of 60fps games on the other consoles, regardless of who's making them, because other studios are releasing 60fps games on them that skip the Wii U entirely.
 
The most popular Wii U games are 60 fps. Mario Kart, Sm4sh, etc. Nintendo is getting credit because they are developing good games at 60 fps themselves.

The two games you have mentioned belong to genres that favour 60fps on both PS4 and Xbox One.

The most popular upcoming Wii U Zelda game runs between 20-30fps.
 

ViciousDS

Banned
I'm just hoping they don't trade the extra resolution at the expense of graphical effects. It might end up being that the 1080p version ends up looking better in some cases.


This is what I'm worried will end up happening. Or every game will just upscale from 1080p with extra graphics in the future.
 
So you didn't see them showing Shadow of Mordor with super sampling a few seconds after that? then For Honor after that?

As someone that's not going to buy a 4K TV anytime soon, I was hoping on Sony to show a lot more for us 1080p users; a few brief clips is simply not enough. That said, I'm hoping to see some coverage from EGX next week.
 

III-V

Member
The two games you have mentioned belong to genres that favour 60fps on both PS4 and Xbox One.

The most popular upcoming Wii U Zelda game runs between 20-30fps.

This is what I'm worried will end up happening. Or every game will just upscale from 1080p with extra graphics in the future.

I would not worry too much, if the 1080p mode is blowing away the built-in checkerboard upscale solution, then just output the 1080p signal to your 4k HDR TV and let your monitor scale it up.

Games look quite nice 1080p upscaled on my monitor, certainly better than standard 1080p.
 

Nanashrew

Banned
That's all fine and well, but so are MS (Killer Instinct, Halo 5, Forza Motorsport, Minecraft. Ori, etc) and Sony (The Last of Us, hopefully Gran Turismo 5... or hell, anything PSVR, Tearaway Unfolded, Gravity Rush, etc). The other have a lower ratio sure... but that has a lot to do with them releasing more games that fit the molds where Nintendo will drop 60fps too (Xenoblade, Zelda etc).

Point is, at the end of the day you still get a higher number of 60fps games on the other consoles, regardless of who's making them, because other studios are releasing 60fps games on them that skip the Wii U entirely.

The reason Nintendo may get too much credit for 60fps is because nearly all of Nintendo's output is 60fps with few exceptions. And that's why some felt it was a big deal when Mario Kart 8 was 59fps because it's "unlike" Nintendo to not achieve solid 60fps.
 

onQ123

Member
Which is a process not at all dissimilar to scaling 900p to 1080p. It's still blatantly not a true 1920x1080 output, even if the resulting display has that many unique pixels. You were just talking about frame buffers, and now you're happy to accept an LCD TV's interlace conversion process as 1920x1080?

It is 1920x1080 no matter if I accept it or not.


1080i 60hz is basically the same pixels as 1080P 30fps because instead of drawing 1920x1080 directly to the screen in 1 sequence it draw 1920x1080 to the screen with a odd & even field of 1920x540. the fields are separated by time but the 1920 x 1080 pixels still make it to the screen in the same time to make up a full frame.
 

Metfanant

Member
Due to the developer priorities most games will still run at 30fps.

Fixed that

Devs use hardware in different ways, a game like ACU can't run at both, but a game like Battlefront certainly has the eyecandy with the framerate to match. It's not a true 4k console, and I consider that a bit misleading, what is the advertising gonna saw "almost true 4k?"

I still dont think you're understanding...if a dev sets out to extract the absolute limit from the hardware from a purely visual standpoint, you simply can't have both...

You don't generally see this on the PC side of things for many reasons...the biggest being the infinite combinations of hardware...if a dev codes to take the ultimate advantage of a particular card's hardware, you would cause problems for other cards...it's part of the reason consoles always "punch above their weight" so to speak...

In a closed box environment you don't have to worry about breaking compatibility with hardware...

The console absolutely is a 4k console, it's more than capable of outputting a 4k signal for media apps, as well as games...natively, upscaled, or using alternative rendering techniques like checkerboard rendering...

Is it going to run the newest CoD in 4k/60? No, not without graphical sacrifices vs 1080/60 mode, or a checkerboard mode...but frankly, for $400 it's going to get closer to native 4k than anything in its price range...


I would not worry too much, if the 1080p mode is blowing away the built-in checkerboard upscale solution, then just output the 1080p signal to your 4k HDR TV and let your monitor scale it up.

Games look quite nice 1080p upscaled on my monitor, certainly better than standard 1080p.

I'm truly hoping that the majority of devs decide to focus on 1080p, or 1440p with downsampling...I have nothing against the checkerboard approach, but as someone still waiting for the right 4k TV to be manufactured...it benefits me the most haha
 
It is 1920x1080 no matter if I accept it or not.


1080i 60hz is basically the same pixels as 1080P 30fps because instead of drawing 1920x1080 directly to the screen in 1 sequence it draw 1920x1080 to the screen with a odd & even field of 1920x540. the fields are separated by time but the 1920 x 1080 pixels still make it to the screen in the same time to make up a full frame.
It's not the same because there's a slight shimmering effect that you don't get with a true progressive signal.
 

onQ123

Member
As someone that's not going to buy a 4K TV anytime soon, I was hoping on Sony to show a lot more for us 1080p users; a few brief clips is simply not enough. That said, I'm hoping to see some coverage from EGX next week.


Well you have 2 months before Pro is released you will have time to see 1080P games on PS4 Pro . Sony did a 30 minute reveal I'm not sure why it's killing people that they didn't spend all day explaining that a more powerful console is going to play 1080P games better than it's played on the less powerful console? they explained that devs can choose to take advantage of the extra power however they like for HD TVs as long as the frame rate isn't lower than it is on PS4.

I'm not even sure what y'all are looking for at this point
 

Metfanant

Member
It's not the same because there's a slight shimmering effect that you don't get with a true progressive signal.

Yes, 1080i breaks down a little in motion because it has to be de-interlaced...it's the reason ESPN broadcasts in 720p...

But at the end of the day, the image displayed is truly 1920x1080
 

Lady Gaia

Member
Well, 4K looks better on the box than 2.5K.

Not to mention nobody has a 2.5K TV so that would just create confusion. It's a console intended to have substantial benefits specifically for 4K TVs, whether or not the content is completely native. It's 100% appropriate to talk about 4K output and features that are currently exclusive to 4K sets like HDR and the BT.2020 color space.

It's not Sony's fault that most consumers don't have the patience or interest in understanding the subtleties.
 

Synth

Member
The reason Nintendo may get too much credit for 60fps is because nearly all of Nintendo's output is 60fps with few exceptions. And that's why some felt it was a big deal when Mario Kart 8 was 59fps because it's "unlike" Nintendo to not achieve solid 60fps.

Again though, this is mostly a case of the genres that they commonly release being those that have 60fps as a popular framerate in all cases. You have Smash Bros, but that sits in the same bracket as something like Killer Instinct, Tekken, Street Fighter, etc. You have Mario Kart... and whilst I'll give credit here as non-sim racers often opt for 30fps, the first party's are still putting out Forzas, WipEouts (well, they were) and the like. 2D platforms are 60fps as a rule (Ori, Rayman, etc). Multiplayer focused shooters are 60fps as a rule (CoD, Halo, Battlefield, Overwatch and the mp portions of games like Gears and Uncharted).

When Nintendo creates more typical "experience" based games they do just as other people do. Zelda isn't 60fps for non-remakes, Metroid Primes weren't, Xenoblade isn't, Bayonetta 2 certainly doesn't quality for 60fps, etc... they just have less of these games than others do... but the platform also has less 60fps games, and the ones it has can generally be matched with an equivalent elsewhere.

It is 1920x1080 no matter if I accept it or not.

1080i 60hz is basically the same pixels as 1080P 30fps because instead of drawing 1920x1080 directly to the screen in 1 sequence it draw 1920x1080 to the screen with a odd & even field of 1920x540. the fields are separated by time but the 1920 x 1080 pixels still make it to the screen in the same time to make up a full frame.

Listen to yourself here...

Sure, 1080i at 60fps is the same pixel count as 1080p at 30fps... but it's NOT the same pixel count as 1080p at 60fps. So if you have a 60fps game like Dead or Alive 4 that outputs 1080i, it's not equivalent to 1920x1080/60fps... it'd be half that, in other words 1920x540/60fps.

You gone completely away from your "what the game buffer rendered" reasoning now, and have gone straight into "you realise it'll be 1080p as your output right?" territory. Guess what, 900p/60fps would be greater in pixel count than 1080p/30fps, but surely you can see how silly an arguing point that would be right?

If we only care what the final image is made up of, then everything you play on a 1080p screen is 1920x1080, regardless of if it's checkboarded, native, or upscaled. You've rendered this talking point meaningless.

You can't ignore the ESRAM...

You pretty much can, unless the 320gb/s number is calculated with some sort of ESRAM of its own. It's the same bandwidth that a 980ti and a 1080 have, and higher than that of a 1070. There's no reason at all to assume a Scorpio wouldn't be able to native 4K any game that the XB1 can run at 1080p from what we know so far.
 

onQ123

Member
It's not the same because there's a slight shimmering effect that you don't get with a true progressive signal.

Yes which is why they have filters & so on but it's still a 1920 x 1080 frame that's being displayed.


If Sony is doing parallel rendering they are breaking the rendering process up into smaller processes but the final frame will still be made up of 3840 x 2160 pixels.


Just like tiled based rendering that's why I was trying to explain that it is still 4K just not immediate mode 4K rendering.


https://en.wikipedia.org/wiki/Alternate_frame_rendering


Alternate frame rendering

From Wikipedia, the free encyclopedia

Alternate Frame Rendering (AFR) is a technique of graphics rendering in personal computers which combines the work output of two or more graphics processing units (GPU) for a single monitor, in order to improve image quality, or to accelerate the rendering performance. The technique is that one graphics processing unit computes all the odd video frames, the other renders the even frames. This technique is useful for generating 3D video sequences in real time, improving or filtering textured polygons and performing other computationally intensive tasks, typically associated with computer gaming, CAD and 3D modeling.[1]

One disadvantage of AFR is a defect known as micro stuttering.
Parallel rendering methods[edit]

AFR belongs to a class of parallel rendering methods, which subdivide a four-dimensional image frame sequence (x,y,z and time) into smaller regions, each of which is then assigned to a different physical processor within a multi-processor array. Note that the regional boundaries may be defined in space or in time. Also, the multiple processors can be implemented within a single video card or separate video graphics cards can be combined, subject to the motherboard and I/O slot limitations. When separate video cards are used, they must be specifically designed to allow a "cross-link" between them.
If a computer has two video cards that combine their outputs into a single video monitor, then one of four methods could be used to create the images.

Alternate Frame Rendering (AFR): One graphics processing unit (GPU) computes all the odd video frames, the other renders the even frames. (i.e. time division)
Split Frame Rendering (SFR): One GPU renders the top half of each video frame, the other does the bottom. (i.e. plane division)

Checker board: As the name implies, the image is split into smaller squares, which are assigned to different cards
Scan-Line Interleave: The origin of the SLI trademark, as employed by the 3dfx Voodoo2, which renders a frame's even scan-lines on the first GPU and its odd scan-lines on the second. The SLI trademark passed to Nvidia upon its acquisition of 3dfx in 2000 and now stands for Scalable Link Interface.



I think one of the more efficient rendering techniques that will be used is Scanline rendering


https://en.wikipedia.org/wiki/Scanline_rendering

Scanline rendering

From Wikipedia, the free encyclopedia

Scan-line algorithm example

Scanline rendering is an algorithm for visible surface determination, in 3D computer graphics, that works on a row-by-row basis rather than a polygon-by-polygon or pixel-by-pixel basis. All of the polygons to be rendered are first sorted by the top y coordinate at which they first appear, then each row or scanline of the image is computed using the intersection of a scanline with the polygons on the front of the sorted list, while the sorted list is updated to discard no-longer-visible polygons as the active scan line is advanced down the picture.

The main advantage of this method is that sorting vertices along the normal of the scanning plane reduces the number of comparisons between edges. Another advantage is that it is not necessary to translate the coordinates of all vertices from the main memory into the working memory—only vertices defining edges that intersect the current scan line need to be in active memory, and each vertex is read in only once. The main memory is often very slow compared to the link between the central processing unit and cache memory, and thus avoiding re-accessing vertices in main memory can provide a substantial speedup.




Use in realtime rendering[edit]

The early Evans & Sutherland ESIG line of image-generators (IGs) employed the technique in hardware 'on the fly', to generate images one raster-line at a time without a framebuffer, saving the need for then costly memory. Later variants used a hybrid approach.

The Nintendo DS is the latest hardware to render 3D scenes in this manner, with the option of caching the rasterized images into VRAM.

The sprite hardware prevalent in 1980s games machines can be considered a simple 2D form of scanline rendering.
The technique was used in the first Quake engine for software rendering of environments (but moving objects were Z-buffered over the top). Static scenery used BSP-derived sorting for priority. It proved better than Z-buffer/painter's type algorithms at handling scenes of high depth complexity with costly pixel operations (i.e. perspective-correct texture mapping without hardware assist). This use preceded the widespread adoption of Z-buffer-based GPUs now common in PCs.

Sony experimented with software scanline renderers on a second Cell processor during the development of the PlayStation 3, before settling on a conventional CPU/GPU arrangement.

Similar techniques[edit]

A similar principle is employed in tile based deferred rendering (most famously the PowerVR 3D chip); that is, primitives are first sorted into screen space, then rendered in fast on-chip memory, one tile at a time. The Dreamcast provided a mode for rasterizing one row of tiles at a time for direct raster scanout, saving the need for a complete framebuffer, somewhat in the spirit of hardware scanline rendering.
Some software rasterizers use 'span buffering' (or 'coverage buffering'), in which a list of sorted, clipped spans are stored in scanline buckets. Primitives would be successively added to this datastructure, before rasterizing only the visible pixels in a final stage.

Comparison with Z-buffer algorithm[edit]

The main advantage of scanline rendering over Z-buffering is that the number of times visible pixels are processed is kept to the absolute minimum which is always one time if no transparency effects are used—a benefit for the case of high resolution or expensive shading computations.
 

onQ123

Member
Listen to yourself here...

Sure, 1080i at 60fps is the same pixel count as 1080p at 30fps... but it's NOT the same pixel count as 1080p at 60fps. So if you have a 60fps game like Dead or Alive 4 that outputs 1080i, it's not equivalent to 1920x1080/60fps... it'd be half that, in other words 1920x540/60fps.

You gone completely away from your "what the game buffer rendered" reasoning now, and have gone straight into "you realise it'll be 1080p as your output right?" territory. Guess what, 900p/60fps would be greater in pixel count than 1080p/30fps, but surely you can see how silly an arguing point that would be right?

If we only care what the final image is made up of, then everything you play on a 1080p screen is 1920x1080, regardless of if it's checkboarded, native, or upscaled. You've rendered this talking point meaningless.


.

1080i 60hz is not 1080i 60fps because each frame is made up of a odd & even field instead of being drawn to the screen all at once.
 

Synth

Member
1080i 60hz is not 1080i 60fps because each frame is made up of a odd & even field instead of being drawn to the screen all at once.

Yea, and that's done by your TV deinterlacing it. If you output that same game on a 1080i CRT only half of that info (the part actually put out by the time console) gets drawn at each frame. The rest of it is the TV making the rest up, much like it does when told to draw a 900p image onto a 1080p panel.
 

Nanashrew

Banned
Again though, this is mostly a case of the genres that they commonly release being those that have 60fps as a popular framerate in all cases. You have Smash Bros, but that sits in the same bracket as something like Killer Instinct, Tekken, Street Fighter, etc. You have Mario Kart... and whilst I'll give credit here as non-sim racers often opt for 30fps, the first party's are still putting out Forzas, WipEouts (well, they were) and the like. 2D platforms are 60fps as a rule (Ori, Rayman, etc). Multiplayer focused shooters are 60fps as a rule (CoD, Halo, Battlefield, Overwatch and the mp portions of games like Gears and Uncharted).

When Nintendo creates more typical "experience" based games they do just as other people do. Zelda isn't 60fps for non-remakes, Metroid Primes weren't, Xenoblade isn't, Bayonetta 2 certainly doesn't quality for 60fps, etc... they just have less of these games than others do... but the platform also has less 60fps games, and the ones it has can generally be matched with an equivalent elsewhere.

Err.. The Metroid Prime games do run at 60fps.
 
They is no upremdered lol

Checkerboard is upscaling the image... the difference is the algorithmic used.

Checkerboard rendering is not an upscaling technique.

Because checkerboard rendered 4K is still 4K it's just being broke down to smaller blocks

https://en.wikipedia.org/wiki/Alternate_frame_rendering

Please stop linking to deprecated technologies that have nothing to do with the PS4 Pro. AFR refers to methods used to split workloads that have no bearing on a modern APU.

Upscaling doesn't have one single method to upscale. There are various ways of interpolating the image in order to scale it. I'd argue checkerboard rendering is more in line with upscaling because they're both interpolation methods to increase the resolution of the final output. It's certainaly a lot closer to upscaling than native 4K IMO.

Checkerboard rendering is not an upscaling technique. At no time is anything ever "scaled" unless your target framebuffer is sub 4K to begin with on the PS4 Pro.

Yea, and that's done by your TV deinterlacing it. If you output that same game on a 1080i CRT only half of that info (the part actually put out by the time console) gets drawn at each frame. The rest of it is the TV making the rest up, much like it does when told to draw a 900p image onto a 1080p panel.

On a CRT your TV doesn't make anything up. The missing lines are just scanned at an alternating cadence. The phosphors glow long enough for the image to be sustained somewhat.
 

Synth

Member
On a CRT your TV doesn't make anything up. The missing lines are just scanned at an alternating cadence. The phosphors glow long enough for the image to be sustained somewhat.

I'm not saying a CRT makes anything up. I'm saying the CRT will draw what was actually output by the system. The LCD's have to attempt to reconcile the two separate frames into one whole however.

A CRT outputting 900p would be displaying more than the same screen outputting 1080i if the game were to be running at 60fps in both cases.

EDIT: Reading that line back, I can see I was pretty unclear in how I worded it.
 
we have known for 6months that the PS4 Pro wasn't powerful enough for 4k. Not sure why anyone is evening arguing it at this point. It will push more pixels and frames than both PS4 and Xbone. Games will look much better and they will have some tricks to produce cleaner upscaled 4k. There will be a few games that will be native but most will be 4kpr (or whatever you want to call it). and obviously there will be streamed 4k content.

How about we just enjoy a more powerful console upgrade for the first time in our lives and worry about Scorpio at this time next year? I can't believe there are so many people that are negative about what the PS4 Pro is and is capable of. We would have killed for something like this to happen over the past 10 years.
 

dogen

Member
You pretty much can, unless the 320gb/s number is calculated with some sort of ESRAM of its own. It's the same bandwidth that a 980ti and a 1080 have, and higher than that of a 1070. There's no reason at all to assume a Scorpio wouldn't be able to native 4K any game that the XB1 can run at 1080p from what we know so far.

We don't know if scorpio has ESRAM or not, so it's not accurate yet to say it has 5x the bandwidth
 

MacTag

Banned
Again though, this is mostly a case of the genres that they commonly release being those that have 60fps as a popular framerate in all cases. You have Smash Bros, but that sits in the same bracket as something like Killer Instinct, Tekken, Street Fighter, etc. You have Mario Kart... and whilst I'll give credit here as non-sim racers often opt for 30fps, the first party's are still putting out Forzas, WipEouts (well, they were) and the like. 2D platforms are 60fps as a rule (Ori, Rayman, etc). Multiplayer focused shooters are 60fps as a rule (CoD, Halo, Battlefield, Overwatch and the mp portions of games like Gears and Uncharted).

When Nintendo creates more typical "experience" based games they do just as other people do. Zelda isn't 60fps for non-remakes, Metroid Primes weren't, Xenoblade isn't, Bayonetta 2 certainly doesn't quality for 60fps, etc... they just have less of these games than others do... but the platform also has less 60fps games, and the ones it has can generally be matched with an equivalent elsewhere.
I think that's fair, Nintendo does tend to specialize in genres where 60fps response benefits gameplay (platformers, racers, party games, fighters, puzzlers, now shooters). And when it comes to genres where there's little benefit, like rpgs or adventure games, they usually drop to 30fps so they can up the scope and splendor.

I think that's also precisely the reasoning for why Nintendo generally gets "60fps credit" though. Both proportionately and in terms of raw numbers they're just putting out more 60fps games on consoles than most other publishers, all their 1st party Wii U titles aimed for 60fps except Zelda, Pikmin, Xenoblade, TMS and Lego City. And if rumors about NX spec are true (CPU heavy) then that's probably going to continue.
 

Synth

Member
We don't know if scorpio has ESRAM or not, so it's not accurate yet to say it has 5x the bandwidth

True. But then I'd say that you can still ignore ESRAM, and it's more of case of if the Scorpio actually has 320gb/s. Because if that number is using some fuzzy ESRAM + DDR3 style calculation then it's simply not a legit spec. I highly doubt that it'll have insufficient bandwidth for 4k either way though.

I think that's fair, Nintendo does tend to specialize in genres where 60fps response benefits gameplay (platformers, racers, party games, fighters, puzzlers, now shooters). And when it comes to genres where there's little benefit, like rpgs or adventure games, they usually drop to 30fps so they can up the scope and splendor.

I think that's also precisely the reasoning for why Nintendo generally gets "60fps credit" though. Both proportionately and in terms of raw numbers they're putting out more 60fps games on consoles than most other publishers, all their 1st party Wii U titles aimed for 60fps except Zelda, Pikmin, Xenoblade, TMS and Lego City. And if rumors about NX spec are true (CPU heavy) then that's probably going to continue.

Yea, I should probably not have said "Nintendo get too much credit", as that kinda muddies what I was actually getting at... more that Nintendo or PC is the way to go. It wouldn't really matter if 15 out of 15 Nintendo games were 60fps, it doesn't make sense to recommend them over the other consoles for that reason primarily, if the other consoles even have 25 out of 100 games being 60fps. If someone is adverse to lower framerates then they should simply not buy games that aren't 60fps, but they shouldn't necessarily by a console that actually has less 60fps games, simply because it lacks a larger set that aren't.
 
The argument over native and non-native really doesn't mean anything to me. If the game looks good, it looks good. Native 4K isn't something I automatically believe makes games any more visually impressive or just plain better, especially if one too many compromises had to be made in order to get there. Quantum Break wasn't Native 1080p, but looked fantastic as far as I'm concerned.
 

MacTag

Banned
Yea, I should probably not have said "Nintendo get too much credit", as that kinda muddies what I was actually getting at... more that Nintendo or PC is the way to go. It wouldn't really matter if 15 out of 15 Nintendo games were 60fps, it doesn't make sense to recommend them over the other consoles for that reason primarily, if the other consoles even have 25 out of 100 games being 60fps. If someone is adverse to lower framerates then they should simply not buy games that aren't 60fps, but they shouldn't necessarily by a console that actually has less 60fps games, simply because it lacks a larger set that aren't.
Sure, but what if those 15 games you can only get on Nintendo's console and out of those 25 games on the other consoles you can get say 23 of them on PC. The PC+Nintendo combo still basically holds up.
 

Synth

Member
Sure, but what if those 15 games you can only get on Nintendo's console and out of those 25 games on the other consoles you can get say 23 of them on PC. The PC+Nintendo combo still basically holds up.

In the XB1's case this would likely hold true, as stuff like Forza, Halo, Killer Instinct etc become available on PC.

For the PS4, I'm pretty confident the tally would still be comfortably in its favour even with the removal of multiplats. At the end of the day you should buy a Nintendo console, because you want Nintendo games. not because you're 60fps sensitive. The PC is the only platform that should be recommended on that basis alone.
 

MacTag

Banned
In the XB1's case this would likely hold true, as stuff like Forza, Halo, Killer Instinct etc become available on PC.

For the PS4, I'm pretty confident the tally would still be comfortably in its favour even with the removal of multiplats.
I'm not. With most Japanese pubs moving hard on to Steam it really just isn't the case anymore. And with Sony's 1st party all their 60fps exclusives are seemingly PS3/Vita remasters, PSVR demoware or GT.
 
The argument over native and non-native really doesn't mean anything to me. If the game looks good, it looks good. Native 4K isn't something I automatically believe makes games any more visually impressive or just plain better, especially if one too many compromises had to be made in order to get there. Quantum Break wasn't Native 1080p, but looked fantastic as far as I'm concerned.

I agree with the main point of your argument that resolution isn't the only thing that matters when it comes to graphics. In my opinion 60 fps in a game is better than the same game running at a higher resolution at 30 fps. I'll also run a game at a lower resolution if running it on native is too taxing for my system and I get a bad framerate or unacceptably low settings. That said, if the hardware can handle it then running a game at the display's native resolution (or higher) IS better. The same game at the same settings and framerate but at a higher resolution will look better.

So far Sony has been very clear about the PS4 Pro and even though Cerny did use some silly terms (brute forcing resolution comes to mind) there isn't any indication that they are going to deceive the audience by claiming that games run at 4K.
 

Synth

Member
I'm not. With most Japanese pubs moving hard on to Steam it really just isn't the case anymore. And with Sony's 1st party all their 60fps exclusives are seemingly PS3/Vita remasters, PSVR demoware or GT.

If we were to elimate games that appeared on Vita or PS3, then it probably would tip towards the WIi U (though those games in nearly all cases weren't 60fps before either, so by the logic we're using, they should effectively be new to this person). Outside of that though, there are still enough random Japanese stuff that doesn't land on PC (or takes some unpredictably long window of time... hello Xrd Revelator) to make a noticeable difference. Whether that be fighters like King of Fighters 14, RPGs like Persona 5 or Yakuza 0, or off-the-wall stuff like Dead or Alive Xtreme 3.

Now would someone want to play Dead or Alive Xtreme 3, simply because it's 60fps? Probably not... but that's why it's not something to recommend a platform for, unless you could apply it to effectively anything you would want to play (PC).
 

geordiemp

Member
The most popular Wii U games are 60 fps. Mario Kart, Sm4sh, etc. Nintendo is getting credit because they are developing good games at 60 fps themselves.

No, the ability for devs to hit 60 FPS is more to do with GENRE and game ambition.

Nintendo do allot of small map 2D or 3D platformer and genres that you would expect to be 60 FPS.

Kart racer = 60 FPS = You dont say
Smash 2D fighter = Knock me down surprised a 2D fighter

Any 2D game or short draw distance 3D with cartoon graphics then yeah, should be.

We are talking about large map AAA full 3D games, and the only 2 I can think of on WiiU are Zelda breath of wild and Xeno, and guess what ? Actually, I cant think of such a game at 1080p being released on WiiU yet.

Are they 1080p60 ? Lol No.

On Console, large world RPG or action games like Ass Creed, yeah they are generally 30 on any console. You get the odd standout, MGS V, Doom, some Frostbite games now, but its not Nintendo, Sony or MS in General.
 

leeh

Member
Yes, they said new semi-custom SoCs featuring Zen are coming in 2018 for the PC market, which means nothing with regard to consoles. MS could specifically be working with AMD on a custom SoC featuring Zen that could be out before the same tech is available in the PC market. They know they need Zen to provide a real generational jump.
That article is FUD. There's no direct quote and if anything it implies the opposite.
 

c0de

Member
I meant just for the comparison.

Oh. Well, the user was only mentioning the speed of ddr3, of course.
ESRAM is of course very fast but not only because of bandwidth but because of access times and speed in different access patterns that doesn't change that much like the one big pool from ps4 where the bandwidth is reduced by a good amount when CPU and GPU access that pool.
I think MS is aware that this is crucial for performance and will have a way better solution to this than Sony.
 
Sony's pitch worked, look at all the TV threads!

Good latency, high brightness HDR 4k screens are still a few years from being mainstream and affordable but people are already jumping the gun for the 4kPR push.

The more I read about the support or lack thereof for games on the PS4Bro the less I feel like keeping the pre-order going. It's turning into a bigger question, "will this $3500 Canadian TV make my games look better?" Well yes, no shit it will! Hell the built in upscaled alone will make it look better lol.

Havent seen much benefit from the pro but we know HDR looks great on new expensive ass screens...
 
Top Bottom