• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

CrustyBritches

Gold Member
I just noticed something I missed earlier due the fragmentation of leaks. Chronological order:
1. January 17, 2019 | 1GHz core clock


2. April 10, 2019 | 1.8GHz core clock


3. June 10, 2019 | New obscured Gonzalo(Oberon?)


4. June 25, 2019 | FS overall score reveal

---
Likely they were testing the 2GHz Gonzalo/Oberon at this point since the obscured "Gonzalo" was already appearing by June 10, 2019. *edit* updated info


Benji's "13TF" post was April 30, 2019.*Theory-craft* First known public leak of Navi-based kits come via Reddit OQA certification leak on May 20th, 2019. Benji's source likely did not have Gonzalo/Navi-SoC based kits at this point. Navi 10 retail drops July 7th, 2019.
 
Last edited:

Sosokrates

Report me if I continue to console war
I've been casually revisiting some of these old leaks, especially anything that was deleted/removed quickly after being posted. One in particular I like is Benji-sales and his contrite attitude after posting the PS5 dev kit specs and soon after removing it(April 30, 2019):
XQsKZbA.png


This is the root of the 13TF rumor for the PS5 from what I can see.
---
Benji actually has a lot of interesting discussion on his Twitter.


It's pretty clear that he believes the 13TF was from an early dev kit, and now follows the same Apisak and Digital Foundry theory.
---
I've been going back and getting a timeline of relevant leaks:

Can anybody point me to where the idea of 13TF+ might have come from if not Benji?


That reddit leak gives a die size of 315mm²
If that's the case ps5 can't be more then 36cus, likely clocked high, around 1800mhz.
 

Mass Shift

Member
My issue with Epsilon leak, beside absolutely insane specs - 32GB of GDDR6 is almost 50W of power, is that it contains clear typos. Just look at first slide - gameplataforms.

Well anyone who has ever seen any professional publishing process knows that any technical document first gets approval, goes back for scrutiny, authentication, proof reading and grammar corrections before getting approval a second time before being submitted as an official corporate document.

Those typos would have never escaped that process. Just real amateur hour stuff going on.
 

CrustyBritches

Gold Member
That reddit leak gives a die size of 315mm²
If that's the case ps5 can't be more then 36cus, likely clocked high, around 1800mhz.
PS5 is Navi 10 Lite, 36CU min, 40CU max, with 3 GPU clock profiles: PS5 = 2GHz, PS4 Pro = 911MHz, PS4 = 800MHz.

Probable Fire Strike overall score @ 1.8GHz/2GHz = 20K+. 20K+ overall can be either RX 5700(BIOS flash to 5700XT)+Ryzen 5 2600, or RX 5700 XT+Ryzen 5 1600. *Both of these GPUs completely mash a Vega 64.
 
Last edited:

R600

Banned
And what about Prospero now? The final devkits won’t be produced until next year. There’s still more changes to come.
Prospero can be Playstation 5 codename, like PS4s was Orbis, yet its SOC was codename liverpool (AMD), as well as Thebe and Starsha (Sony).

We already have several people (and images) showing PS5 special dev kit being out since June. And what was leaked on May 21st? Alleged PS5 devkit PCB. While everyone said it is far too early and therefore fake, it looks more and more likely by the day, especially since it matches July's Flute leak.

Flute, Gonzalo, Prospero...all from Tempest from Shakespeare.
 
Last edited:

R600

Banned
That reddit leak gives a die size of 315mm²
If that's the case ps5 can't be more then 36cus, likely clocked high, around 1800mhz.
It can actually, I already covered that.

40CUs on die with 256bit bus (~253mm²) + 20% more space per CU on die for RT (~20mm², this is 2x Nvidia size for a good measure) + Zen2 with 1/4 of its L3 cache (~40mm², full Zen2 is 70mm²).

I guess Scarlett would be bigger, but 320bit bus already takes ~16mm² more space. If they have lower clocked RAM chips, their effective BW will be neck and neck with PS5s running hotter and Scarlett SOC being larger. If they go for additional cache (might be the case with them going on and on about it) that is additional 15mm² and you are already ~350mm² for effectively similary specced SOC, with PS5 having chance of even being more powerful if they clock it higher. Number of CUs could be the same therefore.
 

CrustyBritches

Gold Member
Yes.

Listen it's not about lowballing, Rx5700 is 36CU, Rx5700 xt 40CU.
next AMD cards (5800) gonna have 52CU & 56CU (i assume) so :
52CU - 12.6Tf that gonna sell for 500$ - 450$
56CU - 13.6Tf for 500$ - 550$
In order reconcile:
1. Benji's 13TF Tweet
2. Gonzalo 20K+ FS Score
3. Sony's BC patent
4. Historic Console TDP
5. Reddit PCB die size
6. Gonzalo's Navi 10 Lite designation

...IMO it cannot be "Big Navi" since "Big Navi" would have projected overall Fire Strike score of 23K/24K+ even with a jank CPU. Apisak said "5700XT + 3700X = next gen console simulator". Furthermore, "Big Navi" doesn't fit in Reddit OQA PCB leak's ~316mm² or Komachi's Chinese Forum leak "~300mm² size".
---
R R600
Richard Leadbetter's theory-craft from April 20, 2019 was that the MS and Sony partnership for Azure to host PSNow and eventually PSN, along with their collab with AMD, has lead to situation where they are aware of each other's specs and possibly collaborating for aforementioned integration.

*Theory-craft* PS5's narrow+high clock design was chosen to facilitate PS4 BC and Sony's "Hybrid/Shader RT" implementation, and high core clock facilitates "high tide raises all ships" effect. MS has slightly different priorities with Xbox, yet strangely retains Shakespearean code name nomenclature with Arden/Argulas. Their RT implementation could be different, they might have retained more "Game Cache" than Sony, and they have different approach for BC/forwards compat. Nonetheless...
giphy.gif
 
Last edited:
Well anyone who has ever seen any professional publishing process knows that any technical document first gets approval, goes back for scrutiny, authentication, proof reading and grammar corrections before getting approval a second time before being submitted as an official corporate document.

Those typos would have never escaped that process. Just real amateur hour stuff going on.
Someone had speculated that they occasionally insert intentional typos to track down leakers...

Either way, we didn't have 7nm products back in early 2018. Sony (and MS) are planning way ahead (since they have access to AMD/TSMC/Samsung roadmaps), they set some targets and they're waiting for silicon tech to mature enough for mass production.

Cerny himself said that PS5 R&D started back in 2015, probably with the rumored (by OsirisBlack) 2nd SKU of PS4 Pro (Zen 1 CPU, $499 MSRP). A proto-PS5 if you will.

The way I see it, it's a continual process of making experimental APUs and building upon that. There's nothing final right now.

It's the same thing with controllers. They make dozens of prototypes, but only one variant will eventually become a commercial product.
 

Imtjnotu

Member
Tbh from Apisak/Gonzalo/PCB/Flute leaks, its all been very consistant.

1st leak (January) specified Navi10 Lite clocked at 1.0Ghz.

2nd leak (April) specified Navi10 Lite 2nd revision and 1.8Ghz clocks

3rd leak (May) specified PCB specs, including 256bit bus and 18Gbps RAM chips, with SOC of around 316mm².

4th leak (July) Flute showed CPU clocks and performances, as well as memory chips which match 3rd Leak - PCB from May.

I want to bring 2 points to discussion.

1) There are no 18Gbps chips currently available. All Super and new AMD cards (5500) use 16Gbps. But May leak specified Samsung code for 18Gbps chips and these will actually be available in mass production by early next year. What this means is Sony probably went with "leaner" SOC size and higher clocked memory chips to get back lost BW, while MS seemed to have gone with wider bus but slower memory, thus bigger SOC but similar effective BW.

2) Navi10 Lite does not mean cut of Navi10 from PC (which is 5700XT). In AMDs case, 10 is codename for first GPU in gen. In last few years it was always top spec card, but this time its 5700. What might have been the case when Gonzalo leaked back in January is that Navi10 was to be full - 56/64CU Navi and therefore cut of version was called Lite. After that, it was decided that big card is a no go and Radeon 7 will be their big GPU, while Navi10 will be 5700.

In any case, most important thing about leaks is consistancy and this is why, IMO, Gonzalo/Flute/PCB make so much sense. Not only are they coming straight from horses mouth (AMD data miners), they also connect to one another in very specific way (Gonzalo clocks, code name nomenclature similar to other Sony AMD chips, bus width, chips used etc)
I've posted this a million times in this thread.

Samsung started mass production of 16gb Rams chips at 18gbps last year.

 

CrustyBritches

Gold Member
I was looking for where Anandtech might have pulled that from and it's likely from the Sumsung Press release on their site posted January 18, 2018:
Built on Samsung’s advanced 10-nanomter (nm) class* process technology, the new GDDR6 memory comes in a 16Gb density, which doubles that of the company’s 20-nanometer 8Gb GDDR5 memory. The new solution performs at an 18-gigabits-per-second (Gbps) pin speed with data transfers of 72 gigabytes per second (GBps), which represents a more than two-fold increase over 8Gb GDDR5 with its 8Gbps pin speed.
...
Samsung’s immediate production of GDDR6 will play a critical role in early launches of next-generation graphics cards and systems. With all of its improvements in density, performance and energy efficiency, the 16Gb GDDR6 will be widely used in rapidly growing fields such as 8K Ultra HD video processing, virtual reality (VR), augmented reality (AR) and artificial intelligence.

With extensive graphics memory lineups including the new 18Gbps 16Gb GDDR6 and recently introduced 2.4Gbps 8GB HBM2, Samsung expects to dramatically accelerate growth of premium memory market over the next several years.

Still can't find anything else about -HC18, but reading Eurogamer's Scarlett reveal breakdown, the Scarlett SoC render has a mix of memory chips..."ending in 325BC-HC14 and 325BM-HC14. The 'HC-14' confirms 14gbps speed, but the two part numbers suggest a mixture of both 1GB and 2GB memory modules. Whether this is just Microsoft seeking to throw off people like me looking for clues on the make-up of the system remains to be seen, but this is certainly a strange set-up."

From Samsung's site:
YKtmn4i.jpg

*edit*
Did Sony gobble up all their current 18Gbps production capacity?
*last edit*
Keep in mind, the OQA PCB leak is still just a rumor, in comparison to the all-but-confirmed Gonzalo/Oberon data-mining leaks.
 
Last edited:
Let sweeten things up for the non believers, (if 2Ghz is true) put 40CU's @2Ghz = 10.2TF Navi if you believe what klee (don't know if that what he call himself) said about it being double digit Teraflops Navi and asked if it's better than Rx5700xt and wink wink he nodded.
 

llien

Member
Yes.

Listen it's not about lowballing, Rx5700 is 36CU, Rx5700 xt 40CU.
next AMD cards (5800) gonna have 52CU & 56CU (i assume) so :
52CU - 12.6Tf that gonna sell for 500$ - 450$
56CU - 13.6Tf for 500$ - 550$
so you expecting PS5 will have 13TF NAVI for a decent price 449$ to 500$ a console, i don't think so but if you do believe that i respect that and only time will tell.
The price argument can be addressed.
After all, GPU is the most important part of console and, unlike with GPU, price of the "rest" is not changing much, just the chip.

The reason I don't believe in "much faster than 5700" theories is power consumption. Non-XT hovers at 180w (already problematic), XT slaps another 30-50w on top. I've heard the "but EUV", but in my books EUV would simply bring power consumption down to good console levels.
 

TLZ

Banned
They doin it again.
nieuwe-playstation-2020.jpg


They should at least remove all the obvious devkit stuff like the 1-7 and everything else on there and just leave the disc and a couple USB 3 ports to give it a clean retail look. Also make it half that height with sharp square edges, keeping the V. And there, you have a better looking retail version.
 
Last edited:
They should at least remove all the obvious devkit stuff like the 1-7 and everything else on there and just leave the disc and a couple USB 3 ports to give it a clean retail look. Also make it half that height with sharp square edges, keeping the V. And there, you have a better looking retail version.

So, something like this?

243066_med.jpg


I could actual dig that.
 

Mass Shift

Member
That wouldn't matter. That's for public customers. Customer like Sony and MS would know well in advance when products are coming on the market and would likely have first dibs on it. What Samsung or any semiconductor company lists on their website is usually far behind what their prime customers get.

I wasn't disputing anything. Samsung obviously made the announcement.

I only said it was interesting because it's been 21 months since they announced mass production for it. Not seeing it at least listed on the public site after all that time is a bit odd.

They have late product announcements from this year already detailed on the public site.
 

dark10x

Digital Foundry pixel pusher
I dont know if DF are MS shills. they have their own biases or preferences like everyone in this thread. richard has always had a soft spot for ms. anyone who's seen their face/offs since the early days of ps360 knows this. its hard to refute. the alex guy is new and he seems to be a pc guy but whatever, i dont see how that has translated into any poor coverage from him. just a lot of jerking off over ray tracing which is great because no one else seems to be doing that. john seems to be the only one without a preference. i dont know, i just find it hard to believe that they are all ms fanboys.

what i do find somewhat funny and hypocritical is their fawning over shitty switch ports. i mean these ports are hitting 360p. fucking 360p. we cant even watch youtube videos at 360p nowadays. hell, i back out as soon as i see a pornhub video running at 360p. these guys used to make such a big deal about ps3 games running at sub 720p resolution. it was all ms back then and they pretended as if they needed glasses because of how blurry non 720p games looked on the ps3.

when the ps4 launched and 900p to 1080p became a norm b/w the two consoles, all of a sudden resolution became pointless. it was actually hilarious to see them look for framerate improvements on the x1 versions. i remember the far cry 4 face/off in particular because they focused on a 1-2 fps improvement as if it was a gamechanger. im like you do realize the ps4 is pushing 50% more pixels.

at the end of the day, i would just like them to be consistent. you simply cannot look at zelda and give it a passing grade as a tech site. you need to name and shame the studio. you need to tell your audience to skip the game. you have made your name bitching about 10% difference in pixels during the ps3 era and harp on and on about 1-2 fps difference this gen. you dont get to pick and choose when to unleash the outrage. that is why they get so many accusations of being ms shills, they simply arent consistent when giving games a pass.

as for downplaying tflops, its bizarre but i think they are being smart about it. they have left themselves a lot of room to wiggle out of the low tflops estimates, simply because they havent given us an estimate. they are hinting at sub 10 tflops, but again, i dont see any real claim here. no off the record sources. i dont know why they wouldnt come out and tell us the tflops numbers if they knew. i truly believe they are just as clueless as we are. no offense intended to DF or any of the sub 10 tflops folks here.
You're welcome to think what you like, that's no problem, but I can at least explain why Switch is judged differently - from my perspective.

I don't think it should be judged on the same playing field as PS4 or Xbox One because it isn't those machines. It's a portable system with a very low power draw using a mobile chipset that is throttled down. It's not really about the Switch, however, it's about understanding these limitations and examining what developers were able to achieve.

Doom on Switch is horrible next to playing it on PS4, no doubt, but I don't think it's interesting to say that as it is 100% impossible to match PS4. It cannot be done. The hardware isn't there. Thus, I would judge it by what I feel the development team has achieved on said hardware. I think it's impressive that it was achieved on the Switch. So I like to look at what much more powerful machines have achieved and examine the areas where cutbacks were made, basically.

So that's what to keep in mind when looking at Switch videos - that's where I'm coming from. So, even if you don't agree, you can understand why they are presented this way.

I find that I tend not to make videos aimed solely at 'reviewing' the experience any longer - my approach today has evolved. I'm more interested in examining techniques developers have employed (or at least the areas where cutbacks were made) relative to the platform. It's the same thing that powers DF Retro - it's just an attempt to look at how things were accomplished given less powerful hardware.

If the Switch were capable of matching a PS4 and a port like Doom was released, I'd say it was terrible and have little praise to offer - but that's not what we have.

Of course, I also judged Ark: Survival Evolved VERY harshly on Switch because it WAS very poorly made.

It's a tricky thing - it's all about considering the capabilities of the hardware versus the results that have been achieved. Hopefully there's enough data presented where you could look at a video and say "OK, I see where the port falls short - I don't think this is good enough for me" even if I ultimately am impressed with the results. See what I mean?

you have made your name bitching about 10% difference in pixels during the ps3 era and harp on and on about 1-2 fps difference this gen.
Keep in mind that the DF you're talking about isn't the same. I certainly didn't do that.

I didn't really cover many last gen games since I wasn't part of it at the time but, if you revisit them, it's pretty shocking just how awful many big games truly are in terms of performance. It was hit or miss but this gen is a massive improvement.

I think the key thing to keep in mind is that the approach to covering this stuff has changed so much. I didn't even make videos until late 2015 and only really starting finding what I liked to do with them a couple years later. The approach to coverage now is not at all the same as it used to be.

If you just want the raw numbers, however, I'd say places like VG Tech are excellent for that as they basically offer that - just raw info. I don't personally find that info all that interesting on its own hence why I don't make videos like that but it is definitely out there. I just think development and understanding of techniques is rewarding and interesting to explore.

D dark10x is a huge fanboy alright, of retro games, and I'm ok with that :messenger_relieved:
Yeah, this is true ha ha.

I don't really have any passion for the modern systems any longer. It's neat to examine but they don't incite any real emotions.

Older machines, however, with highly specialized hardware that differs greatly from other machines of the era is far more interesting to me. I could talk about that stuff all day.
 
Last edited:
The price argument can be addressed.
After all, GPU is the most important part of console and, unlike with GPU, price of the "rest" is not changing much, just the chip.

The reason I don't believe in "much faster than 5700" theories is power consumption. Non-XT hovers at 180w (already problematic), XT slaps another 30-50w on top. I've heard the "but EUV", but in my books EUV would simply bring power consumption down to good console levels.
If the bold is true, then why do they need this weird V design? A more conventional & sleek design (a la XB1X) would suffice.

I'm starting to think if perhaps SonGoku SonGoku was right about making a 250-300W console... it will be a 1st if they pull it off.
 

TLZ

Banned
Yeah, this is true ha ha.

I don't really have any passion for the modern systems any longer. It's neat to examine but they don't incite any real emotions.

Older machines, however, with highly specialized hardware that differs greatly from other machines of the era is far more interesting to me. I could talk about that stuff all day.
Then create your own retro channel my duuuude. Here's some tips on how to start one:




:messenger_tears_of_joy:

Crazy guys.
 

"But, with the next-gen, I think you’ll see a big upgrade in CPU, because we really want to make sure that you don’t have any compromises with the framerates. Yes, we can do 4K, but we can also do 120 frames per second. So I think that type of capability will be something that people don’t see today"
Again not talking about the GPU (that means nothing crazy to be expected) the main focus is the CPU.
Don't get me wrong Gtx 1080 Ti perf in nextgen is great.
 

FranXico

Member

"But, with the next-gen, I think you’ll see a big upgrade in CPU, because we really want to make sure that you don’t have any compromises with the framerates. Yes, we can do 4K, but we can also do 120 frames per second. So I think that type of capability will be something that people don’t see today"
Again not talking about the GPU (that means nothing crazy to be expected) the main focus is the CPU.
Don't get me wrong Gtx 1080 Ti perf in nextgen is great.
We occasionally hear similar comments from Sony. It's a good thing specs will be similar and the CPU issues of this gen are being addressed.
 
Last edited:

McHuj

Member
Frame rates are up to the developers. Unless Sony and MS enforce frame rates on their platforms, things will be the same next gen.

I'm sure current gen games all can run 60-120 FPS with the upgrade GPU, but when developers start pushing physics, animation, AI, and whatever else they wish to make a next-gen game, frame rates will take a back seat unless enforced from above.
 

ethomaz

Banned

"But, with the next-gen, I think you’ll see a big upgrade in CPU, because we really want to make sure that you don’t have any compromises with the framerates. Yes, we can do 4K, but we can also do 120 frames per second. So I think that type of capability will be something that people don’t see today"
Again not talking about the GPU (that means nothing crazy to be expected) the main focus is the CPU.
Don't get me wrong Gtx 1080 Ti perf in nextgen is great.
Basically he has no ideia what he is talking about.
Framerate compromisse will always exists no matter where you are... even on top-high-end PCs.

In any game developers needs to put in the balance graphic quality vs performance.
There is no exception.
 
Last edited:

TeamGhobad

Banned
It can actually, I already covered that.

40CUs on die with 256bit bus (~253mm²) + 20% more space per CU on die for RT (~20mm², this is 2x Nvidia size for a good measure) + Zen2 with 1/4 of its L3 cache (~40mm², full Zen2 is 70mm²).

I guess Scarlett would be bigger, but 320bit bus already takes ~16mm² more space. If they have lower clocked RAM chips, their effective BW will be neck and neck with PS5s running hotter and Scarlett SOC being larger. If they go for additional cache (might be the case with them going on and on about it) that is additional 15mm² and you are already ~350mm² for effectively similary specced SOC, with PS5 having chance of even being more powerful if they clock it higher. Number of CUs could be the same therefore.

whats "better" being hotter or being larger?

Frame rates are up to the developers. Unless Sony and MS enforce frame rates on their platforms, things will be the same next gen.

I'm sure current gen games all can run 60-120 FPS with the upgrade GPU, but when developers start pushing physics, animation, AI, and whatever else they wish to make a next-gen game, frame rates will take a back seat unless enforced from above.

i wouldnt mind 60fps being forced.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
60 frames is cool, but I still want some games to melt my eyeballs next gen and developers know graphics are more marketable than framerates, so I expect 30 FPS games next gen too
 
Frame rates are up to the developers. Unless Sony and MS enforce frame rates on their platforms, things will be the same next gen.

I'm sure current gen games all can run 60-120 FPS with the upgrade GPU, but when developers start pushing physics, animation, AI, and whatever else they wish to make a next-gen game, frame rates will take a back seat unless enforced from above.
^ This.

We've had 1080p 120 fps claims since the PS3 era and we ended up with 720p 30 fps being the norm.


TLOU1 at 1080p 120 fps would look and play like a PS1/2 game. There's no way around that.

What they truly mean by 120 fps is MP games like Fortnite and Apex Legends. They already run at 60 fps on the Jaguar, so...
 
We occasionally hear similar comments from Sony. It's a good thing specs will be similar and the CPU issues of this gen are being addressed.
Companies never admit anything negative about their product until their new one is coming out, go get this one it's better.
Frame rates are up to the developers. Unless Sony and MS enforce frame rates on their platforms, things will be the same next gen.

I'm sure current gen games all can run 60-120 FPS with the upgrade GPU, but when developers start pushing physics, animation, AI, and whatever else they wish to make a next-gen game, frame rates will take a back seat unless enforced from above.
Basically he has no ideia what he is talking about.
Framerate compromisse will always exists no matter where you are... even on top-high-end PCs.

In any game developers needs to put in the balance graphic quality vs performance.
There is no exception.
I think it's gonna be a hard sell to put 30 fps next gen, i think they would be shamed for that.
whats "better" being hotter or being larger?
Being larger bro c'mon 👩‍🦰.
^ This.

We've had 1080p 120 fps claims since the PS3 era and we ended up with 720p 30 fps being the norm.


TLOU1 at 1080p 120 fps would look and play like a PS1/2 game. There's no way around that.

What they truly mean by 120 fps is MP games like Fortnite and Apex Legends. They already run at 60 fps on the Jaguar, so...
The fuck are you smoking ? maybe i missed the part he says 1080p 120fps, where is it ?
"TLOU1 at 1080p 120 fps would look and play like a PS1/2 game."
you are ridiculous.
 

ethomaz

Banned
I think it's gonna be a hard sell to put 30 fps next gen, i think they would be shamed for that.
Since when it is hard to sell 30fps games?
Over 90% of the game next-gen will be 30fps.

The reason it is because it is easier to sell a 30fps with pretty graphics than a 60fps with downgrades.

I believe people are setting expectations too high with 60fps... it won't happen on consoles ever and there is little reason to happen.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You're welcome to think what you like, that's no problem, but I can at least explain why Switch is judged differently - from my perspective.

I don't think it should be judged on the same playing field as PS4 or Xbox One because it isn't those machines. It's a portable system with a very low power draw using a mobile chipset that is throttled down. It's not really about the Switch, however, it's about understanding these limitations and examining what developers were able to achieve.

Doom on Switch is horrible next to playing it on PS4, no doubt, but I don't think it's interesting to say that as it is 100% impossible to match PS4. It cannot be done. The hardware isn't there. Thus, I would judge it by what I feel the development team has achieved on said hardware. I think it's impressive that it was achieved on the Switch. So I like to look at what much more powerful machines have achieved and examine the areas where cutbacks were made, basically.

So that's what to keep in mind when looking at Switch videos - that's where I'm coming from. So, even if you don't agree, you can understand why they are presented this way.

I find that I tend not to make videos aimed solely at 'reviewing' the experience any longer - my approach today has evolved. I'm more interested in examining techniques developers have employed (or at least the areas where cutbacks were made) relative to the platform. It's the same thing that powers DF Retro - it's just an attempt to look at how things were accomplished given less powerful hardware.

If the Switch were capable of matching a PS4 and a port like Doom was released, I'd say it was terrible and have little praise to offer - but that's not what we have.

Of course, I also judged Ark: Survival Evolved VERY harshly on Switch because it WAS very poorly made.

It's a tricky thing - it's all about considering the capabilities of the hardware versus the results that have been achieved. Hopefully there's enough data presented where you could look at a video and say "OK, I see where the port falls short - I don't think this is good enough for me" even if I ultimately am impressed with the results. See what I mean?


Keep in mind that the DF you're talking about isn't the same. I certainly didn't do that.

I didn't really cover many last gen games since I wasn't part of it at the time but, if you revisit them, it's pretty shocking just how awful many big games truly are in terms of performance. It was hit or miss but this gen is a massive improvement.

I think the key thing to keep in mind is that the approach to covering this stuff has changed so much. I didn't even make videos until late 2015 and only really starting finding what I liked to do with them a couple years later. The approach to coverage now is not at all the same as it used to be.

If you just want the raw numbers, however, I'd say places like VG Tech are excellent for that as they basically offer that - just raw info. I don't personally find that info all that interesting on its own hence why I don't make videos like that but it is definitely out there. I just think development and understanding of techniques is rewarding and interesting to explore.


Yeah, this is true ha ha.

I don't really have any passion for the modern systems any longer. It's neat to examine but they don't incite any real emotions.

Older machines, however, with highly specialized hardware that differs greatly from other machines of the era is far more interesting to me. I could talk about that stuff all day.

First of all, your deep dives into tech in games beyond the framerate and resolutions is the best content on DF. keep it up even if its not doing anything for you at the present. I do think the CPU and SSD upgrades next gen will allow for some awesome destruction, physics and more interactivity which should bring back some of your passion. But trust me, your deep dive videos and Alex's ray tracing stuff is excellent and exactly what i want out of tech sites because no game journalists nowaday give a crap about that stuff. Believe it or not, i still go back and rewatch DF analysis of Horizon, UNcharted and god of war instead of simply putting on the game, not just out of laziness but its good to see you guys talk about the tech in such great detail.

Secondly, the Switch is still a $300 console/handheld. the PS4/X1 were already sub $300 by the time it launched. I get that its a handheld but surely 720p should be the limit since thats the base resolution of the screen. And its not even about the resolution. The graphics quality is simply unacceptable. I was looking at the Witcher 3 comparison by Tom and i couldnt help but wonder if it could run on last gen consoles at those settings. It looks absolutely atrocious. Way way worse than some of the cross gen games we saw this gen. Titanfall, Watch Dogs, MGSV, and especially Rise of Tomb Raider on the X360 dont look nearly as bad. So whats the frame of reference here? Devs are allowed to launch a game that looks awful just because its on a $300 handheld?

Lastly, is there anything you can say about next gen beyond what you and richard discussed during EGX? Are you expecting sub 10 tflops like the presentation suggests? Are you expecting 12 tflops GCN performance or 12 tflops RDNA performance? you guys were at Gamescom a couple of months ago, did you ask devs about the specs. I really couldnt care less about who is stronger as long as we get 10+ RDNA tflops. With all the increases in physics destruction and NPC counts the CPU will allow, not to mention the fast traversal SSDs will allow, the GPU will need to be a lot more powerful to render all that stuff. The pop in will be a massive problem next gen with faster traversal.
 
Last edited:
Lets say that PS5 is 8TFLOPS of Navi RDNA. Sony can say "double the TFLOPS of PS4 Pro!" in terms of marketing. But if Xbox Scarlett is similar TFLOPS, how can they market it 'superior or more' compared to Xbox One X of 6TFLOPS? Cause just being 2 TFLOPS more than Xbone X dont sound too great :/
 
Since when it is hard to sell 30fps games?
Over 90% of the game next-gen will be 30fps.

The reason it is because it is easier to sell a 30fps with pretty graphics than a 60fps with downgrades.

I believe people are setting expectations too high with 60fps... it won't happen on consoles ever and there is little reason to happen.
I know that to push graphics you need to lower frame rates BUT the main complain (over the years) is people are tired of 30 fps (this is what consoles are known for) and even Sony & Ms marketing strategy is pushing this PR "we made this powerful enough to allow it to run high frames and if it doesn't it's the Devs not us", it become to a point where it's shameful.
Don't feed the trolls. Ignore list is a wonderful invention.
Not my fault if you act like a pokemon, some times the inner pokemon in you take's over.
 

McHuj

Member
I think it's gonna be a hard sell to put 30 fps next gen, i think they would be shamed for that.

I think MS will stick to 60 FPS for their first party games like Halo, Gears, Forza. But I don't think 3rd party devs will care one bit. I think the group of us that really care about frame rate is comparatively small to the rest of the gaming population.
 
I think MS will stick to 60 FPS for their first party games like Halo, Gears, Forza.
You're right about cross-gen games, but what about next-gen ones?

If MS has cross-gen 1st party games at 60 fps and Sony has next-gen 1st party games at 30 fps, the gulf in terms of graphics/AI will be sizeable and that is only going to fuel their PR/marketing department.

Of course there's always a chance game development has become so expensive and lengthy, that "30 fps max graphics" may not be viable anymore. We'll see. If we get 60 fps SP games, it will most likely be due to cost/time concerns, not because platform holders will make it mandatory (MS had tried to make 720p & AA mandatory during the 360 era, but it didn't last).
 

dark10x

Digital Foundry pixel pusher
First of all, your deep dives into tech in games beyond the framerate and resolutions is the best content on DF. keep it up even if its not doing anything for you at the present. I do think the CPU and SSD upgrades next gen will allow for some awesome destruction, physics and more interactivity which should bring back some of your passion. But trust me, your deep dive videos and Alex's ray tracing stuff is excellent and exactly what i want out of tech sites because no game journalists nowaday give a crap about that stuff. Believe it or not, i still go back and rewatch DF analysis of Horizon, UNcharted and god of war instead of simply putting on the game, not just out of laziness but its good to see you guys talk about the tech in such great detail.

Secondly, the Switch is still a $300 console/handheld. the PS4/X1 were already sub $300 by the time it launched. I get that its a handheld but surely 720p should be the limit since thats the base resolution of the screen. And its not even about the resolution. The graphics quality is simply unacceptable. I was looking at the Witcher 3 comparison by Tom and i couldnt help but wonder if it could run on last gen consoles at those settings. It looks absolutely atrocious. Way way worse than some of the cross gen games we saw this gen. Titanfall, Watch Dogs, MGSV, and especially Rise of Tomb Raider on the X360 dont look nearly as bad. So whats the frame of reference here? Devs are allowed to launch a game that looks awful just because its on a $300 handheld?

Lastly, is there anything you can say about next gen beyond what you and richard discussed during EGX? Are you expecting sub 10 tflops like the presentation suggests? Are you expecting 12 tflops GCN performance or 12 tflops RDNA performance? you guys were at Gamescom a couple of months ago, did you ask devs about the specs. I really couldnt care less about who is stronger as long as we get 10+ RDNA tflops. With all the increases in physics destruction and NPC counts the CPU will allow, not to mention the fast traversal SSDs will allow, the GPU will need to be a lot more powerful to render all that stuff. The pop in will be a massive problem next gen with faster traversal.
Thanks!

So, yeah, you're not wrong about that - it's a tricky thing. I tend not to take money into account at all when creating videos so the price of the system isn't a factor at all.

I'm not sure that Witcher 3 could be done on last gen as it's not just horsepower where they're outdated - the Switch GPU is more modern and supports features that they do not. That said, I don't think Switch is that much more capable than last-gen consoles either so it is quite comparable.

Which is how I view it. I haven't played it myself but, from what I've seen, I think it's impressive that it exists at all on Switch. I'd feel the same about it as I would if it were released on Xbox 360 or PS3.

Basically, I don't think it could get much better than that. Now that we can access settings and the like, we can play around with the performance and see that, at stock clocks, there isn't much headroom there. It may be a $300 system, but there isn't much more that the developers can do. So, in that sense, they did an impressive job with the conversion even if the end results aren't great looking.

It's like a port of Chase HQ to the ZX Spectrum - it looks HORRIBLE next to the arcade, absolutely terrible - but then you consider the platform it's on and, suddenly, it's impressive.

So the verdict will be completely different depending on the approach you take in judging it. You know what I mean? It could either be a horrible, ugly port or an impressive one and both would be true. That's kinda how I feel about The Witcher 3 on Switch - it's impressive and ugly at the same time, if you will.

You CAN still judge the quality of a port, however - like I said, Ark: Survival Evolved is an example of a very poorly made conversion - as in, you could theoretically expect better from the game given the hardware. It's just a poorly optimized game across the board. I don't think Witcher 3 could look that much better on Switch no matter how much time and money you pour into it so I don't feel the development team failed to deliver their best.

Sadly, there is still a lot of info to be discovered so I can't say for sure. I just have a feeling that the tflops number will be around or less than 10 - but that's just a guess so don't take it to mean anything. It's the systems being efficient in every other area that will help. Tflops were used by marketing this time as the GPUs were rather capable so it made sense to focus on that even though they are weak in many other areas. Again, though, all just educated guessing. Hopefully we can discover more soon enough but it's tough to get that info right now, as you might expect (and it may also be in flux).
 

Hobbygaming

has been asked to post in 'Grounded' mode.
You're right about cross-gen games, but what about next-gen ones?

If MS has cross-gen 1st party games at 60 fps and Sony has next-gen 1st party games at 30 fps, the gulf in terms of graphics/AI will be sizeable and that is only going to fuel their PR/marketing department.

Of course there's always a chance game development has become so expensive and lengthy, that "30 fps max graphics" may not be viable anymore. We'll see. If we get 60 fps SP games, it will most likely be due to cost/time concerns, not because platform holders will make it mandatory (MS had tried to make 720p & AA mandatory during the 360 era, but it didn't last).
I'm curious if Microsoft will stick with forward compatibility because the sequels to Spider-man, GOW and HZD will most likely be exclusive to the PS5 and won't be held back by last gen
 
Will Microsoft make any advancements to Direct X 12??? It was released back in July of 2015. Its 4+ years old. Will there be a direct x 12.1, 12.2 etc?
 

SlimySnake

Flashless at the Golden Globes
If the bold is true, then why do they need this weird V design? A more conventional & sleek design (a la XB1X) would suffice.

I'm starting to think if perhaps SonGoku SonGoku was right about making a 250-300W console... it will be a 1st if they pull it off.
The price argument can be addressed.
After all, GPU is the most important part of console and, unlike with GPU, price of the "rest" is not changing much, just the chip.

The reason I don't believe in "much faster than 5700" theories is power consumption. Non-XT hovers at 180w (already problematic), XT slaps another 30-50w on top. I've heard the "but EUV", but in my books EUV would simply bring power consumption down to good console levels.

I really dont think we should base anything on the rx 5700 cards. the 7.9 tflops 5700 is already at 180w. Thats basically the TBP of the 580 which went into the X1X's 170w console. Are we seriously expecting a $499 7.9 tflops console? No one here would even entertain that thought. Even the most pessimistic bunch here are looking at 9.2 tflops which should be around 220w given how the 9.6 tflops 5700XT is 235w.

I think these cards are way too power hungry and there is no way in hell they are going into the next gen consoles. AMD has specifically stated the RDNA 2.0 cards will have RT and will be on 7nm+. I dont see why consoles would go with a year old tech and the x700 cards after going with the 7870, 480 and 580 versions in the last gen. they typically go with more CUs and simply downclock. So why change it now?

The simulating gonzalo thread shows how watts and performance dont scale linearly as you increase clocks. Going from 1.6 ghz to 2.0 ghz, we see a performance increase of roughly 20% and a power increase of over 60%. So why woud the console manufacturers go with higher clocks and fewer CUs when they didnt do it before? Cost of silicon has increased but they now have an extra $100 to play with. surely some of that will go into SSD and fancier cooling solutions but the APU only cost $100 on the PS4 despite including a GPU with the same number of CUs as the $350 7870 albeit with an 800 mhz clockspeed compared to the 1000 mhz clockspeed of the desktop part. Shouldnt they have gone with a smaller gpu back then when they were on a tighter budget?

powerscalinggpuonly3pkch.png
 

Ovech-King

Gold Member
Why would you assume console titles will be 60fps at 4K by default? Hasnt happened...ever actually. Even when consoles where money pits with better specs then PC at the times you had ton of sub HD sub 30fps games.

On the other hand, 9TF Navi is currently only behind Radeon VII as fastest AMD chip and without RT uses anywhere from 150W to 230W at full power (depending on wattage obviously). It costs 400$ retail as well

To compare it to this gen when top of the line PC GPUs had 3.8TF, consoles where capped at 1.3 and 1.8TF.

So I dont see a precedent to assume we HAVE to get 11-12TF because 4K60fps magic will not work with it. Consoles designers start from clean sheet of paper with thermal, manufacturing and price points in mind, and not some mythical 4K60fps all the time. That is marketing and developer part to deliver.

Simple facts for me : TV's are 4k not 1440p so new gen = native 4K and not an half step up like the Pro. And now that the cpu will be 4 times more powerful (or finally above 3 gz if you wish ) there is nothing in the way of 60 fps anymore
 
Last edited:

diffusionx

Gold Member
Basically he has no ideia what he is talking about.
Framerate compromisse will always exists no matter where you are... even on top-high-end PCs.

In any game developers needs to put in the balance graphic quality vs performance.
There is no exception.

Jeez, there's already enough of this nonsense out there without company execs spouting it.

I have a Ryzen 7 3700X in my PC. Based on what Greenberg is saying here I should have some sort of magic CPU with no limitations, but when I turn up settings my framerate goes down. Maybe I just don't have the secret sauce?
 

R600

Banned
Simple facts for me : TV's are 4k not 1440p so new gen = native 4K and not an half step up like the Pro. And now that the cpu will be 4 times more powerful (or finally above 3 gz if you wish ) there is nothing in the way of 60 fps anymore
I mean, in 2006 TVs where full HD, yet we got more sub HD games then Full HD ones so I am not sure they will deliver on your wishes.
 
Status
Not open for further replies.
Top Bottom