• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5, Project Scarlett to hit over 10TFLOPs of power, sources say.

demigod

Member
My source has always told me between 1080 and 1080Ti. In real world calculations, that's not very powerful to me. There are several games that still struggle to hit 60FPS on the 2080Ti. Trying to run a game like Control on a 1080 with some RT cores isn't going to do much to the look. You'd have to drop so many samples to get it to run at 1080p/30FPS. I'm convinced that the consoles' weakest part is the GPU. It's just not powerful enough for another 6 years.

You're full of shit. You never mentioned 1080Ti or anything between it before.

 

48086

Member
At this point, I'm kind over "impressive" visuals. I want 60fps no stuttering, screen tearing, large fps drops, etc. etc. and good gameplay. Art direction and gameplay are far more important to me than the number of pixels and muh inclusive story.
 
Last edited:

JordanN

Banned
Console optimizatios are a thing. Forza 7 is 4k, fully stable 60fps with hdr, and it runs on a 2013 modified mobile cpu.
What exactly is this?

I remember when there was huge controversy over the Wii U's power. In real world performance, it had more power than PS3/360 but it ALWAYS required sacrificing something.
Need for Speed was an example of this. They made it look visually better but there was still complaints about the CPU holding it back in some ways that no developer could get around.
Digital Foundry said:
"GPU bottlenecks on PS3/360 aren't as much of an issue on Wii U, though there are occasions when the older consoles manage to pull ahead."

You can only do so much with the actual hardware you're given. I don't believe there's a magic "code to the metal" or whatever that's native to console. Look at multiplatform games as direct evidence of this.
 
Last edited:

thelastword

Banned
MS has never had a more powerful box than Sony launch aligned. If MS launches after PS, that's the only way it can be more powerful, OG XBOX came after PS2 by 1 year and 8 months.........XBONEX came after PRO by 1 year, if XBONEX came out the same time with PRO, Xbonex would be weaker..... Even then, 1 year after PRO and PRO has more tech features, Vega features , a better design and more rops, just not utilized.....

Launch aligned Sony, the hardware company is aces over MS.... In 2013, Sony was ahead of MS by 40%, they didn't need a year extra to accomplish that.....So 2020 is looking the same, if MS wants to have a power advantage later on next gen, they will have to launch another upgraded console after Sony again....
 

Polygonal_Sprite

Gold Member
I have a Ryzen 2700x @4.3ghz with 32gb Ram and a 2080ti, and people on Gaf honestly expect the next gen consoles to even come close to it...

I say every single time.

Keep... Your expectations... In... Check.

There will be next gen exclusive launch games that wipe the floor with anything your PC currently plays. We've done this song and dance with the "lol Jaguar / laptop GPU lol" which then produced the likes of The Order, Uncharted 4, Horizon, Spider-Man and God of War...

Keep.... Your limitations... In... Check.

PC gamers get so defensive at this time of the gen haha!
 

thelastword

Banned
There will be next gen exclusive launch games that wipe the floor with anything your PC currently plays. We've done this song and dance with the "lol Jaguar / laptop GPU lol" which then produced the likes of The Order, Uncharted 4, Horizon, Spider-Man and God of War...

Keep.... Your limitations... In... Check.

PC gamers get so defensive at this time of the gen haha!
Money.....
 
So basically 1080Ti which'll me memory limited in a way 1080Ti is not.

OC'd, a 1080ti is more like 13.9 Tflops.

The PS5 may end up having "1080Ti like performance" but just based on Tflops, there is a pretty big lead in favor of a 1080ti.

I'm definitely interested in seeing how my 1080ti handles next gen games compared to next gen consoles, before I upgrade.
 
Last edited:

VFXVeteran

Banned
Console optimizatios are a thing. Forza 7 is 4k, fully stable 60fps with hdr, and it runs on a 2013 modified mobile cpu.

There is no proof of modded console optimizations especially from 3rd party devs. Game after game after game still has shortcomings in the graphics department (see all the Digital Foundry articles on various AAA game comparisons). There is only but so much you can optimize with limited hardware. Dabbling in ray-tracing is a whole other level to getting good graphics.
 

JLB

Banned
There is no proof of modded console optimizations especially from 3rd party devs. Game after game after game still has shortcomings in the graphics department (see all the Digital Foundry articles on various AAA game comparisons). There is only but so much you can optimize with limited hardware. Dabbling in ray-tracing is a whole other level to getting good graphics.

Tell me of another piece of hardware that runs 4k, 60fps with hdr using a 2013 mobile cpu.
 

JLB

Banned
What exactly is this?

I remember when there was huge controversy over the Wii U's power. In real world performance, it had more power than PS3/360 but it ALWAYS required sacrificing something.
Need for Speed was an example of this. They made it look visually better but there was still complaints about the CPU holding it back in some ways that no developer could get around.


You can only do so much with the actual hardware you're given. I don't believe there's a magic "code to the metal" or whatever that's native to console. Look at multiplatform games as direct evidence of this.

It means that since the hardware specs are fixed, devs are able to optimize games to that configuration.
 
also, again, RDNA (newest AMD architecture) has a way better performance to TFLOP ratio than GCN (AMD architecture used in current consoles)

so even a 6TFLOP RDNA GPU would massively outperform zhe GPU in the Xbox One X

if the new consoles have a 10 or even 11 TFLOP RDNA GPU the performance increase will be higher than what the numbers say.

that's not even talking about Memory

I did forget about the architecture differences...weren't there some people saying that 10 TF on Navi is comparable to 13-14 or so TF on Vega/Polaris? There was definitely some talk like that in one of the next-gen threads IIRC.
 

VFXVeteran

Banned
1080ti is about 2080 level of performance though. But that's not counting games that will be optimized with VRR and hardware RT. That'll be plenty enough for 3 more years (until the next mid-gen console).

BTW did you just change you tune ? Are you slowly accepting that PS5 could have more than 10tf ? I think it's the first time you talk about PS5 having 1080ti level of performance. :) You always talked only about 1080 before AFAIK.

I don't accept anything. I only trust my sources. This person didn't say 1080Ti. They said between 1080 - 1080Ti. That tells me a little more powerful than a 1080 but less powerful than a 1080Ti. I've also mentioned this several times already so nothing has changed on my end. And my concerns are still warranted. Even a 1080Ti is subpar for the end cycle games that are out today. You simply can not run these games at 4k @ Ultra settings on PC. They will tank the framerate. Trying to implement anything outside of simple reflections using ray-tracing is out of the question.
 

VFXVeteran

Banned
When people say 'consoles are holding back PC's', they are talking soley about the majority of PC games being console ports, ports that are based on tech numerous levels below a lot of PC's. And that is fact.
I mean it works both ways of course, majority of big PC games are console ports, but because of that it means we can play games at higher settings, resolutions, and framerates. So there is always a sliver lining, and thats whats so good about PC gaming, the freedom and choice.

I talked to my sources about this too and all I'm told is that it depends. 1st party games will design for specific console architecture but develop on the PC. 3rd party games are developed on the PC from the word "go" and are ported down to the consoles.
 

VFXVeteran

Banned
Tell me of another piece of hardware that runs 4k, 60fps with hdr using a 2013 mobile cpu.

I have no idea what they had to scale back on the Switch (if that's what you are talking about). However, can you produce a PS4/X1 3rd party game that's running amazing visuals when it's not supposed to?
 

JordanN

Banned
It means that since the hardware specs are fixed, devs are able to optimize games to that configuration.
In the grand context of everything, why would PC games be exempt from "optimization"?

Especially when you consider this is the same platform where PC gamers have had frame rates as high as 300fps since 2001. I really doubt it's in any developer's best interest to just dump a game and pray it works on PC.

Or as another example, consider that game engines are updated all the time. Yet, I have never heard Epic say that their version of UE4 runs better on console than on PC. 99% of the time, it's the complete opposite. Every new graphical feature is showcased on PC first, and then consoles get the same or a stripped down version of it later.

Case and point, look back at the Elemental demo. The PS4 version still had to make sacrifices that a more powerful PC didn't.



Or another infamous example, the original Crysis on PC vs Console. Even though the console version ran on an improved version of the engine with newer features, the memory limits still meant there was less geometry/lower texture resolutions compared to the original.

 
Last edited:

VFXVeteran

Banned
In the grand context of everything, why would PC games be exempt from "optimization"?

Especially when you consider this is the same platform where PC gamers have had frame rates as high as 300fps since 2001. I really doubt it's in any developer's best interest to just dump a game and pray it works on PC.

Or as another example, consider that game engines are updated all the time. Yet, I have never heard Epic say that their version of UE4 runs better on console than on PC. 99% of the time, it's the complete opposite. Every new graphical feature is showcased on PC first, and then consoles get the same or a stripped down version of it later.

Case and point, look back at the Elemental demo. The PS4 version still had to make sacrifices that a more powerful PC didn't.



Or another infamous example, the original Crysis on PC vs Console. Even though the console version ran on an improved version of the engine with newer features, the memory limits still meant there was less geometry/lower texture resolutions compared to the original.



This is fact. There is no way around it. Game devs are no longer interested in dumping piles of money on the table for a new iteration of their graphics engine that supports hardware X for gen X. It's a lot of wasted money and R&D time they could use to get a better game with better assets. Even 1st party devs are embracing this paradigm. You basically have PC hardware in the consoles already (i.e. x86 with either AMD or Nvidia chipsets). They are all using either DX, Vulkan or OpenGL as the lower level API to the graphics hardware for their graphics engine. The hardware vendors are responsible for putting out good drivers to support these low level APIs. That's just how things are done now and going forward in the future.
 
Next gen consoles are literally just PC's with mid range specs from a few years ago. There may be a few newer features. Still waiting to see what kind of SSD they will have but my money is on an NVMe running on PCIe 4.0. And even that will 18+ months old tech by the time they launch.

The real advantage of next gen consoles IMO is a new higher bar set for game development. There's no denying it that games get better looking when new consoles release....but.... this benefits PC games just as much if not more.
 

Agent_4Seven

Tears of Nintendo
OC'd, a 1080ti is more like 13.9 Tflops.
The base value of a reference card is 11.3 TFLOPs, so yeah, ~14 when OC'd.

The PS5 may end up having "1080Ti like performance" but just based on Tflops, there is a pretty big lead in favor of a 1080ti.
As I've said, PS5 / XONE GPUs will be memory limited at some point, cuz there's absolutely no way that they'll have access to 11 GBs of VRAM and the more VRAM you've got, the better looking game you can make and the more you can do with it visualy, not to mention support of native resolutions without relience on temporal resolution scaling.

I'm definitely interested in seeing how my 1080ti handles next gen games compared to next gen consoles, before I upgrade
Look no further than Quantum Break / Control. 1080Ti is the one and only card capable of handling these games in native 1080p 60 on maxed settings (RTX cards don't count cuz you don't need them and they've less VRAM) . So at one point it's gonna be 1080p 60 GPU for modern games (depending on a game) - 2-3 years from now.... I mean, it kinda is now, but only cuz of lazy and incompitent devs who can't optimize their games properly - I'm not talking about Quantum Break or Control here.
 
Last edited:

bitbydeath

Member
Still waiting to see what kind of SSD they will have but my money is on an NVMe running on PCIe 4.0. And even that will 18+ months old tech by the time they launch.

Probably should have made a thread on it but PS5 looks to be using this.


DSC05371.jpg


Note the PlayStation controller in the image.
 
Last edited:

Justin9mm

Member
I know this sounds cliche but why do people look to consoles for 60fps and not PC?

Unless your last game console was literally a SEGA Genesis from the 90s, it's almost 3 decades now where 60fps as a standard does not exist.

It's honestly up there with people asking "Why isn't Grand Theft Auto on Nintendo"? Just make peace with the idea that if it didn't happen the last 3 console cycles, it's not going to happen the next one.

And no, you don't need a $5000 PC to play games at 60fps. I just bought a $500 PC and even though I don't game much, I can still get 1080p/60fps in any title I want. If I want more, then you obviously have to pay more.
Except for the fact that current Pro and X is already currently outputting Dynamic 4K/60fps in some games.. What are you smoking, can I have some?
 

Justin9mm

Member
Console optimizatios are a thing. Forza 7 is 4k, fully stable 60fps with hdr, and it runs on a 2013 modified mobile cpu.
Don't bother trying to argue with people like this.. It's like someone slating a game or movie that they've never seen or played.
 
  • Like
Reactions: JLB

JordanN

Banned
Except for the fact that current Pro and X is already currently outputting Dynamic 4K/60fps in some games.. What are you smoking, can I have some?
Yeah, "some" games.

On PC, it's never a compromise. Ever since 2001, you can get frame rates as high as 300 fps. I don't remember PS2/Xbox/ or Gamecube ever doing that and the generation that came after that was even worse.

If you want consistently high frame rates, consoles have never had that advantage the moment they shifted towards 3D graphics.
 
Last edited:

Justin9mm

Member
When people say 'consoles are holding back PC's', they are talking soley about the majority of PC games being console ports, ports that are based on tech numerous levels below a lot of PC's. And that is fact.
I mean it works both ways of course, majority of big PC games are console ports, but because of that it means we can play games at higher settings, resolutions, and framerates. So there is always a sliver lining, and thats whats so good about PC gaming, the freedom and choice.
It actually means you get a variety of AAA games period because the PC Community cannot support the development cost of these titles. I'm by no means saying console is better but as far as I see it, PC needs console for games, consoles do not need PC!
 
Last edited:

Justin9mm

Member
Yeah, "some" games.

On PC, it's never a compromise. Ever since 2001, you can get frame rates as high as 300 fps. I don't remember PS2/Xbox/ or Gamecube ever doing that and the generation that came after that was even worse.

If you want consistently high frame rates, consoles have never had that advantage the moment they shifted towards 3D graphics.
2001 is completely different to now as far as tech hardware. Consoles are closing in on performance. Please show me a PC that I can buy for $500 that will run 'some' current games at Dynamic 4K HDR 60fps?

Edit: I've been gaming on my 4K TV for nearly 2 years, fuck 1080p. I'd rather 30fps 4K then 1080p 60+fps. I have a mid tier gaming PC with 1440p monitor and can run 1440p 60fps and I hardly use it because personally the experience I feel on my 75 inch 4K HDR TV feels superior.
 
Last edited:

Tqaulity

Member
Ok...before i go on a serious rant can I just ask the question why everyone thinks that every game has to be at "max settings" to be viable? Do people realize how inefficient and unoptimized "max settings" are for most games? Have people actually compared the difference between "high" and "very high" or "ultra" in most games? The difference is negligible in most cases and sometimes is barely noticeable even in side by side comparison. Yet the performance impact can be huge! You can literally double your performance or more by simply optimizing the visual settings to deliver the best bang for the buck as it the case with console versions. That's why we have 4K 60fps games like Forza 7, Gears 5, Sea of Thieves and other huge 4K 30fps games like Red Dead Redemption, Far Cry 5 etc running on relatively lowly hardware like an Xbox One X.

What really works my nerve is when people make statements like "GPU X is not powerful enough to run Game X at some resolution and frame rate (i.e. 1080p/60 or 4K/60)". That statement is fundamentally irrelevant since it's validity is solely based on the settings used. However, it is almost always inferred that the aforementioned statement applies when using "ultra" or the game's highest setting. But that is a silly statement to make since that same GPU could run perfectly fine at 1080p/60 or 4K/60 by just dialing down some of the settings. So for example saying that the "GTX 1080 ti can't run modern games at 4K/60fps is not true. It may be true at max settings but it can probably run the vast majority of games today perfectly fine at 4K/60fps with some of the settings turned down. Of course, that does not appear to be an option for most people for some reason.

Bottom line: Ultra settings is largely wasteful in today's games and are usually grossly unoptimized. Meaning small visual differences for large performance deltas. A next gen console with a 1080ti level GPU will absolutely be able to run most games at 4K/60 given the fixed nature of consoles and the ability to dial in an optimal graphical setting.

There are a ton of articles and videos on this very issue so just check these out for a refresher and ... STOP COMPARING HARDWARE BASED ON ULTRA SETTINGS!


 

JordanN

Banned
2001 is completely different to now as far as tech hardware. Consoles are closing in on performance. Please show me a PC that I can buy for $500 that will run 'some' current games at Dynamic 4K HDR 60fps?

Edit: I've been gaming on my 4K TV for nearly 2 years, fuck 1080p. I'd rather 30fps 4K then 1080p 60+fps. I have a mid tier gaming PC with 1440p monitor and can run 1440p 60fps and I hardly use it because personally the experience I feel on my 75 inch 4K HDR TV feels superior.
This video is from 2 years ago but he built a PC that's actually cheaper than XONEX and mentioned if he lowered the settings more, he could get full 4K & 60fps.

 
Last edited:

UltimaKilo

Gold Member
For everyone bitching about 60FPS: you’ll likely have to wait until 2023 for a PS5 Pro (with hopefully a 144hz, 150 degree FOV and 4K per eye PSVR2).
 

Agent_4Seven

Tears of Nintendo
What exactly you don't understand from what I've said?

And again

Heh, yeah, If TESVI will look exactly like that (but it won't) on PS5 / Project Scarlett in 4K 60 (no way in hell), I'll eat my goddamn shorts.

Bethesda is not even close to make something so cool and alive as this when it comes to art and level desing, not to mention attention to detail, use of really hi-res textures / models across the board, interactive enviroment which reacts to wind and other weather effect.

Their worlds look more like 95% static, dead and muddy junkyards (since Fallout 3 and to this day), even Fallout 4 with it's world and terrain complexity looks like junk and only somewhat good art and good lighting saves it from being complete trash.
 
Last edited:
Personally, I can see more games offering Performance and Presentation modes, which would be 1080/60 or 4K/30. That choice would be a no-brainer for me.
 

jonnyp

Member
If you're being serious, there are two choices:
1. Wait for Sony/MS to remaster them on new consoles
2. Wait for PC emulation to run them at higher FPS

At least with the second option, it actually benefits everyone, because emulation helps preserve gaming history while making them run better. Whereas Sony/MS could release a game one day and say "lol suck it" and then take it down like they've done with Gravity Rush or Driveclub.

And that's just 1st party exclusives. Almost every 3rd party game now a days is multiplat in which case, the choice is clear. A PC version of the same game will always have 60fps available.

I buy consoles for the exclusives when they are released. I don't sit around for 5 years and hope they will be remastered for a new console or emulated on PC just because of 60fps. That is crazy.
 
Unless your last game console was literally a SEGA Genesis from the 90s, it's almost 3 decades now where 60fps as a standard does not exist.
These are standardized platforms.
And in no way it's more to ask for 60 fps on a PS5 than it was for the MD or SNES.

On those old consoles you also had to make compromises to get 60 FPS.
I am coding on these old platforms as a hobby, you can believe me you don't get 60 fps automacally on there.

It's just that they choose to sacrifice FPS for image quality and other graphical bells and whistles.
 

xPikYx

Member
There is no point to work out 60fps on console (limited machines) unless for achieving particular gameplay matters. Best is to push image quality as far as possible on the 30fps which is the best balance between image quality results and gameplay. If you rather have 60fps at all conditions buy PC which is what I personally do because I don't like playing at 30fps obviously this costs a lot more but is an acceptable compromise. If you prefer cheap hardware (consoles) you'll get cheap results (checkboard rendering, medium image quality, 30fps). If you prefer best image quality, ultra details at 60fps on high resolutions, you'll get to spend big money for that. It's choice
 

Agent_4Seven

Tears of Nintendo
They will have access to at least 16Gb of VRAM. All the RAM in the new consoles is VRAM.
We don't know official specs yet. At least 4GB (if not more) of RAM will be used by OS, not to mention that all of the memory on consoles is shared. It's basically PC without dedicated RAM and VRAM for different stuff and workloads. On PC both VRAM and RAM is used when you're playing games (including OS). For example, Resident Evil 2 eats up to 9GB of VRAM in native 4K (or close to it) and 6GB of RAM, but if you'll be playing the game in 4K on a GPU which is memory limited, RAM usage can spike up to 10GB in 4K and up to 9GB in 1440p so that's almost 19-20 GB (RAM + VRAM) in total for 1440p / 4K (provided you even have that much, otherwise game assets will be loading from SSD / HDD instead).

You can clearly see that some of the games on base consoles droped screen resolution to 720p in some cases, that's because they are memory limited and memory bandwith along with overall performance (floating-point operations per second) is just not good enough anymore for modern games to work at higher resolution with 30 FPS, even PS4 Pro struggles to achieve optimal and locked 30 FPS in some of the modern games. There's nothing you can do here and on consoles, but on PC you can just upgrade your GPU without even touching RAM or CPU (if it's a high-end one).
 

JordanN

Banned
It's just that they choose to sacrifice FPS for image quality and other graphical bells and whistles.
What "image quality"?
Specifically, when it came to SNES/Genesis comparisons, it was the Genesis that actually outputted at higher res while still running at a higher frame rate.
SNES had lower res, but outputted more colors, but I would blame overall it just had a worse CPU in it (hence, why the same system accepted cartridges with special chips. But the Genesis could do the same thing with little or no extra cost).

But other than that, no, I don't think comparison with the 8-bit/16-bit systems are the same. These same systems also relied on sprites and there was an actual limit to how many you could have on screen.

With the the start of the PS1/N64 generation, the switch to polygons meant developers had more infinite more ways to be creative, but also at the complete expense of frame rate (i.e think about lighting and shading. More real time lights and shadows was costlier when 8-bit/16-bit usually had these effects baked in).

I buy consoles for the exclusives when they are released. I don't sit around for 5 years and hope they will be remastered for a new console or emulated on PC just because of 60fps. That is crazy.
You're missing the point. You just asked me "where can I play these exclusive games at 60fps". Did I not just give you an answer as to how?
 
Last edited:
Top Bottom