• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

DeepEnigma

Gold Member
I'm not saying EVERY game will look like ps4 games at 4k, but there will be plenty. If the consoles have dedicated seperate hardware for ray tacing, that will help make some games look beyond PS4 at 4k, but if its software ray tracing?, well those terraflops will be gobbled up like nobodys business.
Look at the X, 6tf yet it cant always run every XB1 game at 4k, it does most, but not all. You're looking at 8tf to run PS4 games at 4k (as Cerny said himself), which will leave what, about 4tf to play with (assuming the consoles are around 12tf), sortware ray tracing will eat those remaining 4tf without batting an eyelid. I mean look at the difference the RTX cards have with ray tracing compared to no ray tracing, and they have dedicated RT hardware!.

Things do not work linear like this. If you develop at this second, then maybe, but there are constant improvements to middleware, rendering pipelines, etc.. Just as all of improvements we got this gen over the previous due to said advancements. The new RDNA architecture, etc..

I think we will be wow'ed even more since the baseline will be much, much higher to go with advancements that are constantly made.
 
god-of-war-ps4-comparison-screenshot-graphics-downgrade.jpg



This looks the same, except he's closer to the cave so there's less fog.
 
I'm not saying EVERY game will look like ps4 games at 4k, but there will be plenty. If the consoles have dedicated seperate hardware for ray tacing, that will help make some games look beyond PS4 at 4k, but if its software ray tracing?, well those terraflops will be gobbled up like nobodys business.
Look at the X, 6tf yet it cant always run every XB1 game at 4k, it does most, but not all. You're looking at 8tf to run PS4 games at 4k (as Cerny said himself), which will leave what, about 4tf to play with (assuming the consoles are around 12tf), sortware ray tracing will eat those remaining 4tf without batting an eyelid. I mean look at the difference the RTX cards have with ray tracing compared to no ray tracing, and they have dedicated RT hardware!.
Xbox One X is 6TF and runs Xbox, Xbox 360, and Xbox One games at 4k. Adding an additional 6TF would make some pretty remarkable games. Espectially if 8-9TF is the sweet spot for 4k on consoles.
 
Last edited:

Gamernyc78

Banned
We already know for the most part since ps3 days Sony first parties basically pass the graphics benchmark on consoles back to each other and will be no different during PS5 days, with games looking insane. Remember when ppl couldn't believe Uncharted 2 live demo? Or when Killzone 2/3 came out graphically slaughtering the other fps games last Gen. Tlou, God of War etc. This gen even The new Infamous Second Son still looks better than alot of games coming out. Idk about on other consoles but I know first party Sony games on PS5 with or without Ray tracing will look a generation ahead. History has shown no different
 
Last edited:

TeamGhobad

Banned
TeamGhobad TeamGhobad

Are you trying to make a drop the mic moment post? Because you are failing miserable in that task with your posts.
That is why I asked what are you trying to archive showing things that didn't support your point?

im just light trolling, i guess the point is that sony had pulled off a bunch of bullshots. damn im missing the mgs egghead meme.
 
Once again, people compare flops without providing context. Apples to oranges, really.

RSX was 176 Gflops vs PS4 Radeon at 1840 Gigaflops.

10x difference someone would say, BUT you also have to take into account architectural differences (i.e. RSX flops were strictly divided between pixel vs vertex shader pipelines, not to mention lack of compute features). So yeah, the GPU difference is more than 10x.

The point is, Navi will also have lots of improvements/baseline features, like neural networks acceleration (INT8/FP16) for next-gen AI, variable rate shading etc.
 

TeamGhobad

Banned
Where the fuck did I ever mention PC?

We get it, you need to flex in a console thread to a comment specifically about what may or may not happen graphically based on tech, rendering pipes, etc..

no dude im just messing with you. point is ps4 games looked good but they weren't amazing and definitely never rivaled PC. and a lot of times sony had bullshots on stage.
 

DeepEnigma

Gold Member
no dude im just messing with you. point is ps4 games looked good but they weren't amazing and definitely never rivaled PC. and a lot of times sony had bullshots on stage.

Again, how does that correlate to my original comment?

Here, let me help you...

I am going to bookmark this post. It is possible, but 1.84TF had no business making games look and run as good as GoW, HZD, etc., and at the time of the announcement in 2013, even showing off those games years later in initial trailers, people still called BS on what they were seeing or told they would eventually see.

:pie_thinking:

Do you want to discuss the confines of the consoles and a console specific comment my reply was to, or are we going to PC big-dick flex like some weird insecure persecution complex?

PC had its fair share of bullshots from developers last gen as well, most notably Watch Dogs and TW3. But that has nothing to do with the context of my post.
 
Last edited:

TeamGhobad

Banned
Again, how does that correlate to my original comment?

Here, let me help you...



:pie_thinking:

i guess im trying to prove that ps4 isn't all that, and sony did pulloff a bunch of bullshots both with GoW and Uncharted 4, they all do it. and then i started mocking sony fanboys for being overtly defensive about ps4 abilities. but its all jokes bro. relax. lol.
 

Darius87

Member
no dude im just messing with you. point is ps4 games looked good but they weren't amazing and definitely never rivaled PC. and a lot of times sony had bullshots on stage.
i guess im trying to prove that ps4 isn't all that, and sony did pulloff a bunch of bullshots both with GoW and Uncharted 4, they all do it. and then i started mocking sony fanboys for being overtly defensive about ps4 abilities. but its all jokes bro. relax. lol.
giphy.gif
 
I was being generous to Radeon VII.

It is 35% only in 1080p... over that is 39%.



yeah you're right. had a little oopsi based on the HUB day 1 coverage of both cards (mixed up the base value)

1440p_2080ti_2080fuki9.png


1440p9ejxg.png


that would give you indeed 32%
newer results are even clearer...
 

llien

Member

Why would it be last minute? Nvidia was first out the gate with it but that doesn't mean it wasn't on anyone's radar until they had it, it would have been on API roadmaps and likely somewhere in AMDs, in the 4+ year dev cycle of the PS5 they probably saw an interesting feature near-enough in the roadmap to pull it in.

But neither MS nor Sony are into silicon business. How come they would have developed sophisticated RT cores?

That's tech demo running at 1080p 30fps. Real games like MineCraft need 1070 just for 720p. Do you really think sony would build PS5 to run games just at 720p LOL😂😉👍. No, they will run games at 1440p and above and they absolutely need HW RT for that.
Demos push boundaries. Tone down number of reflections/shades and there you go.
 
Last edited:

Fake

Member
720p to 1080p. PS4 to PS5 will be 1080p to 4k, plus PS4 didnt have ray tracing to deal with over PS3.
Its not just about resolution. And PS4 running games at 1080p was a step foward while PS3 barely run games at 720p.
Besides, while in PS5/nextbox gen they could implement a new feacture besides RT.

edit: OMG at this war. Guys, reserve your guns for 2020 please.
 
Last edited:

llien

Member
ps4 games looked good but they weren't amazing and definitely never rivaled PC.
Now, I haven't played that many of them on PS4, but these:

1) Uncharted 4 (yes, and by the way, it didn't look the way it is in your screnshot)
2) GoW 4 (just started though)
3) Horizon (oh dayum, the best of the 3 imo)

Now, GoW/Horyzon were multiplied by HDR greatness, but I've never seen something even remotely as beautiful on PC.
 

LordOfChaos

Member
But neither MS nor Sony are into silicon business. How come they would have developed sophisticated RT cores?

How did Tesla make the highest performance per watt driving AI chip, how did Apple go from not making any CPUs to designing the most ambitious ARM cores within two generations of starting, etc.


A: Hire a bunch of smart silicon engineers, hah.

Through the 90s and early 'aughts it sure seemed like silicon design was incredibly hard, all but a few companies dying off. It sure seems like the barriers to entry have been lowered for whatever reason - supply of engineers maybe, more readymade toolsets for helping with design, foundry transistor libraries, etc.

As far as Sony, GPU/silicon customizations aren't new to them either. And they were working with AMD remember, not going it alone. Maybe AMD just decided it wasn't worth the silicon yet while they're trying to catch up to Nvidia, while Sony didn't want to skip RT for a whole 'nother generation. AMD can introduce new cards with it whenever, where a console generation lasts 5-8 years.
 

Munki

Member
How did Tesla make the highest performance per watt driving AI chip, how did Apple go from not making any CPUs to designing the most ambitious ARM cores within two generations of starting, etc.


A: Hire a bunch of smart silicon engineers, hah.

Through the 90s and early 'aughts it sure seemed like silicon design was incredibly hard, all but a few companies dying off. It sure seems like the barriers to entry have been lowered for whatever reason - supply of engineers maybe, more readymade toolsets for helping with design, foundry transistor libraries, etc.

As far as Sony, GPU/silicon customizations aren't new to them either. And they were working with AMD remember, not going it alone. Maybe AMD just decided it wasn't worth the silicon yet while they're trying to catch up to Nvidia, while Sony didn't want to skip RT for a whole 'nother generation. AMD can introduce new cards with it whenever, where a console generation lasts 5-8 years.

It looks like MS is already trying to hire:

Microsoft Silicon team
 
Last edited:

llien

Member
How did Tesla make the highest performance per watt driving AI chip, how did Apple go from not making any CPUs to designing the most ambitious ARM cores within two generations of starting, etc.
Ah, haha, I have an answer to that, the same way AMD went from Buldozer to Zen:

9sKvuGH.jpg


That be said, Tesla bit is surprising. I suspect AI hardware they made is quite straightforward.

As far as Sony, GPU/silicon customizations aren't new to them either. And they were working with AMD remember, not going it alone. Maybe AMD just decided it wasn't worth the silicon yet while they're trying to catch up to Nvidia, while Sony didn't want to skip RT for a whole 'nother generation. AMD can introduce new cards with it whenever, where a console generation lasts 5-8 years.
Well, could be, along the "let us have edge over Microsoft" lines, plus, perhaps, Sony studios were among the first to realize, that realistic reflections/refractions using rasterisation have become prohibitively expensive to design.

One day we'll see. I'm betting on non HW based solution though, as there is a lot of GPU related known-how involved in what NVD did with RT cores, AMD could do that, if focused, I doubt others could though. We'll see.
 

ethomaz

Banned
How did Tesla make the highest performance per watt driving AI chip, how did Apple go from not making any CPUs to designing the most ambitious ARM cores within two generations of starting, etc.


A: Hire a bunch of smart silicon engineers, hah.

Through the 90s and early 'aughts it sure seemed like silicon design was incredibly hard, all but a few companies dying off. It sure seems like the barriers to entry have been lowered for whatever reason - supply of engineers maybe, more readymade toolsets for helping with design, foundry transistor libraries, etc.

As far as Sony, GPU/silicon customizations aren't new to them either. And they were working with AMD remember, not going it alone. Maybe AMD just decided it wasn't worth the silicon yet while they're trying to catch up to Nvidia, while Sony didn't want to skip RT for a whole 'nother generation. AMD can introduce new cards with it whenever, where a console generation lasts 5-8 years.
Just to add point...

Sony / MS didn't add silicon to their chips... they ask (and pay) AMD to design that new feature to add in their silicon... of course Sony/MS participate in how the feature will work and all but the silicon knowledgement is all from AMD.

I though it was clear what the "semi-custom silicon" business meant.

Sony / MS at best needs fews guys in knowledgement about the subject to help AMD delivery what they want.
 
Last edited:
  • Like
Reactions: TLZ

vpance

Member
Now, I haven't played that many of them on PS4, but these:

1) Uncharted 4 (yes, and by the way, it didn't look the way it is in your screnshot)
2) GoW 4 (just started though)
3) Horizon (oh dayum, the best of the 3 imo)

Now, GoW/Horyzon were multiplied by HDR greatness, but I've never seen something even remotely as beautiful on PC.

You forgot Days Gone. It's even better than those games IMO.
 

TeamGhobad

Banned
We should make a list of the most outrageous claims we have heard from "professional leakers":
I'll start:
Moore's Law is Dead: Nexbox will have Zen 3 CPU, 3 threads per core. He heard this from 3 independent sources.
Adoretv: Navi is too hot, Nexbox is going Vega.
 

ethomaz

Banned
I don't like his (Jez) tone.
He is just seeking attention like usual... well most youtubers and tweeters like that.
Seems like a thing of that generation... my nephews love that thing to make Lives on YouTube to everything lol
That is fine too... it is just a bit weird for my generation (36 years old here).
 
Last edited:

LordOfChaos

Member
We should make a list of the most outrageous claims we have heard from "professional leakers":
I'll start:
Moore's Law is Dead: Nexbox will have Zen 3 CPU, 3 threads per core. He heard this from 3 independent sources.
Adoretv: Navi is too hot, Nexbox is going Vega.

Someone was saying Nvidia even this close to launch, right?
 
Status
Not open for further replies.
Top Bottom