• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

ph33rknot

Banned
Because it couldn't, Spiderman would look silly at that speed. But it's irrelevant, because Spiderman wasn't even on the PS4 Pro version, the difference would be proportionaly the exact same even with it.
And unless you can prove that Spiderman's model will slow down the system in a relevant way, your talk is just useless.
He's one of the most graphic intensive things in the game
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Do you think that if I would quote Dictator to remind him this I would be banned too? :)

Timestamped: PS5 is not going to have hardware-based ray tracing


Dictator just issued a press release:

Utter Trash Neogaf - this place is utter trash.

OMG. How about you quote what I actually said. I said PROBABLY wont have it. Checkmate bitch.

How about this NeoGAF moderation! Please do something!! I have been up all night because of this Sony fuckboy correcting me. All I wanted to do was downplay consoles, and now I have to get people like nib95 banned for quoting what I said even though I was the one who asked him to quote me.

I am being harassed on twitter by the Technical Art Director at Naughty Dog who hurt my feelings so bad I cant sleep at night, and browse gaf and era all night even though I said I am taking a break. Yes, as you can probably tell, when I said something, be it about ray tracing, SSD or leaving forums forever, i mean the complete opposite.

Gaf just shot themselves in the foot. I dont think you know much about the insufferable PC master race culture (I am an expert), but downplaying consoles and crying to mods is a huge part of it. If you screw someone over in our community, we just turn to our distant cousins and unleash the full wrath of xbox era discord who will show up here in droves reporting every post, asking all of my posts to be threadmarked, engaging Micorsoft MVP mods and offering me gratitude and blowjobs for being brave enough to spread FUD about Sony consoles.

Mods, please ban this fake italian Rusco Da Vino or kiss my account goodbye.
 
Last edited:

pasterpl

Member


Speaking to IGN's Podcast Unlocked, Spencer was asked how he felt after watching the PS5 tech specs reveal with Mark Cerny, and how Xbox Series X stacks up against its key rival:

"No doubt, I felt really good about how Series X lines up. Now, I think Mark and the team did some really good work on the audio processing that they talked about, their SSD technology is impressive, we like that. We saw the work that they did. But you know, we we took a holistic view on our platform, from CPU, to GPU, to RAM, to throughput velocity architecture, latency, back compat – you know, it took us years to get to this point.

"I will definitely have respect for any platform team that's launching, it just takes a lot of work. But I will say, when we finally saw the public disclosure, I felt even better about the choices that we made on our platform. And I kind of expected that I would."
 
The developer stated why they are not supporting PlayStation;

Where's the support for PlayStation?
PS4/5 development kits are far too expensive for me to obtain (they are in excess of $4000, compared to Switch's $400 and Xbox's $0 dev mode). Unless someone wants to donate me a couple of PS4/5 devkits, I have no plans to support the platform. I apologize for any inconvience this may cause. Talk to Sony about lowering the price of their devkits to something reasonable if you are unhappy. You are of course free to make your own PS4/5 fork of Void2D, as long as it remains open source. If I do manage to get my hands on some PS4/5 devkits I may even use your code in the main branch, and of course you will be fully credited.

That is not an answer I don't see wrong to not release a game in a specific console I mean if you will release to one platform attack other doesn't bring you any good thing, maybe a
couple of console warrior will encourage to continue doing but if I am someone from PR of Sony or similar and I saw what are you doing well let's say is not best way to have a good impression
with me.
 
Will all Major ps5 exclusive comes to pc ? Any insider thoughts on this one
As Sony is working yeah maybe in a couple of years after its release in PS5 but the problem with PS first games are the story is a main element so if
you be spoiled because you wait until Sony announces X game came to PC well maybe your impression not will be so good.
 

pasterpl

Member
Yeah I'm really picturing Phil Spencer say "well Xbox is just inferior, cheers" in case it was the opposite.

I think this bit is worth highlighting

"We're definitely going to be continuing to keep our eyes wide open as we go towards launch, looking at what the competition is doing, but you know, we have a plan and we feel very solid about our plan. We think it's a winning plan. I believe we have a plan that can win, we've got to go execute. But I feel really good about the plan that we put together."

in case anyone was wondering if ms had a plan :messenger_beaming:
 

ANIMAL1975

Member
Oh shit , now Xbox Series X is 48GB of RAM?! LOL 🤣
You never had a chance! Couple that with the real world RT performant 25tf GPU and it's a massacre. It's much more bigger than the 1, 5 ps4 people believe, not even close! PS5 would need 2 SSDs to only narrow it down, not enough to compete! Cerny f up real hard and all you can do is cry!

That not how Sampler Feedback works lol
It is a hardware feature (so PS5 hast it too) that now MS added to API that allow devs to capture and record texture sampling info and locations via hardware.

You still needs new texture loaded on VRAM like usual.

It is a feature to reuse textures.

 

Neo Blaster

Member
The developer stated why they are not supporting PlayStation;

Where's the support for PlayStation?
PS4/5 development kits are far too expensive for me to obtain (they are in excess of $4000, compared to Switch's $400 and Xbox's $0 dev mode). Unless someone wants to donate me a couple of PS4/5 devkits, I have no plans to support the platform. I apologize for any inconvience this may cause. Talk to Sony about lowering the price of their devkits to something reasonable if you are unhappy. You are of course free to make your own PS4/5 fork of Void2D, as long as it remains open source. If I do manage to get my hands on some PS4/5 devkits I may even use your code in the main branch, and of course you will be fully credited.

So the developer has a grudge on Sony devkit prices and decides to downplay PS because cannot afford one? Ok, let's just throw professionalism out of the window and be bitter.
 
Last edited:

Imtjnotu

Member
But not on the one we saw and wired mentioned 15 sec load times
did you see the demo? if not hush already

from wired

"To demonstrate, Cerny fires up a PS4 Pro playing Spider-Man, a 2018 PS4 exclusive that he worked on alongside Insomniac Games. (He’s not just an systems architect; Cerny created arcade classic Marble Madness when he was all of 19 and was heavily involved with PlayStation and PS2 franchises like Crash Bandicoot, Spyro the Dragon, and Ratchet and Clank.) On the TV, Spidey stands in a small plaza. Cerny presses a button on the controller, initiating a fast-travel interstitial screen. When Spidey reappears in a totally different spot in Manhattan, 15 seconds have elapsed. Then Cerny does the same thing on a next-gen devkit connected to a different TV. (The devkit, an early “low-speed” version, is concealed in a big silver tower, with no visible componentry.) What took 15 seconds now takes less than one: 0.8 seconds, to be exact. "

no where does it say spiderman was not on screen. cut the bullshit dude
 

Dodkrake

Banned
All true.... Except in those cases, the GPU boost and CPU boost work independently. Both clock up and down as needed and do not interfere with each other, unlike the PS5, which has been confirmed to need to balance between the CPU and GPU workload.

And yet, it's still not a clock boost.
 

Neofire

Member
They are outrageously expensive though.
To be fair I'm sure the guys revenue from selling his games far exceed the expenses of 4K and he also gets a return on his investments. 110+ million install isn't anything to make excuses for in my opinion.
 

Evilms

Banned
Dictator just issued a press release:

Utter Trash Neogaf - this place is utter trash.

OMG. How about you quote what I actually said. I said PROBABLY wont have it. Checkmate bitch.

How about this NeoGAF moderation! Please do something!! I have been up all night because of this Sony fuckboy correcting me. All I wanted to do was downplay consoles, and now I have to get people like nib95 banned for quoting what I said even though I was the one who asked him to quote me.

I am being harassed on twitter by the Technical Art Director at Naughty Dog who hurt my feelings so bad I cant sleep at night, and browse gaf and era all night even though I said I am taking a break. Yes, as you can probably tell, when I said something, be it about ray tracing, SSD or leaving forums forever, i mean the complete opposite.

Gaf just shot themselves in the foot. I dont think you know much about the insufferable PC master race culture (I am an expert), but downplaying consoles and crying to mods is a huge part of it. If you screw someone over in our community, we just turn to our distant cousins and unleash the full wrath of xbox era discord who will show up here in droves reporting every post, asking all of my posts to be threadmarked, engaging Micorsoft MVP mods and offering me gratitude and blowjobs for being brave enough to spread FUD about Sony consoles.

Mods, please ban this fake italian Rusco Da Vino or kiss my account goodbye.

Dictator's been freaking out for a few days.

IEMKHo.gif


His call for help from the modos on trashera is quite comical
1466366219-risitas54.png
 

SamWeb

Member


RX 5700 OC (~ 2150MHz, 2304 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~ 9,9TF FP32 performance)
RX 5700 XT Stock (~1750MHz, 2560 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~8,9TF FP32 performance)
+256 Shading Units vs. higher frequency
 
Last edited:
Yeah I'm really picturing Phil Spencer say "well Xbox is just inferior, cheers" in case it was the opposite.


"We're getting incredible support from Microsoft [...] they're very linked into what our plans are, and we're going to make sure we stay agile on our pricing, and that we have a good plan going into launch."

Just imagine $399 :messenger_mr_smith_who_are_you_going_to_call:
 
RX 5700 OC (~ 2150MHz, 2304 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~ 9,9TF FP32 performance)
RX 5700 XT Stock (~1750MHz, 2560 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~8,9TF FP32 performance)
+256 Shading Units vs. higher frequency

how does the top card benchmark with the higher clocks?
 

ethomaz

Banned


RX 5700 OC (~ 2150MHz, 2304 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~ 9,9TF FP32 performance)
RX 5700 XT Stock (~1750MHz, 2560 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~8,9TF FP32 performance)
+256 Shading Units vs. higher frequency

Useless comparison because that OC won't be sustained in RDNA and harm the performance below the stock clocks.
RDNA 2 just is better for that and scale better with high clocks due the 50% increase in perf. per watt.

RDNA 2 will come and we will have an ideia how it works at high clocks.
 
Last edited:

Ascend

Member
So the developer has a grudge on Sony devkit prices and decides to downplay PS because cannot afford one? Ok, let's just throw professionalism out of the window and be bitter.
And where do you draw that conclusion from? It has clearly been stated that the person is willing to develop on it, and even encourages other developers to fork the software for the PS dev kits.

What you just said is simply a convenient view to suit your preconceived biases.
 
D

Deleted member 775630

Unconfirmed Member
And how is that everyone else's problem if they chose to use unoptimized content for their demo? There's no excuse for that, if you want to show what you can do you'll use the best case scenario, something Sony wisely chose to do with that Spidey demo.
It's about comparing oranges to apples.
 

ph33rknot

Banned
did you see the demo? if not hush already

from wired

"To demonstrate, Cerny fires up a PS4 Pro playing Spider-Man, a 2018 PS4 exclusive that he worked on alongside Insomniac Games. (He’s not just an systems architect; Cerny created arcade classic Marble Madness when he was all of 19 and was heavily involved with PlayStation and PS2 franchises like Crash Bandicoot, Spyro the Dragon, and Ratchet and Clank.) On the TV, Spidey stands in a small plaza. Cerny presses a button on the controller, initiating a fast-travel interstitial screen. When Spidey reappears in a totally different spot in Manhattan, 15 seconds have elapsed. Then Cerny does the same thing on a next-gen devkit connected to a different TV. (The devkit, an early “low-speed” version, is concealed in a big silver tower, with no visible componentry.) What took 15 seconds now takes less than one: 0.8 seconds, to be exact. "

no where does it say spiderman was not on screen. cut the bullshit dude
Did you
 

SamWeb

Member
Useless comparison because that OC won't be sustained in RDNA and harm the performance below the stock clocks.
RDNA 2 just is better for that and scale better with high clocks due the 50% increase in perf. per watt.
LOL :messenger_grinning_squinting: Who said that RDN2 more scalable? Mark Сerny? And PS5 has a dynamic frequency that varies depending on the load and both consoles will have a common architecture based on RDNA1. They will inherit all the attendant advantages and disadvantages of this architecture.
 
Last edited:


RX 5700 OC (~ 2150MHz, 2304 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~ 9,9TF FP32 performance)
RX 5700 XT Stock (~1750MHz, 2560 Shading Units, 256-bit memory bus, 448GB/s memory bandwidth, ~8,9TF FP32 performance)
+256 Shading Units vs. higher frequency

You never get 10% more performance for 10% overclock.
Same will happen with Ps5
I expect 15% more performance for 25% overclock

But as a console they will get more out of the higher frequency.
 
Last edited:

ph33rknot

Banned
In order to have him on the screen, the developers would have had to create new code and animations to depict Spidey web-slinging at lightning speeds, which sounds impractical and unnecessary for the purpose of the demonstration.
So no model that takes up the most poly's just a background and I'm just bored mofos so mad
 

ethomaz

Banned
LOL :messenger_grinning_squinting: Who said that RDN2 more scalable? Mark Сerny? And PS5 has a dynamic frequency that varies depending on the load and both consoles will have a common architecture based on RDNA1. They will inherit all the attendant advantages and disadvantages of this architecture.
AMD.
Is that even a question?

Again this shit about consoles being RDNA? lol

That forum is become weird... people want to have a technical discussion but discard every technical detail from AMD and Cerny :messenger_tears_of_joy:
 
Last edited:

SamWeb

Member
Erm I think AMD showed 3 bars from GCN, to RDNA 1 to RDNA 2.


I don’t understand what you mean.

But
https://www.anandtech.com/show/15592/amds-2020-client-gpu-roadmap-rdna-3-on-the-horizon
"And given TSMC’s roadmaps, it’s more or less inevitable that this will be the point where AMD begins using an EUV-based process for their GPUs, as AMD has indicated that this year’s RDNA 2 will not be using TSMC’s EUV-based 7nm+ process."
1-1260.jpg

N7P for RDNA2 and nextgen consoles
 
Last edited:

SamWeb

Member
AMD.
Is that even a question?


Again this shit about consoles being RDNA? lol

That forum is become weird... people want to have a technical discussion but discard every technical detail from AMD and Cerny :messenger_tears_of_joy:
Can I get a link to the relevant AMD statement? RDNA 1 is based on RDNA 2. What are the problems?
 
Status
Not open for further replies.
Top Bottom