• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GhostRunner PS5/XSX/XSS DF Performance Analysis

Md Ray

Member
Did you click any of the links that I've provided? There's many for you to choose from, so you can't claim I'm cherry picking. Also, you are the one who brought up the comparison in the first place. Don't you think you should know at least the bare minimum facts about ECC, before making that absurd comparison?

Go ahead and click the link. Several of us have been saying this in the thread already. It's not a good look to debate something you aren't that knowledgeable about. Seriously, click the link and look at any of the benchmarks. Some of the articles even explain why ECC introduces more latency into the equation. Gamers always go for the lowest latency. Whether that be input latency on a mouse, monitor, ram, etc. No gamer is going to spend MORE money for MORE latency. That's ridiculous. No gamer is going to buy a workstation GPU for the sole purpose of GAMING. Is this starting to make sense yet?
Walls and walls of text but still no data from you. You can't just make shit up and tell people to just go google. I know what ECC is... Show me ECC vs non ECC benchmarks with the same speed. Preferably with GDDR6. Why is it so hard to pick one from your link and put it here? I'll wait.
 
Last edited:
Walls and walls of text but still no data from you. You can't just make shit up and tell people to just go google. I know what ECC is... Show me ECC vs non ECC benchmarks with the same speed. Preferably with GDDR6. I'll wait.
I've already linked that.


I've linked it again. All the data is right there. Not sure what's so hard about accepting the facts. And at this point your just being purposely dense or refuse to accept that you are wrong.

ECC at the same speed, will be slower than non ECC at the same frequency. That is a fact.

3200mhz non ECC is ALWAYS going to be faster than 3200mhz ECC. This will always be the case, and theses nothing you can say or do to go against these facts. If you have something to add, go ahead, but you can't change the facts or change my mind about this. It's not even debatable at this point, so I'm unsure why you keep digging a deeper hole for yourself?
 

Loxus

Member
Most games this year performed better on the XSX though. We have seen games demonstrate the 18% TFLOP difference between the consoles. I've got a full list with sources if you want me to post them?
Hope you post games that have a 30% performance advantage with 40-50fps delta. If not, it just holds true to my point that some games will perform better on PS5 while others on XBSX.

XBSX ain't even that much more powerful that it's impossible to think that the PS5 can outperform it.
 
Same sources I've listed previously. Here's another one. Actually, multiple sources on the link say at least 2% slower. Not to mention a single slot GPU, made for workstations, featuring ECC memory, which isn't used in gaming.


It's a fact that ECC will always be slower than non ECC. You should know this already though, as you said you are knowledgeable about ECC in a previous post. 🤔 🤔 🤔
 

Loxus

Member
I didn't hear your technical explanation why this cross generational title running on hardware very similar to the PS5 is incapable of running similarly OUTSIDE of software optimization which you have completely eliminated as a possibility. While you are at it you can explain the technical advantages and faster loading the Xbox has over the PS5 in the Medium again OVER software optimizations. You say its because of the PS5's better engineering I'd like to hear your breakdown.


This sounds totally imaginary. Name one technical source that stated the XSX was 30% more 'powerful' than the PS5. Every outlet I read from DF to IGN all stated both consoles were very similar in power and the results speak for themselves. NO ONE claimed that the Xbox would running games 40-50 FPS faster than the PS5 especially when they have the same CPU. It is nonsense to claim that an unoptimized game would outperform an optimized one when running on similar hardware. That is the reason why this game in particular shows the difference between optimized vs unoptimized. I do hope it gets fixed but it is doubtful.
That's what you guys where running with when PS5 specs were announced all over Twitter and YouTube.
This tweet is just one example of the nonsense that many people were saying.
CTJeyxt.jpg


You talk about technical source, but ignore things from Mark Cerny, Tim Sweeney and actual game developers. Smh

I did breakdowns with concrete sources many times already.

For me to do all that work again just for you to ignore will be idiotic of me. It's still kind of fun doing it, but I'm busy playing Red Dead Online Call to Arms right now.
 
Last edited:
That's what you guys where running with when PS5 specs were announced all over Twitter and YouTube.
This tweet is just one example of the nonsense that many people were saying.
CTJeyxt.jpg


You talk about technical source, but ignore things from Mark Cerny, Tim Sweeney and actual game developers. Smh
Well see there is your problem. You believe people on Twitter. Next time go to a reputable source over crazy people on Twitter. You probably didn't know this but there were videos of smoke coming out of the XSX making it appear that it was overheating and catching fire. Nonsense is all over the place.
 

Zathalus

Member
Hope you post games that have a 30% performance advantage with 40-50fps delta. If not, it just holds true to my point that some games will perform better on PS5 while others on XBSX.

XBSX ain't even that much more powerful that it's impossible to think that the PS5 can outperform it.
Why would I post a 30% performance advantage with a 40-50fps delta? Nobody with an ounce of credit was claiming that. The on paper specs are 18% advantage to the XSX, we have numerous examples of games performing better on the XSX around 15-25% or so. Exactly what the on paper specs would indicate.

Some idiot on twitter is not a reliable source.
 

Md Ray

Member
Same sources I've listed previously. Here's another one. Actually, multiple sources on the link say at least 2% slower. Not to mention a single slot GPU, made for workstations, featuring ECC memory, which isn't used in gaming.


It's a fact that ECC will always be slower than non ECC. You should know this already though, as you said you are knowledgeable about ECC in a previous post. 🤔 🤔 🤔
So 2%?

In the end... It's nothing? And they perform about the same? My view on it was correct then:
I'll say this. My stance on GDDR6-14000 ECC vs non ECC is that it makes no diff to perf.
And here you were acting like there was some kind of a massive 15-20% delta between them at same speeds and cannot be compared at all. I'd actually accept if the diff is like approaching 10% or more.
 
Last edited:
So 2%?

In the end... It's nothing? And they perform about the same? My view on it was correct then:

And here you were acting like there was some kind of a massive delta between them at same speeds and cannot be compared at all.
2% is significant when we're talking about an average difference of 4%.
 

dcmk7

Banned
Because maybe, just maybe a indie game made by 30 people that the console versions were outsourced to a separate studio to do a free next gen update a year after the console versions launched is absolutely no indication of anything if we do not know for a fact how much time was spent on the patch or even if the same studios did it.
Was it outsourced?

Is there a source which confirms this?
 
So 2%?

In the end... It's nothing? And they perform about the same? My view on it was correct then:

And here you were acting like there was some kind of a massive delta between them at same speeds and cannot be compared at all. Like it's prohibited or something. :messenger_tears_of_joy:
Do you even hear what you are saying? Let's make a poll and ask gamers if they would follow your logic and pay double the price to get a -2% decrease in performance? Let's see how many would be in your corner. I'll wait.

2% is a difference, not to mention it's added latency. Imagine being wrong and somehow still trying to convince yourself the you were right along 🤣🤣🤣.

Anyways, I've proven that ECC will always be slower, gamers won't spend double the price for a GPU which isn't meant for gaming, and that this was a poor comparison to begin with. But you do you. Get that workstation GPU, pair it with an ECC compatible MB, and let's see how much performance you lose. It'll be more than 2%. Oh yeah, and you'd end up paying double the price, when you could have done what every gamer would do, build a GAMING PC.
 

dcmk7

Banned
They say it on all their twitter posts, when people complain about the console versions.


Cool, cheers, didn't know that.

It's good we got confirmation that only one studio was involved in doing that. Be really strange for two studios to do a platform each.

We appreciate your feedback and have already passed it to our partner responsible for console versions of Ghostrunner.
 

Md Ray

Member
Do you even hear what you are saying? Let's make a poll and ask gamers if they would follow your logic and pay double the price to get a -2% decrease in performance? Let's see how many would be in your corner. I'll wait.

2% is a difference, not to mention it's added latency. Imagine being wrong and somehow still trying to convince yourself the you were right along 🤣🤣🤣.

Anyways, I've proven that ECC will always be slower, gamers won't spend double the price for a GPU which isn't meant for gaming, and that this was a poor comparison to begin with. But you do you. Get that workstation GPU, pair it with an ECC compatible MB, and let's see how much performance you lose. It'll be more than 2%. Oh yeah, and you'd end up paying double the price, when you could have done what every gamer would do, build a GAMING PC.
All that chest-beating over a margin of error stuff. :messenger_tears_of_joy: I've also seen results where ECC can increase perf by 1-2% over standard, which isn't anything major and comes under the run-to-run variance. GDDR6 is what we're talking about though, not DDR4/3. 2% here and there is nothing so I'll state this again:
My stance on GDDR6-14000 ECC vs non ECC is that it makes no diff to perf.
 
Last edited:

BigLee74

Member
Until the devs come out and say one way or another why the game is performing better on the PS5, it’s pure speculation on both sides (PS5 hardware suited to engine/Xbox hasn’t been optimised properly).

So until that time, or the time of future patches, this one is simply a win for the PS5. No more, no less.
 
All that chest-beating over a margin of error stuff. :messenger_tears_of_joy: I've also seen results where ECC can increase perf by 1-2% over standard, which isn't anything major and comes under the run-to-run variance. GDDR6 is what we're talking about though, not DDR4/3. 2% here and there is nothing so I'll state this again:
Chest beating? Not from me. I was simply pointing out that


1. It's a very poor comparison
2. ECC is slower than non ECC, and costs much more.
3. ECC has NO gains for gaming, as matter a fact you'll lose frames and performance with it.
4. You can apply this to GDDR6 as well. ANYTHING with ECC will be slower than non ECC.



Look, I'll break it down so you should be able to really understand this. Anything with ECC will be slower. Whether it be RAM or VRAM. ECC adds an additional step in the equation, so it will always be slower. Take two identical vehicles, strip one of them down completely to reduce weight, and race them against each other. The one with less weight will win, and it would represent non ECC. Does that make sense to you?

I've also seen results where ECC can increase perf by 1-2% over standard,
Source Md Ray Md Ray ???
 
Last edited:
You can google it, my friend. 😉
Not finding anything. You should do what I did, and provide at least one source or more. Unless you're making shit up after doubling down so hard? I don't mind being proven wrong, I don't have pride or ego like some people here. Only problem is, I'm not wrong in this situation.


So please list a source for that wild claim of yours. Can't imagine any game gaining performance from ECC 😂😆
 

Md Ray

Member
Not finding anything. You should do what I did, and provide at least one source or more. Unless you're making shit up after doubling down so hard? I don't mind being proven wrong, I don't have pride or ego like some people here. Only problem is, I'm not wrong in this situation.


So please list a source for that wild claim of yours. Can't imagine any game gaining performance from ECC 😂😆
Just click on the link you posted and you'll stumble upon several sources showing +- 1-2% in both ways.

As I've been saying... the diff between them in perf at the end of the day amounts to:
wow-fucking-nothing.gif
 
Last edited:
Just click on the link you posted and you'll stumble upon several sources showing +- 1-2% in both ways.

As I've been saying... the diff between them in perf at the end of the day amounts to:
wow-fucking-nothing.gif
None of my links shows that at all. You've been proven wrong, and are now trying to say ECC has gained performance over non ECC?!?! Lol! I can't make this up 😂.

That's like you saying you can charge a device faster by wireless vs hardwire. Or that wireless headphoness will have better quality or bandwidth vs the same pairof wired cans. ECC won't be faster than non ECC. Not sure why you aren't getting that factual concept?

It's even funnier you call 2%, "nothing", yet you get all happy if your console of choice has a 1% gain over the competitors console. So is 2% now less than 1%? See the irony in this?
 
Yup it's nothing.

Receipts. I think you've mistaken me for your lil brother Riky. :messenger_tears_of_joy: I've seen your sneaky likes and 🔥 on his troll-ey posts multiple times.
2% is nothing... So why are you concerned about 1% and 0.1% like here for instance?

Exactly. In such cases, there's a chance your 1% and 0.1% low min fps can suffer even if the avg. is good enough.

Lmfaaaao! You must be trolling at this point. 2% is nothing, but you are concerned about ZERO POINT ONE PERCENT. It would take 20x of those 0.1% to equal 2%.

You were wrong all along, and you can't stand to be incorrect. But since you are now trying to call other people out who have nothing to do with this, I'll just let you continue to wrong and biased. Non ECC goes VRROOM VRROOOM compared to ECC.

✌️
 
Last edited:
Holy shit... It's referring to the benchmark minimum fps metric. How dense can you be?

0SlIKRo.png



So you honestly don't think having an overall 2% decrease in performance, will effect those 1% or 0.1% lows? You do understand how these things work, right? If I were to lose even a mere 2% of performance from my PC, you do realize my 1% and 0.1% will be lower. That's how these things work. Which is why myself and others have been trying to help you understand that 2% might seem small, but it makes a difference. People were getting banned for doing the math about ps5 dropping 2% to 3% with smartshift, and how that would affect the TF number. So please don't say 2% is nothing, when your concerned about smaller percentages.
 
Last edited:

Md Ray

Member
So you honestly don't think having an overall 2% decrease in performance, will effect those 1% or 0.1% lows? You do understand how these things work, right? If I were to lose even a mere 2% of performance from my PC, you do realize my 1% and 0.1% will be lower. That's how these things work. Which is why myself and others have been trying to help you understand that 2% might seem small, but it makes a difference. People were getting banned for doing the math about ps5 dropping 2% to 3% with smartshift, and how that would affect the TF number. So please don't say 2% is nothing, when your concerned about smaller percentages.
Well, look what we have here. Rainbow six siege ecc vs non-ecc results at same mem speed:

Surprisingly, ECC is actually winning out on the overall min fps side by 9%, and ever so slightly higher overall avg. but I'd say it's about the same as non-ECC. You were saying drotahorror drotahorror DonJuanSchlong DonJuanSchlong ?

Ynh4bDF.png
 
Last edited:
Well, look what we have here. Rainbow six siege ecc vs non-ecc results at same mem speed:

Surprisingly, ECC is actually winning out on the overall min fps side by 9%, and ever so slightly higher overall avg. but I'd say it's about the same as non-ECC. You were saying drotahorror drotahorror DonJuanSchlong DonJuanSchlong ?

Ynh4bDF.png
Different RAM sizes, different RAM speeds, and the ECC was running a lower latency kit than the non ECC. Also in that thread, here's what a poster said


Your CPU is chogging the GPU, RAM frequency won't matter at this time. Score difference is more like a margin of error as differences are not consistent and ECC RAM happens to have higher scores.


So it's a b.s. benchmark essentially. Do you have any VALID benchmarks to disprove my claims facts?

I'll give you one last chance to prove me, GAF, and physics, wrong. Otherwise, I'm just wasting my time trying to help you understand a basic concept about computers.
 

DenchDeckard

Moderated wildly
I'm happy to admit I don't know much about the exact running of all this tech shit but I know enough to know I should never pretend that I know shit that I don't really gave a clue about. That's where you're one step from creating a twitter account and turning into a fanboy psycho online. Posting cherry picked screenshots and trolling every possible thread.

Like I've said from the beginning this one is a win for the ps5 and I would love to know the reasons why this game I'd showing performance differences that do not line up with the hardware.
 
It's literally the same speed. You were proven wrong. Accept it and move on.
Proven wrong? Lol what?!


Your CPU is chogging the GPU, RAM frequency won't matter at this time. Score difference is more like a margin of error as differences are not consistent and ECC RAM happens to have higher scores.

You been wrong all this time, posted a b.s. benchmark, which several people commented in what's wrong with the benchmark. Again, different RAM timings, CPU bottlenecking the GPU, etc. RAM speeds won't matter at that point, etc. Not sure why you are failing to understand the basic concept. ECC<<<Non ECC in speeds and latencies. It's quite sad that you have been wrong all this time, and can't even find a reliable benchmark that works in your favor. There are hundreds of them out there, and you can't find a single one to help your poor argument.

Just accept your L and move on.




 
Comedy Central Mm GIF by Workaholics


(I've said that in jest one time and got rightly mocked)
I'm sure no one would get mocked in this specific situation, proving a very basic fact, like ECC being slower than non ECC. Water is wet. Wired will always be superior to wired in regards of quality and bandwidth.


This reminds me when I was arguing with some guy who thinks wifi 6 is faster than a wired Ethernet cord. Fucking 😂. This ECC thing is extremely similar to that, maybe even worse, as it's clearly stated, literally EVERYWHERE. I'm not sure why his pride/ego is so big, that he can't just move on. I even said I'm out a while back, then he doubles down with even more incorrect info 😂.


I'm out though fr. I'll let the homie continue to be wrong, but at least everyone here witnessed it. I just would hate for someone to buy a workstation GPU over a 3060 ti/3070 after reading that horrible comparison, or even worse, buy ECC ram because they think Md Ray Md Ray was actually correct. That buyer would be extremely pissed to realize they not only left performance on the table, but left their wallet there as well, as they would spend much more to get much less performance/bang for their buck. I don't like the spread of misinformation, as people who are equally uniformed, could possibly make a horrible decision.
 

Md Ray

Member
Proven wrong? Lol what?!




You been wrong all this time, posted a b.s. benchmark, which several people commented in what's wrong with the benchmark. Again, different RAM timings, CPU bottlenecking the GPU, etc. RAM speeds won't matter at that point, etc. Not sure why you are failing to understand the basic concept. ECC<<<Non ECC in speeds and latencies. It's quite sad that you have been wrong all this time, and can't even find a reliable benchmark that works in your favor. There are hundreds of them out there, and you can't find a single one to help your poor argument.

Just accept your L and move on.





Where have I been wrong? I've been saying that ECC, non ECC hardly make any difference to gaming and are within a margin of error at the end of the day, lol.
 
I'm happy to admit I don't know much about the exact running of all this tech shit but I know enough to know I should never pretend that I know shit that I don't really gave a clue about. That's where you're one step from creating a twitter account and turning into a fanboy psycho online. Posting cherry picked screenshots and trolling every possible thread.

Like I've said from the beginning this one is a win for the ps5 and I would love to know the reasons why this game I'd showing performance differences that do not line up with the hardware.
Lack of optimization shown by the fact res wasn't at least lowered to stabilize fps. Ps5 probably didn't get any more, just worked better porting it over due to tools, API or the hardware differences.
 
I never tried to argue different. I find lots here think you have to be for one or the other and it causes a lot of bickering. The only thing I'm really for is a good playable experience for anyone who buys the game.
Edit
That and arguing, I love arguing. It's like debate but you get to say fuck.
I wish more people could think like that. Not having to take sides, and treat it like a religion. Most people buy a console because it's what all of their friends play on. If ps5 and xsx can play the same games, who cares which runs better? Unless you have both consoles, and are genuinely trying to figure out what is the best experience, that should be the only reason to really be invested in comparisons.

Most people just want to put a check mark next to their box of choice, wherever it wins a comparison. Pathetic. Play the games, don't play the console warrior role.
 
Lack of optimization shown by the fact res wasn't at least lowered to stabilize fps. Ps5 probably didn't get any more, just worked better porting it over due to tools, API or the hardware differences.
I'm willing to accept that it was hardware differences that caused issues with this game over optimization as long as it is consistent. We would have to say that Xbox's more efficient engineering allowed games like the Medium to load faster on Xbox despite the PS5 having a faster drive. Would we really be willing to accept that premise universally? It just seems to make way more sense to say it was software optimization or lack there of that causes the biggest performances issues not some sort of hardware deficiency or efficiency at least when it comes to 3rd party games that are on last generation consoles.
 

01011001

Banned
so I see people are still so dumb and argue about console capabilities based on a game that is a trainwreck on every system? damn people are really beyond stupid these days

someone should make a thread about the performance profile of Sonic 06 on PS3 vs 360, and I bet people would argue about which system is more powerful based on the findings :messenger_tears_of_joy:
 
Last edited:
Top Bottom