• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Hitman 3 PS5 vs Xbox Series X|S Comparison

Status
Not open for further replies.

Andodalf

Banned
A 30% drop in resolution plus lowering other settings. The PlayStation is weaker on paper but not that much weaker. What do you think happened?

They didn't want to pick an arbitrary resolution and 1800p was the best step down from 4k for scaling, even if they could've hit an arbitrary higher count such as 1928
 

assurdum

Banned
The "frame rate drops" happen for perhaps 1% to 4% of the game at most though. I could understand this emphasis on framerate drops if it were 20%, or 50% of the time, because at that point it's affecting a large share of the experience.

But barely 1%? It's okay to acknowledge it happens but some people are REALLY trying to harp on it like it's a rampant issue in the Xbox version of the game when it really isn't. Being equally as petty, Xbox fans could technically harp on the fact the PS5 version is at a 44% lower resolution 100% of the game time, but really does that necessarily matter when most people won't notice the difference too much?

And if the argument is that people won't really notice a difference of 44% resolution 100% of the time, how exactly are some other people trying to argue about the framerate drop like people will notice a 12% - 15% framerate drop that happens for barely 1% to 4% of the game? Smells a bit like double standards IMHO.

EDIT: I've also seen people suggest that because on PS5 that area is locked 60 then the game could probably run at a higher resolution...while maintaining locked 60... but you'd think IO would have tested that and if such were true, had actually gone that route. They did not. By the same notion, we could say Series X could run that specific spot at a locked 60 if the resolution were lowered, and there's more proof to actually support that side of the claim than the other.

From what performance profiles IO chose for the two versions I'd figure PS5 could certainly run the game at native 4K but it would've seen more framerate drops than what happens on Series X. If I had to take a guess, the framerate drop rate on PS5 if it were in native 4K would likely be closer to 10% - 15%.



1. There are things specific to Primitive Shaders from Vega that AMD disabled which Sony may've taken and retooled for the Geometry Engine in PS5. So in that context saying the GE in PS5 is recycled from Vega is not exactly misleading.
2. If you're making these claims against him it's best to pull up the quotes he specifically said regarding these things otherwise it's hearsay.​
3. If he said the SSD doesn't help evolve videogame design, then that was probably a knock against people thinking the SSDs would actually aid in processing graphics in the game. They don't. They are just a faster, lower-latency, wider-bandwidth means of moving data into and out of RAM to maximize the RAM usage in the systems. This is to make up for the fact that RAM capacity only doubled (or in Series X's case, only increased by 33%) from 8th-gen systems (due to RAM price drops slowing down massively).​
4. It's a bit hard to go deep into PS5's specs when Sony themselves haven't gone deep into various aspects of the system. That's how channels like RedGamingTech and Moore's Law Is Dead are able to get away with speculating on PS5 specifications that have a high likelihood of not being true: because Sony haven't gone specifically into aspects of the architecture (and I still don't 100% believe it's due to backlash from Road to PS5).​
It has been explained many times what's the GE on ps5 is about it and try to disqualify it with approximative chats as Dictator did it's ungenerous and misinformative as hell. Don't defend such attitude because he is always like this with him when ps5 is involved in a tech discussion. Find a single discussion with him really interested to disquise about ps5 specs without try to redimensioned it in some way. Why didn't do the same when the series X is involved? Where was when many developers on Era tried to explain how modest are the hardware difference in next generation to who said otherwise? You can bet your balls if it was the contrary you will see Dictator in first line to try to convince how ps5 hardware is modest and people who said otherwise we're just pathetic fanboy. He isn't new to such attitude.
 
Last edited:

Mister Wolf

Member
They didn't want to pick an arbitrary resolution and 1800p was the best step down from 4k for scaling, even if they could've hit an arbitrary higher count such as 1928

Thats a silly reason. There are 4 true 16:9 resolutions divisible by 8 between 1800p and 2160p. Of what benefit would choosing 1800p help scaling. Non standard native resolutions were used all the time during the xbox360/PS3 era.
 

Shmunter

Member
With such a large drop in resolution between the two consoles plus lowering the shadows I feel like they struggled developing for the PS5.
Probably not so much struggled, more as spread thin considering the full PSVR game also. That would have required a whole added dev effort including performance as a focus.
 

assurdum

Banned
I think the developers and their proprietary game engine were more suitable working with the PS5.
Couldn't be the same for Hitman3 ? Especially considered series X is virtual coded and cross gen in theory should be easily ported with less trouble on series X?
 
Last edited:

assurdum

Banned
i think coz was just a launch window game and xsx devkits arrived very very late to devs
Honest question you expect that then next assassins will run the same as this one ?)
I mean they update massively Dirt 5 on series X but still it has edge on ps5 resolution wise. Can I say it's quite weird such issue it's just early tools kit relative?
 

SlimySnake

Flashless at the Golden Globes
A 30% drop in resolution plus lowering other settings. The PlayStation is weaker on paper but not that much weaker. What do you think happened?
The 2060 which performs worse than the PS5 in game (not in the memory bandwidth test) can average 58 fps at native 4k and medium settings which is what PS5 is using. Keep in mind, this game performs better on AMD cards than on nvidia cards. Alex's memory bandwidth test is the only place where nvidia cards are outperforming amd cards in this game.

CKeg6A4wGLjRFaTGSq8yD9-970-80.png


The problem is that it drops below 58 fps at times which is something IO did not want on consoles. So they settled on a resolution that afforded them 0 drops below 60 fps.

Here is another benchmark showing the 5700 at ultra setting averaging over 60 fps, though they are using an ryzen 9 5950x which can hit 4.9 ghz which is why we are seeing slightly better performance.

index.php


However, the game isnt CPU bound and has very little VRAM and CPU utilization. Once again, the PS5 outperforms the 5700 in both Alex's memory bandwidth test and in game where it dips below 60 fps at PS5 settings.

What's happening here is that every level performs differently. Some stay above 70 fps at all times. Others stay in the 50-60 range. The PS5 and Xbox Series X both needed every single level to be above 60 fps on average with 0 drops. See the 5700xt performance below. Drops below 60 fps consistently. The PS5 is on par with the 5700xt and simply cannot launch in that state so they reduced the resolution to the next resolution on the list (1800p) and called it a day.



The XSX GPU is 20% more powerful than the 5700xt and is able to stay above 60 fps in all levels at all times (that bizarre foliage bug notwithstanding) which is why you are seeing the XSX run at native 4k 60 fps and the PS5 has to settle at 1800p. I think IO just didnt want to replay the entire game at a custom resolution between native 4k and 1800p only to find out that one level has dips and get trashed by DF costing them sales.

It will be interesting if they launch a native 4k version with an uncapped framerate because I suspect you will see the PS5 go above 60 fps in most levels and even average above 60 fps but as you can see in the dips above, that would simply be unacceptable to many.
 
Last edited:

assurdum

Banned
the thing is very easy
- PS3 with Ken Kutaragi had a very technical presentation where he proudly revealed the Cell architecture where they even showed processor die-shot, showing the layout of the chip and Jen hsun huang explaining nvidia RXS gpuspecs we did know EVERYTHING.

- PS4 had his technical presentation with mark cerny where he explained step by step every little tidbit about the console from asynchronous compute..to the unified pool of memory ...and all the rest . All complemented by even more technical articles released immediately after the presentation where there was practically everything related to the console's hardware. for example this article


This gen instead we had multiple interviews about the xsx .. a reveal of DF .. die shot and the most complete explanation of the architecture made directly by Ms at Hot Chip 2020 ... followed by a presentation of the RDNA2 architecture made by amd talking about the xsx.
On the PS5 side instead we had the technical presentation made by Cerny where he explain all (as he did for the ps4) the hw news that would have introduced by ps5 with a heavy emphasis on customizations made regarding the I / O. that's it
Now there are PlayStation fans who imagine who knows what architecture behind the UNSOLD by cerny but honestly if there had been who knows what hidden advantages .... cerny would have them advertised. Sony would have forced him to do so. From here you will understand for yourself the skepticism of those who know this industry for decades.
What exactly MS has provided to be more clear of Sony presentation? Forgive me but tons of MS chats are about PR marketing, the tech explanation many times are unclear and fit of incredibly perfomance boost without a data of concrete evidence just some comparison pics which the improvement about a game. Let's take how bandwidth should work. I'd like to know how the fuck GPU can reach 560 GBs if hearing them they separated virtually the GPU to the CPU and the RAM benches are the same for both, so how cpu job doesn't interfers to the effective GPU bandwidth? Who knows. And let's not talk about learning machine and the promise of a 300% of boost in perfomance.
 
Last edited:

Loope

Member
What exactly MS has provided to be more clear of Sony presentation? Forgive me but tons of MS chats are about absurdity, the tech explanation about how bandwidth works is far from clear and really contradictory (I'd like to know how the fuck GPU can reach 560 GBs if hearing them they separated virtually the GPU to the CPU and the RAM benches are the same for both, so how cpu job doesn't interfers to the effective GPU bandwidth?).
You're saying that the technical presentation from MS is false and they are lying? You sure like to throw things up in the air without anything to backup.
 

assurdum

Banned
You're saying that the technical presentation from MS is false and they are lying? You sure like to throw things up in the air without anything to backup.
No I'm not saying they are false. But they even show some stuff on nvidia GPU to explain how machine learning should work on series X. Like wut. Their informations are full of such ambiguity if you pay the right attention.
 
Last edited:

MonarchJT

Banned
I mean they update massively Dirt 5 on series X but still it has edge on ps5 resolution wise. Can I say it's quite weird such issue it's just early tools kit relative?
Yes i think it is ..otherwise what should it be ? it is not as 2013 where the rendering frame buffer exceeded the little esram of the xbox one forcing the developers to lower the game resolution compared to the one used in the ps4. The gpus today are practically identical, one at most is a little more evolved than the other (as ms scream "full RDNA2") the main difference lies in the nominal teraflops. The cpu are essentially the same (one has a handful of hz more but nothing that can seriously affect a game) and the bandwidth of the one that runs worse is higher than that in which the game runs better.
You know for example...that right now there's just one game (hivebusters) that use vrs tier 2 that is something very good that free up resources. Ultimately then yes ... I think it is only and exclusively a matter of immaturity of the tools and devs not familiar with the new hardware due to the delay of the devkits.
 
Last edited:

assurdum

Banned
Yes i think it is ..otherwise what should it be ? it is not as 2013 where the rendering frame buffer exceeded the little esram of the xbox one forcing the developers to lower the game resolution compared to the one used in the ps4. The gpus today are practically identical, one at most is a little more evolved than the other (as ms scream "full RDNA2") the main difference lies in the nominal teraflops. The cpu are essentially the same (one has a handful of hz more but nothing that can seriously affect a game) and the bandwidth of the one that runs worse is higher than that in which the game runs better.
You know that right now there's just one game (hivebusters) that use vrs tier 2 for example? that is something very good that free up resources. Ultimately then yes ... I think it is only and exclusively a matter of immaturity of the tools and devs not familiar with the new hardware due to the delay of the devkits
There are tons of possible explanation about it. One of them: unified bandwidth fit better than splitted in many scenarios.
 
Last edited:
Last year, an era member broke down all of his anti Sony posts, most were in regards to the ssd and it was so obvious that Alex had an agenda. Alex then got him banned.

The funny thing is that Alex with his videos on CoD and Valhalla is generally responsible for the PS5 receiving so much great press. He ended up proving that the PS5 is performing like a 2080 Super in those two games.

P.S IIRC, this was after a couple of devs came out and said Alex downplaying the SSD's impact on game design was silly. Alex went on the defensive saying he never said that, nib95 compiled a list of his posts doing exactly that and Alex went to mods crying.
I don't know what he said, what we all know is that most new titles coming on both platforms are coming to the older consoles, that's 3rd party and exclusives. So that means for a while at least the SSD isn't changing the way games are being made, that's a fact.
 
It has been explained many times what's the GE on ps5 is about it and try to disqualify it with approximative chats as Dictator did it's ungenerous and misinformative as hell. Don't defend such attitude because he is always like this with him when ps5 is involved in a tech discussion. Find a single discussion with him really interested to disquise about ps5 specs without try to redimensioned it in some way. Why didn't do the same when the series X is involved? Where was when many developers on Era tried to explain how modest are the hardware difference in next generation to who said otherwise? You can bet your balls if it was the contrary you will see Dictator in first line to try to convince how ps5 hardware is modest and people who said otherwise we're just pathetic fanboy. He isn't new to such attitude.

You have a bad read on him; if anything Alex is more a PC guy, and if you really want you can break down almost every DF person to having a certain preference. PC for Alex, PlayStation for John, Xbox for Richard. The thing though is that they tend to look at multiplat and even certain 1P games together so that kind of creates its own checks-and-balances. That's something none of the other analysis folks out there can claim they have by proxy of usually being a one-man show (though that isn't me saying they give in to their own biases as a result).

I don't really know what things you're specifically referring to because I don't really pay attention to any one singular person's posts like that. If it's a topic I'm interested in, I read from various people, and engage in the topic if I feel like it. If not, then I'll probably engage in discussion of a different topic. But I'm not interested in micromanagement of what one specific person says or does online to try saying if they're a fanboy or spreading FUD or whatever, like I'm trying to build up a court case against them. If someone says something I personally feel is inaccurate, then I'll throw in my own two cents and see what happens. If they end up having a pattern of doing that with very specific things, and I've noticed that firsthand through engagement, then I might bring that up or keep it in the back of my head when knowing how to engage with them on a given topic or guessing what way they will respond.

Thing is Alex doesn't post here like that, and I might see him post every once in a while on B3D but he says nothing disparaging about PS5 there from what I have noticed. He probably posts a tad more on Reeeee, but a lot of threads there tend to devolve into stupidity and I stop lurking in them, probably well before I see posts from any specific person. The last thing I recall him saying there is something about Series X tools improving...don't see how that's a particularly controversial statement if it's indeed true.
 

MonarchJT

Banned
What exactly MS has provided to be more clear of Sony presentation? Forgive me but tons of MS chats are about PR marketing, the tech explanation many times are unclear and fit of incredibly perfomance boost without a data of concrete evidence just some comparison pics which the improvement about a game. Let's take how bandwidth should work. I'd like to know how the fuck GPU can reach 560 GBs if hearing them they separated virtually the GPU to the CPU and the RAM benches are the same for both, so how cpu job doesn't interfers to the effective GPU bandwidth? Who knows. And let's not talk about learning machine and the promise of a 300% of boost in perfomance.
Here we go....memory setup in xsx works exactly like this :







for machine learning you should watch this :

and read about amd fidelityfx super resolution
 
Where is the 12TF difference or the at least 18% power advantage of the Series X? if they don't use the advantage of a console in a multiplatform it will not show automatically... you know that right?
So you saying these multiplatform are not showing true power of these consoles yet?
 
Last year, an era member broke down all of his anti Sony posts, most were in regards to the ssd and it was so obvious that Alex had an agenda. Alex then got him banned.

The funny thing is that Alex with his videos on CoD and Valhalla is generally responsible for the PS5 receiving so much great press. He ended up proving that the PS5 is performing like a 2080 Super in those two games.

P.S IIRC, this was after a couple of devs came out and said Alex downplaying the SSD's impact on game design was silly. Alex went on the defensive saying he never said that, nib95 compiled a list of his posts doing exactly that and Alex went to mods crying.

So on the one hand he has an agenda against PS5 because he made posts downplaying the role of the SSD in design paradigm shifts, yet he's also part of the reason PS5 has been compared to a 2080 Super due to his own analysis of CoD and Valhalla so...he has an agenda against himself in reality?

This is why these accusations are so hilarious to me; they're a lot more like self-projection from people with too much skin in the game than is healthy IMHO. If I had to take a guess, he probably wasn't so much dismissing the SSD's impact so much as taming expectations. And let's be perfectly honest: a good deal of people even now continue to overestimate or misunderstand what the SSD actually fits in the grand scheme of things, and this applies to both Sony and Microsoft's systems.

I think part of this is due to the fact they don't understand how SSDs, and specifically NAND technology, actually works. Like let's take latency for example: it doesn't necessarily matter which system has lower latency because they are still stuck with NAND which has latency measured in microseconds. For comparison, DRAM has a typical latency of 100 ns (nanoseconds) and SRAM even lower than that (typically, L1$/L0$ (depends on if we're talking CPU or GPU here, per AMD nomenclature) has latency of only 1 ns (nanosecond)...although that isn't always the case).

That's just one example; there's still things related to peripheral interconnect (yes, operating a drive for access over PCIe has a bit more latency than if it's directly soldered onto the board, although the actual amount of that latency can't be specified without actual provided data), random access rate, read and write time for the NAND devices (a lot of NAND devices/modules have differing read and write rates), etc. where NAND (and therefore SSDs) are just simply inferior to volatile memories like DDR, GDDR, HBM etc., let alone SRAM and SRAM-based caches.

So yeah, if people are trying to contextualize what the SSDs will actually provide in relation to the entire system design for these systems, I don't think putting that into perspective is anywhere near the same thing as dismissing or downplaying them.
 

sendit

Member
How am I am Xbox fan!? Lol, If anything, I'm a steam fan over everything. You are either projecting really hard at this point, or you just can't find a valid rebuttal to the discussion. Like I said, go through my post history, and you'll find all the answers you need.
You're claiming the tools narrative with no factual evidence. XSX is a DirectX machine, the tools have been around for ages. No one is denying they will continue to iterate and improve. This goes the same with Sony's APIs. For a PC guy, you seem incredibly desperate for XSX's 12 TF to finally have a victory over PS5's 10 TF.
 
Last edited:
You're claiming the tools narrative with no factual evidence. XSX is a DirectX machine, the tools have been around for ages. No one is denying they will continue to iterate and improve. This goes the same with Sony's APIs. For a PC guy, you seem incredibly desperate for XSX's 12 TF to finally have a victory over PS5's 10 TF.
There have been posts that have said the developers have had their hands on the GDK, which is much different than what you are talking about. The only desperate people are you and all of the other warriors. Even in the PC world, gpu's that are wider and have more CU's/cores, perform better than gpu's that are narrow with less bandwidth, less cores, et . Having a higher clock speed, hasn't really given RDNA2 much performance boost at all, which seems to have the same effect on ps5. Might as well get used to happening more and more often as developers get used to the transition to GDK/TOOLS.
 

assurdum

Banned
You have a bad read on him; if anything Alex is more a PC guy, and if you really want you can break down almost every DF person to having a certain preference. PC for Alex, PlayStation for John, Xbox for Richard. The thing though is that they tend to look at multiplat and even certain 1P games together so that kind of creates its own checks-and-balances. That's something none of the other analysis folks out there can claim they have by proxy of usually being a one-man show (though that isn't me saying they give in to their own biases as a result).

I don't really know what things you're specifically referring to because I don't really pay attention to any one singular person's posts like that. If it's a topic I'm interested in, I read from various people, and engage in the topic if I feel like it. If not, then I'll probably engage in discussion of a different topic. But I'm not interested in micromanagement of what one specific person says or does online to try saying if they're a fanboy or spreading FUD or whatever, like I'm trying to build up a court case against them. If someone says something I personally feel is inaccurate, then I'll throw in my own two cents and see what happens. If they end up having a pattern of doing that with very specific things, and I've noticed that firsthand through engagement, then I might bring that up or keep it in the back of my head when knowing how to engage with them on a given topic or guessing what way they will respond.

Thing is Alex doesn't post here like that, and I might see him post every once in a while on B3D but he says nothing disparaging about PS5 there from what I have noticed. He probably posts a tad more on Reeeee, but a lot of threads there tend to devolve into stupidity and I stop lurking in them, probably well before I see posts from any specific person. The last thing I recall him saying there is something about Series X tools improving...don't see how that's a particularly controversial statement if it's indeed true.
Get a research on the net. Look Dictator post history about the ps5. It's not that tough. I did and every time he talks about it it's just to criticize something to someone else and nothing more. He never leaked anything positive about the ps5, well any news in general. Coincidence? But on series X indeed....for me the argument is close anyway. I'm not want to change your mind about him
 
Last edited:

assurdum

Banned
Here we go....memory setup in xsx works exactly like this :







for machine learning you should watch this :

and read about amd fidelityfx super resolution

You aren't unable to understand what's missed in the full bandwidth report right? They practically leave to intend the bandwidth is like to have 336+560 GBs but can't be possible. Every single time the CPU uses 1 GB, occupies half of the bandwidth of the RAM bench, excluding the GPU to it.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Because whatever he actually did say I'm pretty much 100% sure it wasn't some malicious anti-console/anti-PS5 agenda like some people are thinking it was.
I have defended Alex and DF more than any other Sony fan here, but he has repeatedly downplayed the SSD, inferred Sony and Cerny were lying about hardware ray tracing, and has been called out by industry programmers. TBH, I take it back. I dont care if he has an agenda or not. It doesnt matter. He's made a fool out of himself time and time again.
 

JackMcGunns

Member
Oh normally it’s a waste but somehow the art style changes his technical principals and doesn’t the human eye know it!


Absolutely!! Do you even play games? Games with a lot of detail in the distance or lots of foliage could look vastly different at higher resolution. Now John Linneman is biased and lying :pie_eyeroll:
 

MonarchJT

Banned
You aren't unable to understand what's missed in the full bandwidth report right? They practically leave to intend the bandwidth is like to have 336+560 GBs but can't be possible. Every single time the CPU uses 1 GB, occupies half of the bandwidth of the RAM bench, excluding the GPU to it.
oh ahah now I understand what you are pointing to. But you're in doubt because you probably haven't figured out how it works yet
it is a bit off topic and boring but if you want I will try to explain it to you ... but there is nothing wrong with it
 
Last edited:
So are we back to Series X being 40% more powerful now? Strange it seemed to swing back to people suggesting parity just a few weeks ago, but I guess this confirms we'll be seeing 40% from now on.

When the next big releasevto compare - Yakuza on PS5 in March?
Whoever said that did not watch the video.

The series X drops a couple of frames in the tall grass section (all the way to 50), the PS5 doesn't drop at all, but it also renders less pixels.

We don't know what it would do if they had full parity because we don't know the overhead on the PS5 version.

You guys are back at it:
crack crazy kids GIF
 

sendit

Member
There have been posts that have said the developers have had their hands on the GDK, which is much different than what you are talking about. The only desperate people are you and all of the other warriors. Even in the PC world, gpu's that are wider and have more CU's/cores, perform better than gpu's that are narrow with less bandwidth, less cores, et . Having a higher clock speed, hasn't really given RDNA2 much performance boost at all, which seems to have the same effect on ps5. Might as well get used to happening more and more often as developers get used to the transition to GDK/TOOLS.

There have also been post where developers have been saying the tools are fine. Who do you believe? Comparing a PC part vs a console part is pointless. Microsoft paid AMD the same way Sony paid AMD to put together a custom APU within their budget. You're telling me that a software company like Microsoft can't get their DirectX tooling for the XSX at a optimized level? You're also telling me that developers developing on the PS5 can no longer improve on what they've done.

BTW, I have a 3090 in my PC. :messenger_winking:
 
There have also been post where developers have been saying the tools are fine. Who do you believe? Comparing a PC part vs a console part is pointless. Microsoft paid AMD the same way Sony paid AMD to put together a custom APU within their budget. You're telling me that a software company like Microsoft can't get their DirectX tooling for the XSX at a optimized level? You're also telling me that developers developing on the PS5 can no longer improve on what they've done.

BTW, I have a 3090 in my PC. :messenger_winking:
Do you not realize ps4 sdk to ps5 is seamless. Microsoft is transitioning to GDK, which devs aren't exactly familiarized with. I'm sure you knew this already though, and are playing obtuse. Anyone who doesn't care about the console wars, realizes this already. But go ahead and continue to think Xbox doesn't have a hardware advantage in the graphics department, the same way ps5 has an I/O advantage with it's SSD.

If you had a 3090, I doubt you would care so much about ps5 having inferior performance.
 
Last edited:

sendit

Member
Do you not realize ps4 sdk to ps5 is seamless. Microsoft is transitioning to GDK, which devs aren't exactly familiarized with. I'm sure you knew this already though, and are playing obtuse. Anyone who doesn't care about the console wars, realizes this already. But go ahead and continue to think Xbox doesn't have a hardware advantage in the graphics department, the same way ps5 has an I/O advantage with it's SSD.

If you had a 3090, I doubt you would care so much about ps5 having inferior performance.
You're speaking with your ass again. With newer hardware, there is always a transition period with development kits. They aren't reinventing the wheel with GDK. This is an iteration. This goes the same for both Sony's and Microsoft's machine.

Also, I care about the PS5 because Sony brings it with their exclusives that I can't get on the PC day 1 (no matter have many teresterial tera flops i'm pushing in my PC). Whereas I can get every Xbox exclusive (Gamepass ultimate subscriber) on the PC (day 1) and play with a much better experience.

Strix 3090, let me know when you can afford one so you can stop parading Xbox.


VlemOpj.jpg
 
Last edited:
You're speaking with your ass again. With newer hardware, there is always a transition period with development kits. They aren't reinventing the wheel with GDK. This is an iteration. This goes the same for both Sony's and Microsoft's machine.

Also, I care about the PS5 because Sony brings it with their exclusives that I can't get on the PC day 1. Whereas I can get every Xbox exclusive on the PC (day 1) and play with a much better experience.

Strix 3090, let me know when you can afford one so you can stop parading Xbox.


VlemOpj.jpg
Let's go over this again, one more time for you. The transition from xb1 to xsx GDK is more complicated than ps4 to ps5 SDK. You are talking out your ass if you think otherwise.

Let you know when I can afford one!? 😂 Who says I can't or don't have one? One of my subwoofers costs more than the 3090, do I get cool points for having 3? Guess how much the amp costs to power it? Should I take pictures of my bank account too?! Dick measuring isn't doing anything for the discussion. Assumptions aren't your strong point thus far, and projecting isn't helping your argument either.

Back on topic, Xbox has better potential for graphics, which is what devs are finally starting to exploit, as they get familiarized with the new set of tools. Not sure why this is so hard to understand.
 

sendit

Member
Let's go over this again, one more time for you. The transition from xb1 to xsx GDK is more complicated than ps4 to ps5 SDK. You are talking out your ass if you think otherwise.

Let you know when I can afford one!? 😂 Who says I can't or don't have one? One of my subwoofers costs more than the 3090, do I get cool points for having 3? Guess how much the amp costs to power it? Should I take pictures of my bank account too?! Dick measuring isn't doing anything for the discussion. Assumptions aren't your strong point thus far, and projecting isn't helping your argument either.

Back on topic, Xbox has better potential for graphics, which is what devs are finally starting to exploit, as they get familiarized with the new set of tools. Not sure why this is so hard to understand.

Agreed. Playstation 5 will remain stagnate. It has no room for growth. It will not excel at anything greater than what the XSX can do. Playstation 5 is late to the party and was overclocked at the last minute in attempt to close the gap.

54SZzFw.png
 
Status
Not open for further replies.
Top Bottom