• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.


To detail my post :

1) "Gamers comparing Xbox Series X vs. PS5 aren't factoring in Microsoft's dark horse: BCPack. We don't have any real details yet, but it's possible that BCPack will be stronger than RDO BCx encoding+Kraken compression."

2) "In comes a new tweet from a former PlayStation developer, who says to have heard from several developers that Microsoft’s console is the superior console and that the power difference between the Xbox Series X and PS5 is quite shocking. These developers have also said, however, that this doesn’t mean the PS5 won’t be getting great games."

So basically the only advantage Sony had (SSD speed) might be for nothing...
 

M-V2

Member
To detail my post :

1) "Gamers comparing Xbox Series X vs. PS5 aren't factoring in Microsoft's dark horse: BCPack. We don't have any real details yet, but it's possible that BCPack will be stronger than RDO BCx encoding+Kraken compression."

2) "In comes a new tweet from a former PlayStation developer, who says to have heard from several developers that Microsoft’s console is the superior console and that the power difference between the Xbox Series X and PS5 is quite shocking. These developers have also said, however, that this doesn’t mean the PS5 won’t be getting great games."

So basically the only advantage Sony had (SSD speed) might be for nothing...
SSD advantage is 130% (might be for nothing)

The difference in power between the ps5 & SX is 15% (is quite staggering & shocking)

I mean BRUH...
 

devilNprada

Member
seeing as a lot of people here are concerned with superior graphics.. I have a serious question (sorry if off topic).
How much does a quality TV play in getting the best gaming visuals?
 
To detail my post :

1) "Gamers comparing Xbox Series X vs. PS5 aren't factoring in Microsoft's dark horse: BCPack. We don't have any real details yet, but it's possible that BCPack will be stronger than RDO BCx encoding+Kraken compression."

2) "In comes a new tweet from a former PlayStation developer, who says to have heard from several developers that Microsoft’s console is the superior console and that the power difference between the Xbox Series X and PS5 is quite shocking. These developers have also said, however, that this doesn’t mean the PS5 won’t be getting great games."

So basically the only advantage Sony had (SSD speed) might be for nothing...

No I still expect PS5 to have the SSD advantage. All the optimizations in the world can't close up the raw hardware gap. It looks like PS5's memory controller operates on more channels than XSX's (12 vs 8), and it just has more raw performance as a whole. Doesn't mean XSX's SSD is a slouch, but just goes to show how much Sony's put into theirs (out of necessity).

However in practice it very well may not be quite as big a delta as the numbers on paper paint. There's still a lot to both systems we don't know of yet but it's particularly interesting that MS seems to still have a lot of system details to roll out considering how much they've already mentioned.

SSD advantage is 130% (might be for nothing)

The difference in power between the ps5 & SX is 15% (is quite staggering & shocking)

I mean BRUH...

In raw specifications, yes. But in practice, we need to see. Both systems are pushing a lot of optimizations, and there's apparently a lot to XSX SSD MS is holding out on discussing. Now, whatever they're doing with it (if they're doing anything with it), some or a lot of that could be in reaction to Sony, we'll see.

But there's a chance actual performance between the two aren't as large as the paper specs imply, same as how in the GPU department there are some advantages to PS5's approach that help close it a bit with XSX in those areas (and other areas where XSX's GPU has the advantage that makes the spec gap potentially larger in those tasks than the paper numbers indicate).

As for the overall gap, it's closer to 16% to 18%, the 18% depending on instances where the clock drops by the 2% claim Cerny mentioned in the presentation (dropping from 10.275 TF to 10.07 TF).
 
Last edited:
Where are people getting 130%? Last I read, it was 50%. Where are people getting15%? Its anywhere from 18%-25% depending on the variable clock loads.

Oh yes, it's

PS5 SSD is 129% faster than XSX SSD
XSX SSD is 56% slower than PS5 SSD
PS5 SSD speed is 229% of XSX's SSD speed ( that's 22 GB vs 6 GB theoretical number )
XSX SSD speed is 43% of PS5's SSD speed
 
Last edited:

kyliethicc

Member
Not sure if this has been posted yet. Dev talking next gen.

qZcaO77.jpg
 

Neo Blaster

Member
I'm still confused on ps5 storage. If someone can help me out

Is it a spare a nvme slot or is it just replaceable? If only replacable I can imagine alot of people just moving games back and forth from external hdd back to internal this applies to series x too.

I have it in my mind that I'll likely get 3 series x external nvme's and name them based on my 3 big game genres and keep everything else on the internal. I'm gonna assume swapping out the ps5 nvme if it's indeed just replaceable and not a spare slot will not require screws or anything to get too?
I bet the internal SSD is soldered to the main board, hence irreplaceable. Cerny said PS5 will have a bay for expansion, so you can stick a NVME drive as long as it can fit inside and comply to minimum requirements.
 

Great Hair

Banned

probably some 2d platformer but still, might be our first next gen game we will see

Xbox Series X Exclusive From Dynamic Voltage Games Will Run At Native 4K, 120 FPS. FIRST GAME ON XBOX SERIES X to run at 8K 240fps confirmed.

This one of their games .... #LOL
 
Last edited:
I bet the internal SSD is soldered to the main board, hence irreplaceable. Cerny said PS5 will have a bay for expansion, so you can stick a NVME drive as long as it can fit inside and comply to minimum requirements.

The internal drive's replaceable, they mentioned it (or at least pretty much alluded to it) at the conference. The "schematic" (more like a drawing) of the flash memory controller in the presentation made it look like there's only one port path for the flash memory channel ICs. I think if there were a way to connect two drives to the controller it'd of been shown off in the presentation.

Either they have two memory controllers (can't picture it, probably very costly since just one sounds pretty sophisticated), or there's some mux/demux for a singular one to read from two installed drives, but would only be able to read from one or the other, not both concurrently. Plus he (Cerny) stressed a lot about making sure 3rd-party drives were compatible, even down to the design able to fit the slot and not to mention the overhead the memory controller will need to do with 3rd-party drives. So it's likely replaceable only.

Third-party external HDDs should be able to connect over some USB port, but we can pretty much assume those will only be for cold storage of PS5 games, or to run PS4 games off. Similar in that regard to XSX. Such an external drive over USB would be too slow and not able to provide sufficient interconnect with the flash memory controller to it to do its job.
 

Neo Blaster

Member
To detail my post :

1) "Gamers comparing Xbox Series X vs. PS5 aren't factoring in Microsoft's dark horse: BCPack. We don't have any real details yet, but it's possible that BCPack will be stronger than RDO BCx encoding+Kraken compression."

2) "In comes a new tweet from a former PlayStation developer, who says to have heard from several developers that Microsoft’s console is the superior console and that the power difference between the Xbox Series X and PS5 is quite shocking. These developers have also said, however, that this doesn’t mean the PS5 won’t be getting great games."

So basically the only advantage Sony had (SSD speed) might be for nothing...
Have you just arrived in this thread?
 

Zero707

If I carry on trolling, report me.
With conventional cooling XSX could as well produce more heat, it packs more CU's and RT. So let's wait and see :)
like said even if the console was in nuclear state the chip would still push 1825 GHz and XSX was designed around power and performance also do you realize they still have head room for overclock they can reach 1900 Ghz with the chip if they want
 

quest

Not Banned from OT
like said even if the console was in nuclear state the chip would still push 1825 GHz and XSX was designed around power and performance also do you realize they still have head room for overclock they can reach 1900 Ghz with the chip if they want
Hope they don't all reports from 3rd party sources are its quiet even being on all day. Its has reasonable tdp as is no need to turn it into a nuclear reactor. Well designed unit nothing to change.
 

Zero707

If I carry on trolling, report me.
Sweet as thank you so I'll be buying multiple nvmes for ps5 too lordy storage is gonna be expensive for both
I think XSX SSD 1TB version will cost $100-$200 as for PS5 get ready to pay more than $250 for SSD 1TB version.
Also keep in mind Seagate is the launch partner for XSX expect more XSX SSDs from other companies down the line
 

wintersouls

Member
This comparison is very curious. It is in full Spanish that if you are interested you can translate it with google translator. Leaving aside the capabilities that we already know about the SSD.


Basically they comment that AMD and SONY have worked very hard and hand in hand to offer optimization and performance technology in their CPU and GPU: SmartShift, which can make a difference.

 
Last edited:

M-V2

Member
No I still expect PS5 to have the SSD advantage. All the optimizations in the world can't close up the raw hardware gap. It looks like PS5's memory controller operates on more channels than XSX's (12 vs 8), and it just has more raw performance as a whole. Doesn't mean XSX's SSD is a slouch, but just goes to show how much Sony's put into theirs (out of necessity).

However in practice it very well may not be quite as big a delta as the numbers on paper paint. There's still a lot to both systems we don't know of yet but it's particularly interesting that MS seems to still have a lot of system details to roll out considering how much they've already mentioned.



In raw specifications, yes. But in practice, we need to see. Both systems are pushing a lot of optimizations, and there's apparently a lot to XSX SSD MS is holding out on discussing. Now, whatever they're doing with it (if they're doing anything with it), some or a lot of that could be in reaction to Sony, we'll see.

But there's a chance actual performance between the two aren't as large as the paper specs imply, same as how in the GPU department there are some advantages to PS5's approach that help close it a bit with XSX in those areas (and other areas where XSX's GPU has the advantage that makes the spec gap potentially larger in those tasks than the paper numbers indicate).

As for the overall gap, it's closer to 16% to 18%, the 18% depending on instances where the clock drops by the 2% claim Cerny mentioned in the presentation (dropping from 10.275 TF to 10.07 TF).
Say that to Xbox fanboys too my friend, many are spreading misinformtion & I didn't see you quote them (or maybe I'm hallucinating)
 

B_Boss

Member
One thing I keep thinking about is, did the deep dive reveal the "Best features" Jim Ryan was talking about.

I’d wager Jim spoke in terms of marketing, what consumers would (or could) see in PS5 advertisements, very clear and concise descriptions of PS5 features. Mark took a deep dive into the more robust and technical features of the console (of which I am excited to see if Sony’s marketing will translate any of those benefits for public consumption 🤔).
 
I wouldn't consider that a big gap in performance. When a game is gimped down to 30fps and another is running at 60fps...I'd consider that a major performance gap. A minor bump in resolution is nothing when most people probably couldn't even tell the differences. Also, you're comparing something based on the average person and what TV they're using, how far they're sitting, angles, etc. FPS impacts gameplay for every player whether they recognize it or not. That's a staggering performance gap not a minor resolution bump to please your eyes.

"staggering" seems like major hyberbole to me using that word to compare the performance gap considering PS5 has their own advantages. There's too much in play here for anyone to give realistic variables on how well games will perform. Wait until the games are being made before anyone judges either console.
1080p vs 720/900p is quite significant. A lot of blur vs clean image.
5.5 is 129% more than 2.4

2.4 is 56% less than 5.5
um, the 10.2 tf figure is not absolute. There is a reason for the decision to go with variable clocks. The difference can be anywhere from 12.1tf vs 9.2tf-10.2tf. Therefore your numbers are guesstimates at this point
 
1080p vs 720/900p is quite significant. A lot of blur vs clean image.

um, the 10.2 tf figure is not absolute. There is a reason for the decision to go with variable clocks. The difference can be anywhere from 12.1tf vs 9.2tf-10.2tf. Therefore your numbers are guesstimates at this point

Yeah, because 720p is blurry to 1080p, 900p less so, but 1800p to 4k isn't.

Surely it's not absolute, but i'm also thinking that PS5 power won't fall below 10 TF. Said few pages back, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. If that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
 
Last edited:

quest

Not Banned from OT
Yeah, because 720p is blurry to 1080p, 900p less so, but 1800p to 4k isn't.

Surely it's not absolute, but i'm also thinking that PS5 power won't fall below 10 TF. Said few pages back, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. If that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
Never said 2% he said a couple big difference. He was very careful not to give any specific numbers or charts on the down clock it will probably be up to the media to uncover.
 

Sinthor

Gold Member
The ones listed. You do realize that locked clocks are the standard for consoles right? Both PS4 and Xbone run at locked frequencies while in "game mode". Any type of speed-step (thermal based) would result in performance differences between consoles based on what cabinet they are sitting in or ambient room temps.

You do realize that in the real world, GPU clocks vary depending on what's going on? Watch any FPS comparison video and you can see this. Now perhaps you say that's because it's on PC. But then Sony emulating how things are on the PC which has been a superior gaming platform in many ways for many years is somehow a BAD thing? It's been clearly stated that the variable clocks are for performance management, NOT thermal regulation. Let's just see how the machine and games perform, shall we? But you do realize that with the ability to drop the clocks when less is needed, the PS5 is likely to be much more energy friendly AND run at far lower average temps?

Bottom line, guys we need to see the games, but developers continue to say that the difference in performance between the two is negligible. No carping about PS5 having massive performance loss as games crank up in intensity or anything like that. Let's try logical analysis here until we get actual games, shall we?
 
Yeah, because 720p is blurry to 1080p, 900p less so, but 1800p to 4k isn't.

Surely it's not absolute, but i'm also thinking that PS5 power won't fall below 10 TF. Said few pages back, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. If that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
Why do you mention 1800p?
Yeah, because 720p is blurry to 1080p, 900p less so, but 1800p to 4k isn't.

Surely it's not absolute, but i'm also thinking that PS5 power won't fall below 10 TF. Said few pages back, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. If that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
You're "thinking" is a guessimate. If Cerny truly knows for certain that the most the clocks will drop is .02, then surely they'd look to eliminate that for the more reliable/predictable locked clock over variable. Surely Sony could beef up the specs somewhere to eliminate that tiny .02. No?

In other words, if Cerny truly believed the most the PS5 clocks would drop at the maximum load is .02, i have a hard time believeing he'd stick with variable clock solution, but would rather look for another way to elimenate variable clocks all together. As having locked clocks allows devs easier development, more predictability in what the hardware is capable of.
 

quest

Not Banned from OT
You do realize that in the real world, GPU clocks vary depending on what's going on? Watch any FPS comparison video and you can see this. Now perhaps you say that's because it's on PC. But then Sony emulating how things are on the PC which has been a superior gaming platform in many ways for many years is somehow a BAD thing? It's been clearly stated that the variable clocks are for performance management, NOT thermal regulation. Let's just see how the machine and games perform, shall we? But you do realize that with the ability to drop the clocks when less is needed, the PS5 is likely to be much more energy friendly AND run at far lower average temps?

Bottom line, guys we need to see the games, but developers continue to say that the difference in performance between the two is negligible. No carping about PS5 having massive performance loss as games crank up in intensity or anything like that. Let's try logical analysis here until we get actual games, shall we?
We can have a more logical analysis when Sony shows us the numbers. Post them for everyone to review transparency. Until then I'll treat this like I did Microsoft in 2013 worst case and spin unit proven otherwise.
 
Last edited:

kyliethicc

Member
The internal drive's replaceable, they mentioned it (or at least pretty much alluded to it) at the conference. The "schematic" (more like a drawing) of the flash memory controller in the presentation made it look like there's only one port path for the flash memory channel ICs. I think if there were a way to connect two drives to the controller it'd of been shown off in the presentation.

Either they have two memory controllers (can't picture it, probably very costly since just one sounds pretty sophisticated), or there's some mux/demux for a singular one to read from two installed drives, but would only be able to read from one or the other, not both concurrently. Plus he (Cerny) stressed a lot about making sure 3rd-party drives were compatible, even down to the design able to fit the slot and not to mention the overhead the memory controller will need to do with 3rd-party drives. So it's likely replaceable only.

Third-party external HDDs should be able to connect over some USB port, but we can pretty much assume those will only be for cold storage of PS5 games, or to run PS4 games off. Similar in that regard to XSX. Such an external drive over USB would be too slow and not able to provide sufficient interconnect with the flash memory controller to it to do its job.
No the internal SSD is not removable. PS5 and XSX will be like the Switch. Internal storage plus expansion. The M2 expansion bay on PS5 is a 2nd storage option, hence expansion not replacement. XSX has the expansion card slot. The PS5 has a 12 channel interface and therefore 12 individual flash chips soldered onto the board and will not be removed by the user (I mean I guess you could but that’s highly risky and just modding the console at that point; probably just end up breaking it.)
 
Why do you mention 1800p?
You're "thinking" is a guessimate. If Cerny truly knows for certain that the most the clocks will drop is .02, then surely they'd look to eliminate that for the more reliable/predictable locked clock over variable. Surely Sony could beef up the specs somewhere to eliminate that tiny .02. No?

In other words, if Cerny truly believed the most the PS5 clocks would drop at the maximum load is .02, i have a hard time believeing he'd stick with variable clock solution, but would rather look for another way to elimenate variable clocks all together. As having locked clocks allows devs easier development, more predictability in what the hardware is capable of.

Was referred to your 720p statement how it was significant difference.
Mentioned 1800p because difference in raw GPU power percentages between PS5 and XSX. Just because of that. And 1800p is trully a minor difference vs. 4k. Maybe both will be 4k, but some less minor graphical stuff on PS5.

Maybe it's a guess, maybe it's not. Cerny literally said that both GPU and CPU will be at peak clocks most of the time. And if there is a necessity to lower a clock in worst case scenarios, few percentages will be enough. Cerny said that. And also like i said, based NXG calculation, drop is around 50 MHz.

Not necessarily. Look at the Switch and his variable clock :

New Switch mod delivers real-time CPU, GPU and thermal monitoring - and the results are remarkable
 
Last edited:

kyliethicc

Member
Yeah, because 720p is blurry to 1080p, 900p less so, but 1800p to 4k isn't.

Surely it's not absolute, but i'm also thinking that PS5 power won't fall below 10 TF. Said few pages back, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. If that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
Yeah people forget that there are diminishing returns to increases in resolution. That’s why 4k is the highest resolution we probably ever need. 8k is just overkill. Even 1440p is good enough, frame rate it far more important. But again, diminishing returns. 30 vs 60 FPS is huge, but 60 vs 120 less, and 120 vs 240 is again even less impactful of a change because once at 120 it’s plenty high already.
 

nikolino840

Member
No the internal SSD is not removable. PS5 and XSX will be like the Switch. Internal storage plus expansion. The M2 expansion bay on PS5 is a 2nd storage option, hence expansion not replacement. XSX has the expansion card slot. The PS5 has a 12 channel interface and therefore 12 individual flash chips soldered onto the board and will not be removed by the user (I mean I guess you could but that’s highly risky and just modding the console at that point; probably just end up breaking it.)
I Remember that you can replace also...but i don't want to rewatch the Cerny's video XD
 

ethomaz

Banned
I think he said PS5 will struggle running this next to Series X...….He says Series X runs this so much better......:messenger_grinning_smiling::messenger_tears_of_joy:
The game won’t release to PS5 he didn’t have DevKit.

I understand his ideia... he is a nobody so he wants to release day one where there is not that much games and get sales.... indies profits a lot in the first year of a console before the heavy hits come.

As his game is exclusive to Xbox he is trashing PS5 and praising Xbox with the goal to get Xbox user attention and sales.

He posted a water video he made for his game using that powerful Xbox and I though wow that is trash level and didn’t look even like water... he wanted to shows how devs can use the Xbox power to made pretty graphics at same time that water can’t run on PS5 because uses compute and not SSDs speeds.
 
Last edited:

DaGwaphics

Member
You do realize that in the real world, GPU clocks vary depending on what's going on? Watch any FPS comparison video and you can see this. Now perhaps you say that's because it's on PC. But then Sony emulating how things are on the PC which has been a superior gaming platform in many ways for many years is somehow a BAD thing? It's been clearly stated that the variable clocks are for performance management, NOT thermal regulation. Let's just see how the machine and games perform, shall we? But you do realize that with the ability to drop the clocks when less is needed, the PS5 is likely to be much more energy friendly AND run at far lower average temps?

Bottom line, guys we need to see the games, but developers continue to say that the difference in performance between the two is negligible. No carping about PS5 having massive performance loss as games crank up in intensity or anything like that. Let's try logical analysis here until we get actual games, shall we?

I wasn't saying that Sony's system doesn't work, just that MS's approach isn't unusual, in fact it is has been the standard. The thermal based clocks in PC components has always been one of the weakest aspects of the platform. Why most won't match the benchmarks for any given card/CPU (because most of the published benchmarks are on open benches). Personally, I don't care how much power a console uses, so long as it can dissipate the heat created and not sound like a vacuum cleaner. LOL
 

SonGoku

Member
ROPs confirmed or guess?
Sounds familiar.
Except in this case the opposite is being parroted.
the performance delta between the two platforms is not as great as the raw numbers lead the average consumer to believe,
Raw numbers tell us the performance delta for next gen is way smaller than this gen.

It's seeming like the Tempest engine is basically Sony's own sound system like ATMOS,
ATMOS is a license not a hardware solution available to each user
 

Audiophile

Gold Member
TF is the theoretical peak amount of floating point operations that one component of a GPU can do in a second. Think of this as the ceiling, not where you spend most of your time (or in reality, any of it).

Running all processors at full clocks doesn't equate to full utilisation and peak power draw. It's primarily the nature of the calculations being done and the utilisation of the threads that will draw more power.

Hence you can run at full clocks and "10.3TF" (but not really) without drawing peak power. When the GPU does approach peak utilisation and subsequently peak power draw, it can utilise smartshift so that the CPU can then lend some of its power to the GPU to get it even closer to peak utilisation; which in turns allows it to throw a few more pixels/frames/fx at the screen. It's relatively rare that CPU and GPU are being fully saturated at the same time in game software, so I expect this to be the dominant and preferable option rather than the GPU clocking down causing performance loss.

Sony are placing their power budget in a place where the system is likely to spend the vast majority of its time, rather than where the higher peaks are. This allows them to run everything higher than they otherwise would have given the thermal/financial/power envelope inherent to their design. This also means they don't have to over-engineer the cooling solution for a state that the console is rarely in.


Don't think of this a base clock and a boost clock. Think of the inverse, with the boost as the base clock and in certain circumstances (very high thread utilisation, power hungry instructions (AVX on CPU for eg.]) it will come down to a throttle clock.


This is based on power only and not random thermals (a set, uniform thermal ceiling will be factored in to the cooling already); it is also deterministic and based around a model SoC. Developers will be able to determine when and where certain changes may occur and how to handle them. I'll hazard a guess they'll have this implemented in their profiling software and will likely be able to automate much of the process with algorithms; and throttling could likely be handled by a small dynamic resolution scale, more aggressive VRS, reduced LOD or something of that nature.

Also, with a ~2% reduction in clocks giving a 10% reduction in power. Don't expect massive down clocks beyond that range.


Why variable? Because it at least allows for more than would otherwise be possible with the given piece of hardware and surrounding components; with the caveat that in rarer scenarios decisions may have to be made in terms of where to direct your power budget. It's a smart, economic compromise that turns an existing idea on its head and allows for the system to punch a little above its weight while allowing for resources to be put into -- what are in my opinion -- more important areas..

In addition to this, a constant, known power draw means a constant known thermal footprint, which means you can tailor your cooling system and fan speeds for a singular point; maximising efficiency and acoustics.

It takes a little while to get your head around it as the idea is a reversal of what we've seen elsewhere, the point from which it's built outwards is shifted mainly from thermals to power draw; and it's optimised for what the system does most as opposed to what it does rarely.


Edit: The only thing that points to a 9.2TF version of the PS5 GPU is the GitHub leak, which admittedly was right about the CU count, but was wrong about architecture and feature sets. GitHub also reported a different clock and CU count for the XSX. Clocks tend to be subject to change, as are disabled/active CU counts.
 
Last edited:
Status
Not open for further replies.
Top Bottom