• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

magnumpy

Member
Well that’s the thing I barely use my Xbox. I haven’t used it since 2017. So waiting a year would be totally easy for me. sounds like an easy decision to wait then I guess unless the backwards compatibility isn’t universal.

plus the initial games will be mostly "enhanced" last gen games. true next gen titles won't be coming out until at least a year after launch, so holidays 2021 at best :(
 

demigod

Member
---
Matt from Resetera flip-flopped and "guessed" that Xbox will be more powerful:
Matt1.jpg

---
Klee stated people won't be impressed by specs as games. Simply stated double-digit TF, which I don't disagree with.
---

Why are you quoting Matt and not what he actually meant? Why aren’t you quoting him after Klee’s claims? Oh right because you’re being disingenuous.
 

kikonawa

Member
The only thing that I can conclude from all the recent spec reveals (plus speculation on this thread) is that the new consoles are going to be expensive this time around. This correlates with Sony's statement that the price of the PS5 will be "appealing to gamers". High spec machines, but I bet they're between £449 and £500. It would certainly make them the priciest machines compared to others:

PlayStation: $472
Sega Saturn: $630
Nintendo 64: $305
Dreamcast: $287
PlayStation 2: $418
GameCube: $270
Xbox: $406

(Adjusted for inflation. Also, that Saturn price. WTF?!)

the ps2 launched in belgium at 20.750 bef. Thats 520 euro without inflation! Nearby countrys itwas aprox the same
 
1080p can't be the resolution forever.

It might well be, as it all depends on the size of your screen and the distance you are from it. The further away you are from the screen the more of the details are lost (obviously) so the quality of the picture won't be so discernible.

For example, at a distance of 10 feet, you will be hard pressed to notice the difference between 1080p and 4K on a TV under 50 inches (I think it might even be 65 inches!). That's basically most people sitting on a sofa in their living room. I, myself, sit around 8 feet from my 42" TV, so upgrading to 4K would have no benefit. HDR, on the other hand, would be quite noticeable.

PC gamers tend to play games on monitors, sitting perhaps 2-3 feet away. The difference between 1080p and 4K for them will be obvious. For the rest of us, here's a good guide.

chart_Rtings.com_.jpg


(note: when I was looking for this image I found a handful of graphs that suggested that 8K was worth it at 8 feet on a 40" TV, which were obviously sarcasm).
 

stevecro7

Neo Member
It might well be, as it all depends on the size of your screen and the distance you are from it. The further away you are from the screen the more of the details are lost (obviously) so the quality of the picture won't be so discernible.

For example, at a distance of 10 feet, you will be hard pressed to notice the difference between 1080p and 4K on a TV under 50 inches (I think it might even be 65 inches!). That's basically most people sitting on a sofa in their living room. I, myself, sit around 8 feet from my 42" TV, so upgrading to 4K would have no benefit. HDR, on the other hand, would be quite noticeable.

PC gamers tend to play games on monitors, sitting perhaps 2-3 feet away. The difference between 1080p and 4K for them will be obvious. For the rest of us, here's a good guide.

chart_Rtings.com_.jpg


(note: when I was looking for this image I found a handful of graphs that suggested that 8K was worth it at 8 feet on a 40" TV, which were obviously sarcasm).


That optimal viewing guide is flawed - it comes from rtings https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

it assumes you ( and everyone else) have precisely 20-20 vision
20-20 vision is the borderline between vision which is normal , and vision which needs to be corrected
it has no relation to what an average / median person's vision looks like, after correction

perfect vision is 20-10, and the distances in the above graph would be doubled, if they wanted to see the same amount of information, as someone with 20-20 vision,
so at 50 inches UHD for them, would be worth it at 16ft...

now I am not saying everyone has perfect vision, I certainly don't, but I am saying the chart is guff

here is a calculator for you : https://goodcalculators.com/tv-viewing-distance-calculator/
put your own visual acuity into it
 

stevecro7

Neo Member



That optimal viewing guide is flawed - it comes from rtings https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

it assumes you ( and everyone else) have precisely 20-20 vision
20-20 vision is the borderline between vision which is normal , and vision which needs to be corrected
it has no relation to what an average / median person's vision looks like, after correction

perfect vision is 20-10, and the distances in the above graph would be doubled, if they wanted to see the same amount of information, as someone with 20-20 vision,
so at 50 inches UHD for them, would be worth it at 16ft...

now I am not saying everyone has perfect vision, I certainly don't, but I am saying the chart is guff

here is a calculator for you : https://goodcalculators.com/tv-viewing-distance-calculator/
put your own visual acuity into it

Here is a better calculator

Proxy for EU GDPR block: https://proxy-nyc.hidemyass-freepro...29tL2Rpc3BsYXktcmVzb2x1dGlvbi1jYWxjdWxhdG9yLw
 
Last edited by a moderator:

Sosokrates

Report me if I continue to console war
Will it still amount to much of a visual difference though? The higher the resolution the harder it gets to spot differences.

It could do, if one has like $100 more loss then the other, but the rumours have suggested this not to be the case.
 

pawel86ck

Banned



That optimal viewing guide is flawed - it comes from rtings https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship

it assumes you ( and everyone else) have precisely 20-20 vision
20-20 vision is the borderline between vision which is normal , and vision which needs to be corrected
it has no relation to what an average / median person's vision looks like, after correction

perfect vision is 20-10, and the distances in the above graph would be doubled, if they wanted to see the same amount of information, as someone with 20-20 vision,
so at 50 inches UHD for them, would be worth it at 16ft...

now I am not saying everyone has perfect vision, I certainly don't, but I am saying the chart is guff

here is a calculator for you : https://goodcalculators.com/tv-viewing-distance-calculator/
put your own visual acuity into it
20/20 is average eyesight quality, and for this reason considered a healthly norm. Of course human eyesight can see even better than 20/20, but eyesight quality is constant fluctuation and people will see differently depending on many factors (lighting, stress, tiredness, and even your posture). There's no way people who play games all day long will see better than 20/20 for extended period of time. I will even say the norm is many gamers wear glasses, and the thing is with glasses you will never see the same amount of details as people without glasses, because glass change objects natural size and limit the amount of details you can see. So with glasses you will see sharp, but the amount of details is limited.

When it comes to 4K resolution viewing dostance, there are tests like that made by HDTV sites and experts, and conclusion is always the same. Most people can see real difference between 4K and 1080p from up close, but not from their normal viewing distance. That's why many people consider 4K resolution a waste of HW resources. IMO 4K is perfect resolution, but on PC monitor. I bet if PS5 and xbox4 gsmes will use dynamic 4K or 1800p, 99% of people will not even see the difference.
 
Last edited:

stevecro7

Neo Member
20/20 is average eyesight quality, and for this reason considered a healthly norm. Of course human eyesight can see even better than 20/20, but eyesight quality is constant fluctuation and people will see differently depending on many factors (lighting, stress, tiredness, and even your posture). There's no way people who play games all day long will see better than 20/20 for extended period of time. I will even say the norm is many gamers wear glasses, and the thing is with glasses you will never see the same amount of details as people without glasses, because glass change objects natural size and limit the amount of details you can see. So with glasses you will see sharp, but the amount of details is limited.

When it comes to 4K resolution viewing dostance, there are tests like that made by HDTV sites and experts, and conclusion is always the same. Most people can see real difference between 4K and 1080p from up close, but not from their normal viewing distance. That's why many people consider 4K resolution a waste of HW resources. IMO 4K is perfect resolution, but on PC monitor. I bet if PS5 and xbox4 gsmes will use dynamic 4K or 1800p, 99% of people will not even see the difference.


Not sure where you are getting the 20/20 vision being average from,
Visual acuity changes with age, for averaging around 20/14 and deteriorating , 20/20 is average for people age 60

if your vision is any worse than 20/20 it is defective and needs correction

I'm not sure if you have ever seen a 4k tv, but the clarity is incredible compared to a 1080p panel,
if 4k wasn't visible there wouldn't be a market for it
 
I love that
It might well be, as it all depends on the size of your screen and the distance you are from it. The further away you are from the screen the more of the details are lost (obviously) so the quality of the picture won't be so discernible.

For example, at a distance of 10 feet, you will be hard pressed to notice the difference between 1080p and 4K on a TV under 50 inches (I think it might even be 65 inches!). That's basically most people sitting on a sofa in their living room. I, myself, sit around 8 feet from my 42" TV, so upgrading to 4K would have no benefit. HDR, on the other hand, would be quite noticeable.

PC gamers tend to play games on monitors, sitting perhaps 2-3 feet away. The difference between 1080p and 4K for them will be obvious. For the rest of us, here's a good guide.

chart_Rtings.com_.jpg


(note: when I was looking for this image I found a handful of graphs that suggested that 8K was worth it at 8 feet on a 40" TV, which were obviously sarcasm).
The shit you are spatting is incorrect. I've got a 32" 4K monitor for my PC as well as a 65" 4K TV and I can tell the difference from 1080p. But who's to say that next gen games wont have options like they do now for the pro systems?
 

pawel86ck

Banned
Not sure where you are getting the 20/20 vision being average from,
Visual acuity changes with age, for averaging around 20/14 and deteriorating , 20/20 is average for people age 60

if your vision is any worse than 20/20 it is defective and needs correction

I'm not sure if you have ever seen a 4k tv, but the clarity is incredible compared to a 1080p panel,
if 4k wasn't visible there wouldn't be a market for it
Ophthalmologists consider 20/20 a norm on eye exam and will prescribe you glasses only if you dont see 20/20 line, and it's also why all distance calculation charts mention 20/20, because it's a well known norm. Yes, some people and especially young kids can see even better than 20/20 line (like 20/15 and even 20/10) but it's never constat quality. If you would ask kids to look at 20/10 line for long period of time, they will stop seeing it after few minutes. Human body is not a machine my friend, and muscles works differently depending on many factors. However healthy people can see 20/20 line for extended periods of time, because their average eyesight quality will be still above 20/20, so quality fluctuation is not that much visible then. However you cant expect people to see 20/10 for a whole day, it's just impossible.

Now lets talk about real world scenarios


As you can see, some people cant tell the difference even from up close, and not to mentiom from 3 meters when they watch TV from normal viewing distance.

You say there's a marked for 4K displays, because people can see differences. That's of course your opinion, but I think TV manufacturers care about money and their marketing skills just force people into 4K without even thinking about it. These days you cany even buy good quality 1080p tv, so people are forced to buy 4K set anyway. Now TV manufacturers will try to push their 8K marketing to sell 8K sets to people who just already bought 4K sets 😂😂👌.
 

stevecro7

Neo Member
Ophthalmologists consider 20/20 a norm on eye exam and will prescribe you glasses only if you dont see 20/20 line, and it's also why all distance calculation charts mention 20/20, because it's a well known norm. Yes, some people and especially young kids can see even better than 20/20 line (like 20/15 and even 20/10) but it's never constat quality. If you would ask kids to look at 20/10 line for long period of time, they will stop seeing it after few minutes. Human body is not a machine my friend, and muscles works differently depending on many factors. However healthy people can see 20/20 line for extended periods of time, because their average eyesight quality will be still above 20/20, so quality fluctuation is not that much visible then. However you cant expect people to see 20/10 for a whole day, it's just impossible.

Now lets talk about real world scenarios


As you can see, some people cant tell the difference even from up close, and not to mentiom from 3 meters when they watch TV from normal viewing distance.

You say there's a marked for 4K displays, because people can see differences. That's of course your opinion, but I think TV manufacturers care about money and their marketing skills just force people into 4K without even thinking about it. These days you cany even buy good quality 1080p tv, so people are forced to buy 4K set anyway. Now TV manufacturers will try to push their 8K marketing to sell 8K sets to people who just already bought 4K sets 😂😂👌.

If you can't see the difference, just don't get a 4k tv

ps5/xbox 2 will work with the 1080p tv, not an issue
so does the ps4 / pro / xbox one x

but they have to target the 4k market


if sony are saying their console will look great at 4k30 and 4k60
and microsoft comes out and says 1080p is the future

I can tell you which console I would buy
 
The only thing that I can conclude from all the recent spec reveals (plus speculation on this thread) is that the new consoles are going to be expensive this time around. This correlates with Sony's statement that the price of the PS5 will be "appealing to gamers". High spec machines, but I bet they're between £449 and £500. It would certainly make them the priciest machines compared to others:

PlayStation: $472
Sega Saturn: $630
Nintendo 64: $305
Dreamcast: $287
PlayStation 2: $418
GameCube: $270
Xbox: $406

(Adjusted for inflation. Also, that Saturn price. WTF?!)

Even if you took out the pack-in game that's still $580....yeesh....

Hmm...I guess we might see a $499 PS5 after all. I can't imagine MS pricing Scarlet beyond that though, it would probably be suicide unless the hardware was a magnitude more capable (and honestly, how much stronger can they realistically make Scarlet over PS5 to justify a $100 increase?).
 
That's just hardware decoding for video files. Or do you refer to something different? Because this is 100% not related to video games.
When they say 8K support (it was a bullet point in Sony's presentation), I assume they mean video decoding and maybe simpler, indie games.
 
I wonder how many bits of processing power & texture mapped polygons per second scarlett and ps5 can do. Probably 1 trillion polys/sec and 384bits :p
 

pawel86ck

Banned
If you can't see the difference, just don't get a 4k tv

ps5/xbox 2 will work with the 1080p tv, not an issue
so does the ps4 / pro / xbox one x

but they have to target the 4k market

if sony are saying their console will look great at 4k30 and 4k60
and microsoft comes out and says 1080p is the future

I can tell you which console I would buy

I already have 4K TV

Based on my tests 4K native makes a difference from up close, but not from NORMAL viewing distance. I have asked many people to share their opinion about 4K content on my TV, and their impressions were the same. I dont know why, but even 1080p looks ultra sharp on my TV, and the only issue on 1080p content for me is aliasing compared to 4K. However even on LCD monitor is a totally different story, becanse then upscaled content looks like blurry mess to me and if I would play only a PC montior, then I would only want to see 4K native content on it (not 1440, 1800p or even 4K dynamic). For PC montior owners I think 4K content on next gen consoles will make a big difference.

I also have 1080p plasma HDTV in the other room, or should I say had, because two days ago lightning has destroyed it. Detailing from close distance was worse, but not to the point I would say picture was blurry, but again from normal viewing distance I could tell no real diffrence in sharpness between 1080p and 4K screen. However I could always see much better colors on my plasma TV, better motion, black levels and contrast. These things were much more important to me than 4K resolution alone, and for this reason I was gaming on plasma most of the time. If damage is beyond repair I will be forced to buy another 4K HDTV, because current 1080p TV's are far inferior even compared to my old plasma GT60, and only the best 4K sets offers good quality panels that can match and surpass my old plasma (OLED for sure).
 
Last edited:
I wonder how many bits of processing power & texture mapped polygons per second scarlett and ps5 can do. Probably 1 trillion polys/sec and 384bits :p
It's irrelevant these days to mention these metrics, that's why they don't bother. It's all about TF wars, lol.

Zen 2 is a 256-bit processor (if you count the vector unit width), Navi is a GCN ISA GPU (2048-bit vector processing). 384-bit might be the memory bus width.

Polygons are abundant these days, even on Switch.
 

Dane

Member
Even if you took out the pack-in game that's still $580....yeesh....

Hmm...I guess we might see a $499 PS5 after all. I can't imagine MS pricing Scarlet beyond that though, it would probably be suicide unless the hardware was a magnitude more capable (and honestly, how much stronger can they realistically make Scarlet over PS5 to justify a $100 increase?).

I think we could even see games being priced at 70 bucks.
 

Gamernyc78

Banned
Well that’s the thing I barely use my Xbox. I haven’t used it since 2017. So waiting a year would be totally easy for me. sounds like an easy decision to wait then I guess unless the backwards compatibility isn’t universal.

I didn't buy an X either because I never use my Xbox so it didn't make any sense and now I'm playing the wait and see game.
 

Dane

Member
ooh...got a feeling that won't cut down on the lootboxes and microtransactions, either.

It's like things are back to 16-bit era in pricing but, weren't discs supposed to make games cheaper? Hmmm :goog_confused:

LEL, i'm not even saying raising the price for nothing ,but rather the inflation, normally the price may stand for two generations, and then it raises on the next one.
 
GaaS/DLC/MTX are extremely profitable schemes, there's no reason for price hikes.

Besides, $60 or $70 it doesn't make a difference, because if you wait 3-4 months most games drop 50%, so...

It's like a meta-game of the traditional razor & blades business model, where the base game is being devalued intentionally to sell you more stuff (subscriptions, season passes, DLC, MTX).
 

FrostyJ93

Member
Got a feeling we will see both consoles before April. Everyone knows PS4 and One hardware sales have not been that great this year so next gen reveals can make up for it. And even if they aren't out till October/November reveals n Q1 will get the consoles on peoples minds as they get their tax returns.
 
Last edited:

DrAspirino

Banned
If you can't see the difference, just don't get a 4k tv

ps5/xbox 2 will work with the 1080p tv, not an issue
so does the ps4 / pro / xbox one x

but they have to target the 4k market


if sony are saying their console will look great at 4k30 and 4k60
and microsoft comes out and says 1080p is the future

I can tell you which console I would buy
It depends. If 1080p BUT 120 Hz and Raytracing is the future, then, by all means, I'd put my money on that. I'd rather have Raytracing at 1080p (and would be ideal at a higher framerate) than 4k 60 hz and no raytracing.
 
The SSD in either console probably won't be removable.

Yep. I have a feeling these things are going to be closely tied to the motherboard. If so, I have a feeling that (for the PS5) you'd see something like this:

External HDD are supported.
PS5 games on external HDDs must be moved to the internal storage to be played.
PS4 games and apps (TV stuff like Netflix) can run from anywhere.

I say this because PS5 game devs will be expecting certain IO performance from the internal system that could cause issues when being read from an external device. There's been cases in the past where devs have expected such things but have encountered unexpected problems. I once read about how Bioware, when developing Knights of the Old Republic, expected the Xbox's DVD drive read has consistent IO performance. As it turned out, MS sources their DVD controllers (or maybe even the drives) from different manufacturers, leading to varying read and write speeds.

(it might have been a different game, but I certainly read it in RetroGamer some years back)
 

FrostyJ93

Member
Yep. I have a feeling these things are going to be closely tied to the motherboard. If so, I have a feeling that (for the PS5) you'd see something like this:

External HDD are supported.
PS5 games on external HDDs must be moved to the internal storage to be played.
PS4 games and apps (TV stuff like Netflix) can run from anywhere.

I say this because PS5 game devs will be expecting certain IO performance from the internal system that could cause issues when being read from an external device. There's been cases in the past where devs have expected such things but have encountered unexpected problems. I once read about how Bioware, when developing Knights of the Old Republic, expected the Xbox's DVD drive read has consistent IO performance. As it turned out, MS sources their DVD controllers (or maybe even the drives) from different manufacturers, leading to varying read and write speeds.

(it might have been a different game, but I certainly read it in RetroGamer some years back)

Exactly. Putting in an SSD and focus on it and minimizkng load times only to compromise to run off an external hard disk drive kinda defeats the purpose. Hence why external HDDs will act as cold stprage for PS5 games.
 
It can still be a removable M.2 NVMe PCI 4.0 SSD with a custom software stack for increased I/O efficiency (think of something like Mantle).

They can always set some minimum requirements for replacing the SSD, just like Sony did for the PS4:
What type of hard drive does PS4 use?

PS4 is equipped with a 5400 RPM SATA II hard drive. Users can choose to install a new hard drive so long as it complies with these standards, is no thicker than 9.5mm, and is larger than 160GB.
Trust me, there are plenty of XB1 users that don't like it when their internal HDD is bricked. The entire console becomes useless, unless you're lucky enough to be covered by warranty.

I'm not sure how many of you guys would like it if the SSD was soldered and a few years later your console was bricked... I don't think Sony/MS would enjoy the negative PR of #BrickGate.

We know for a fact both Sony and MS want their consoles to last a minimum of 10 years. QLC NAND has a write endurance of 100-1000 cycles. For example, Intel 660p only supports 200 write cycles.

An external USB HDD can also run BC games directly (PS1/2/3/4). Regarding next-gen games, we could have some PS5 indies (think of Resogun) that don't need an SSD.

I'm pretty sure the PS5/XB2 SDKs will have a boolean flag to set if SSD is required or not. A 500MB indie has no reason to require it, since it can run straight from the DRAM.

It's the huge open world AAA games (both SP collectathons such as AC Odyssey and MP games with huge maps such as Fortnite/Apex) that will absolutely require it.
 

Mass Shift

Member
I once read about how Bioware, when developing Knights of the Old Republic, expected the Xbox's DVD drive read has consistent IO performance. As it turned out, MS sources their DVD controllers (or maybe even the drives) from different manufacturers, leading to varying read and write speeds.

(it might have been a different game, but I certainly read it in RetroGamer some years back)

No you're spot on. There were people with older Xboxs that were having all kinds of hang ups with KOTOR while new Xbox owners experienced fewer issues.

I wouldn't have entertained playing it on my Xbox back in those days but there was no PC version to play. Xbox was the only way to play it at first.
 
Last edited:
Status
Not open for further replies.
Top Bottom