• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

HAL-01

Member
The next Fallout will finally reach the best AAA in graphics .... of the gen of PS4 and Xbox One of course, just imagine
be able to bug your game in some way than the SSD behave like the old HDD lol
"were sorry to confirm a bug in The Elder Scrolls VI that causes every asset streamed into memory to be duplicated into your save file. We're looking into it, meanwhile consider expanding your storage. Please understand"
 

xacto

Member
What part of the demo was unclear when they said billions of polygons visible in the frame "crunched down losslessly to around 20 million drawn triangles." It's a completely new method of rendering scenes and you're effectively getting what would have taken billions of polygons in a frame. It's however they are doing the lossless compression that is key.

It was so clearly explained and demonstrated I can't wrap my mind around anyone saying it's "crap." That's honestly your takeaway from what you saw? Exactly what are you working on in your career to be able to so casually toss such a phrase around clearly revolutionary technology. :pie_thinking:

This whole "I have no idea what UE5 is about but I have my own opinion about it" is just the same thing as people having no idea what Cerny has shown in his presentation, but they kept babbling about it still.
 
Nvidia will improve a lot its RT solution, now they can render scenes with millions of polygon lights:


In case you want a more technical explanation well here is the Paper (also if you want to have a headache)
https://developer.download.nvidia.com/devblogs/ReSTIR.pdf

Denoising is particularly challenging with low sample-count images; as shown in Fig. 2, improving the quality of samples provided to a denoiser can significantly increase its effectiveness.

We introduce a method to sample one-bounce direct lighting from many lights that is suited to real-time ray tracing with fully dynamic scenes. . Our approach builds on resampled importance sampling (RIS) , a technique for taking a set of samples that are from one distribution and selecting a weighted subset of them using another distribution that better matches the function being integrated. Unlike prior applications of RIS, we use a small fixed-size data structure—a “reservoir” that only stores accepted samples—and an associated sampling algorithm (used frequently in non-graphics applications to help achieve stable, real-time performance. . Our approach builds on resampled importance sampling (RIS) , a technique for taking a set of samples that are from one distribution and selecting a weighted subset of them using another distribution that better matches the function being integrated. Unlike prior applications of RIS, we use a small fixed-size data structure—a “reservoir” that only stores accepted samples—and an associated sampling algorithm (used frequently in non-graphics applications) to help achieve stable, real-time
performance.

Given the reservoir, our approach does not use any data structures more complicated than fixed-size arrays, yet it stochastically, progressively, and hierarchically improves each pixel’s direct light sampling PDF by reusing statistics from temporal and spatial neighbors.
 
Last edited:

xacto

Member
w6zGQgb.png



He makes a good addition to r/WatchPeopleDieInside
 

HAL-01

Member
Slightly other topic; With megascan assets now included in ue aren’t you guys worried that lots of games going to look very similar?
big studios have their own large art department and may only use premade assets sparinly. Indies may have to be smart about it, but the console storefronts should do a good enough quality control job at keeping out asset flips
 

IkarugaDE

Member
I was trying to watch the GOT presentation in 4K but YouTube is such a piece of turd right now, macroblocks everywhere.

Code:
https://assets14.ign.com/videos/zencoder/2020/05/14/3840/cff4bb2627dac6092814d6c4c299a37d-24156000-1589489748.mp4

You can also download it via that link. Should be 4k.

Edit: Sorry, was very hard to type in the link only. Remove the spaces.
Edit 2: I'm stupid. Now it's okay.
 
Last edited:

Bo_Hazem

Banned
Slightly other topic; With megascan assets now included in ue aren’t you guys worried that lots of games going to look very similar?

You can edit them, you should watch more Quixel videos, you'll enjoy a lot. But haven't you ever heard of Sony Pictures? Sony has been using its own film assets, and the rumors of the two divisions working together more closely this time around is closer to reality than ever.

Sony Pictures Imageworks
Sony Pictures Imageworks is the Academy Award®-winning visual effects and animation unit of Sony Pictures Motion Picture Group known for live-action visual effects, dynamic creature and character animation and all-CG animation.

The company, headquartered in Vancouver, B.C., recently completed work on the Warner Bros. feature SUICIDE SQUAD, STORKS for Warner Animation Group, Columbia Pictures’ GHOSTBUSTERS, Disney’s ALICE THROUGH THE LOOKING GLASS, THE ANGRY BIRDS MOVIE for Columbia Pictures and Rovio Animation, Sony Pictures Animation’s HOTEL TRANSYLVANIA 2, Marvel Studios’ GUARDIANS OF THE GALAXY and the Warner Bros. feature EDGE OF TOMORROW. Sony Pictures Imageworks is currently in production on SMURFS: THE LOST VILLAGE for Sony Pictures Animation, KINGSMAN: THE GOLDEN CIRCLE for Fox, Columbia Pictures’ SPIDER-MAN: HOMECOMING, MEG for Warner Bros., PACIFIC RIM: MAELSTROM for Legendary and Warner Animation Group’s SMALLFOOT.







They have they're proprietary engine (Sprout) that they design their whole movie scenes with:


You can see how insane it can be here:






That's another matter MS should worry about, not Sony. Quixel megascans are just supplementary to Sony's massive library.
 
Last edited:

HAL-01

Member
You can edit them, you should watch more Quixel videos, you'll enjoy a lot. But haven't you ever heard of Sony Pictures? Sony has been using its own film assets, and the rumors of the two divisions working together more closely this time around is closer to reality than ever..
I remember when they showed that bust of dr octavius at the ps3 announcement... Could you imagine if they opened the June conference with the mysterio sequence from far from home, only to reveal it was running in real time on a PS5.
 

jonnyp

Member
Now the problem with level of detail is the same as before money and time:


Also I am sure we are going to see some game where the marketing say something like
"Our game use more 60 GB in assets".


Of course, more detailed assets take longer to create. Still impressive that 2 dozen people created that demo in six months. The Megascans library and not having to tweak LOD levels, baking normal and lighting maps etc surely help mitigate some of the added time for creating the more unique, more detailed objects.
 
They are not showing hundreds of billions of polygons. Pure marketing speak by epic.
"There are over a billion triangles of source geometry in each frame that Nanite crunches down losslessly to around 20 million drawn triangles."

Time-stamped:




EDIT: Just noticed that this had already been covered, but repetition is the mother of learning, right?
 
Last edited:
On another note guys....how about that Ghost of Tsushima reveal today? From the past trailers I honestly expected them to push the game out further and finally announce it was actually a PS5 title, but it's all PS4 and looks AMAZING. Of course, I can see some differences with that UE5 demo from yesterday, but still. This thing looks IMPRESSIVE. As long as it performs in game like we're seeing in these trailers.....just.....WOW. I am so hyped for this game. I was REALLY waiting for TLOU2 but the "leaks" have tempered my excitement a good bit (unfortunately I wasn't able to avoid spoilers, at least the written kind). I'm STILL hoping that the leaks came from an earlier version of the game and story but we'll see.

In any case, I'm still hyped for TLOUS2 and getting more hyped for Ghost of Tsushima by the day! Anyone else?

Me watching the 'State of Play' today!

The thing is, none of SuckerPunch's games have been powered by middleware third party engines. They built theirs for the first Sly Cooper game and have been rolling with it ever since, making steady and meaningful improvements to it as new games were being developed. This is the state the engine is now with Ghost Of Tsushima and it's mighty impressive for a late gen showcase title!!!
 
The point of Nanite is if you were to spend however many seconds it took to render a frame that literally contained billions of triangles from high quality source assets, and then fed those same assets into Nanite and let it losslessly crunch them down so they GPU only had to end up rendering a fraction of them, the resultant frames would be identical pixel for pixel

There’s no point having a thousand triangles hiding behind a single pixel that will eventually have that pixel be some one dimensional RGB value if you can crunch down the geometry before that stage to almost a single triangle that ends up giving you the same RGB value.

Nanite in UE5 means you can sling your highest quality assets in and have them rendered “as if” the GPU was crunching the whole lot in real time as far as the final frame is concerned.

That means for practical purposes there really are billions or triangles in that single frame (at least in a data-structure before Nanite goes to work) and the closer you move in to take a look, the more of them you’ll see from a given asset.

That’s incredible. It means practically infinite detail, infinite draw distance, zero pop-in, zero loading. Truly next-gen in a way that slapping higher resolution and even frame rate on what we already have just isn’t.

It’s the geometry side of adding detail just simply DONE.

Was this really what Cerny meant when he said “when triangles are small...”? Was he talking literally with this technology in mind?

A lot of trying to be done to cynically dismiss what Cerny and Sweeney are saying as just marketing or even somehow political in nature.
They are both proven and successful game engine engineers.
If they can be dismissed as just being clever at marketing and misleading the public with the UE5 demo (even at the expense of alienating all other platforms/customers in the case of Epic), then gaming journalists and technodabblers and babblers here can be dismissed as knowing far less about the subject than either of these two men.
 

THE:MILKMAN

Member
"There are over a billion triangles of source geometry in each frame that Nanite crunches down losslessly to around 20 million drawn triangles."

Time-stamped:



This is the single quote that describes most of this in a nutshell.

My follow up question is how do they get >98% compression losslessly? The details around this were still NDA'd according to that other guy on Twitter but hope we get more info today (Epic said more this week?).

Also what is the maths to work out how many gigs a second 20 million triangles+8k textures requires? The maths just goes way over my head....
 
Last edited:
No reasoning. Sometimes things are just different :messenger_tears_of_joy:


7ad721e7-bf4e-460c-bj8kpm.jpeg


The implication by some gamers and journalists that Sony and Cerny have gone to the effort and expense of making PS5’s IO so much faster than what has ever been (in the context of throughput and latency in the entire storage to GPU cache pipeline) for no tangible benefit over a cheap conventional setup, and without enabling them to do anything new or significant is utter lunacy and staggeringly arrogant ignorance when a game engine architect and engineers from a platform agnostic highly successful technology company is literally telling you it is vital to what they have demonstrated, and have actually gone out and spent years developing to demonstrate it.
Those people are the ones that should be casually dismissed. They are not talking from any position of authority. They can only see things through the lens of how things have always been done. They’re journalists and not technology innovators for a reason.
 

FeiRR

Banned
The point of Nanite is if you were to spend however many seconds it took to render a frame that literally contained billions of triangles from high quality source assets, and then fed those same assets into Nanite and let it losslessly crunch them down so they GPU only had to end up rendering a fraction of them, the resultant frames would be identical pixel for pixel

There’s no point having a thousand triangles hiding behind a single pixel that will eventually have that pixel be some one dimensional RGB value if you can crunch down the geometry before that stage to almost a single triangle that ends up giving you the same RGB value.

Nanite in UE5 means you can sling your highest quality assets in and have them rendered “as if” the GPU was crunching the whole lot in real time as far as the final frame is concerned.

That means for practical purposes there really are billions or triangles in that single frame (at least in a data-structure before Nanite goes to work) and the closer you move in to take a look, the more of them you’ll see from a given asset.

That’s incredible. It means practically infinite detail, infinite draw distance, zero pop-in, zero loading. Truly next-gen in a way that slapping higher resolution and even frame rate on what we already have just isn’t.

It’s the geometry side of adding detail just simply DONE.


Was this really what Cerny meant when he said “when triangles are small...”? Was he talking literally with this technology in mind?

A lot of trying to be done to cynically dismiss what Cerny and Sweeney are saying as just marketing or even somehow political in nature.
They are both proven and successful game engine engineers.
If they can be dismissed as just being clever at marketing and misleading the public with the UE5 demo (even at the expense of alienating all other platforms/customers in the case of Epic), then gaming journalists and technodabblers and babblers here can be dismissed as knowing far less about the subject than either of these two men.
Okay, let's assume it works the way you described. I just have one doubt: where is all that data coming from? I mean, if you start zooming in, it has to get more detailed triangle data from that higher resolution source, same for textures. Where is the source then? One billion triangles would be several terabytes of data. It's not going to sit on the SSD.
 
Last edited:
My follow up question is how do they get >98% compression losslessly?

Lossless is easy when you consider everything eventually just ends up deciding what finite value to set an RGB pixel to.

As an extreme example, rendering a castle from a mile away so that it only takes up one pixel of your screen ultimately gets crunched down to a single 24-bit RGB integer value

You could spend ages going through each triangle and summing them all up until that single pixel is worked out to be 238,129,43 (for example), or you could use Nanite (and its secret sauce) to infer what colour all of that would ultimate end up as, and draw just a single triangle coloured 238,139,43 at the same location, without even a fraction of the rendering costs.

The castle has to be losslessly whittled down into something that has the same final effect on a monitor.
238,139,43 is the same as 238,139,43 so in that sense Nanite managed to reduce geometry without any loss from the point of view of the pixel.

This is all simplified to explain how it can be lossless, and the general idea behind Nanite.

How they’re doing that is another matter entirely. As is whether it’s truly lossless—final pixel value for final pixel value—or pretty much as good as lossless.
 
Last edited:
Okay, let's assume it works the way you described. I just have one doubt: where is all that data coming from? I mean, if you start zooming in, it has to get more detailed triangle data from that higher resolution source, same for textures. Where is the source then? One billion triangles would be several terabytes of data. It's not going to sit on the SSD.

The statues in the UE5 demo wouldn’t need to be duplicated in storage or in working memory for one. Vertex data also compresses extremely well (most likely the kind of data Cerny meant when he mentioned 22GB/s). Each vertex can be relative to the last rather than an absolute value. There will be long strings or repetition. All stuff that compression loves.
 

Corndog

Banned
"There are over a billion triangles of source geometry in each frame that Nanite crunches down losslessly to around 20 million drawn triangles."

Time-stamped:




EDIT: Just noticed that this had already been covered, but repetition is the mother of learning, right?

Ya. It’s still wrong no matter how many times it gets brought up. And notice how I’m disputing hundreds of billions.
 
To all those naysayers, fud spreading people and self procliamed experts: "People said this stuff was impossible to do, but then a guy appeared who didn't know it was impossible and just did it".
I've watched Nvidias Ampere reveal yesterday. These lunatics have shown some really crazy stuff with their new gpu architecture. Be prepared to be mindblown in the next few years once you wittness whats possible.
I'm quite glad that amd is finally catching up to intel and nvidia. They are forcing both companys to improve their performancegains for new iterations of cpus and gpus.
Nvidias AI hardware acceleration is really astounding, I'm expecting (contextual)AI to be everywhere by the end of this decade.

Competiton is the best thing that could happen to us consumers. :messenger_smiling_hearts:
 
Did somebody see this? EA can't help it.


Logic, which dev is going to give away its work when can get income?. MS wants to introduce first day Game Pass games, Smart Delivery now and will have to pay for it or limit itself to indies or they own AA games. Is not suistanable with the AAA development costs.
 
Last edited:
To all those naysayers, fud spreading people and self procliamed experts: "People said this stuff was impossible to do, but then a guy appeared who didn't know it was impossible and just did it".
I've watched Nvidias Ampere reveal yesterday. These lunatics have shown some really crazy stuff with their new gpu architecture. Be prepared to be mindblown in the next few years once you wittness whats possible.
I'm quite glad that amd is finally catching up to intel and nvidia. They are forcing both companys to improve their performancegains for new iterations of cpus and gpus.
Nvidias AI hardware acceleration is really astounding, I'm expecting (contextual)AI to be everywhere by the end of this decade.

Competiton is the best thing that could happen to us consumers. :messenger_smiling_hearts:
Jen-Hsun is you?, are u ok?.
Wake me up when you can ray trace 3,6 million polys per frame.
Suddenly RT is obsolete!, SVOGI for the win!.
 
Last edited:
Okay, let's assume it works the way you described. I just have one doubt: where is all that data coming from? I mean, if you start zooming in, it has to get more detailed triangle data from that higher resolution source, same for textures. Where is the source then? One billion triangles would be several terabytes of data. It's not going to sit on the SSD.

Obviously depending on the game you are trying to make you will have set some max polycounts for each handcrafted modell.
Now what could be a solution to save space would be procedural generated geometry and models. Dunno how good that would work but I guess it will be getting better.
So you'd only need some base/reference modell and could recreate/morph them to different shapes on the fly.
I guess AI core could be used for such things in the future.
However I'm no expert. Didn't finish my University Degree in computer science or physics. To much weed and other stuff going on in my life back then.

So call me a dreamer or idiot if thats not going to be possible :messenger_grinning_sweat:

Jen-Hsun is you?, are u ok?.
Wake me up when you can ray trace 3,6 million polys per frame.
Suddenly RT is obsolete!, SVOGI for the win!.
Nah I am expecting AMD to show some awesomestuff to. I just haven't seen any RDNA 2 offical specs yet. :)
 
Last edited:

Thirty7ven

Banned
Ray Tracing is cool, but considering the GI in U5 demo looks so fucking brilliant and offers much better performance it becomes a bit of a moot point. Getting more accurate results is cool and all, but it's all smoke and mirrors in the end, what you want is perceived approximation. There's no value added by having a waterfall be physically calculated in a videogame if you can get similar perceived results at much better performance.

XSX was doing Minecraft at 1080p hovering around 40(?) fps, and that has as basic as geometry gets. I'm supposed to be more impressed by that than what a system like Luminen achieves? It's hard to wrap my head around that.
 
Last edited:

user1337

Member
Question about smart delivery. If there is no time limit:

- This kind of guarantees that next game games will cost exactly the same as current gen games right?

Otherwise what's stopping me from buying the "cheaper version" and unlocking the more expensive version for free??

- would that then also mean that price reductions for games will be done at the same rate and time for Xbox games across both gens?

Otherwise again, I can just find the cheaper copy that has a bigger discount and get the more expensive copy free.
 
Last edited:

Vae_Victis

Banned
One billion triangles would be several terabytes of data.
I'm not a 3D artist, but that to me sounds off by several orders of magnitude, especially when you consider models are also compressed in the SSD.

You should at minimum need 3 floating point numbers per vertex. I found for example in Maya models each of those floating points are 4 bytes, so in total 1 vertex costs 12 bytes.

Then you need to map faces (triangles) to the vertices, which should cost in the region of 6 bytes each as far as I could find.

Since the overwhelming majority of triangles reuse the same vertices, we might be talking about perhaps 1.5 billion vertices for a 1 billion-triangles model, or even closer to the number of triangles (is there a formula to generalize it)?

So you should have (1B x 6) + (1.5B x 12) = 6B + 18B = about 24B bytes, or 24GB

Again, this is all uncompressed. Tools like Draco and Open3GDC can easily reduce model sizes by a factor of 30-50, so we would really be talking about 800MB.
 
Last edited:

J_Gamer.exe

Member
Yes, any engine can be adapted to a platform and it will be here. But...

"The strengths of one platform can be made to work on others" doesn't make any sense. It's not a [relative] strength if it can be done on the other platform.

The XSX's strengths are different to the PS5's strengths. If the PS5 can move ~2x the data from memory to storage, you can't use the XSX's ability to push more pixels to also move 2x the data from memory to storage; they're different things, they're different strengths.

The PS4/X1 AC game eg. doesn't hold weight as that is a matter of different APIs and hardware blocks; that's incompatibility, not incapability. With a base PS4 and X1, the PS4's strength is a stronger GPU, you can't make that the X1's strength. Just as the X1X has a stronger CPU/GPU over the PS4Pro, you can't make that the PS4Pro's strength, because it just isn't. You can make compromises in some areas to achieve parity in others. Also, trading off resolution, performance and graphics in an X1 or PS4 multiplatform game is often functionally inconsequential. SSD speed however provides a predominantly more functional benefit.

Say for eg. you're the flash in a super hero game and you want to traverse from one end of the map to the other at 200mph and with a given asset complexity/density you need 9GB/s to do it that fast. On PS5 this can be done, on XSX you only have 4.8GB/s so your choice is to drop traversal close to 100mph (a functional, gameplay compromise) or drop asset complexity/density to fit the 4.8GB/s budget (a visual, presentational compromise); or a middleground between the two. A compromise has to be made in that scenario.

That ~2x throughput is the strength of the PS5, it's a strength the XSX doesn't have; therefore you can't make that strength work on the latter. You could argue that they could optimise code to be more efficient and find workarounds, but there's nothing distinctive enough about the nature of either to make me think such optimisations wouldn't be applicable to both. Even looking at XSX (Compressed) vs PS5 (RAW); the PS5 is still 15% faster and that's before we get to latency, parallelism and coherency informing intelligent cache scrubbing.

Unless you're trading off a singular resource or set of resources (cpu, gpu) for a given set of capabilities (res, perf, gfx), saying the strength of one platform in one way can be made to be the strength of another platform in another way is like saying a weightlifter and a marathon runner are interchangeable in their respectful fields.

Surely by that logic, if the SSD capabilities of the PS5 can be made to be the strength of the XSX then they can make the extra 18% of GPU compute the XSX has a strength on the PS5..?!

And if an AC game looks similar on X1 and PS4 a big part of that is the only difference is graphical (usually resolution scaling) which isn't a functional, gameplay-impacting component as mentioned above; and that it's a multiplatform game where all the fundamentals are usually developed to a lowest common denominator, by its very nature the strengths of at least one platform aren't being fully exploited. Also, it doesn't take into account he paradigm shifting nature of the high speed SSDs in the new consoles.

Now if UE5 for eg. is being used to develop an exclusive for PS5 and an exclusive for XSX; the PS5 game will be able to do things the XSX can't do in terms of traversal/transitions and asset density/complexity because it has a functional advantage with the SSD being able to move more data, the XSX will have to make a compromise in one, the other or to some degree both. There's nothing I've seen to suggest the XSX can do anything the PS5 can't. The PS5 will just do it with 20% less native pixels in a worst case scenario, something mitigated even further by reconstruction.

That flying example of the xbox being half as fast assumes that the xbox would have the same io hardware or efficiency by its methods.

It doesn't have the same hardware and it's very unlikely its eliminated the bottlenecks to get that 4.8 gb/s to transfer that speed to the useable end.

So in reality this half as fast is IMO a lot less. The flying speed or compromises could be a lot more than half IMO.

If xbox had a 40 or so times as fast SSD compared to last gen and on that load demo it was aprox 4 times as fast as the one x.

Now last gen games wont fully tap into the xbox velocity architecture I'm sure but let's say they get it to be 10 x or even a bit more, faster than last gen. Very respectable in it's own right.

But sony has claimed a 100 times faster SSD will transfer to 100 times faster at the other end.

So their SSD speed isnt apparently being lost.

This could be a huge factor in the differences.

If they dont transfer that less than half as fast SSD speed to the useable end they could be a lot less than half as fast at flying at identical quality.
 

Shmunter

Member
To be fair, it's just an admission to not knowing everything. I think that's ok. It certainly is much less condescending and confrontational than other reactions I saw.
How so? Didn’t he just say the PS5 ssd is not an advancement, just different?

Btw NX gamer just published his video on the demo. I bet you he did not ignore the part the ssd plays in streaming all that complex geometry so quickly. Will watch soon.

Edit: and here it is....lunacy that DF didn’t cover the ssd aspect and how it works with the engine. I mean Sweeney and co went on and on about how pivotal it is to all this. Lol, it’s conspiratorial but it’s as if there is a concerted effort to take focus off the ssd, I mean seriously.

 
Last edited:
The implication by some gamers and journalists that Sony and Cerny have gone to the effort and expense of making PS5’s IO so much faster than what has ever been (in the context of throughput and latency in the entire storage to GPU cache pipeline) for no tangible benefit over a cheap conventional setup, and without enabling them to do anything new or significant is utter lunacy and staggeringly arrogant ignorance when a game engine architect and engineers from a platform agnostic highly successful technology company is literally telling you it is vital to what they have demonstrated, and have actually gone out and spent years developing to demonstrate it.
Those people are the ones that should be casually dismissed. They are not talking from any position of authority. They can only see things through the lens of how things have always been done. They’re journalists and not technology innovators for a reason.
There others types of lenses in play here. This guy has being constantly downplaying PS5 SSD advantages for months (or even the geometry engine Cerny mentionned) and oddly constantly praising things such as VRS for months (which was the most overrated tech of next gen based on their own benchs. 10% better perf with easily visible artefacts. Since their own tests, nobody is talking about VRS anymore).

Now people realize the geometry engine of PS5 might be even more important than Ray tracing based on actual demos. And Cerny had talked in length about that in his speech: Their narrow and fast design, importance of the geometry engine (maybe custom on PS5, we'll see), their super fast SDD, all of that in order to display tons of small triangles in big worlds.

And what was the most impressive in the real first glimpse of next gen running on PS5 ? the incredible amount of triangles displayed on screen + nice global illumination + the fast travel part in the end (still real-time). Not VRS, not Ray tracing (Minecraft geometry at 1080p30fps lol) that was kinda disappointing running on a XSX.
 
Last edited:
Okay this might be a little bit of reaching (maybe a lot of reaching) but hear me out, folks. I was remembering this 4chan leak that was posted on reddit back on January 9 when we all were expecting a PS Meeting on February. And after reading it again I think there are some things in here that have been kind of accurate or even recently rumored (and some others that don't, like Sony going to E3 and the PS Meeting in february), I don't know if it was just lucky guess or something else but it's intriguing to say the least. There are still things that haven't been confirmed or denied, but there are some that are starting to buzz lately (like RE 8).



Kinda Accurate/Recently Rumored Stuff:

  • Buzz words for the console's features include "little to no load times", "blazing fast downloads", "immersive controls", "modular installs for games, download whatever", "disc drive included", and "download the games, or stream the games as an option" (we're looking at you Stadia).
ps5-6-1584555776.png



(About the "stream the games as an option")
  • Backwards compatiblty with all PS4 games is also a big feature. Through a new transferring features, users will easily transfer their PS4 games to the PS5 if those games are downloaded. Save data/backups for PS4 games will also be transferable
PS5-backward-compatible-with-PS4.jpg

  • PS5 will launch worldwide in October 2020. Priced at $499 USD / £449 UK / €449 EU / ¥54,999 JP

(Yes, I know Sony already denied that date but just wanted to point out that the date was on the leak)
  • Specs to be almost on par with Xbox Series X (which will be $100 more), and more powerful than Xbox Lockhart (a console that's $100 less with 4TFlops of compute power compared to the PS5's 10TFlops)
SIpBCDp.jpg


The leak is probably a lot of BS, but just wanted to point out those parts.
 
Last edited:

B_Boss

Member
To be fair, it's just an admission to not knowing everything. I think that's ok. It certainly is much less condescending and confrontational than other reactions I saw.

Are you saying Alex’s response was his essentially saying he doesn’t know everything? To be fair to that point, I know for a fact he (as with the rest of the human race) does not “know all truths or facts“ lol.

It seems to me that he is at least implying that he knows or believes that there is no advancement with Sony’s SSD engineering.
 
Last edited:

J_Gamer.exe

Member
How so? Didn’t he just say the PS5 ssd is not an advancement, just different?

Btw NX gamer just published his video on the demo. I bet you he did not ignore the part the ssd plays in streaming all that complex geometry so quickly. Will watch soon.

Edit: and here it is....lunacy that DF didn’t cover the ssd aspect and how it works with the engine. I’m mean Sweeney and co went on and on about how pivotal it all is to this. Lol, it’s conspiratorial but it’s as if there is a concerted effort to take focus off the ssd, I mean seriously.



I watched the DF video and was waiting for the SSD and io analysis which epic themselves said was integral to it running, it never came.

Absolutely staggering omission.
 

Vae_Victis

Banned
Okay this might be a little bit of reaching (maybe a lot of reaching) but hear me out, folks. I was remembering this 4chan leak that was posted on reddit back on January 9 when we all were expecting a PS Meeting on February. And after reading it again I think there are some things in here that have been kind of accurate or even recently rumored (and some others that don't, like Sony going to E3 and the PS Meeting in february), I don't know if it was just lucky guess or something else but it's intriguing to say the least. There are still things that haven't been confirmed or denied, but there are some that are starting to buzz lately (like RE 8).



Kinda Accurate/Recently Rumored Stuff:

  • Buzz words for the console's features include "little to no load times", "blazing fast downloads", "immersive controls", "modular installs for games, download whatever", "disc drive included", and "download the games, or stream the games as an option" (we're looking at you Stadia).
ps5-6-1584555776.png



(About the "stream the games as an option")
  • Backwards compatiblty with all PS4 games is also a big feature. Through a new transferring features, users will easily transfer their PS4 games to the PS5 if those games are downloaded. Save data/backups for PS4 games will also be transferable
PS5-backward-compatible-with-PS4.jpg

  • PS5 will launch worldwide in October 2020. Priced at $499 USD / £449 UK / €449 EU / ¥54,999 JP

(Yes, I know Sony already denied that date but just wanted to point out that the date was on the leak)
  • Specs to be almost on par with Xbox Series X (which will be $100 more), and more powerful than Xbox Lockhart (a console that's $100 less with 4TFlops of compute power compared to the PS5's 10TFlops)
SIpBCDp.jpg


The leak is probably a lot of BS, but just wanted to point out those parts.

How would anyone back in January even know the RRP of not only the PS5, but also Xbox Series X and Lockhart, when reports today say not even Sony and Microsoft have finalized them?

RtXcgsf.jpg
 
Last edited:
Status
Not open for further replies.
Top Bottom