• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

kungfuian

Member
To think this Unreal 5 demo is only the start of next gen. Can't wait to see other engines, and games using these new techniques once they have matured a bit. Especially on the PC end- when hardware gets build around them even more.

Like look at how raytracing and machine learning is maturing so quickly. Sure this is using a Quadro RTX 8000, but the future of real time graphics is just mind blowing!

 
From the DF video: 1080p@29.97fps

148942.jpg

I know what that's doing is impressive but it's doing it in Minecraft, there isn't any real game going on there behind it. If it's not a stable 30fps with that game how would this perform with something like Halo or Gears? Especially with MS's 1st party focus on 60fps?
 

Bo_Hazem

Banned
I know what that's doing is impressive but it's doing it in Minecraft, there isn't any real game going on there behind it. If it's not a stable 30fps with that game how would this perform with something like Halo or Gears? Especially with MS's 1st party focus on 60fps?

If you put all tech and sorcery available in the world into Minecraft I won't be impressed. This game makes me wanna vomit when I look at it, I can't wrap my head around how someone would attempt to play it with the likes of Dreams or LittleBigPlanet around that are more refined and creative.

But it's pretty famous, which makes me question my existence on Earth. Better reach Elon Musk to sign me up in his Martian mission.
 
Last edited:

Bo_Hazem

Banned
To think this Unreal 5 demo is only the start of next gen. Can't wait to see other engines, and games using these new techniques once they have matured a bit. Especially on the PC end- when hardware gets build around them even more.

Like look at how raytracing and machine learning is maturing so quickly. Sure this is using a Quadro RTX 8000, but the future of real time graphics is just mind blowing!



It's wonderful, indeed, but here's what I notices: (Note I'm watching it on my 55" 4K tv about 2.5m away)

-Resolution is dropping quite often partially (VRS) and overall (dynamic).
-The main marble ball is insanely full of noise and particles.
-There is sudden stutter and buffer at some parts and framerates droping below 30fps.
-Anti-aliasing is not doing a great job at the end with ladder effects visible.

Overall, I still think it's a very good tech, but I think we should start to see graphics cards with massive RAM (VRAM) or even going the extra mile and having dedicated memory inside them to have the whole game in it, even if temporarily with ultra fast SSD outside to offload it when done.

EDIT: (enlarge images to see them up close)

Shitty ball quality at some locations (wither it's ray tracing, VRS, etc)

148963.jpg


Severe ladder effect caused by lowered (partial, VRS) resolution

148964.jpg


Insanely sharp, flawless quality at the beginning

148965.jpg
 
Last edited:

Shmunter

Member
I swear a lot of you are just completely ignoring how scalable Epic claimed the engine was.. how you could make a game targeting "next gen high end consoles" that works fine on a cell phone.

That's the huge news for developers to me. It kind of squashes the notion that you have to ignore older hardware to really take advantage of the massive I/O of the PS5 SSD... and completely contradicts what some of you are saying about developers ignoring other systems in favor of PS5.
The engine is scalable to fit the hardware. Results achieved on the top end may not be realistic on the lower end. For example it’s common sense to not expect playing games designed for a 2080ti to be even close to realistic on a iPhone6, even if it’s the same engine and tools, it’s a wide spectrum.

The closer the hardware, the more of the same features can be deployed, the closer the result. And vice versa in the opposite direction.

If the UE5 demo shown is indeed munching 5.5 gig as a fundamental to that demo, then it’s all downhill from there on competing hardware. Acknowledging that the difference may not be super massive, just not optimal.
 
Last edited:

D.Final

Banned
To think this Unreal 5 demo is only the start of next gen. Can't wait to see other engines, and games using these new techniques once they have matured a bit. Especially on the PC end- when hardware gets build around them even more.

Like look at how raytracing and machine learning is maturing so quickly. Sure this is using a Quadro RTX 8000, but the future of real time graphics is just mind blowing!



Cool video
 

HAL-01

Member
It's wonderful, indeed, but here's what I notices: (Note I'm watching it on my 55" 4K tv about 2.5m away)

-Resolution is dropping quite often partially (VRS) and overall (dynamic).
This reminds me, for the unreal demo we've been told the resolution is dynamic hovering around 1440p, do we have any numbers for the top and minimum resolution? Is it also doing any sort of special 4K upscaling? I'd wager by the time the engine releases and all the kinks are worked out we could maybe achieve a higher average res on the same demo
 

Bo_Hazem

Banned
This reminds me, for the unreal demo we've been told the resolution is dynamic hovering around 1440p, do we have any numbers for the top and minimum resolution? Is it also doing any sort of special 4K upscaling? I'd wager by the time the engine releases and all the kinks are worked out we could maybe achieve a higher average res on the same demo

The way I see it that ladder effect it's most likely dropping down to 1080p if not lesser partially (VRS). I don't like those artifacts and the main playable marble ball was pretty crappy quality later on.

First of all, this is also using less pixels than the original 16:9 aspect ratio, cinematic 2.39:1 ratio. So even 1080p is much lesser than that. I can see Jensen pushing so hard as he's obviously not happy with the UE5 showing how weak/outdated the PC architecture is. The AMD Infinity Architecture that PaintTinJr PaintTinJr has brought to the table might do something to fix some of the bottlenecks in current PC's.

ElCap_13.jpg


IF%20v1.jpg


IF%20v2.jpg


As long as I can tell, it's more of workstation/server focused so far, but you can see how it should be superior to PCIe 4.0.

Our friend PaintTinJr should assist with better knowledge on the matter.
 
Last edited:

kensama

Member
The way I see it that ladder effect it's mostly likely dropping down to 1080p if not lesser partially (VRS). I don't like those artifacts and the main playable marble ball was pretty crappy quality later on.

First of all, this is also using less pixels than the original 16:9 aspect ratio, cinematic 2.39:1 ratio. So even 1080p is much lesser than that. I can see Jensen pushing so hard as he's obviously not happy with the UE5 showing how weak/outdated the PC architecture is. The AMD Infinity Architecture that PaintTinJr PaintTinJr has brought to the table might do something to fix some of the bottlenecks in current PC's.

ElCap_13.jpg






As long as I can tell, it's more of workstation/server focused so far, but you can see how it should be superior to PCIe 4.0.

Our friend PaintTinJr should assist with better knowledge on the matter.



Some terms remind Cerny and the PS5.
Like Coherency engine, Low latency GPU/CPU and Unified memory across CPU/GPU. Is that what Cerny told when he told that if we see in near future some feature of PS in AMD GPU discret?
 

Shmunter

Member
@SlimySnake

Did you see Dictator's post in the NXgamer thread? Absolutely hilarious, here it is:





Even though in the first few minutes of the UE5 reveal where the developer specifically states that all of the polygon detail is made possible due to the SSD streaming polygons in, he's still on here trying to downplay it as though "we don't know how it scales". He even tries to downplay the SSD by saying that there's not as much normal map data and so forth. He even goes on to say specifically that UE5 should scale with GPU power to try and pain the narrative that SSD isn't hugely responsible for this but somehow a 15% difference in GPU power will be an enormous scaling of power.

It's amazing the lengths he's going to downplay Sony's SSD, though completely predictable.
It’s not good to be blinded by bias in a position of influence. Highly unprofessional.

I’m not sure why DF is not remedying this. It reflects poorly on their body of work as a whole to continue like this.

But here’s the prediction. There will come a point where the SSD influence on game engines and design can simply no longer be refuted. I know we’re pretty much there already, but nevertheless;. Dictator can simply fix it by officially acknowledging his past skepticism, and acknowledging he is now a believer. Done deal, they can move on from this.

Edit: there is also the potential of undisclosed cash fuckery from MS. If that ever turns out to be true, pack your bags, it’s over.
 
Last edited:

Bo_Hazem

Banned
Some terms remind Cerny and the PS5.
Like Coherency engine, Low latency GPU/CPU and Unified memory across CPU/GPU. Is that what Cerny told when he told that if we see in near future some feature of PS in AMD GPU discret?

I can smell the same, probably in a slightly different way? Not sure. I'm more than excited about their stacked chiplets, I can see PS5 Pro using something smaller, similar combining an equivalent of 2x APU's like PS4 Pro:

X3D-14_678x452.jpg


Not sure about heat management, but those cooling pipes penetrating through the APU is already interesting about the leaked info of PS5 cooling solution. We shall wait and see, I'm not understanding it enough.
 
Last edited:

Shmunter

Member
This was running at 1080p@30FPS with ray-traced global illumination, shadows and reflections enabled all at the same time right? REALLY interested to see how PS5 does hardware-accelerated triangle-based ray-tracing. Guess we'll find out June 4th. (Hopefully)
Wild prediction: Gran Turismo where the gameplay is ‘photo mode’ quality and can pass for real life footage
 

Radical_3d

Member
The great awakening.......begins
This is the first guy that lefts his position in this trench war. PC will be there in a few months but until then as fast as their RAID SSD can be, can’t compensate for the overhead in the rest of operations. The consoles have dedicated hardware to uncompressed the data they got from the drive and any HW based solution run circles around what a PC CPU can do. And in top of that the PS5 has a HW solution for everything else. I think the next generation of GPUs will have those features that are now exclusive to consoles. Moore’s Law mentioned that in video.
 

FeiRR

Banned
My theory is still that Sony promised Epic that Sonys upcoming games for PC will be exclusive to EGS.. or that Sony paid Epic a decent amount of money for this partnership.

I very much doubt Epic would spend thousands of hours producing a tech demo of this sort, just because the CEO is a "nerd" who gets "excited" about this stuff.
Unreal is used by numerous dev studios, although not Sony's first-party. Still, marketing it as an easy tool to develop is in Epic's best interest. Just taking into consideration their own Fortnite, that means tens of millions of people using their tech on Sony's consoles. That partnership will print them a lot of money.

I think Microsoft might have a similar demo at their event but it won't be the same thing. Maybe a Gears 6 teaser? They depend on UE even more, including their first party. If they don't show their own take at that technology, they'll lose a lot of ground.

Microsoft ran their marketing through Dictator's and Richard's mouth.

Sony marketed their machine through EPIC's skills.
I just realized how bad it is. Microsoft has a lot of money... No, I'm not going the warchest way. They in fact use a lot of money to market their products. Remember that eerie Circus de Soleil show for X1? Remember Reeves at last E3? That's what I mean. And now what: a bunch of tech-illiterate clowns who run a semi-popular YT channel? Streams from bedrooms with $5 cameras while the biggest feature was changing backgrounds in Skype? They can do much better, at least in the marketing department.

Interesting bit. Not too familiar on the process of acquiring engines, but is there any possibility that exclusive 1st party developers may already or will have UE5 prior to official launch in 2021? Surely there would be more of an incentive for them to get it early, especially if Sony / Xbox are working with Epic?
Of course, that's why they give it free to small dev studios. User input is the best input for testing new features. UE5 will gain market share like crazy. It saves small and mid-level studios piles of cash, which will be godsent in the current situation.

It seems those "world tours" Cerny made to ask developers what they wanted paid off.
Now we can confirm Epic was one of them. Which other studios you guys think were visited by Cerny? (excluding the Sony first party ones which are kinda obvious)
I'm sure he went to Japan to Squeenix, Capcom and Namco. He speaks Japanese, which probably opens a lot of doors.

Ok, from the outside of this contest or whatever it is.... From what I've seen about the demo and the comments about 'scaling down' I think that both sides are correct in the crucial details. First, this demo COULD run on an Xbox Series X. Hell, it should be able to run on a base PS4.
This is an interesting part. They said it'll run on lesser hardware but what about asset culling? Is Nanite software-based or hardware-based? In the latter option, it will surely run but you won't be able to use high-def assets just like that. Just one statue from the demo would choke current hardware to death. I'm not even starting with the GI implementation, Chaos physics. You'd see one frame per second or less and PS4 Pro would launch for Mars.

I see that even tech-savvy people scratch their heads as to how Nanite works. No draw calls sounds like magic. Billions of dollars have been spent on optimizing the rendering path and asset workflows. DX12, Vulcan, all those "code to the metal" ruckus at the end of the last gen. And suddenly, there comes Epic and puff! Magic! I'm not proficient enough in 3D graphics to even guess how they did that but my guess is: hardware-based. We need to know more and I'm sure they'll tell us about it soon. If they have a patent for that, other engine devs might have a problem. But I bet that Sony's first parties are currently implementing similar solutions in their own engines.

So tinfoil hat theory time on my part, is the weird storage amount on the PS5 due to a possible raid like setup and would that also lead into the 12 channels?

Thanks for indulging my secret sauce comment. 🤗
RAID means you sacrifice half capacity for performance or data safety (using redundancy). They certainly don't need redundancy so that'd be the former. Then the PS5 SSD would need to be 1650 GB, which I highly doubt... Or around 400 GB, which I doubt even more. What they did is a bit similar to RAID because it uses multiple channels but in a smarter way. Of course that solution has drawbacks. You are tied to a specific storage capacity, it's expensive and possible expansions will cost a leg. But it's a good compromise for a console.

It will go live when Sony announces the price... they can do that up to September for a November release imo.
I think I know why Sony's event has been pushed back even though they confirm the hardware/software part hasn't been much affected by the pandemic. If you looked at exchange rates of major currencies like dollar, euro and yen in March and April, you might have got a headache. The situation was very unstable and unpredictable. Since the first psychological shock passed (which has nothing to do with the actual situation, but that's a different topic), global economy, though shaken, will resume its course and so will financial forecasts and the ability to calculate all costs. The price they announce is there to stay for a few years, which means billions of dollars of profit or loss. So they wait.

It's wonderful, indeed, but here's what I notices: (Note I'm watching it on my 55" 4K tv about 2.5m away)
You're watching a YT video with terrible compression. You shouldn't make any assumptions based on that.
 

ArcaneNLSC

Member
Japanese Analyst Holds PS5 First Year Prediction at 6 Million Units But Is Concerned About Color

Despite the COVID-19 emergency Yasuda-san mentions that the demand for games has remained solid, so he holds firm on hisprevious prediction of 6 million units shipped during the current fiscal year, which will end on March 31, 2021. On top of that, he predicts 15 million units shipped during the following year (between April 1, 2021, and March 31, 2022).

The rest of Yasuda-san’s analysis includes a certainly rather unique take.

He admits that some have mentioned that PlayStation may be lagging behind in terms of marketing compared to the Xbox Series X, with raw specs inferior to the competition and games that haven’t been announced.

Yet, Ace Research Institute believes that sales won’t be decided primarily by games or performance (the firm believes that style and design are a more relevant factor) so it’s not meaningful to compare those at this point.

On the other hand, Yasuda-san argues that whether the main unit will be white might be a bigger issue (likely inspired by the design of the DualSense controller). He mentions that with the Wii it wasn’t well accepted among core gamers of higher age ranges. If the console is white, the success of PS5 may be accepted.



 

geordiemp

Member
If you're displaying one triangle per 1 pixel, then Nanite's compute-based software rasterisation makes more sense and is more efficient. However, when you approach one triangle per 4 pixels the implication seems to be that it will fall back to hardware-based rasterisation and be more efficient in that respect.

When it comes to RT, hardware is absolutely more effective across the board. But even with hardware acceleration you're only accelerating one of the required calculations (traversal of Bounding Volume Hierarchies), the conventional GPU hardware still has to cast, render the result and denoise the rays which could potentially steal a good chunk of ALU away from other functions. As mentioned before, I expect that in your average AA+ title, screen space solutions will remain dominant and RT will be plugging the occlusion/off-screen leaks.

Nanite will be culling a crap-tonne (technical term!) of assets/geometry to remain efficient; and for RT to work you'll need those for reference. So they'll likely have to find a way to store and easily access those assets/geometry, perhaps in a much-reduced fashion.

For eg. if you're trying to RT reflect a rock that has been culled due to being off screen, you can't reflect it... There are still problems to be solved here.

I agree, I dont think you can use ray tracing and voxel lumens together, its one or other.

Also what if its moving grass and trees, does the technique still work efficiently over static rocks and cliffs ?

They did not use the technique for the character (Lara).

My thoughst are that all we know so far is you can have film like quality on static objects and cant use RT ?
 
Last edited:
If the UE5 demo shown is indeed munching 5.5 gig as a fundamental to that demo, then it’s all downhill from there on competing hardware. Acknowledging that the difference may not be super massive, just not optimal.
The demo is around 400 seconds long. They seem to have said 100s of billions of polygons were displayed, if true that means at least 200 billion polygons were displayed. That is about 500M polygons per second + the multiple 8k textures, though there is some redundancy so the streaming requirement might be less.

Someone estimated the 33 Million polygon statue was 600~MB, if true, 500M polygons per second is 15x that or 9GB/s which seems possible with kraken compression. Though again the demo likely streamed less than this as there was likely redundancy of assets, but it is possible there are sections of the demo were this was what was streamed.

Is Nanite software-based or hardware-based? In the latter option, it will surely run but you won't be able to use high-def assets just like that.
I think they said nanite is mostly compute on gpu. Also lumen was not using the accelerated ray tracing hardware. If nanite and lumen can be accelerated by primitive shading h/w and ray tracing h/w the demo might run even better if optimized more.
 

BGs

Industry Professional
Maybe it's not very related but I had to share it. Yesterday my 5 year old son saw the explanation of the UE5. I showed it to him because of the character that was funny to him. Also I am teaching him basic 3D since he was 4 years old and he is interested. The case is that I put the video to him and the first thing that he says to me is (with his words as a small child, of course) the following: "Daddy, if only one triangle per square pixel is shown then you only see half of the drawing?".

It is adorable. Sorry. Father's love.
 
Last edited:

HAL-01

Member
The demo is around 400 seconds long. They seem to have said 100s of billions of polygons were displayed, if true that means at least 200 billion polygons were displayed. That is about 500M polygons per second + the multiple 8k textures, though there is some redundancy so the streaming requirement might be less.
I don’t think there’s a way to accurately measure how much data the demo was streaming per second just from the info we’ve been given. We know the engine relies on aggressive geometry streaming, but we don’t know how much or how long does unseen geometry remain in memory. That or I missed something in their explanation.
 

FeiRR

Banned
The demo is around 400 seconds long. They seem to have said 100s of billions of polygons were displayed, if true that means at least 200 billion polygons were displayed. That is about 500M polygons per second + the multiple 8k textures, though there is some redundancy so the streaming requirement might be less.

Someone estimated the 33 Million polygon statue was 600~MB, if true, 500M polygons per second is 15x that or 9GB/s which seems possible with kraken compression. Though again the demo likely streamed less than this as there was likely redundancy of assets, but it is possible there are sections of the demo were this was what was streamed.
I think I found at least two loading sections in the demo ;). First one is around 5:10 when Lara enters the temple. We can see the doorway but it's dark inside. When Lara enters the doorway, statues already loom in the dark. The other moment is when Lara leaves the statue room, she heads towards the bright doorway. Then there is a short corridor and the incredible landscape fades into view. This happens between 7:40 and 7:42 in the original video (which already has over 10m views on Epic account and half of that on IGN, by the way). Of course the loading happens all the time in the demo, I'm sure, but I think those are two crucial moments when the engine has to scrap huge amounts of data at once and load another heap of them.

I think they said nanite is mostly compute on gpu. Also lumen was not using the accelerated ray tracing hardware. If nanite and lumen can be accelerated by primitive shading h/w and ray tracing h/w the demo might run even better if optimized more.
When I was looking for answers, I stumbled upon some drama involving primitive shaders in AMD architecture about two years ago, with a possible lawsuit because they had promised but didn't deliver. I didn't have time to read about that. Does anyone know more about it?
 
I don’t think there’s a way to accurately measure how much data the demo was streaming per second just from the info we’ve been given. We know the engine relies on aggressive geometry streaming, but we don’t know how much or how long does unseen geometry remain in memory. That or I missed something in their explanation.
There likely isn't I'm just estimating. They said 100s of billions not 100 billion, that means at least 200Billion polygons were in the demo, though it could be 300 billion or more. If it was all unique geometry, that means you have to load it in the 400~ seconds the demo lasts, which means at least 500M polygons per second. But there is geometry redundancy so it likely is less, though even if less might reach such figures in some portions.
 
I think I found at least two loading sections in the demo ;). First one is around 5:10 when Lara enters the temple. We can see the doorway but it's dark inside. When Lara enters the doorway, statues already loom in the dark. The other moment is when Lara leaves the statue room, she heads towards the bright doorway. Then there is a short corridor and the incredible landscape fades into view. This happens between 7:40 and 7:42 in the original video (which already has over 10m views on Epic account and half of that on IGN, by the way). Of course the loading happens all the time in the demo, I'm sure, but I think those are two crucial moments when the engine has to scrap huge amounts of data at once and load another heap of them.
The most obvious one would be the initial crevice she squeezes through, and they said that was unecessary for loading and merely an artistic choice. IF true that probably means these too are artistic choices and not needed for loading.
 

bitbydeath

Member
Last edited:
Just a little something:

Textures are generally squares to make them easier to tile and break up for parallel computing, or for different quality levels within the same file. It’s a 1:1 aspect ratio image file. It’s a piece of data.

A 4K texture has nothing at all to do with a 4K game or TV.

One is literally a 4096x4096 image that gets pasted over a 3D model. The other is the final output resolution of a game, or TV, at 3840x2160 at 16:9.

The same applies to an 8K texture versus a new 8K TV. They are totally different things, and you don’t choose 4K textures for a 4K game and a “1080P” texture for a 1080P game, etc

There’s no such thing as a 1080P texture, or a 480P texture. There is such a thing as a 512x512 texture, and a 1024x1024 texture etc

When you confuse the two or imply they’re connected to make some kind of a point, you just show that you don’t really understand what you’re talking about.

Also, nobody (that I can recall in this thread) ever said Nanite and Lumen wouldn’t run on XSX or PC. The point being made was that this particular “Lumen in the land of Nanite” scripted demo was made entirely for PS5 by leaning heavily on its IO capabilities, and wouldn’t be possible without changes on other hardware with slower IO, like XSX and PCs.
This is what Sweeney said when it was first revealed, this is what he recently was compelled to clarify.

The demo shown and the engine are two different things.
UE5 with its Nanite and Lumen technologies will do amazing things on PS5, XSX and high-end PCs with NVMe SSDs.
The specific UE5 demo shown is pushing Nanite hard and was only possible due to Sony’s IO in this instance.

It’s not hard. It’s what Sweeney said originally. It’s what an Epic spokesperson said when contacted by Kotaku. It’s what Sweeney again clarified on Twitter.

Separate the game from the engine.

UE5 works on pretty much all devices.
Lumen and Nanite can be used on all devices with the required resources (CPU/GPU/IO).
Lumen in the Land of Nanite can be ran on anything with enough CPU/GPU/IO to keep up with the amount of assets in it, and how fast the character is moving through the world. Something Sweeney has repeatedly said is only possible with Sony’s IO.
It would make no sense at all to build a tech demo for PS5 and then only use less than 2.4GB/s raw storage speed. That’s not a technical demo. If LitLoN did use way less than 2.4GB/s raw storage speed you can bet they’d already have shown it running on XSX and PC as part of how amazing their engine is, which is what they’re really there to sell.
 

Md Ray

Member
They also forgot to remove the FPS indicator at the top left it seems. It is absolutely, 100% PC footage.
But console footage can also have FPS indicator. TLOU2 leaked footage comes to mind. But yeah, as others have suggested it looks to be PC footage. I think it still looks to be running at lower perhaps 1080p res. PS5 would have been higher res.
 

Handy Fake

Member
Maybe it's not very related but I had to share it. Yesterday my 5 year old son saw the explanation of the UE5. I showed it to him because of the character that was funny to him. Also I am teaching him basic 3D since he was 4 years old and he is interested. The case is that I put the video to him and the first thing that he says to me is (with his words as a small child, of course) the following: "Daddy, if only one triangle per square pixel is shown then you only see half of the drawing?".

It is adorable. Sorry. Father's love.
Quite frankly he seems more technically minded than half the posters on this thread.
 

Danlord

Member
But console footage can also have FPS indicator. TLOU2 leaked footage comes to mind. But yeah, as others have suggested it looks to be PC footage. I think it still looks to be running at lower perhaps 1080p res. PS5 would have been higher res.
Oh absolutely, I usually tend to see more debug information on consoles if they're showing anything more than the standard user-interface, rather than just the FPS in the corner. It's been a while since I used the FPS indicator in Steam Overlay but is that not the same design (font,size etc.) as the Steam Overlay?
 

Darius87

Member
ZVEd9Lb.jpg


does that last tweet from Sweeney implies U5 demo uses kraken for more then 5.5GB/s BW for streaming? or it would use it anyway even if it less then 5.5GB/s ? more then 5.5GB/s would make sense considering it uses 8K textures if so then it's most likely other next-gen platforms will use 4K textures, i expect that will be main difference between them.
 
Status
Not open for further replies.
Top Bottom