• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Bo_Hazem

Banned
Yes when he talks about scalability he doesn't specifically say "this demo."

But you really think Epic chose a demo for their engine they boasted is highly scalable that literally would only ever work on PS5?

His comment about "at any scale" is just nonsense hype at face value. Why wouldn't you be able to do that demo at a lower scale on different hardware exactly?

The "at any scale" wasn't speaking about scalability, speaking about the same exact demo at any scale of high-end PC or console.

You can always use compressed 480p-1080p assets instead of Hollywood-level, uncompressed 8K assets according to the hardware capability budget. Here: (timestamped)

 
if anyone interested of the source of this .. a mod from an arabic forum chated with the developer personally ..

link : https://www.true-gaming.net/boards/index.php?threads/أخبار-مهمة-قادمة-غدًا-من-عالم-البلايستيشن-بدأ-البث.47394/page-11#post-1839356

The dev also said next gen games will look much better than UE5 demo with even 60 fps

Not much of a technical opinion, even going so far as to admit he can’t imagine what you’d ever need PS5 IO speeds to enable.
Do you know what game he’s talking about? Or what studio he’s working for? Instinct is small studio licensing someone else’s engine.
 

Corndog

Banned
this is flat earther level of denial. Just because you don't understand how theyre doing it doesnt mean its "bull crap".
Either take the time to learn how the core technology works or dont comment about it.
Please drop the condescending crap. You are free to disagree with me but leave the personal attacks at home.

I don’t claim that since I don’t understand how they are doing it that it is crap. I call it crap despite not knowing exactly what they are doing.
You have limited resources on a pc or console and it would be ridiculous to waste them on geometry the end user will never see.
You are not going to see games rendered with billions of polys per frame. This is a fact. Neither console has anywhere near the power to handle that. The generation after this won’t be able to handle that either. Maybe the generation after that will.
Please listen, this is something everyone already explained. There's 20 million triangles processed per frame, not 3.7. the "1 tri per pixel" is just a comparison of relative size. If you only processed tris within direct line of sight you wouldnt be able to do any lighting or physics calculations as the half of every asset facing away from the camera would be missing. They're also doing the same for textures, yes.

As i said before please stop calling bull**** about tech you clearly only have a superficial understanding of.
Correct. And nowhere do you seem billions of polys right. As for the 3.7 figure I am just using visible polys as a 1440 display has 3.7 million pixels. And the presentation said rendering 1 poly per pixel.
In regards to culling they do indeed cull back facing and obscured and frustum culled geometry. They always cull unseen geometry if possible. Because of this lighting engines have some artifacts that result. Please watch the DF video to see examples of this in UE5.
 

Audiophile

Member
Do some of you really think Epic is going to only code UE5 to take advantage of PS5 features and not XSX.

Like really...

Unreal engine... One of the most used engines in the industry...

You are playing yourself if you think this.

No one is saying it's only being coded for PS5 and not being coded for XSX. That's an extreme proposition most are fabricating just so they can argue against it and feel justified in their insecurities.

Of course it is being built for both as well as many other platforms.

However, this particular demo appears to have been used to demonstrate PS5's main strength which is the SSD & I/O Stack that can push roughly 2x the data (and considerably more in optimal, likely edge-case scenarios). If you can get that much more data in and out of memory that fast that means you have the potential to display that much more data on-screen in a given time. Or, you can transition or transverse between areas with distinct assets that much quicker.

If either the visual density we see in general or the traversal we see in the latter sequence effectively require ~5.5GB/s of throughput then with a little optimisation and/or minor cutbacks, the XSX could probably run it; and playing to its strengths, will likely push 10-20% more native pixels while doing it... However, if this is utilising that full 8-9GB/s on PS5, then the XSX would necessitate some concessions such as visual density/complexity, traversal/transition speeds, loading etc.
 

Corndog

Banned
I’m also curious to the rebuttals that a faster memory pipeline having no benefit. Link to the Alex reasoning?
I see it as a benefit but not something that is going to miraculously result in a higher pixel count. I can see it giving the obvious faster loading times, less pop in and more geometry and texture variety. But all of this, except loading time can be achieved just by increasing the amount of ram.
It is a balance between how much ram you can afford and the ssd speed.
 

Corndog

Banned
A direct quote from an article by The Verge day of UE5 demo.

Sweeney says “Epic has been working closely with Sony for years now on ensuring UE5 can best utilize the upcoming PlayStation's unique architecture”

Lol 🤦🏽‍♂️🤷🏽‍♂️
And I’m sure they have been working with Microsoft and Nintendo as well as cell phone manufacturers.
 

HAL-01

Member
Please drop the condescending crap. You are free to disagree with me but leave the personal attacks at home.

I don’t claim that since I don’t understand how they are doing it that it is crap. I call it crap despite not knowing exactly what they are doing.
You have limited resources on a pc or console and it would be ridiculous to waste them on geometry the end user will never see.
You are not going to see games rendered with billions of polys per frame. This is a fact. Neither console has anywhere near the power to handle that. The generation after this won’t be able to handle that either. Maybe the generation after that will.
Correct. And nowhere do you seem billions of polys right. As for the 3.7 figure I am just using visible polys as a 1440 display has 3.7 million pixels. And the presentation said rendering 1 poly per pixel.
In regards to culling they do indeed cull back facing and obscured and frustum culled geometry. They always cull unseen geometry if possible. Because of this lighting engines have some artifacts that result. Please watch the DF video to see examples of this in UE5.
the artifacts are not due to geometry culling, they're due to the screen space nature of the lighting solution.
no one is claiming billions of polys per frame, as i already said theyre 20 million per frame.
they're claiming billions per scene, which are obviously not all processed at once.
I'm done with the discussion, its clear you fundamentally misunderstood the tech and are doubling down on it.
 

Shmunter

Member
I see it as a benefit but not something that is going to miraculously result in a higher pixel count. I can see it giving the obvious faster loading times, less pop in and more geometry and texture variety. But all of this, except loading time can be achieved just by increasing the amount of ram.
It is a balance between how much ram you can afford and the ssd speed.
It will not increase pixel count, whoever says that is full of it. And in all honesty I have yet to see anyone say such a crazy thing. Everything else you say is legit.

Regarding Increasing Ram, it becomes a fools errand at some point, you cannot keep going in that direction. Load times would increase exponentially without tackling the entire pipeline for improvement, new Quality of life innovations like suspending and resuming gameplay also become impossible.

A balanced future is the right path.
 
Last edited:

Andodalf

Banned
the artifacts are not due to geometry culling, they're due to the screen space nature of the lighting solution.
no one is claiming billions of polys per frame, as i already said theyre 20 million per frame.
they're claiming billions per scene, which are obviously not all processed at once.
I'm done with the discussion, its clear you fundamentally misunderstood the tech and are doubling down on it.

People absolutely were. In the convo you just had.
Might've been more that I have on ignore too

One aspect that maybe didn't register with all the SSD talk yesterday is, that if they are filling in new data every frame, and drawing 1billion front facing polygons per frame, that will change by the smallest delta in camera angle, then the caches on the GPU are going to change on a frame by frame basis, too, and even though there will be large redundancy in L3 assets (from frame to frame), without cache scrubbers how would another system handle that rate of change?
There was some leak claiming that Horizon 2 will show billions of polygons on screen and many of us called it bullshit. It turned out that the 100s of billions of polygons are true. 🤣🤣🤣🤣🤣🤣

Corndogs post that you first replied to that started this was said horizon zero dawn has billions of polys on screen. Goalpoast shifting in the same convo never works
 

Bo_Hazem

Banned
yes, but then again you are normal.

i have seen donbesca do it to james sawyer ford and other posters. its just his thing i guess. i thought about putting him on ignore, but then i wouldnt know if my posts triggered him or not.

He did to many of mine too, but I managed to get "real" likes at some rare posts. I think he was too busy catching up and gave me true likes by mistake. :messenger_tears_of_joy:
 

HAL-01

Member
People absolutely were. In the convo you just had.
Might've been more that I have on ignore too




Corndogs post that you first replied to that started this was said horizon zero dawn has billions of polys on screen. Goalpoast shifting in the same convo never works
the discussion is not about that post, however. Me and a few other people attempted to explain the basics core tech to him, which he repeatedly dismissed, came up with his own theories and continued to claim Epic was simply lying about what it can do
 

Nikana

Go Go Neo Rangers!
Because it was never designed to take advantage of the architecture.
What do you think adapting an engine is?

Do you think engine makers just go, well it works that's good enough!

The engine is crafted for each individual platform. When it's not you have Skyrim on PS3.
 

DaGwaphics

Member
Stop lying. It's transparent what you're doing. Show me a source that says that Sony "strictly prohibited performance comparisons".

it is hard to compare the footage when xbox nor PC can run it. Epic did not have the time to downgrade til lesser machines just to satisfy you :)

This is why we have to see other demos, then we can compare properly. Sony has given us something to compare against, now it's MS's turn.
 

Dr Bass

Member
Please drop the condescending crap. You are free to disagree with me but leave the personal attacks at home.

I don’t claim that since I don’t understand how they are doing it that it is crap. I call it crap despite not knowing exactly what they are doing.
You have limited resources on a pc or console and it would be ridiculous to waste them on geometry the end user will never see.
You are not going to see games rendered with billions of polys per frame. This is a fact. Neither console has anywhere near the power to handle that. The generation after this won’t be able to handle that either. Maybe the generation after that will.
Correct. And nowhere do you seem billions of polys right. As for the 3.7 figure I am just using visible polys as a 1440 display has 3.7 million pixels. And the presentation said rendering 1 poly per pixel.
In regards to culling they do indeed cull back facing and obscured and frustum culled geometry. They always cull unseen geometry if possible. Because of this lighting engines have some artifacts that result. Please watch the DF video to see examples of this in UE5.

What part of the demo was unclear when they said billions of polygons visible in the frame "crunched down losslessly to around 20 million drawn triangles." It's a completely new method of rendering scenes and you're effectively getting what would have taken billions of polygons in a frame. It's however they are doing the lossless compression that is key.

It was so clearly explained and demonstrated I can't wrap my mind around anyone saying it's "crap." That's honestly your takeaway from what you saw? Exactly what are you working on in your career to be able to so casually toss such a phrase around clearly revolutionary technology. :pie_thinking:
 
Me reaction this last 24 hours when I saw:

-Than the people are actually denneying the tech demo runs in PS5
-They think the demo should looks better in XSX or even a PC with a normal SSD. Yeah because Unreal Engine love to show for
first time in a less capable hardware
-Sweeney lied telling us is the absolute best hardware by the end of the year (don't make crazy he is not only talking for gpu)
-Was a coincidence the Sweeney focus the SSD of PS5 and Cerny focus the console to remove the bottlenecks


We have months waiting for a demo which can show us the next gen graphics and not only a current game of pc set to
ultra and 4k but as this demo doesn't run in your favorite box well should be a lied..... most of here never saw or worked
with an engine but hey they know exactly why the demo was a lied or runs better in a more traditional hardware.

Note: just the fact to be able to accomplish REYES in real time is like someone saw for first to other person use pulley to obtain a better result with less effort,
if it seems magic but it is not
 
Last edited:

Nikana

Go Go Neo Rangers!
No one is saying it's only being coded for PS5 and not being coded for XSX. That's an extreme proposition most are fabricating just so they can argue against it and feel justified in their insecurities.

Of course it is being built for both as well as many other platforms.

However, this particular demo appears to have been used to demonstrate PS5's main strength which is the SSD & I/O Stack that can push roughly 2x the data (and considerably more in optimal, likely edge-case scenarios). If you can get that much more data in and out of memory that fast that means you have the potential to display that much more data on-screen in a given time. Or, you can transition or transverse between areas with distinct assets that much quicker.

If either the visual density we see in general or the traversal we see in the latter sequence effectively require ~5.5GB/s of throughput then with a little optimisation and/or minor cutbacks, the XSX could probably run it; and playing to its strengths, will likely push 10-20% more native pixels while doing it... However, if this is utilising that full 8-9GB/s on PS5, then the XSX would necessitate some concessions such as visual density/complexity, traversal/transition speeds, loading etc.

The concessions would be adapted and made possible by crafting the engine around what it's strengths are.

Any engine can be adapted to any platform. The strengths of one platform can be made to work on others.

To say that if the engine is utalized fully by the ps5 SSD and that the XSX would have to compensate somewhere is exactly how the engineers don't think. If that was the case then developers wouldn't want to use that engine. Which means less money in their pocket.

In its current form sure, xsx can't do it because the demo is literally coded for ps5. But that's like saying the PS4 version of Assassin's Creed is impossible on Xbox One. It's impossible because that version of the game is coded to the PS4. But the Xbox one version sure as hell looks a lot like the PS4 version doesn't it? That's because the engine has been crafted to and adapted on all platforms and their stregnths and weaknesses.
 

B_Boss

Member
In the words of Alucard: “I’ve come to put an end to this.” lol.
About that narrow area in the UE demo? It was not loading. Unless we’ve reason to believe Jeff Grubb is lying, here’s what he said about it:


May it never rear it’s ugly and time consuming FUD head 🍻.
 
Last edited:

Bo_Hazem

Banned
4k 120hz yes, but VRR doesnt need 2.1 as such. All of the samsung qled tv's (50inch and above) have it as standard on HDMI 2.0.

Not sure about how important VRR is, I've never had screen tearing in my whole life, and lately with my Sony tv HDMI 2.0. I only see screen tearing on youtube as they talk and show it. Maybe it's a problem with some tv's or monitors. I've played on open framerates on many games on PS4 Pro and never had an issue, and it's obvious that the performance is floating between 30-60fps and unstable yet looks clean.
 
Last edited:

Corndog

Banned
People absolutely were. In the convo you just had.
Might've been more that I have on ignore too




Corndogs post that you first replied to that started this was said horizon zero dawn has billions of polys on screen. Goalpoast shifting in the same convo never works
Exactly. I don’t have any problem with 20 million polys total. It’s this fake billions of polys on screen. It’s just pr from epic. They know people are going to hear that and misinterpret it. Especially the way they presented it. Like I said it is total crap. No one is going to use billions of polys to generate an image when they can get the exact same quality for exponentially less.

And then I have people attacking me personally. I’m not forcing anyone to agree with me. It’s fine if you don’t.
 
The important thing is that we now have a demonstration that shows off the strengths of the PS5’s design with its highly optimized I/O, and how that can contribute to visual fidelity.

Microsoft and Sony are approaching the next generation with different strategies. MS will use its services to cater to its existing install base, hence the emphasis on legacy support and software scalability. Rumor has it that they will also go after the low end with an inexpensive 1080p machine. Sony will, as they have done for every transition, break with the past with the new generation‘s capabilities, further integrating the interactive medium with the high production world of cinema.

Now, I am less inclined to believe that MS’s strategy will claw back significant market share in the console space, but it is hard to predict at this moment because there is still so much to see over the next year.
 

Corndog

Banned
The important thing is that we now have a demonstration that shows off the strengths of the PS5’s design with its highly optimized I/O, and how that can contribute to visual fidelity.

Microsoft and Sony are approaching the next generation with different strategies. MS will use its services to cater to its existing install base, hence the emphasis on legacy support and software scalability. Rumor has it that they will also go after the low end with an inexpensive 1080p machine. Sony will, as they have done for every transition, break with the past with the new generation‘s capabilities, further integrating the interactive medium with the high production world of cinema.

Now, I am less inclined to believe that MS’s strategy will claw back significant market share in the console space, but it is hard to predict at this moment because there is still so much to see over the next year.
Ya and this is something Microsoft much do as well if they are to compete with Sony. They definitely failed with their gameplay trailers.
 

user1337

Member
Not sure about how important VRR is, I've never had screen tearing in my whole life, and lately with my Sony tv HDMI 2.0. I only see screen tearing on youtube as they talk and show it. Maybe it's a problem some tv's or monitors. I've played on open framerates on many games on PS4 Pro and never had an issue, and it's obvious that the performance is floating between 30-60fps and unstable yet looks clean.

There are 3 very good videos worth watching on this topic of 4k 120hz and VRR/VRS. The issue with TVs isn't just about high frame rates, but also low frame rates like 2K 30fps.



And



And

 
Last edited:
T

Three Jackdaws

Unconfirmed Member




I remember someone on the thread talking about how we don’t know the latency performance of the PS5’s SSD. I don’t know exactly what that means but the Epic Games engineer seemed to be vague but positive about it.
 
Last edited by a moderator:

Bo_Hazem

Banned
There are 2 very good videos worth watching on this topic of 4k 120hz and VRR/VRS. The issue with TVs isn't just about high frame rates, but also low frame rates like 2K 30fps.



And



Seen them before, I like FOMO videos. Just said that screen-tearing has never ever happened to me. I'm getting the Sony XH90 (X900H) with HDMI 2.1 full support (4K@120Hz, VRR, ALLM), but only stating that I've never seen screen tearing in practice. Maybe it's just PS4/Pro?
 




I remember someone on the thread talking about how we don’t know the latency performance of the PS5’s SSD. I don’t know exactly what that means but the Epic Games engineer seemed to be vague but positive about it.

But Alex told me that was impossible, than the SSD doesn't help in graphics only loadings.

Outside of jokes I want to see how this engine scales with different configurations as well
as how PC games will be seen in a few years when the PC-Console gap is again considerable
not only in GPU raw power.
 

HAL-01

Member
Exactly. I don’t have any problem with 20 million polys total. It’s this fake billions of polys on screen. It’s just pr from epic. They know people are going to hear that and misinterpret it. Especially the way they presented it. Like I said it is total crap. No one is going to use billions of polys to generate an image when they can get the exact same quality for exponentially less.

And then I have people attacking me personally. I’m not forcing anyone to agree with me. It’s fine if you don’t.
good thing epic never said that, and were extremely clear in their messaging from the get go
 

user1337

Member
Seen them before, I like FOMO videos. Just said that screen-tearing has never ever happened to me. I'm getting the Sony XH90 (X900H) with HDMI 2.1 full support (4K@120Hz, VRR, ALLM), but only stating that I've never seen screen tearing in practice. Maybe it's just PS4/Pro?
No idea mate, haha. My point was that there seems to be too much focus on trying to push gamers to hdmi 2.1 when it may not really be needed for the first versions of next gen (maybe if there is a PS5 pro it will need the extra bandwidth of 2.1).
 

bitbydeath

Member
What do you think adapting an engine is?

Do you think engine makers just go, well it works that's good enough!

The engine is crafted for each individual platform. When it's not you have Skyrim on PS3.

They don’t usually go to the extent they are adapting PS5. Yes, they do try make the most power wise but not feature wise.

This is new!
 

ZeroFool

Member
Hitchhikers Guide to Nextgen - "Don't Panic"

Keep calm and enjoy keeping your hands and feet outside the train at all times.

I will eerily admit that I enjoyed flying around in Anthem on PC and liked the visuals. It is going to be fun to see what the artist and masters of the huge budget games can show us. If it is indeed now faster to develop with better visual fidelity while increasing the length and quality of the stories I will be ecstatic. Last two gens I have been disappointed by great visuals but a crappy story or short one.

That is my rambling update for tonight, see you tomorrow. I have a ton of work in the morning and I really should get another youtube video ready. I need to break 1000 subscribers darn it! 🤣
 
  • Triggered
Reactions: TLZ

Sinthor

Gold Member
Exactly. I don’t have any problem with 20 million polys total. It’s this fake billions of polys on screen. It’s just pr from epic. They know people are going to hear that and misinterpret it. Especially the way they presented it. Like I said it is total crap. No one is going to use billions of polys to generate an image when they can get the exact same quality for exponentially less.

And then I have people attacking me personally. I’m not forcing anyone to agree with me. It’s fine if you don’t.

Man, you should lose the victim mentality. People like to complain way too much about other people being mean to them on this forum!

Personal attacks? I don't know what you're seeing....




:messenger_winking_tongue:
 
Last edited:

Nikana

Go Go Neo Rangers!
They don’t usually go to the extent they are adapting PS5. Yes, they do try make the most power wise but not feature wise.

This is new!

This isn't new.

If you really want to latch onto this pipedream that UE5 is the first engine to adapt features based on a console then ID Tech 3 beat it by about 20 years. John Carmack said when he was writing the engine he adapted parts of it spcefically to run Doom 3 on the Xbox.
 
Last edited:

ZehDon

Gold Member
So long as folks keep their emotions in check and let the scientific facts rule the day, we’re all good and golden.
giphy.gif
 
If this is the approach then like I said billions of polys per scene is bull crap. You still have a reasonably small poly budget which if it is approximately 1 poly per pixel is 3.7 million polys per frame. Now that I can believe.
you don't need to. As even 4k is about 8million pixels. 8Billion polygons onscreen is senseless. 1000 polygons per pixel? That doesn't make sense. But 8Billion polygons per scene makes sense. This is an approach similar to pixar reyes with micropolygons.

The 33 M statue gets close to the screen, the sufficient detail is shown on screen. You get close enough, and the Millions of polygons on screen are all from the statue's close up. There is only so much geometry you can resolve for a given resolution.

edit hollywood cgi pixar reyes
Recall that every primitive is diced into a grid of micropolygons with a density of approximately one micropolygon per pixel. -steckles
notice the 1 micropolygon per pixel quite similar to the nanite approach. That is why such cgi does not actually put the billions of polygons on screen as it'd be meaningless with just 8million pixels.

You are not going to see games rendered with billions of polys per frame. This is a fact. Neither console has anywhere near the power to handle that. The generation after this won’t be able to handle that either. Maybe the generation after that will.
The scene needs the billions of polygons or you will get polygonal edges when you get close to elements of the scene. Some film cgi models have more polygons than even 4k has pixels, what do you think happens to the excess polygons that would clearly be subpixel size? Would the film cgi model be the same without the tens of millions of polygons? Clearly not.
 
Last edited:

Sinthor

Gold Member
Not sure about how important VRR is, I've never had screen tearing in my whole life, and lately with my Sony tv HDMI 2.0. I only see screen tearing on youtube as they talk and show it. Maybe it's a problem with some tv's or monitors. I've played on open framerates on many games on PS4 Pro and never had an issue, and it's obvious that the performance is floating between 30-60fps and unstable yet looks clean.

I don't recall seeing tearing EVER, except when I played Haze on the PS3. There was this one level that I saw some of that. I've never seen this on the PS4 and since I upgraded to a Pro haven't seen it there either. I am pretty hyped for any kind of "boost mode" backwards compatibility though based on my experience from "regular" PS4 to the Pro model with Elite: Dangerous. With the PS4, there could be SOME SLIGHT slowdown in packed asteroid fields while combat was also happening. I don't see that at all on the Pro. Good times! I can't wait!
 

Bo_Hazem

Banned
I don't recall seeing tearing EVER, except when I played Haze on the PS3. There was this one level that I saw some of that. I've never seen this on the PS4 and since I upgraded to a Pro haven't seen it there either. I am pretty hyped for any kind of "boost mode" backwards compatibility though based on my experience from "regular" PS4 to the Pro model with Elite: Dangerous. With the PS4, there could be SOME SLIGHT slowdown in packed asteroid fields while combat was also happening. I don't see that at all on the Pro. Good times! I can't wait!

I want to pick PS5 as fast as possible and act as if PS4 and previous gens never existed :messenger_tears_of_joy: After that demo, I'm having a hard time to play the upcoming current gen games:messenger_grinning_sweat:
 

Sinthor

Gold Member
On another note guys....how about that Ghost of Tsushima reveal today? From the past trailers I honestly expected them to push the game out further and finally announce it was actually a PS5 title, but it's all PS4 and looks AMAZING. Of course, I can see some differences with that UE5 demo from yesterday, but still. This thing looks IMPRESSIVE. As long as it performs in game like we're seeing in these trailers.....just.....WOW. I am so hyped for this game. I was REALLY waiting for TLOU2 but the "leaks" have tempered my excitement a good bit (unfortunately I wasn't able to avoid spoilers, at least the written kind). I'm STILL hoping that the leaks came from an earlier version of the game and story but we'll see.

In any case, I'm still hyped for TLOUS2 and getting more hyped for Ghost of Tsushima by the day! Anyone else?

Me watching the 'State of Play' today!
 

HAL-01

Member
you don't need to. As even 4k is about 8million pixels. 8Billion polygons onscreen is senseless. 1000 polygons per pixel? That doesn't make sense. But 8Billion polygons per scene makes sense. This is an approach similar to pixar reyes with micropolygons.

The 33 M statue gets close to the screen, the sufficient detail is shown on screen. You get close enough, and the Millions of polygons on screen are all from the statue's close up. There is only so much geometry you can resolve for a given resolution.


The scene needs the billions of polygons or you will get polygonal edges when you get close to elements of the scene. Some film cgi models have more polygons than even 4k has pixels, what do you think happens too the excess polygons that would clearly be subpixel size? Would the film cgi model be the same without the tens of millions of polygons? Clearly not.
don't bother with them, we've been at this since a few pages ago, i hope hes just a troll
 

Corndog

Banned
the discussion is not about that post, however. Me and a few other people attempted to explain the basics core tech to him, which he repeatedly dismissed, came up with his own theories and continued to claim Epic was simply lying about what it can do
Wrong. I said they weren’t using billions of polys. Don’t change what I said. Quote me if you want.

Edit: here is my actual statement.
“ They are not showing hundreds of billions of polygons. Pure marketing speak by epic.”
 
Last edited:
Status
Not open for further replies.
Top Bottom