thelastword
Banned
NXGamer said under heavy load it's usually 1260p...…...Dips and tears too....Something, something on XboneX Gears 5 1080p at minimum, something, something 1800p max. during gameplay. Yeah, that's native 4k i presume.
NXGamer said under heavy load it's usually 1260p...…...Dips and tears too....Something, something on XboneX Gears 5 1080p at minimum, something, something 1800p max. during gameplay. Yeah, that's native 4k i presume.
They're all professionals including DF, no receipts necessary, it's 4k native you hear, PRO is that fake 4k. Their opinion is gold, forget people who prove things correctly.....Those guys who prove things correctly, they are a poor mans DF, a poor imitation of Gavon & Meirl etc.....A lot of anus sources today and no receipts.
So is this a case of Physical Core vs Logical Cores? Would that apply in this scenario?
That seems to make sense, I believe they did the same thing with the Xbox One X to some degree with Direct X if I’m not mistaken. Either way, With the tidbits of info, it appears that next gen will be an interesting generation.I'm going to break the vein and say MS' RT scenario is very likely to be built specifically around the parameters of their own DXR api. Engineering it in that way would compliment the api's methods for submitting workloads to the GPU and whatever "dedicated hardware" provided for RT tasks.
I would imagine that throughout DXR's development they've had a lot time to think about it, and just how to apply it's acceleration structure at both high and bottom levels. And it will likely have it's own custom "hooks" the way Nvidia utilizes Volta.
That seems to make sense, I believe they did the same thing with the Xbox One X to some degree with Direct X if I’m not mistaken. Either way, With the tidbits of info, it appears that next gen will be an interesting generation.
I think they are talking about this maybe?I'm out of the loop now but what some here believe MS they did with X?
Actually Lisa Su did it.Sir Mark Cerny did it first.
Who? Not familiar with bitches in the tech industry.Actually Lisa Su did it.
CEO of AMDWho? Not familiar with bitches in the tech industry.
I love AMD.CEO of AMD
The ‘Bitch’ of the GPU industry
I'm out of the loop now but what some here believe MS they did with X?
Same way they came up with dedicated physx cards which turned up to be shit same way they came up with sli which is shit and this dedicated raytracing cores is another shitty idea soon enough as cards will get faster they'll scrap itYou realise the GPU manufacturers and not Microsoft or Sony make this kind of decisions. Their r&d departments decided RT cores is the most efficient way to do it. Same thing happened when shaders came around and every modern gpu has shading units.
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.RT is not a gimmick, just the opposite. Standard lighting methods are gimmicks, because developers try to replicate with various techniques how RT would look like. RT is expensive but no one should call it a simple gimmick. Minecraft shows how drastic difference RT GI can make, and Metro exodus also looks great although they have limited GI RT to the extreme (sun was the only source of RT GI, and on top of that light bounces were limited to "2" and that's not enough to lit evenly certain high contrast scenes).
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.RT is not a gimmick, just the opposite. Standard lighting methods are gimmicks, because developers try to replicate with various techniques how RT would look like. RT is expensive but no one should call it a simple gimmick. Minecraft shows how drastic difference RT GI can make, and Metro exodus also looks great although they have limited GI RT to the extreme (sun was the only source of RT GI, and on top of that light bounces were limited to "2" and that's not enough to lit evenly certain high contrast scenes).
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.
That I believe is actually false.Instead of the usual software application, MS engineered Direct X as a hardware integration into the Xbox One X GPU's command processor. This hardware implementation significantly improved game to GPU communication and reduced CPU overhead.
For example, instruction set calls to the CPU can number in the thousands, by baking Direct X into the hardware those instruction sets could be reduced to less than a dozen. Cutting overall CPU workloads in half.
Which was a good thing when you consider what an utterly shitty processor the Jaguar is. This of course was contingent upon the game being a DIRECT X build.
This part says all...I think they are talking about this maybe?
I dunno
The difference is big in metro exodus as you can see here
That I believe is actually false.
MS used the AMD programmable command processor of GCN to implement DX instructions to run in the hardware.
That is software yet... not hardware implementation.
This part says all...
"they’ll all natively implemented in the logic of the command processor"
They are using the GCN programmable command processor to implement the logic they needs... it is software not hardware implementation... there is no silicon specific to run DX in Xbox One X APU.
BTW that DX implemented logic exists in Xbox One and the PS4/PRo API has instructions in the programmable command processor too (of course in this case not DX lol).
Edit - Just to people have a better ideia... you can create functions/instructions and update it in the command processor via microcode in the GCN... your logic is saved in the command processor to be called at any time you want but that doesn't mean there is a hardware dedicated unit created to execute these instructions.
That logic continue being software but in a more low level that implement it on the game or API code for exemple.
It is more like you are using Assembly to code your logic instead of C/C++... the overhead is lower.
That is a GCN feature... not something exclusive to MS Xbox One X lol
The difference is big in metro exodus as you can see here, and that's just very limited RT implementation (sun is the only source of RT GI, so only outdoor scenes are affected, and there's only 2 light bounces, too small to lit evenly high contrast scene)
...
https://www.youtube.com/watch?v=9skpec8YsGM
"With great power there must also come -- great responsibility"
with RT will come great gameplay flexibility and opportunities.
Like a lot less elements on screen?.
"With great power there must also come -- great responsibility"
remember bilinear texture filtering, perspective correct texture mapping, per pixel lighting etc. etc.
all of them looks like a big waste of resources .. not anymore.
Mwahahaha.
Try to render a simple kitchen scene with a lot of pots with RTX. Before you salivate about infinite reflections.
Well at least you cant deny RTX looks better in metro exodus.Looks like bullshit. They disable any AO or GI to make it look better.
And overall graphics in Metro is nothing to write home about it's sometimes ugly and usually has uneven quality. Some scenes are nice, but others are straight off last gen,
There is nothing in RTX but gimmicks.
I´d agree if those were considered a big waste of resources, the comparison seems like anecdotal oversimplification or a false equivalence.
Infinite reflections are the same thing.
Psone - texture filtering none, same with perspective correction, phong, AA, AF, z buffer etc.
fight against the RT is futile.
Theres no game out there that has shown any drastic difference of traditional rendering to ray tracing, yes there is a difference but the difference isn't enough to waste computing resources to I mean who cares about a fucking reflection and a shadow, it's pointless it's a gimmick until they can properly utilise it.
Minecraft would like a word with you.
That´s not even an argument. Are you making a case for rasterization or what?.
RT is the future, it will stay there.
been there, done that..
A somewhat distant one.RT is the future, it will stay there.
Minecraft would like a word with you.
[/QUOTE Minecraft was originally not a well rendered game they could have updated it's looks without even using raytracing and itll look nice there's nothing special
The difference is big in metro exodus as you can see here, and that's just very limited RT implementation (sun is the only source of RT GI, so only outdoor scenes are affected, and there's only 2 light bounces, too small to lit evenly high contrast scene)
At 1080p performance is not that much worse compared to RTX off, and 2070S can run RTX effects at around 63 fps (71 fps without RTX), while game looks clearly much better in all outdoor scenes.
Of course current games use limited RT (limited amount of rays and effects), but limited or not, it's still RT, so you should not call it a gimmick, because it's not a gimmick and that's exactly why it's so demanding. You can call RT uneccesery feature, but gimmick is the last word you should use.
NXGamer said under heavy load it's usually 1260p...…...Dips and tears too....
I'm not sure...I mean NVIDIA had these ray-tracing cores in their RTX graphics cards at first...then all of a sudden, now GTX cards can do it too? It makes it sound like Raytracing doesn't require specific technology to run it, but maybe having cores dedicated to it makes for a better implementation?
Hell, I dunno.
2023 is a distant future? bless you childA somewhat distant one.
Nv itself stated it won't be until 2023 that AAA will have hardware RT requirements.
PS4 was released 6 years ago.2023 is a distant future? bless you child
Yeah sorry, was just wishing that time still went as slow as when I was a kidPS4 was released 6 years ago.
We are talking about times 4 years ahead, when PS6 will be looming.