• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Dark1x of Digital Foundry says that he noticed a odd trend of a few games starting to perform better on PS4 Pro than Xbox One X

I'm a 1080p user, I have both consoles and I can barely notice any SSAA benefits.

Gimme an option to disable it, kthxbye.
 
Last edited:
Every time i see a thread of his hit multiple pages, i think

fe5.gif
Good Lord that .gif is... amazing! Lol
 
I can barely notice any SSAA benefits.

It really depends on the game, and the general art direction. I've noticed that a lot of games that rely on NPR art directions tend to be far more readable and clear during gameplay when running at a higher resolution than running at your display's native resolution. Due to how muddy textures look in quite a lot of AAA games that try to go for a cinematic approach, the only thing that I have noticed that will happen from playing those games while downsampling is less geometry aliasing.

Here's two comparisons, and even when running fullscreen on a 1080p display, I can tell a difference.



 
The FP16 debate is silly.

No one said that it's going to be used for pixel shaders (the bulk of processing in modern game pipelines), so bringing up banding issues from 2004 GPUs is irrelevant and disingenuous to say the least. Get with the times.

It's going to be useful for AI applications like neural networks and stuff like that. Next-gen AI will be great on PS5/XB2!

Example:



Hell, you could even argue that 16-bit precision is overkill:

https://arxiv.org/abs/1812.08011
https://venturebeat.com/2018/12/02/...o-four-times-faster-while-retaining-accuracy/
https://petewarden.com/2015/05/23/why-are-eight-bits-enough-for-deep-neural-networks/

Do you think it's a coincidence that the latest compute-focused GPUs (Vega/Turing) support acceleration of INT4, INT8 aside from FP16/FP32?

Yes, Sony tends to add niche features (forward-thinking philosophy, same as AMD) that almost no one is going to use to their fullest in current-gen machines (like Cell SPUs, Blu-Ray capacity and now FP16).

Do you think Cell SPUs were useless? Look where we are now with GPGPUs (every console has one, including Nintendo Switch). Do you think Blu-Ray was useless? Look at Wii U (25GB SL BD-ROM) and XBOX ONE...

The same will happen with FP16 and INT8 next-gen (be prepared for absurd INT8/FP16 metrics if FP32 specs are deemed too low for PR/marketing purposes). Have some patience folks and don't start silly system/console wars! :)

the jaguar CPU may not be the best available back then but its performance is not bad for the aplication and the rest of the specs of xbox one or ps4, I dont want to offend anybody but most people dont really understand how a game works and because of that they make clueless opinions about specs and how they affect game performance, so just because "most people" think something not make it right

lets say you replace the jaguar with a CPU 2 times faster, do you really believe the GPU is going to draw 2 times more frames per second?, sure it can improve performance here and there, but is not going to make the GPU draw in less time than it does, shader performance is not going to improve, today is very common to see games with dynamic resolution compared to the beginning of the generation and resolutions under 1080p, that is a GPU problem not a CPU one, and lets not forget that CPU have more free time now than at the beginning of the generation because physics are now calculated in the GPU and sound is calculated in their own processors
Props for the informative post.

Regarding the audio part, do you mean that they use the TrueAudio DSP? Not sure if it supports 3D audio though...

Or is there a possibility that they're using GPGPU/Compute shaders for processing 3D audio? RE2 Remake and Hellblade actually support 3D audio everywhere (including OG consoles), but they don't mention how they do it.

https://www-sop.inria.fr/reves/projects/GPUAudio/

Btw, this is supposed to be a Polaris exclusive feature according to AMD: https://gpuopen.com/gaming-product/true-audio-next/

OG consoles are GCN 1.1, while X/Pro are Polaris-based.
 
Last edited:
The FP16 debate is silly.

No one said that it's going to be used for pixel shaders (the bulk of processing in modern game pipelines), so bringing up banding issues from 2004 GPUs is irrelevant and disingenuous to say the least. Get with the times.

It's going to be useful for AI applications like neural networks and stuff like that. Next-gen AI will be great on PS5/XB2!

Example:



Hell, you could even argue that 16-bit precision is overkill:

https://arxiv.org/abs/1812.08011
https://venturebeat.com/2018/12/02/...o-four-times-faster-while-retaining-accuracy/
https://petewarden.com/2015/05/23/why-are-eight-bits-enough-for-deep-neural-networks/

Do you think it's a coincidence that the latest compute-focused GPUs (Vega/Turing) support acceleration of INT4, INT8 aside from FP16/FP32?

Yes, Sony tends to add niche features (forward-thinking philosophy, same as AMD) that almost no one is going to use to their fullest in current-gen machines (like Cell SPUs, Blu-Ray capacity and now FP16).

Do you think Cell SPUs were useless? Look where we are now with GPGPUs (every console has one, including Nintendo Switch). Do you think Blu-Ray was useless? Look at Wii U (25GB SL BD-ROM) and XBOX ONE...

The same will happen with FP16 and INT8 next-gen (be prepared for absurd INT8/FP16 metrics if FP32 specs are deemed too low for PR/marketing purposes). Have some patience folks and don't start silly system/console wars! :)


Props for the informative post.

Regarding the audio part, do you mean that they use the TrueAudio DSP? Not sure if it supports 3D audio though...

Or is there a possibility that they're using GPGPU/Compute shaders for processing 3D audio? RE2 Remake and Hellblade actually support 3D audio everywhere (including OG consoles), but they don't mention how they do it.

https://www-sop.inria.fr/reves/projects/GPUAudio/

Btw, this is supposed to be a Polaris exclusive feature according to AMD: https://gpuopen.com/gaming-product/true-audio-next/

OG consoles are GCN 1.1, while X/Pro are Polaris-based.


dont know much about 3d audio, they have DSP in the SOC, they still need some 3d directions but there is some comments were devs point to use GPU to trace vectors for 3d sound instead of CPU, so by now I expect it to be implemented just like physics now are calculated on GPU

FP16 was used by DICE to speed part of the rendering in their games

https://www.slideshare.net/DICEStudio/4k-checkerboard-in-battlefield-1-and-mass-effect-andromeda
 

Thiagosc777

Member


Jesus Christ, they are really milking this demo to the very last view!

Let me guess, if onQ123 posted it , then it is probably something positive about the PS4 Pro. Even if it is running on poorly optimized unfinished code that runs poorly even on PCs.
 

onQ123

Member
Jesus Christ, they are really milking this demo to the very last view!

Let me guess, if onQ123 posted it , then it is probably something positive about the PS4 Pro. Even if it is running on poorly optimized unfinished code that runs poorly even on PCs.

It's not just the Pro it's running better on PS4 overall
 

dark10x

Digital Foundry pixel pusher
Jesus Christ, they are really milking this demo to the very last view!

Let me guess, if onQ123 posted it , then it is probably something positive about the PS4 Pro. Even if it is running on poorly optimized unfinished code that runs poorly even on PCs.
Yeah...I don't like milking stuff at all but, alas, sometimes you have to do what the people want (and lots of people were begging for info on the base consoles). :\
 

AnotherOne

Member
Yeah...I don't like milking stuff at all but, alas, sometimes you have to do what the people want (and lots of people were begging for info on the base consoles). :\


A bit off topic but will you guys be revisiting freesync analysis on x again? Perhaps a bit more in depth and more games to test out would be much appreciated.
 

dark10x

Digital Foundry pixel pusher
A bit off topic but will you guys be revisiting freesync analysis on x again? Perhaps a bit more in depth and more games to test out would be much appreciated.
I'd love to but we don't have the hardware to do it. Monitors are one of those things that is tough to buy since they take up a lot of space. Since we work out of our homes mostly, it's tough one. If I needed a new monitor I'd find one that fits the bill. My current monitor supports FreeSync but it doesn't work with HDMI and the range is well above 60hz but only slightly below so it wouldn't help with Xbox One X anyways. Hopefully at some point one of us will get a better FreeSync display.
 
Last edited:

PaNaMa

Banned
It sure would be nice to get the developer to weigh in on Xbox 1X performance specifically, and how they plan to fix it. Whether they lock at 1800p, or implement dynamic resolution scaling, or just change post processing or shadows or AA or something to even out the frame rate - they have to do something. It's not about console wars, it's about the user experience they are presently delivering.
 

DonF

Member
It sure would be nice to get the developer to weigh in on Xbox 1X performance specifically, and how they plan to fix it. Whether they lock at 1800p, or implement dynamic resolution scaling, or just change post processing or shadows or AA or something to even out the frame rate - they have to do something. It's not about console wars, it's about the user experience they are presently delivering.
I know that its super unlikely, but I would love for a dev to speak about it. Like why not drop the resolution for a better performance? Hell, I bet that the xb1x could run the game at 60fps in 1080p. Maybe some parity clause from sony or a resolution mandate for the 1x from microsoft? Maybe some custom solution is in place on the ps4 pro?
 
Last edited:

thelastword

Banned
What's all this hoopla about, can't we see the real elephant in the room........? Originally, Leadbetter said it was 1800p on both, I didn't get to watch that video much later on from when this thread was posted, but I would surmise that most of the arguments centered around that......

The problem here, is that here again DF has guessed wrongly or given inaccurate information and they may still be doing it now......It's the reason why I said this on page 5 in this very thread....

"I will say however, that I absolutely agree, when you say XBONEX should win if a game is not optimized, which Anthem looks to be, by and large, so maybe DF got the rez count wrong.........NxGamer says the XBONEX version is running at 2160p CB instead of 1800CB, but I guess I'll wait on the master rez counters (Vgtech) to confirm..."

And this was after watching the video in the OP and watching NX's take that day, that was before DF's writeup went up, even updating their information from the video.......Perhaps I alerted them to it and they did more thorough work,, but it's strange that yet again NX was right and people write him off and swear on DF, he never got a mention, nothing.....Now I personally swear on Vgtech, I trust their Rez counting implicitly, and history would show why, but I think these outlets deserve some serious respect here.....

To put things into perspective, DF said RE7 was native 4k on XBONEX, and 1800p native on PRO, when that just cannot be......This is opposed to Vgtech's findings (They said it was CB4k-1800CB, XBONEX VS PRO)....Yet now, all of a sudden RE2 is 1620CB on both machines, so what happened to native.....It's the same story for DMC5....Vgtech said DMC5 is CB4k on XBONEX whilst DF says it's native 4k....How could DMC5 be native 4k 60fps on XBONEX? certainly not with these effects, this is not Forza 7 which runs 4k 60fps on a RX 480, so it simply makes no sense...

Now here again, Leadbetter is saying Anthem is 4k Native on XBONEX, when previously he couldn't even tell the two versions apart.....How could that be when a 2080ti with a beast of a CPU is running this around the 40's at 4k...Don't these guys know the divide in FPS from a GTX 1080 to an RTX 2080ti at times, with a beastly Intel i7 CPU no less, but they think an XBONEX decked with a downclocked RX 480 and an 8-core Jaguar is only in deficit of 20fps vs a 2080ti rig at 4K? You really can't make this up......

Now whilst NX, who does this part time is able to get the differences between PRO and XBONEX right away in his analysis, we have DF just guessing away and blaming Post Processing like it's their first rodeo.........ATD, Attention to detail? This should not be guesswork, like forumners do, and it's one of the reasons why I'm a bit miffed......And of course now, there will be all sorts of excuses, but I've never heard the less successful tech outlets make any excuses whatsoever, they just deliver........That is why when Vgtech tested BLOPS 3 at 1080p 60fps on PRO, everybody thought that was a great find and played that mode till the patch came, whilst DF was simply in a "PRO performs lower than base PS4 mode" to raise ire.......So here again, looks like a rush to get a video up, but not going in depth.......I just can't understand why tech guys are so opposed to going into a menu or going through the options and testing everything thoroughly before they publish.......and it's not even like these are hard tasks, just go and uncheck a few options and test.....It's why Leadbetter's attitude towards 1080p 60fps modes on PRO through the OS puzzles me, sometimes he pretends that it's such a chore, as if he has to go tinkering in .Ini and .cfg files or do mods to get things working.....

You have no control over what devs offer, that's why options are great, yet, it's equally great for tech guys to go through menu and graphics options first and don't simply blame PP or make such excuses like it's their first analysis......We have been there before, Redout, even Sea of Thieves on base XBONES, and so many others I've highlighted, VGtech had no issues getting these right on their first try...... I'd take thorough analyses and accurate stats over guesswork anyday, especially when some analyses seems to be more about "their expectations from a console when it performs worse or blaming the devs" instead of ensuring the numbers they're giving are accurate to begin with......Less politics, more accurate analyses, is what people require....
 

onQ123

Member
I'm still going with the SDK update because even at a higher resolution Base PS4 is running a lot better than base Xbox. It's either that or these games are using techniques that PS4 is best suited for vs Xbox One.
'

Sony is probably more aggressive trying to squeeze the juice out of PS4 so that games like Days Gone & Last Of Us 2 will run smoothly when they are released.
 

onQ123

Member
Probably hasnt seen the dead or alive stats that came out today. 1080p on pro and up to 4k on 1x

Seen it & it says something about 60fps on PS4 Pro while not being able to hit 60fps at all times while running in 4K on Xbox One X. sounds to me that the rule about PS4 Pro games not being able to run worse or look worse than the base console is in play.
 
Last edited:

AnotherOne

Member
I'm still going with the SDK update because even at a higher resolution Base PS4 is running a lot better than base Xbox. It's either that or these games are using techniques that PS4 is best suited for vs Xbox One.
'

Sony is probably more aggressive trying to squeeze the juice out of PS4 so that games like Days Gone & Last Of Us 2 will run smoothly when they are released.


I disagree.. its definitely not the SDK update, it's the work of FP16 I'm sure of it.
 

Bogroll

Likes moldy games
I'm still going with the SDK update because even at a higher resolution Base PS4 is running a lot better than base Xbox. It's either that or these games are using techniques that PS4 is best suited for vs Xbox One.
'

Sony is probably more aggressive trying to squeeze the juice out of PS4 so that games like Days Gone & Last Of Us 2 will run smoothly when they are released.
How about they just can't be bothered with the Esram anymore, to much hassle.
 

onQ123

Member
I disagree.. its definitely not the SDK update, it's the work of FP16 I'm sure of it.

While fp16 can also speed up a console without double rate fp16 like base PS4 I don't think it's the case because it would have also speed up Xbox One & Xbox One X .
 

DeepEnigma

Gold Member
Yeah...I don't like milking stuff at all but, alas, sometimes you have to do what the people want (and lots of people were begging for info on the base consoles). :\

Well that's to be expected is it not? That's the largest user base out of both consoles.

As for the gaming populace, calm down on the milking requests and let dark focus on some sweet DF Retro, thanks!
 
Last edited:

EviLore

Expansive Ellipses
Staff Member
ATD, Attention to detail? This should not be guesswork, like forumners do

They are forumers though. GAF has been the HQ of this sort of pixel peeping debate for 20 years. By all means, call out faulty data or conclusions, but you seem to be setting unreasonable standards of perfection in some instances. They have limited resources and often aren't benefiting from the crowdsourced forum discussions with these video analyses.
 

Gavin Stevens

Formerly 'o'dium'
Come on EL, this guy (TLW) stirs the pot so much he may as well use a spoon branded “Sony”. Never once have I seen him post anything worthwhile without bigging up Sony and putting everything else down, each time without listening to facts, reasoning or common sense.

He’s a pain, and is talked about not just here, but other forums, Twitter etc. He’s well known.

There’s fanboys, then there’s... whatever THIS is. It’s not healthy, it’s actually embarrassing, and he has derailed too many threads and baited too many users now...
 
Last edited:

Jigsaah

Gold Member
My opinion. It's the fault of the developer. Remember when Titanfall 2 dropped their Xbox One X patch and it ran worse than the PS4 Pro version. They ended up dropping a subsequent patch and boom....fixed.

This is what should, and I expect will happen to Anthem.
 
Come on EL, this guy (TLW) stirs the pot so much he may as well use a spoon branded “Sony”. Never once have I seen him post anything worthwhile without bigging up Sony and putting everything else down, each time without listening to facts, reasoning or common sense.

He’s a pain, and is talked about not just here, but other forums, Twitter etc. He’s well known.

There’s fanboys, then there’s... whatever THIS is. It’s not healthy, it’s actually embarrassing, and he has derailed too many threads and baited too many users now...

I think this is a little hyperbolic. You’ve never once seen him post something backed up by facts and posted informatively? Literally not once?
 

Gavin Stevens

Formerly 'o'dium'
I think this is a little hyperbolic. You’ve never once seen him post something backed up by facts and posted informatively? Literally not once?

Facts? No. He posts useless information that is a fact, but has nothing to do with the topic at hand. You can sit down and tell him the Xbox has more flops than the Pro and he will spend the next 4 hours talking about how that’s wrong, quoting whatever way he can, to make his point. Even though the original question will never be answered.

I could say right now the damn plastic is of higher quality on an Xbox, and he would find me 17 scientists who each tell him that no, the plastic actually contains devils blood, and so it’s not actually higher quality.

Not a single time have I ever seen him just act like, well, a person.
 

Jigsaah

Gold Member
I think this is a little hyperbolic. You’ve never once seen him post something backed up by facts and posted informatively? Literally not once?

It is hyperbolic, but there's still some truth to what he's saying. In recent days, there have been conversations regarding shills and anti-xbox/pro-sony bias. I believe The Last Word epitomizes this.
 
Facts? No. He posts useless information that is a fact, but has nothing to do with the topic at hand. You can sit down and tell him the Xbox has more flops than the Pro and he will spend the next 4 hours talking about how that’s wrong, quoting whatever way he can, to make his point. Even though the original question will never be answered.

I could say right now the damn plastic is of higher quality on an Xbox, and he would find me 17 scientists who each tell him that no, the plastic actually contains devils blood, and so it’s not actually higher quality.

Not a single time have I ever seen him just act like, well, a person.

Another question, do you really follow him on twitter? How did that start? What are these other forums he has laid waste to? I’m honestly curious to research it myself, never knew he had such a following and such an audience.
 
Last edited:
It is hyperbolic, but there's still some truth to what he's saying. In recent days, there have been conversations regarding shills and anti-xbox/pro-sony bias. I believe The Last Word epitomizes this.

I never said there wasn’t truth to it. Not what I said at all. You don’t need to tell me this.
 

Mass Nerder

Banned
Nope. Back on-topic, please.
Another question, do you really follow him on twitter? How did that start? What are these other forums he has laid waste to? I’m honestly curious to research it myself, never knew he had such a following and such an audience.


Also, I am convinced that he is related to one of the mods. People get banned here for MUCH less than TLW's horse-crap postings.
 

Gavin Stevens

Formerly 'o'dium'
Reply banned from thread. Off-site gossip is irrelevant; engage with the user's positions or move along. Stop the dogpiling.
I don’t follow him on Twitter? I would rather follow Hitlers Holiday Snaps.

But I do read many people commenting about how much of a joke he is that he’s allowed to post here, while a LOT of people have been banned for waaaaaaaaay less.
 

thelastword

Banned
I'm still going with the SDK update because even at a higher resolution Base PS4 is running a lot better than base Xbox. It's either that or these games are using techniques that PS4 is best suited for vs Xbox One.
'

Sony is probably more aggressive trying to squeeze the juice out of PS4 so that games like Days Gone & Last Of Us 2 will run smoothly when they are released.
I'm sure there were updates done to the SDK, and I'm sure some resources have gone back to devs, but I'm not sure that alone constitutes better performance, yet, the consoles are customized differently, moreso on the PRO, so every bit helps, but I'm still thinking the XBONEX should be on top........

I still think people overestimate XBONEX's power.....I remember PS4 vanilla used to run games at a higher framerate, with more effects and a better framerate consistently, unless it was a crapshoot like RE-R2 and parite titles or gimped titles like Witcher 3...Yet we can't say anything like that about Anthem because it has the marketing with MS.....So yes, there's a possibility it gets fixed and runs better on XBOX, but so far, the stats/results are what it is..........

Just finished watching the latest Anthem video from DF, and Leadbetter said "PRO runs higher, but not necerssarily smoother", When XBONEX hits 30fps it feels better because it is capped, but the game barely runs at 30fps on XBONEX, ha ha.....
 

Shmunter

Member
Yeah...I don't like milking stuff at all but, alas, sometimes you have to do what the people want (and lots of people were begging for info on the base consoles). :\

I watch most of your videos, never too much of a good thing.


I'd love to but we don't have the hardware to do it. Monitors are one of those things that is tough to buy since they take up a lot of space. Since we work out of our homes mostly, it's tough one. If I needed a new monitor I'd find one that fits the bill. My current monitor supports FreeSync but it doesn't work with HDMI and the range is well above 60hz but only slightly below so it wouldn't help with Xbox One X anyways. Hopefully at some point one of us will get a better FreeSync display.

I could be wrong but aren’t 2018 Samsung tv’s freesync capable?
 

onQ123

Member
I'm sure there were updates done to the SDK, and I'm sure some resources have gone back to devs, but I'm not sure that alone constitutes better performance, yet, the consoles are customized differently, moreso on the PRO, so every bit helps, but I'm still thinking the XBONEX should be on top........

I still think people overestimate XBONEX's power.....I remember PS4 vanilla used to run games at a higher framerate, with more effects and a better framerate consistently, unless it was a crapshoot like RE-R2 and parite titles or gimped titles like Witcher 3...Yet we can't say anything like that about Anthem because it has the marketing with MS.....So yes, there's a possibility it gets fixed and runs better on XBOX, but so far, the stats/results are what it is..........

Just finished watching the latest Anthem video from DF, and Leadbetter said "PRO runs higher, but not necerssarily smoother", When XBONEX hits 30fps it feels better because it is capped, but the game barely runs at 30fps on XBONEX, ha ha.....

LOL but yeah I guess he is just trying to drive home the fact that he want a 30fps cap on PS4 .

I know there is difference in the way PS4 is made vs Xbox One but it's up to the SDK to let the devs take advantage of the difference.
 
What kind of PS4 SDK changes do you guys expect in 2019?

AFAIK, it's still 6.5 Jaguar cores, 4.5GB RAM and god knows how much GPU % is free for games (XB1 during the Kinect era was like 90%).

Have they released any more resources than that?
 

onQ123

Member
What kind of PS4 SDK changes do you guys expect in 2019?

AFAIK, it's still 6.5 Jaguar cores, 4.5GB RAM and god knows how much GPU % is free for games (XB1 during the Kinect era was like 90%).

Have they released any more resources than that?


Maybe better utilizing of the CPU's vector units
 

onQ123

Member
Elaborate plz?

Jaguar CPU supports AVX assembly since... day 1.

And this isn't even a PS4 exclusive feature to begin with.

I know it's not exclusive I'm saying Sony might have dug deeper into the cpu to get more out of it like ND /ICE dug deeper into the SPEs near the end of PS3 life.
 

c0de

Member
Elaborate plz?

Jaguar CPU supports AVX assembly since... day 1.

And this isn't even a PS4 exclusive feature to begin with.
The amateurs at Microsoft of course don't know about Jaguar CPU features, they still use the 32 bit mnemonics, only Sony is able to use all features of their hardware :D
 

ethomaz

Banned
What kind of PS4 SDK changes do you guys expect in 2019?

AFAIK, it's still 6.5 Jaguar cores, 4.5GB RAM and god knows how much GPU % is free for games (XB1 during the Kinect era was like 90%).

Have they released any more resources than that?
I don’t think any GPU resource is used except for gaming while gaming.

It is 100% available for game while your screen is showing the game... of course if you shift do display the system menu or other app it will use a bit but that only happens when the game is in background.
 

c0de

Member
Yet somehow they can't tell the difference , so did Sony make checkerboard rendering better on PS4 Pro?
The only thing that I can see is the technical proficiency from DICE and their frostbite engine. Their is nothing of evidence at hinting anything else.
 

Redneckerz

Those long posts don't cover that red neck boy
What's all this hoopla about, can't we see the real elephant in the room........? Originally, Leadbetter said it was 1800p on both, I didn't get to watch that video much later on from when this thread was posted, but I would surmise that most of the arguments centered around that......

The problem here, is that here again DF has guessed wrongly or given inaccurate information and they may still be doing it now......It's the reason why I said this on page 5 in this very thread....
In what way has DF given inaccurate information which suggests its a consistent thing they are doing? DF provided a guided level of analysis based on available information at the time. This means that they may end up being incorrect, as they were in the Redout situation. And that was only because Redout employs a unique resolution every 30 metres or so on the track, similar to a snapshot system really.

What i absolutely despise from these kinds of posts is that they are implying there is more to it. VG Tech is no better than DF or NX Gamer, in my opinion they supplement eachother. What you do is highlight a clear favorite for one (VGTech) and a clear disdain for another (Digital Foundry). Hell you even state it as such later in your post: ''.Perhaps I alerted them to it and they did more thorough work,, but it's strange that yet again NX was right and people write him off and swear on DF, he never got a mention, nothing.....Now I personally swear on Vgtech, I trust their Rez counting implicitly, and history would show why, but I think these outlets deserve some serious respect here.....''

Tell me: Does that put you in a position others can rely upon? I don't believe it does.

Your belittleing of DF is more antagonistic in nature than actually helpful and frankly, unneeded. Enjoy VGTech all you want but these attempts to raise bad blood when DF themselves have stated repeately to see NX and VG as colleagues makes posts like these sound incredibly bitter.

Like clockwork lol & people still trying to derail the thread
I am so glad that one of your own predictions came true and you felt the need to quote yourself for it.

Also, I am convinced that he is related to one of the mods. People get banned here for MUCH less than TLW's horse-crap postings.
Too bad your conclusions have little in the way of actual evidence. I am not fond of TLW either, especially in the way how one arrives at conclusions, but if i was going to accuse him of being a mod in disguise, i better have something to show for it.

I'm sure there were updates done to the SDK, and I'm sure some resources have gone back to devs, but I'm not sure that alone constitutes better performance, yet, the consoles are customized differently, moreso on the PRO, so every bit helps, but I'm still thinking the XBONEX should be on top........

I still think people overestimate XBONEX's power.....I remember PS4 vanilla used to run games at a higher framerate, with more effects and a better framerate consistently, unless it was a crapshoot like RE-R2 and parite titles or gimped titles like Witcher 3...Yet we can't say anything like that about Anthem because it has the marketing with MS.....So yes, there's a possibility it gets fixed and runs better on XBOX, but so far, the stats/results are what it is..........

Just finished watching the latest Anthem video from DF, and Leadbetter said "PRO runs higher, but not necerssarily smoother", When XBONEX hits 30fps it feels better because it is capped, but the game barely runs at 30fps on XBONEX, ha ha.....
Can we just add that Pro's unlocked framerate in 1080p mode is not really that amazing either, that neither console platforms delivers a really stable performance and that base XBO has atrocious performance outdoors? I am literally talking FC3 last-gen quarrels here with near consistent 20 fps.

I refuse to believe that the different memory config is to blame for that, with 8 GB DDR3 and 32 MB ESRAM.
 
Top Bottom