• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Series X Won't Be Competitive With NVIDIA's DLSS Unless There's a MIRACLE ~ Digital Foundry

Shmunter

Member
@Hendrick's ok let me be honest. I thought you were cerny, and I was role playing as a Sony fanboy. Sorry for the confusion.
Everyone is a Sony fanboy, even if they don’t admit it. Why? Because every gamer appreciates quality.

Point in proof.

Before
PC fans - “ps games are walking simulators, they suck”
Signs of Sony porting games to PC
PC fans - “oh yeah, yeah, gimme, gimme. “

Before
PC fans - “Sony ssd is pointless, it’s overkill, it’s all about dem flops”
Pc tech announced for next gen I/o
PC fans - “ oh yeah, yeah, gimme, gimme the ssd I/o”

The above is purely factual and unfolding in real-time.

Must clarify, don’t mean to generalise pc fans - many are nothing like the above
 
Last edited:
Everyone is a Sony fanboy, even if they don’t admit it. Why? Because every gamer appreciates quality.

Point in proof.

Before
PC fans - “ps games are walking simulators, they suck”
Sing of Sony porting games to PC
PC fans - “oh yeah, yeah, gimme, gimme. “

Before
PC fans - “Sony ssd is pointless, it’s overkill, it’s all about dem flops”
Pc tech announced for next gen I/o
PC fans - “ oh yeah, yeah, gimme, gimme the ssd I/o”

The above is purely factual and unfolding in real-time.

Must clarify, don’t mean to generalise pc fans - many are nothing like the above
You can check my post history. I've said HZD felt like a knock off of an ubisoft game. But I'll welcome all of Sony's library. More options for everyone who want to dig their hands in. I just don't like how so many gatekeepers are ruining the forums because they are triggered about this happening.
 
Last edited:

Reallink

Member
This forum thinks that checkerboarding is as good as native 4K though, so isn't DLSS irrelevant?

While this sounds ridiculous in theory, it's mostly true in practice. With consoles, probably >90% of people sit too far from too small a screen to even come close to distinguishing native 4K, nevermind the differences Vs. 1800p CB. Then there's the reality of >50% of the population needing vision correction.
 
Last edited:

Redlight

Member
They're making the mistake in assuming that people that buy consoles will just buy a PC if it gets all the console games. That's not true, never has been true, and never will be true. The GPU alone that you need for this costs almost as much as the Series X will.

This puzzles me too. It's not an 'either or' situation. There's some crossover sure, but for the most part people are either console gamers or PC gamers.

I could afford a gaming PC over a console but I've been down that road before and frankly, it's more hard work than I'm prepared to put in. The ease and certainty of console gaming, from a comfy couch, is super appealing to me.

And before anyone asks, no, I'm not going to put a PC next to my TV.
 
This puzzles me too. It's not an 'either or' situation. There's some crossover sure, but for the most part people are either console gamers or PC gamers.

I could afford a gaming PC over a console but I've been down that road before and frankly, it's more hard work than I'm prepared to put in. The ease and certainty of console gaming, from a comfy couch, is super appealing to me.

And before anyone asks, no, I'm not going to put a PC next to my TV.
You can have it in the next room over, while comfortably gaming from the couch.
 

vkbest

Member
Not sure what they mean it doent have the hardware. It has machine learning hardware. How does df explain ms adding hdr to older games? Same hardware that processes the entire screen can be used for upscaling

The performance for Machine Learning on XBOX Series X using full GPU only to accelerate ML, is around of the half performance of tensor cores from a 2060. So they (and Sony) have specific hardware for this task, or they will need implement a software solution, no way they are using so much power from GPU for scaling.
 
Last edited:

llien

Member
The performance for Machine Learning on XBOX Series X using full GPU only to accelerate ML, is around of the half performance of tensor cores from a 2060.
Oh, boy, do you want a cookie for spreading FUD?

Dear god...

It is about something called "inference".
You do NOT need tensor cores for "Inference"

(they do help when training NN, but that happens in datacenters, not on gamer GPUs)

Why, on planet earth, are you making the shit up?
 
Last edited:
They say we don’t know if they’re working on it. But with Xbox they say they’re confident they are and since we got a teardown of the Xbox they know there’s no hardware to do it


My take was that they were saying since the rtx cards do it so much better than XSX you don't need a Xbox to play their games and in likely every case the pc will do it better. PS5 is "in a class of it's own" because you still need one to play most of it's games. GPU sophistication notwithstanding you can't play a PS5 game without one so the point for Sonys machine is moot.

Would a PC do it better? Of course. Can you play PS5 exclusives on your PC? Not yet.
 

On Demand

Banned
I mean that is almost exactly what @VFXVeteran said..... TLOU1.

I added in BB for something nice if they are being nice and I'm being optimistic there.

However I fully expect Sony to think twice about everything after this Nvidia presentation.

There's nothing really for Sony to think about since they were never going to fully support PC anyway. PlayStation brings in a large amount of revenue and profits for the company. They need to keep selling hardware and making exclusives. They've said this as much.

No matter what anybody or any vEterAnzss says otherwise.
 
Last edited:

vkbest

Member
Oh, boy, do you want a cookie for spreading FUD?

Dear god...

It is about something called "inference".
You do NOT need tensor cores for "Inference"

(they do help when training NN, but that happens in datacenters, not on gamer GPUs)

Why, on planet earth, are you making the shit up?

That is training the model, but you need execute that model, that required too power. You don't know what is the cost on power to execute that model for each frame using shaders
 
Oh look, DF guys, who do not see DLSS artifacts "for some reason" and got that lovely Ampere exclusive preview, hype AI Upscaling even further.
How surprising.
What's with your anti Nvidia rhetoric? I've noticed it in several of your posts now.

Nvidia>AMD. DLSS 1/2>AMD sharpening. Nvidia performance>AMD performance. These pills shouldn't be hard to swallow by anyone at this point.
 
Last edited:

llien

Member
That is training the model, but you need execute that model, that required too power
It requires LAUGHABLE power that even Smarpthones have (depends on the network complexity, of course), but, more importantly, it does not need tensor cores.

AND Microsoft mentioned in their slides that they have hardware to do inference (and that it needs laughably small number of transistors to do) without specifying any metric. So nobody could possibly compare that to what any other product has.

DLSS 1/2>AMD sharpening.
Oh, 1 as well? Quite refreshing.

anti Nvidia rhetoric?
Anti BS rhetoric, it's just NV tends to use smoke and mirrors a lot.
Sorry if it hurts your feelings.
 
Last edited:

fybyfyby

Member
Digital Foundry's latest video recapped the NVIDIA 30XX conference.

In regards to the $499 3070 Digital Foundry claims the card will offer Superior GPU Performance compared to the Xbox Series X and that is before you even factor in DLSS.

They say Microsoft is working on image upscaling tech similar to DLSS, but that Xbox Series X doesn't have dedicated hardware and it won't be competitive with what NVIDIA has short of a Miracle


They also point out that we don't know if Sony is working on similar tech yet.

Really disappointing if AMD doesn't have something that comes close to DLSS. I get that they're years behind, but it's such a huge leap that if this is true then it won't be long before we're getting another refresh.

Full Video


Excerpts




What bothers me more than XSX or PS5 is position of AMD. I really wanted AMD to be more highend competitive next gen. Just for sake of gpu prices and market balance.

But its true, that nVidia cards (especially 3070) have now great value for money. At least officialy. Im looking forward for independent tests. And when AMD releases its big Navi, I will consider VEGA successor.

Next year Intel probably releases its own GPU, but probably wont wait for it.
 
Anti BS rhetoric, it's just NV tends to use smoke and mirrors a lot.
Sorry if it hurts your feelings.
I'll take smoke and mirrors all day long with superior ray tracing. You'll get the smoke effects with high quality particles, while being reflected off those mirrors. All in real time. Even the first rtx demos killed AMD ray tracing demo. If AMD can't even hold a candle to Nvidia in rasterization, you'll be shocked to see how big the difference is in ray tracing performance alone.
 

CrysisFreak

Banned
Well why would you expect AMD to pull a legit competitor out of their ass?
These things take time and money and Nvidia just started much earlier. AMD needs to catch up and they're not going to do that in a year.
I also don't think this is going to be a deal breaker in the console space at all.
 

UnNamed

Banned
Oh look, DF guys, who do not see DLSS artifacts "for some reason" and got that lovely Ampere exclusive preview, hype AI Upscaling even further.
How surprising.
Simply not true, for example Death Stranding video with visible artifacts on little objects moving fast.
 

M1chl

Currently Gif and Meme Champion
So like, what is this?

h5uizhwonmh51.jpg
 

vkbest

Member
It requires LAUGHABLE power that even Smarpthones have (depends on the network complexity, of course), but, more importantly, it does not need tensor cores.

I haven't seen similar scaling (or image reconstruction) in real time with ML on smartphones yet, are you?

There are software on PC/Mac for this, and you need several second for a single image

Anyway, iPhones have had tensor cores for years.

Im sure they can create a more simple model for this, but no way have the same quality that Nvidia more complex using Tensor Cores.
 
Last edited:
What's with your anti Nvidia rhetoric? I've noticed it in several of your posts now.

Nvidia>AMD. DLSS 1/2>AMD sharpening. Nvidia performance>AMD performance. These pills shouldn't be hard to swallow by anyone at this point.

We really need to wait to see what AMD got with RDNA2 cards in about 1 months time. They will have an answer to DLSS and reveal their own RT performance/method. All this losing our minds because of Nvidia's marketing and nebulous figures is premature.
 

TonyK

Member
DLSS looks awesome. Problem with these features in PC is they are not standard. Only a few games implement it. Instead, any feature presented by consoles will be used for all games.

Same for the new IO technology presented by Nvidia. It looks amazing but we need to admit that it won't be used for all games. It simply doesn't matter for a lot of developers because there will be not enough install base from a hardware standpoint. Again, instead in consoles all games will be done with the new SSD/IO in mind.
 

MrFunSocks

Banned
That's ridiculous. Of course they care. MS gets a cut of everything sold on xbox platforms.
Theh know that the diehard PC gamers aren’t going to buy their console though, that’s why they’re releasing Pc games now. They’re making money from an audience that they previously didn’t, and they’re not losing money because people aren’t flocking from console to pc.
 

llien

Member
I'll take smoke and mirrors all day long...
Apparently

I haven't seen similar scaling (or image reconstruction) in real time with ML on smartphones yet, are you?
NN inference is used in many ways, upscaling just one of them.
But if you insist on image manipulation, Google's pixel is obviously using it to "enhance" images and so does iPhone.
Face detection is mostly, if not exclusively, ML based.

Anyway, iPhones have had tensor cores for years.
Tensor cores are not needed to do inference.
.Again, Microsof has EXPLICTLY called out that have dedicated piece of silicon for that (and how laughably little it needs to work).
There are no metrics on how large a network it could crunch, there could be no direct comparison to ANYTHING because of that.
It wasn't meant for any single kind of ML activity either (Microsoft called out, god forbid, NPC AI :D)

They will have an answer to DLSS and reveal their own RT performance/method.
I don't know why people care about either. RT... how many games are using that? How many Tesla users are enabling that?
Performance, power consumption and price are the only things that matter to me personally.

Anyhow, I don't think AMD would focus on upscaling tech at all (they have successfully used their checkboard rendering for a while though, FidelityFX was also great and cross-GPU), RT will likely be touched on, since it's such a popular buzzword at this point (the way VR was).
 
Last edited:

geordiemp

Member
Seeing Digital foundry paid article by Nvidia going full retard.

Everybody has already seen perfect upscaling, it requires no ML cores and is good as anybody really wants

Can anyone pick faults in below temporal application ?



So the right temporal technique is just as good as Machine learning and its available to all consoles and PC next year.
 

Shmunter

Member
Seeing Digital foundry paid article by Nvidia going full retard.

Everybody has already seen perfect upscaling, it requires no ML cores and is good as anybody really wants

Can anyone pick faults in below temporal application ?



So the right temporal technique is just as good as Machine learning and its available to all consoles and PC next year.

The marble demo shown by Nvidia is 1440p/30 also. IQ is the definitive metric, no longer pixel counts.
 
This puzzles me too. It's not an 'either or' situation. There's some crossover sure, but for the most part people are either console gamers or PC gamers.

I could afford a gaming PC over a console but I've been down that road before and frankly, it's more hard work than I'm prepared to put in. The ease and certainty of console gaming, from a comfy couch, is super appealing to me.

And before anyone asks, no, I'm not going to put a PC next to my TV.

When every Xbox game releases day and date on PC with potential for much higher graphics settings, Series X is pitched against gaming PCs. You choose whether the perks of a weaker streamlined box without a bloated Windows OS is worth it over a more expensive PC.

This is why MS' marketing about being really powerful has been misplaced again, as they're pairing this message with 'Gamepass box' as a strategy. At this point it would be better to go all in and promote the Series S: cheap streaming box that moves it out of the range of PS5 and gaming PCs.
 

UnNamed

Banned
There are good alternative solutions to DLSS.
For example checkerboard rendering is not good as DLSS, but at the same time is an upscale with "information" unlike normal upscaling, that's why it look so good.

Using others upscaling solutions with information would be good enough.
 
Last edited:

geordiemp

Member
The marble demo shown by Nvidia is 1440p/30 also. IQ is the definitive metric, no longer pixel counts.

IQ is the definative metric, my point is there is not just one upscaling method to rule them all.

Whats amusing is Marble demo is 1440p30 upscaled and Land of Lumen is 1440p upscaled, its wjat do you call them, 4k with DLSS and 4K with temporal ?
 

ZywyPL

Banned
DLSS is Nvidia's trump card, and unless AMD and Intel figure out a similar tech they won't be much of a competition for NV, but strangely (and luckily), the Ampere prices are so damn attractive this time around despite having no competition at all it's not even funny.
 

UnNamed

Banned
And Switch 2 will likely have Tensor cores and use DLSS 2.0 (or even the next gen of that). Scenes when a ~3 TF docked console delivers virtually the same visual output as the XSX and PS5.
3 TF on mobile would be very hard. 1024 cores at 1500mhz? Not impossible but very hard, especially for Nintendo which don't use high end hardwares. According to the Nvidia road map, Nano Next (2021) is for cheaper devices, maybe Orin but in 2022.

Then, you need a mobile CPU powerful as Ryzen on PS5. You can't run next gen games on a old quad core, even if your game run at 540p, which is the minimum resolution for DLSS.
 

RoadHazard

Gold Member
I watched the video, and DF thinks series x will have some ML, quite powerful but not as secret sauce as DLSS.

They also talked about ps5 lack of confirmation for ML upscaling. They give the impression ps5 hardware is the laggard but if you want sony games, then you gotta take its bad with its good. Feels like Sony will be in a Nintendo like position in terms of power v games 🤷‍♀️

Yeah, a 17% difference in GPU performance is definitely the same thing as being an entire generation or more behind.
 
Top Bottom