• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Xbox Series X Complete Specs + Ray Tracing/Gears 5/Back-Compat/Quick Resume Demo Showcase!

01011001

Banned
Probably, though doesn't MS run all games in a Hyper V container, maybe the system can spin up multiple containers. I just thought it was weird for DF to put that information out there. But there were other oddities (why tell us how many CUs are disabled), so likely nothing.

the biggest issue is RAM tho. each virtual xbox one would need around 8GB of ram
 

CatLady

Selfishly plays on Xbox Purr-ies X
I got the impression the SX controller was going to have a stupid sealed battery like the pos DS4 from the article about the new controller on the Xbox website saying it had a modern cable for charging. Assuming this guy knows wtf he's talking about it would appear that we will still have AAs, Hallelujah!!!

 

RealGassy

Banned
This looks way better than expected. CPU 8c/16t is great, 16gb is great, 1tb SSD. High/stable freqencies.

Hard to pinpoint how powerful the GPU would be. What's the peak power consumption for the GPU?

I love how MS seems to be commited to reducing input latency. And just doing a lot of things right.
 
Last edited:

FireFly

Member
Did you think before posting? Has Microsoft came out and said it's console is better than a certain gpu? No. Nvidia's said their laptop gpu is better than next gen consoles. I would much rather listen to the leader of graphics of the world, than some internet fanboy who obviously doesn't have a fucking clue. Nvidia wouldn't make a claim like that unless they know for a fact, and they definitely do.
What claim did Nvidia make exactly?

As I understand it, all they did was post a slide with ">NEXT GEN CONSOLE".

What does that ">" mean?
 

Panajev2001a

GAF's Pleasant Genius
you should, reading your comments you're coming off like a flaming jealous Sony fanboy, stop, let people be excited and hyped...

Well thanks for letting me know how I should behave ;). You are free to read into them what you want, I am excited for things exciting, reflect on what causes me to ponder, etc... not my direct or indirect aim to make people feel bad about their console allegiance and I am not doing anything to that effect, I think.
So, if you feel less hyped because of what I said, you are free to ignore what I said and/or have your group PM chat where I cannot nor really want to be there and ruin the fun. I do hope you keep discussing, staying hyped and have fun here and yes even tell me off if you believe I am out of line.

Still, you also need to realise that this is not any company’s free marketing platform and these are not fan safe spaces where nothing not exclusively positive would hurt the feelings of people automatically and has to be banned from public discourse.
 

Lort

Banned
What a disappointing demo. Minecraft, seriously?
When your RTX game brings the fastest nvidia cards to its knees and your console can match it .. you want to show that.

This is all about the technical specs and you’ll see the games soon enough..
 
What claim did Nvidia make exactly?

As I understand it, all they did was post a slide with ">NEXT GEN CONSOLE".

What does that ">" mean?
That's is the greater than sign. A mobile variant of the rtx 2080 > next gen. I'd imagine the mobile rtx 2080 would be on par with desktop 2070.
 

FireFly

Member
That's is the greater than sign. A mobile variant of the rtx 2080 > next gen. I'd imagine the mobile rtx 2080 would be on par with desktop 2070.
So in what respect is it greater? Features, power consuption, performance? If in performance, performance in what kinds of applications?
 

Cobenzl

Member
Minecraft is the biggest MS IP, it makes sense. And Minecraft with Ray Tracing is simply beautiful.
Personally, THE game I want to see running on Series X is Forza Motorsport 8.

Series X looks great, I would have loved to see some proper next-gen games running on it. All in good time I guess....
 

Mista

Banned
Well thanks for letting me know how I should behave ;). You are free to read into them what you want, I am excited for things exciting, reflect on what causes me to ponder, etc... not my direct or indirect aim to make people feel bad about their console allegiance and I am not doing anything to that effect, I think.
So, if you feel less hyped because of what I said, you are free to ignore what I said and/or have your group PM chat where I cannot nor really want to be there and ruin the fun. I do hope you keep discussing, staying hyped and have fun here and yes even tell me off if you believe I am out of line.

Still, you also need to realise that this is not any company’s free marketing platform and these are not fan safe spaces where nothing not exclusively positive would hurt the feelings of people automatically and has to be banned from public discourse.
Are you going to say the same thing tomorrow after the PS5 gets revealed?
 
Last edited:

CrustyBritches

Gold Member
The reasoning behind showing Minecraft was simple. It's a game that demonstrates the effect of ray tracing in an obvious and straight-forward manner. Most games you'd struggle to even tell the difference outside of reflections. This is where the "The Future is Chrome" type memes spawned from.
 

Mass Shift

Member
Series X looks great, I would have loved to see some proper next-gen games running on it. All in good time I guess....

I have no doubt we're going to get plenty proper next gen showcases very soon, just don't expect to play any of them until late 21
 

Panajev2001a

GAF's Pleasant Genius
Are you going to say the same thing tomorrow after the PS5 gets revealed?

If people have something not positive to say about it, I personally think it is fine as long as they try to be constructive, not just gloating and trolling the other console fans, and they are honest (last is very difficult to judge of course)... even just a bit. Else we may disagree and may discuss about it.

If some people keep posting one line trolls and fake concern drive by crap/lesbian jokes at TLoU2 for the n-th time and just throw shit to try to make other people mad for the lulz? I would react the same way with someone trolling/astroturfing MS threads with the same nasty attitude and tactics.
 
Last edited:

Mista

Banned
If people have something not positive to say about it, I personally think it is fine as long as they try to be constructive, not just gloating and trolling the other console fans, and they are honest (last is very difficult to judge of course)... even just a bit. Else we may disagree and may discuss about it.

If some people keep posting one line trolls and fake concern drive by crap/lesbian jokes at TLoU2 for the n-th time and just throw shit to try to make other people mad for the lulz? I would react the same way with someone trolling/astroturfing MS threads with the same nasty attitude and tactics.
Then whats your issue now since theres people that are doing what you just said on here? Trolling in Xbox threads is fine but if it happened in PS threads I'm going to put my warrior pants on? I don't understand...
 

Moses85

Member
I got the impression the SX controller was going to have a stupid sealed battery like the pos DS4 from the article about the new controller on the Xbox website saying it had a modern cable for charging. Assuming this guy knows wtf he's talking about it would appear that we will still have AAs, Hallelujah!!!



No, no, no!
 

Longcat

Member
I got the impression the SX controller was going to have a stupid sealed battery like the pos DS4 from the article about the new controller on the Xbox website saying it had a modern cable for charging. Assuming this guy knows wtf he's talking about it would appear that we will still have AAs, Hallelujah!!!




Yeah, unless they change it before release, AA batteries are confirmed. Check 19:02 in this video.
And he says it has a "nice clicky action" dpad, so I guess that answers my question from the previous page. Booo!
 

Panajev2001a

GAF's Pleasant Genius
Then whats your issue now since theres people that are doing what you just said on here? Trolling in Xbox threads is fine but if it happened in PS threads I'm going to put my warrior pants on? I don't understand...

Am I saying nobody is trolling? No, beside how could I exclude it, do you expect me to read every post? No.

I do think there is a certain over reaction in some threads as soon as someone breaks the party conga line, this being one of them.
 

pawel86ck

Banned
Gears 5 at 100fps and 4K ultra? I wonder how they did it, on PC not even RTX 2080S can run Gears 5 so smooth at ultra settings (just around 50fps, not to mention with even higher settings). RDNA2 is either far more efficient than I thought, or game is using some dynamic resoluton or VRS (after all VRS can offer 70% boost to performance according to 3dmark)

And it looks like SDD is even more impressive than GPU:



velocity architecture utilise SDD in order to multiply (2-3x, so around 32-48GB RAM equivalent) effective physical memory developers can use. Texture quality on next gen consoles will skyrocket compared to current gen, and I wonder how much VRAM will be needed to run next gen ports on PC (Titan RTX 24GB owners should be fine however :p).
 

FireFly

Member
Performance, raytracing, etc. Looking at the desktop space, this holds true for current gpu's, and for several years now.
Well, we know that the 5700 XT is already on par with the 2070 in performance, and that the XSX GPU has 30% more CUs than the 5700 XT, with a 4% lower "boost" clock and a 4% higher game clock. We also know from DF that in Gears 5, performance is already at ballpark 2080 levels.

And we know that the 90W version of the RTX 2080 Max-Q is clocked 35% slower than the desktop RTX 2080, which if performance scales accordingly, would put at around the RTX 2060 level.

So I doubt Nvidia was talking about rasterisation performance. They might still have faster raytracing, but them saying their technology is better doesn't prove this, since Nvidia always claim their technology is better than AMD's.
 

Kenpachii

Member
Gears 5 at 100fps and 4K ultra? I wonder how they did it, on PC not even RTX 2080S can run Gears 5 so smooth at ultra settings (just around 50fps, not to mention with even higher settings). RDNA2 is either far more efficient than I thought, or game is using some dynamic resoluton or VRS (after all VRS can offer 70% boost to performance according to 3dmark)

And it looks like SDD is even more impressive than GPU:



velocity architecture utilise SDD in order to multiply (2-3x, so around 32-48GB RAM equivalent) effective physical memory developers can use. Texture quality on next gen consoles will skyrocket compared to current gen, and I wonder how much VRAM will be needed to run next gen ports on PC (Titan RTX 24GB owners should be fine however :p).


PC has nvme drives also mate. SSD aint replacing ram mate.

But keep dreaming.

Also a 2080ti sits at 60 fps on ultra at 4k.I think they are full of shit. There is no way that thing was running above pc settings at double 2080ti performance.

It's most likely a cut down version.
 
Last edited:
Well, we know that the 5700 XT is already on par with the 2070 in performance, and that the XSX GPU has 30% more CUs than the 5700 XT, with a 4% lower "boost" clock and a 4% higher game clock. We also know from DF that in Gears 5, performance is already at ballpark 2080 levels.

And we know that the 90W version of the RTX 2080 Max-Q is clocked 35% slower than the desktop RTX 2080, which if performance scales accordingly, would put at around the RTX 2060 level.

So I doubt Nvidia was talking about rasterisation performance. They might still have faster raytracing, but them saying their technology is better doesn't prove this, since Nvidia always claim their technology is better than AMD's.
I wouldn't put Gears 5 performance in that ballpark of the 2080 at all. The game showed frame drops when the camera was moving towards the raytraced lighting. The game wasn't running 4k when they mentioned 100 fps. You'll even see really bad frame drops in Minecraft, and it looks nothing like Minecraft RTX.

If people want to believe Microsoft literally just started working on Minecraft or Gears 5.... Then what would they have planned to show us if e3, gdc, and all of the conventions were not cancelled? That really sounds fishy.

Also, AMD is debuting it's next gen graphics.... In the form of a console... Before they even reveal the pc gpu's?! I guess you can't expect much from what we have seen this far, from Microsoft or from AMD 4xx, 5xx, Vega 56/64, and now 5xxx/xt... Maybe we'll be proven wrong, but I won't get my hopes up...
 

pawel86ck

Banned
PC has nvme drives also mate. SSD aint replacing ram mate.

But keep dreaming.

Also a 2080ti sits at 60 fps on ultra at 4k.I think they are full of shit. There is no way that thing was running above pc settings at double 2080ti performance.

It's most likely a cut down version.
Yes, but PC SDD dont use hardware decompression and whole technology that went into velocity architecture. Video from my previous post explains in details why developers will be able to use 2-3x as much RAM (to make long story short, because thanks to velocity architecture they dont need to load into RAM nearly as much) .
 
Gears 5 at 100fps and 4K ultra? I wonder how they did it, on PC not even RTX 2080S can run Gears 5 so smooth at ultra settings (just around 50fps, not to mention with even higher settings). RDNA2 is either far more efficient than I thought, or game is using some dynamic resoluton or VRS (after all VRS can offer 70% boost to performance according to 3dmark)

And it looks like SDD is even more impressive than GPU:



velocity architecture utilise SDD in order to multiply (2-3x, so around 32-48GB RAM equivalent) effective physical memory developers can use. Texture quality on next gen consoles will skyrocket compared to current gen, and I wonder how much VRAM will be needed to run next gen ports on PC (Titan RTX 24GB owners should be fine however :p).

PC has nvme drives also mate. SSD aint replacing ram mate.

But keep dreaming.

Also a 2080ti sits at 60 fps on ultra at 4k.I think they are full of shit. There is no way that thing was running above pc settings at double 2080ti performance.

It's most likely a cut down version.

Definitely not 100fps @ native 4k. The PR hype train is so strong, and it's gonna be funny to see it crash and burn when comparisons get made.
 

VFXVeteran

Banned


Thanks for this. It's funny how DF is very careful not to specify what resolution they expect games on consoles to run at. That is a big elephant in the room. I'm not sure people will want to go back to 1080p@30FPS for a full RTX-enabled game. There just isn't enough bandwidth to run at 4k res.
 

pawel86ck

Banned
Definitely not 100fps @ native 4k. The PR hype train is so strong, and it's gonna be funny to see it crash and burn when comparisons get made.
It sure looks really strange. I'm not so sure if even 2080ti can run Gears 5 at 100fps (ultra settings). RDNA2 is either far supperior than people thought or they have used some clever tricks and gimmicks just to show off.
 

M1chl

Currently Gif and Meme Champion
I got the impression the SX controller was going to have a stupid sealed battery like the pos DS4 from the article about the new controller on the Xbox website saying it had a modern cable for charging. Assuming this guy knows wtf he's talking about it would appear that we will still have AAs, Hallelujah!!!


Nah Scorpion has Scorpion in the console, does not has to mean anything.
 
Thanks for this. It's funny how DF is very careful not to specify what resolution they expect games on consoles to run at. That is a big elephant in the room. I'm not sure people will want to go back to 1080p@30FPS for a full RTX-enabled game. There just isn't enough bandwidth to run at 4k res.
You realize you will see every game in 4K as your output right?

3c4lry.jpg
 

StreetsofBeige

Gold Member
Thanks for this. It's funny how DF is very careful not to specify what resolution they expect games on consoles to run at. That is a big elephant in the room. I'm not sure people will want to go back to 1080p@30FPS for a full RTX-enabled game. There just isn't enough bandwidth to run at 4k res.
Please include option to turn off RTX!

Not every game needs every pixel flashlighting into your retina.
 

Panajev2001a

GAF's Pleasant Genius
Then whats your issue now since theres people that are doing what you just said on here? Trolling in Xbox threads is fine but if it happened in PS threads I'm going to put my warrior pants on? I don't understand...

There are people doing that, they get called out, others just discussing and people throwing a hissy fit because they thought this thread was on a private official Xbox forum server or something... the latter I have an issue with and would with the same behaviour in reverse, but if you were intent to sort of pick up a fight over this not sure what I can do about it 🤷‍♂️. Unless I am mistaking your attitude here, sorry about that if so.

Drive by shitty trolling and then victim card <Company>GAF shouting is not fine in either camp, but if TBiddy TBiddy or you wanted to pass my posts with a fine toothed comb... fine. Fill your boots :). You may discover I am not... 🤯... perfect ;).
 

FireFly

Member
I wouldn't put Gears 5 performance in that ballpark of the 2080 at all. The game showed frame drops when the camera was moving towards the raytraced lighting. The game wasn't running 4k when they mentioned 100 fps. You'll even see really bad frame drops in Minecraft, and it looks nothing like Minecraft RTX.
DF's Gears 5 performance claims were based on the results from the game's internal benchmark tool, so they should be objective. Hopefully DF will be able to release the results, as they say they will be doing another video on this soon. Minecraft was running at 30 FPS - 60 FPs and the jitters were caused by the encoding method.

My guess is Microsoft is still saving their firepower for E3-time and these demos were thrown together to take advantage of us all being at home! Is it a coincidence that Sony is doing the same thing tomorrow?
 
Last edited:
the biggest issue is RAM tho. each virtual xbox one would need around 8GB of ram

Maybe not. Parts of the dash could be paged to SSD or elsewhere. And you aren't going to watch Netlfix PiP with the processing and overlay running on this thing.

Game runs on this, your social media shit runs on your phone (you social twat!!).

4 X 5 GB per game means 20 GB.

Cloud instance could manage 20GB with frequency reduced 16 GBit ram at 12 Ghz.

For all the important game running shit that's fine.

PC has nvme drives also mate. SSD aint replacing ram mate.

But keep dreaming.

Imagine thinking that a PC NVME was the same as what's talked about here.

Imagine thinking that peak PC performance rates are sustainable in perpetuity regardless of drive, or that just any PC could run the compression, paging and blending operations in real time regardless of performance impact.

Imagine being that guy.

Also: I expect PS5 to be as forward thinking, or perhaps even more. Peak means little, sustainable means everything.
 
It sure looks really strange. I'm not so sure if even 2080ti can run Gears 5 at 100fps (ultra settings). RDNA2 is either far supperior than people thought or they have used some clever tricks and gimmicks just to show off.

This alone disproves every crazy claim in this thread in regards to being comparable to an rtx 2080. I mean, do people think the series x is double the performance of a 2080 Ti or Rtx Titan?! :messenger_grinning_squinting: :messenger_grinning_squinting: :messenger_grinning_squinting:.

LvaoY5D.png


maIPtn8.png


d41sfnS.png
 

CrustyBritches

Gold Member
1. DF stated, "we were shown benchmark results that, on this two-week-old, unoptimised port(Gears 5), already deliver very, very similar performance to an RTX 2080."

2. 16GB of RAM with all the new memory-saving techniques and SSD tech will be fine. Even the 2080 Super has "only" 8GB VRAM and it should be more performant than XSX. PC to console for memory requirements is like comparing apples-to-oranges(i.e. - X1X has 12GB RAM total, 9GB for games. Try playing games on a PC with RX 580 8GB and 4GB system memory and see what happens).

3. Path tracing is extremely hard on performance. Quake II RTX, Path tracing demo on 2060 Super:
Quake-2-RTX-Remaster-Screenshot-Open-GL.jpg

Quake-2-RTX-Remaster-Screenshot-Path.jpg


From around 1000fps to 43fps at 1080p.
 
Last edited:
I was looking at the photos at xbox wire, the console is amazing and all but I just realized that it doesn't have an optical out port on the back... How am I going to connect my sound system to this now?
 

VFXVeteran

Banned
2. 16GB of RAM with all the new memory-saving techniques and SSD tech will be fine. Even the 2080 Super has "only" 8GB VRAM and it should be more performant than XSX. PC to console for memory requirements is like comparing apples-to-oranges(i.e. - X1X has 12GB RAM total, 9GB for games. Try playing games on a PC with RX 580 8GB and 4GB system memory and see what happens).

Nothing is ever enough. We should all know that by now. 16G of memory isn't enough for the visuals that everyone wants. The next-gen consoles won't have all available memory just for graphics. The PC has a separate memory pool for CPU stuff (i.e. running the game, loading levels, etc..). If we shoot for ray-tracing, it will be memory bandwidth limited. 1080p will surely be the standard for the consoles while the high end GPUs will be able to render at true 4k with much larger memory footprints.
 

pawel86ck

Banned
Nothing is ever enough. We should all know that by now. 16G of memory isn't enough for the visuals that everyone wants. The next-gen consoles won't have all available memory just for graphics. The PC has a separate memory pool for CPU stuff (i.e. running the game, loading levels, etc..). If we shoot for ray-tracing, it will be memory bandwidth limited. 1080p will surely be the standard for the consoles while the high end GPUs will be able to render at true 4k with much larger memory footprints.
And you forgot to add no console will match 1080ti :messenger_tears_of_joy: (y) .
 
Top Bottom