• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is Microsoft using a Nvidia GTX 7xx in the Xbox One demo units?

HoosTrax

Member
More than that doesn't the "GEFORCE GTX" light up green on the titan style coolers?
Yes. And there's also the fact that on those cards, there's almost no gap between the "GEFORCE GTX" text and the top (i.e. backside) of the card, just a sliver of silver color. There's a massive amount of black between the text and the top/back of the cooler in the pic.

Don't know why some people are ignoring all evidence that it's not a Titan or a 700 series.
 

ekim

Member
Early devkit documentation states that MS was aiming for the practical performance of a GTX680. So what you can bruteforce in a 680, should work on a XBONE.
 

TheD

The Detective
you all think everything is some conspiracy.

they call up hp say hey I need this hardware with these minimum requirements what you got?? Ok sure send me 50 of those. Thanks bye. Aslong as they are staying true to the visuals and performance of the console there is NOTHING TO SEE HERE.

seriously wtf is this. People are condeming MS like they are bidding something, when this company flat out came out and said all the shit you all seem to hate so much. They didn't hide it till 2 days before release, they gave you the info upfront for you to digest and make an informed decision

So instead of responding to what I stated in a normal manner, you make a crazy "leave MS alone" post full of horrible spelling and grammar mistakes....
 

Crisium

Member
A console that's still in development running on a dev kit specced to console settings in order to keep parity at a huge event like E3? I for one am shocked.

Using a card that is more than twice as fast is parity?

Games that a GTX 680 can keep at 60fps, an HD 7790 cannot.

I guess the good news is that they should be able to just half the frame rate to 30fps and keep the same visual quality. Good thing most games advertised 60fps, anyone?
 

Kerub

Banned
Because it's the closest thing in terms of PC hardware that would be close to Xbox One cloud performance. Not close to performance equity of course, but just to give a general idea. PS4 could be compared to a PS3 in higher resolution, hence the budget price for budget gamers.
 
Early devkit documentation states that MS was aiming for the practical performance of a GTX680*. So what you can bruteforce in a 680, should work on a XBONE.

*Not including frame-rate, level of detail switching, resolution, antisotropic-filtering and anti-aliasing.
 

Zeth

Member
you all think everything is some conspiracy.

they call up hp say hey I need this hardware with these minimum requirements what you got?? Ok sure send me 50 of those. Thanks bye. Aslong as they are staying true to the visuals and performance of the console there is NOTHING TO SEE HERE.

seriously wtf is this. People are condeming MS like they are hidding something, when this company flat out came out and said all the shit you all seem to hate so much. They didn't hide it till 2 days before release, they gave you the info upfront for you to digest and make an informed decision

I agree wholeheartedly with this post. This kind of thing is present at every E3, more-so for console reveal years. This is just fuel for the fire regarding the Xbone dog piling.
 

Sean*O

Member
Everyone at MS has turned into idiots, or they feel so untouchable they just do not care. This whole mess could have been avoided by simply building your demo kiosks without the damn 'Bling Bling' window display case.
 

ekim

Member
Using a card that is more than twice as fast is parity?

Games that a GTX 680 can keep at 60fps, an HD 7790 cannot.

I guess the good news is that they should be able to just half the frame rate to 30fps and keep the same visual quality. Good thing most games advertised 60fps, anyone?

You know that the final box wont have the same overhead as on a PC? You can easily save performance in a closed environment.
 

Caayn

Member
you all think everything is some conspiracy.

they call up hp say hey I need this hardware with these minimum requirements what you got?? Ok sure send me 50 of those. Thanks bye. Aslong as they are staying true to the visuals and performance of the console there is NOTHING TO SEE HERE.

seriously wtf is this. People are condeming MS like they are hidding something, when this company flat out came out and said all the shit you all seem to hate so much. They didn't hide it till 2 days before release, they gave you the info upfront for you to digest and make an informed decision
As much as I agree with you, I doubt that people will stop. Ever since the Xbox One reveal, people have been screaming at every chance they got whenever Microsoft did something/announced.

lol my 680 will run circles around the Xbone.
Yet you forget that games aren't optimized to run on a GTX680. Were as they are optimized to run on the GPU of the Xbox One, greatly improving the Xbox One's performance.
 

Paradicia

Member
Using a card that is more than twice as fast is parity?

Games that a GTX 680 can keep at 60fps, an HD 7790 cannot.

I guess the good news is that they should be able to just half the frame rate to 30fps and keep the same visual quality. Good thing most games advertised 60fps, anyone?

At console settings? Sure. The likes of forza will still run at 60fps when launch comes.

The card has some overhead, which is pretty common in any dev kit.
 

TheD

The Detective
You know that the final box wont have the same overhead as on a PC? You can easily save performance in a closed environment.

It will not be even close to that level, especially so if the API is even thicker than the 360's.
 

Crisium

Member
Anything is possible. But when even crossfire 7790s are no match for a 680, we better have some sexy secret sauce.


And some might even argue that a 7790 is too generous based on current leaked specs, but I think it's the closest card we can assign it.
 

ekim

Member
lol my 680 will run circles around the Xbone.

So an GPU similar specced as the PS3's could run Uncharted 2 with all the overhead of PC Apis and Drivers? I don't think so. People should stop comparing raw flops of a PC GPU and a console GPU.
 

TheD

The Detective
So an GPU similar specced as the PS3's could run Uncharted 2 with all the overhead of PC Apis and Drivers? I don't think so. People should stop comparing raw flops of a PC GPU and a console GPU.

PS3 first party games use a lot of the Cell's power.
 

Vestal

Gold Member
So instead of responding to what I stated in a normal manner, you make a crazy "leave MS alone" post full of horrible spelling and grammar mistakes....
gtfo with the spelling bs I'm on a freaking phone atm. If that's the best you got as a response then might as well not respond at all.

You guys are pulling every little thing and trying to make a big deal out of nothing.

When you are emulating calls to hardware you need more powerful hardware to do it. If its calls particularly coded for amd gpu then that may create a larger overhead to compute in an nvidia gpu
 
Windows and API overhead brings a GTX 680 down to the effective performance of a HD7770 with a custom OS? Man that's a lot of wasted potential. MS needs to work on those ridiculously inefficient APIs.
 
gtfo with the spelling bs I'm on a freaking phone atm. If that's the best you got as a response then might as well not respond at all.

You guys are pulling every little thing and trying to make a big deal out of nothing.

When you are emulating calls to hardware you need more powerful hardware to do it. If its calls particularly coded for amd gpu then that may create a larger overhead to compute in an nvidia gpu

Oh indeed, apparently you need 4Tflops to do it.
 
gtfo with the spelling bs I'm on a freaking phone atm. If that's the best you got as a response then might as well not respond at all.

You guys are pulling every little thing and trying to make a big deal out of nothing.

When you are emulating calls to hardware you need more powerful hardware to do it. If its calls particularly coded for amd gpu then that may create a larger overhead to compute in an nvidia gpu

Then why use nvidia?
 

sk3tch

Member
It could easily just be that they tuned the software (games) down a bit in the settings to approximate Xbox One capabilities. Not necessarily a reason to get too bent out of shape here. Especially considering the rumors earlier about Microsoft have eSRAM yield issues...
 

TheD

The Detective
gtfo with the spelling bs I'm on a freaking phone atm. If that's the best you got as a response then might as well not respond at all.

You guys are pulling every little thing and trying to make a big deal out of nothing.

When you are emulating calls to hardware you need more powerful hardware to do it. If its calls particularly coded for amd gpu then that may create a larger overhead to compute in an nvidia gpu

You did not respond to my post in your post (yet you quoted me), instead going on a rant!
I am damn well going to call you out for it!

Or they could of done what they are likely to be doing..... running a native PC version!
They have no reason at all to get systems with nvidia GPU!
It would take longer to write a translation layer than it would be to just find an AMD GPU!
 

Paradicia

Member
You did not respond to my post in your post, I am damn well going to call you out for it!

Or they could of done what they are likely to be doing..... running a native PC version!

Why would a dev preparing a launch title under pressure, go out for their way to build a native PC version? Makes no sense.
 

TheD

The Detective
Why would a dev preparing a launch title under pressure, go out for their way to build a native PC version? Makes no sense.

Likely because they also have a PC version of the game, it is common practice to do so (very much so with a console that is not 100% done yet).
 

Woo-Fu

Banned
I guess Nvidia really is TWIMTBP. doh, beat by 2 posts.

I don't see any problem using better equipment in development machines, it saves a lot of time.
 

Vestal

Gold Member
You did not respond to my post in your post (instead going on a rant!), I am damn well going to call you out for it!

Or they could of done what they are likely to be doing..... running a native PC version!
They have no reason at all to get systems with nvidia GPU!
It would take longer to write a translation layer than it would be to just find an AMD GPU!
holy hell are you serious?!?! Do you think they have had final hardware since the beginning? Off course not they developed an emulator which you guessed it emulates the performance of your final hardware.

you are not going to write an os to run natively on a PC if you have a perfectly good OS in w7 to run it.

guess why this is not running on w8, because these emulators were probably developed for w7 with specific calls to emulate behaviour
 
From Twitter:

Jonathan_Blow: P.S. It is not true as the article says that "all E3 demos run on hi-end PCs". The Witness was running on PS4 dev hardware, and it looked to me like all the other PS4 games were running on dev kits as well.

artenvelope (sucker punch dev): @Jonathan_Blow Yup, we were definitely on a dev kit.

To people talking about emulation, no, early PC devkits don't "emulate" the final specs, they simply have custom drivers/firmware that mimic the APIs planed to be used when proper dev kits are available.
 

eorl

Banned
At least it means we could see a PC port in the future? Or that this is what we can expect from PC titles which is really cool.
 

TheD

The Detective
holy hell are you serious?!?! Do you think they have had final hardware since the beginning? Off course not they developed an emulator which you guessed it emulates the performance of your final hardware.

you are not going to write an os to run natively on a PC if you have a perfectly good OS in w7 to run it.

guess why this is not running on w8, because these emulators were probably developed for w7 with specific calls to emulate behaviour

Maybe you should stop speaking about something you clearly know nothing about?!

Making an emulator takes a fuck ton of work and not having the final hardware would make it even harder!

It is easier to write a PC version!

Also, anything programmed for Windows 7 should work on Windows 8, otherwise 100,000s of PC programs would be broken on Win8!
 

SilentFlyer

Member
From Twitter:

Jonathan_Blow: P.S. It is not true as the article says that "all E3 demos run on hi-end PCs". The Witness was running on PS4 dev hardware, and it looked to me like all the other PS4 games were running on dev kits as well.

artenvelope (sucker punch dev): @Jonathan_Blow Yup, we were definitely on a dev kit.

I remember The Order: 1886 developer stated on GT interview that they brought the Devkit andit is running in-game in-engine realtime.
 

x3sphere

Member
I remember Project Spark had an actual Durango kit version string displayed the bottom right corner of the screen, so I don't think all XB1 demos were running off PCs.
 

Vestal

Gold Member
Maybe you should stop speaking about something you clearly know nothing about?!

Making an emulator takes a fuck ton of work and not having the final hardware would make it even harder!

It is easier to write a PC version!

Also, anything programmed for Windows 7 should work on Windows 8, otherwise 100,000s of PC programs would be broken on Win8!
dude wtf are you smoking??



what do you think the devs have been working with when they got the first few dev kits?!?!?! A big PC with an emulator. Just like to develop for a windows phone they gave you an emulator.

stop talking about stuff you have no clue about, and I do mean absolutely no fucking clue about.
 

beril

Member
Why wouldn't they use nvidia if they're demoing on PC anyway?
Does it really make any difference when coding in DirectX? Other than AMDs drivers being crap and their shader compilers a lot more fussy that is
 

L.O.R.D

Member
maybe those PC are just using emulator for Xbox one to run the game
there is alot of both that use the actual Xbox one
51bb25a432f85.jpg


if that true, hacker could easily hack the system and make emulator, especially if the xbox one use x86 processor
 

TheD

The Detective
dude wtf are you smoking??



what do you think the devs have been working with when they got the first few dev kits?!?!?! A big PC with an emulator. Just like to develop for a windows phone you gave an emulator.

stop talking about stuff you have no clue about, and I do mean absolutely no fucking clue about.

Kind of funny that the system was not a fucking dev kit then!
It was just a normal PC!

The reason you need an emulator to dev for windows phone is because they run on fucking ARM CPUs!
 
Top Bottom