• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is Microsoft using a Nvidia GTX 7xx in the Xbox One demo units?

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Notice the silver on the 780. That is nowhere to be found in the PC pic. It is a 680.

I'm not going by the hard-to-make-out pic, I'm going by the comments of the guy that took the pic, he said it was a 780.
 

coldfoot

Banned
Maybe I misunderstood your point. I thought you were making the point that a much, much better GPU was negated by access to lower level APIs.
That's not what I meant at all, although GPU overhead is real, it's not Titan -> 7790 level LOL.
I meant using PC's would be fine since the GPU commands in Xbone code would be plain DX11.1 and would work on Windows whether it's AMD or Nvidia.
 
Notice the silver on the 780. That is nowhere to be found in the PC pic. It is a 680.

It amazes me how much people pay attention to the details of the cards shell. Unless it is labeled GTX Titan or 680 I would only know by going to my computer properties lol.
 

sleepykyo

Member
Do people honestly think this is some type of conspiracy? I'd believe such a theory if Microsoft's games were the only graphically intense games at the show, but 3rd party games weren't exactly looking too shabby. Unless you think 3rd party developers are sharing the burden of the coverup.

I wouldn't put it past them. Ubisoft did the same thing before where it was either AC3 or Crysis2 was running on pc and then crashed. Or the time that Peter Hines claims the PS3 version of Skyrim was at parity. Or the time SquareEnix shopped 360 buttons onto the PS3 version of FF13. The video game industry seems to have no qualms about deception at all.
 

Durante

Member
I meant using PC's would be fine since the GPU commands in Xbone code would be plain DX11.1 and would work on Windows whether it's AMD or Nvidia.
Though that's true, performance portability, particularly of custom shader code, is not nearly that straightforward. If I was making a game for a device with GCN architecture, I'd prefer developing on a 7xxx AMD GPU.
 
I know I keep saying this in every thread but I think people should prepare for a definite difference between what they saw at E3 and what they get at launch.
 

Crisium

Member
I'm not going by the hard-to-make-out pic, I'm going by the comments of the guy that took the pic, he said it was a 780.

Ok. But he's wrong. It's a bit blurry, sure, but the gray is quite obvious and leaves no room for misinterpretation:

http://i.ytimg.com/vi/LA8OJUdV_D0/0.jpg

Zoom into the OP's pic again. There is no way you would not see that silver. I'm not trying to be hostile, I just want to know what card it really is and it's not a 780.
 

dr_rus

Member
...Which actually brings us to the question of how long do you think it'll take for PS4/XBO PC emulators to appear this time around? I mean, the code is already native in the next generation, all you have to do is to emulate API calls which shouldn't be too hard - at least not even close to being as hard as PS2 or PS3 or Xenon PC emulation.
Interesting times ahead.
 

Ploid 3.0

Member
Ugh... can you imagine if this actually happens.

Or that the difference between the PS4 and the X1 is bigger than expected.

If it's anywhere near true, it can't be that bad. Hideo Kojima mentioned it might be possible for him to make MGS5 look better than the trailer showed for the next gen consoles, although it is a cross gen game. I believe that any possible plus in using the better hardware was mainly to have a smooth framerate and max 1080p resolution, making the games that use that setup very presentable. You remember the complaints early games like the Halos with poor framerate got? Think about how bad it would be to introduce your new hardware with games that have crappy framerates.
 

Vestal

Gold Member
I thought there were smarter people around here. If they are emulating the Xbox one off course you need more powerful hardware you are EMULATING.

its like GAF has taken crazy pills this week
 

RE_Player

Member
Rumored to be 1/7 but I'd rather have them allocate 1.5-2 just to be safe and then give space back later in the gen.

Yup I feel the same way. The RAM split for PS3 is different now than in years past after tons of patches to the OS. While these are the numbers now I can see them improving in years to come.
 

StevieP

Banned
I know I keep saying this in every thread but I think people should prepare for a definite difference between what they saw at E3 and what they get at launch.

What's worse is when they said "in engine" and ... well yeah. People are not going to be pleased at launch. lol
 

FuturusX

Member
I wouldn't put it past them. Ubisoft did the same thing before where it was either AC3 or Crysis2 was running on pc and then crashed. Or the time that Peter Hines claims the PS3 version of Skyrim was at parity. Or the time SquareEnix shopped 360 buttons onto the PS3 version of FF13. The video game industry seems to have no qualms about deception at all.

FF13 button gate was amazing :)
 

TheD

The Detective
That's not what I meant at all, although GPU overhead is real, it's not Titan -> 7790 level LOL.
I meant using PC's would be fine since the GPU commands in Xbone code would be plain DX11.1 and would work on Windows whether it's AMD or Nvidia.

If the xbone is just using straight DX11.1 commands that can be run by a PC you can kiss coding to the metal goodbye.

I thought there were smarter people around here. If they are emulating the Xbox one off course you need more powerful hardware you are EMULATING.

its like GAF has taken crazy pills this week

You would not be emulating it, it would be fucking stupid to emulate x86 on x86 unless you really had to!
 

HoosTrax

Member
So Nvidia now has a case for all that trash talk earlier?
Nvidia's got your salt right here:

tumblr_inline_mhkml7LMbC1qawqek.png
 
First thing that came to my mind was games spec'd for lower end cards running on higher end cards for stability.

Take a game like HL2, You can run that looking virtually identical on a Titan and GT9800 with the exception of resolution and out of game configurations. My guess is these dev units are built with the intention of having performance problem free gameplay for reviewers. The actual games will still look very similar on final hardware.
 

wildfire

Banned
The Ryse dev is on record (at B3D) that the demo and floor demos are all XB1 hardware, good for his honesty. We should expect honesty, not lies and then help them spin by parroting "dev unit", "it happens all the time". What a bunch BS.

This should be in the OP. I was giving MS latitude because what they are doing isn't abnormal. But to claim one thing while doing another should be called out.
 

Ploid 3.0

Member
...Which actually brings us to the question of how long do you think it'll take for PS4/XBO PC emulators to appear this time around? I mean, the code is already native in the next generation, all you have to do is to emulate API calls which shouldn't be too hard - at least not even close to being as hard as PS2 or PS3 or Xenon PC emulation.
Interesting times ahead.

It's going to be hard to emulate xbox one games without the cloud. I bet the full games aren't on the disk, like Diablo 3. Maybe you use the rest or a key item from the cloud which is stored in temporary memory on the xbox and deleted after the 24hour check don't go through.
 
I thought GAF was a firm believer that Carmac's "coding to the metal" gave you 2x the performace of PC. So there should be no problem using a 2x more powerful gfx card right?

In any case, I don't see why this is a big deal. As multiple people have pointed out E3 2005 had non-dev kit hardware to display games also, so there's already precedent. The only thing you could say is that they are using this to have games at a higher resolution etc overstating the specs, but that's a pretty hard accusation to prove and fundamentally the onus is on you to prove it.
 

Vestal

Gold Member
If the xbone is just using straight DX11.1 commands that can be run by a PC you can kiss coding to the metal goodbye.



You would not be emulating it, it would be fucking stupid to emulate x86 on x86 unless you really had to!
you still need to emulate de underlying xbone os
 

iamblades

Member
If the xbone is just using straight DX11.1 commands that can be run by a PC you can kiss coding to the metal goodbye.

Coding to the metal has been gone for years in the traditional sense of assembly optimization mostly, and it is certainly gone now that Xbone OS is running on top of a hypervisor alongside a whole other OS. It is too much work for too little results and require the kind of in depth hardware knowlegde that very few of today's programmers have.

The main advantage of a unified hardware platform for games has always been that you know down to the last byte exactly how much memory you have available to you. This piece won't change.
 

Timedog

good credit (by proxy)
The xbone is turning out to be a trainwreck of just mindblowing proportions. It's pretty amazing to watch.
 

TheD

The Detective
you still need to emulate de underlying xbone os

That would not need a powerful GPU!

Coding to the metal has been gone for years in the traditional sense of assembly optimization mostly, and it is certainly gone now that Xbone OS is running on top of a hypervisor alongside a whole other OS. It is too much work for too little results and require the kind of in depth hardware knowlegde that very few of today's programmers have.

The main advantage of a unified hardware platform for games has always been that you know down to the last byte exactly how much memory you have available to you. This piece won't change.

I was meaning coding to the metal on the GPU (avoiding DX calls to a driver), you can still code in assembly if you are running on a top of a hypervisor.
It is a hypervisor, not something like a java VM.
 

StevieP

Banned
Thicker than DX11.1 on PC?

I don't think so. Thicker than most consoles prior however, I would wager (the whole box works through a hypervisor). Throw away the "2x more powerful than equivalent hardware" cards now, though, in all cases.

Does this apply to the PS4 games as well?

Yes. When you hear the word "in engine" - be weary. It happened many, many times.
 

Salacious Crumb

Junior Member
No it's not. It's a GTX 680. Plain and simple. I know what a GTX Titan or 700 series looks in a case and that's not it. There is no silver casing on the one shown in the picture.

More than that doesn't the "GEFORCE GTX" light up green on the titan style coolers?
 

Vestal

Gold Member
That would not need a powerful GPU!
you all think everything is some conspiracy.

they call up hp say hey I need this hardware with these minimum requirements what you got?? Ok sure send me 50 of those. Thanks bye. Aslong as they are staying true to the visuals and performance of the console there is NOTHING TO SEE HERE.

seriously wtf is this. People are condeming MS like they are hidding something, when this company flat out came out and said all the shit you all seem to hate so much. They didn't hide it till 2 days before release, they gave you the info upfront for you to digest and make an informed decision
 

Paradicia

Member
A console that's still in development running on a dev kit specced to console settings in order to keep parity at a huge event like E3? I for one am shocked.
 
Top Bottom