• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is Microsoft using a Nvidia GTX 7xx in the Xbox One demo units?

Rolf NB

Member
Just like how the Xbox 360's were running on 2 Power Mac G5's

1522954541785052.JPG
Except these were actually slower hardware. They were running Radeon x800 Pros. 16 pipe, 475MHz, 160 million transistor GPUs. Xenos is a 16 pipe, 500MHz, 232 million transistor GPU. So the real thing ended up faster, with more features to boot. And they even were from the same vendor!

What they're using now to "demonstrate" Xbox One games is at least twice as fast as actual Xbox One hardware on the GPU side.
 

v1oz

Member
Judging by the cooler, that's either a GTX 780 or a Titan (it's probably the Titan as only it has enough VRAM) which are more than 3x faster than the Xbox One GPU, why would MS use PCs with such a massive performance jump over the real hardware?

To make their hardware look competitive at E3. I'm sure the games wouldn't have run smoothly if they had used unfinished dev kit hardware to show them.


But Microsoft is in trouble even their TV features and UI were were faked at the earlier Xbox One reveal.
 

Erasus

Member
Ever since his meta-meta RAM posts, Horse Armour has become my favourite poster.

Anyway, the interesting point here isn't that it runs on a PC (everything does!), it's that it runs on a PC with a Nvidia GPU.

(And, well, that the PC is running Windows 7 and not 8 is also somewhat amusing)

This. The are ofc running at console settings, 1080p and 60fps for now because they have had devkits for a while and know how that works and can get the same settings out on XOne.

Im sure its not running anything higher as people would notice.
The weird thing is that its not an AMD card as the games have still been made with devkits in mind = AMD code.
 

XiaNaphryz

LATIN, MATRIPEDICABUS, DO YOU SPEAK IT
NVIDIA AM SAL ... Wait, what?

I would expect them to at least use AMD, haha

The weird part is that it's not an AMD GPU.

Despite the hardware on the devkits and retail units, developer workstation PCs will likely still have Nvidia Quadro cards. Nvidia pretty much owns the professional/enterprise/corporate development market.
 
Do people honestly think this is some type of conspiracy? I'd believe such a theory if Microsoft's games were the only graphically intense games at the show, but 3rd party games weren't exactly looking too shabby. Unless you think 3rd party developers are sharing the burden of the coverup.

No conspiracy, more another case of MS rushing shit and not being ready.

Dev kits sure, but this definitely not a dev kit :lol
 
Except these were actually slower hardware with less features. They were running Radeon x800 Pros. 16 pipe, 475MHz, 160 million transistor GPUs. Xenos is a 16 pipe, 500MHz, 232 million transistor GPU. At least they were from the same vendor!

What they're using now to "demonstrate" Xbox One games is at least twice as fast as actual Xbox One hardware on the GPU side.

That's funny because I keep hearing that the final games ran worse(link?) even though they ran on worse hardware...
 
The lowest end 700 series right now is the GTX 770, which is a little faster than a GTX 680

So it' still got a lot more power than what's in the Xbone

This is assuming they don't have access to unreleased tech from nvidia, though since they are in bed with AMD for their consoles I doubt they would.

oh of course. my brain didn't register just how new that series is. doh.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I would bet 80% of ps4 titles and demos are running on pc builds too.
Its how the industry works at E3 and especially with next gen console coming out in 4~5 months.

Unless you have evidence, don't spread FUD. Dev units with near final hardware have been out in the wild for six months on both sides. There is literally no reason to run a PC version unless you are just trying to pull the wool over people's eyes. These are PC versions of games, not dev units running console code.

It doesn't matter who does it and which E3 it is, it is deceiving unless the demo is clearly labeled.

The Ryse dev is on record (at B3D) that the demo and floor demos are all XB1 hardware, good for his honesty. We should expect honesty, not lies and then help them spin by parroting "dev unit", "it happens all the time". What a bunch BS.

And the 2005 E3 with PPC x800 dev kits is not at all like a full blown PC with Nvidia. One is a dev kit with like architecture and target specs, the other is a way overpowered PC with different architecture, OS, etc.
 
Because GPU doesn't matter when you're blocked from accessing low level functions and have to use DirectX.

Pleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejokingpleasebejoking
 

Smash88

Banned
Also keep in mind the only cards that are labelled like that are the:

Titan, 780, and 770. But the latter two of those cards only launched within the past few weeks. It is unlikely they only threw this together recently, so these devs kits are almost certainly running the GTX Titan.

The $1000 GTX Titan

The leaked Xbone specs make it most comparable to the $140 Radeon 7790.

Make your own conclusion as to what a Titan in there really means.

I have a Titan, looks like I'm set if that's the case. But looking at it closer, the way it looks is not like a Titan. Definitely looks like a GTX 680.

nvidia-gtx-680.jpg


versus.

GTX-TITAN.jpg


That's definitely a GTX 680, and not the GTX 700 series or Titan.
 

Lord Error

Insane For Sony
they're all like that even at the ps4 booths, adam boyes said there are few actual ps4s and just getting the one for GB after hour took some convincing.
First of all, there's no evidence of this. Second, even if it's true it doesn't matter if the hardware in the PCs is closely representative of the retail console hardware, which in this case would be grossly misrepresented.
 

Fezan

Member
I think they are runing virtualized xbone sdk on these pc. thats y they are using high end gfx card because they are emulating games
 

Crisium

Member
I have a Titan, looks like I'm set if that's the case. But looking at it closer, the way it looks is not like a Titan. Definitely looks like a GTX 680.

Aye. The Titan has some silver color. The GTX 680 though?

http://cdn2.ubergizmo.com/wp-content/uploads/2012/03/nvidia-gtx-680.jpg

I agree with you. That's the best match. GTX 670, 660, etc also have say "GeForce GTX" on them, but the power connectors are closer to the middle. The power connectors on that pic are at the end, so it can only by the 680.

I've been looking at pictures of GTX cards for a while now... based on the color and power connector locations, it is almost certainly a 680 like you said.

Which is still more than double an Xbone.
 

Erasus

Member
From what I understand devs do framerate optimization last. Even if they could get enough devkits there, the games would not run of look like final games. They are running the games on console settings but on PC hardware to get the framerate out.

Hell, I remember a story from Naught Dog how they had to fix a streaming problem on Uncharted 2 the week before it went gold.
 

Fezan

Member
From what I understand devs do framerate optimization last. Even if they could get enough devkits there, the games would not run of look like final games. They are running the games on console settings but on PC hardware to get the framerate out.

Hell, I remember a story from Naught Dog how they had to fix a streaming problem on Uncharted 2 the week before it went gold.
it was u3
 

commedieu

Banned
From what I understand devs do framerate optimization last. Even if they could get enough devkits there, the games would not run of look like final games. They are running the games on console settings but on PC hardware to get the framerate out.

Hell, I remember a story from Naught Dog how they had to fix a streaming problem on Uncharted 2 the week before it went gold.

From what I understand, you didn't read the thread at all.
 
this wont end well for Micro once they rush and throw this shit in the oven with DRM and all that oh man oh man M is gonna cook shiti3st cake ever
 

iamblades

Member
It is also getting to the point where you can't even argue that this is acceptable because of 'coding to the metal' on consoles when Xbone has 3 different OSes running at the same time and has a significant amount of system resource(3GB of ram and how knows how many clock cycles) dedicated to the application side at all times to facilitate the TV and multimedia stuff..

Console OSes are quickly becoming as bloated as PC OSes.

If game devs are willing to show prerendered footage and outright lie that it is a gameplay demo, we should hardly be surprised that they would show the PC version of a game and pass it off as the console version.
 

tarheel91

Member
Despite the hardware on the devkits and retail units, developer workstation PCs will likely still have Nvidia Quadro cards. Nvidia pretty much owns the professional/enterprise/corporate development market.

Workstation graphic cards are not what you'd be using to test run a game on. Workstation graphic cards are for 3D Modeling, not actually running games. I think I also saw a comparison in which a 7990 outperformed pretty much everything in typical workstation applications.
 
I am not surprised, damage control against the more powerful PS4 (also more mature hardware according to #TRUTHFACT) in the most important pre launch conference about games.

People are going to be disappointed when they see the games running in the XB1 compared to what was shown in the conference.
 

Caayn

Member
Have you considered that they are dealing with unoptimized games and they might be just brute forcing them with processing power?
This people, this.

Come on people use a bit of logic for once. Do you people honestly think that Microsoft is stupid enough to fake graphics that their final product isn't capable of achieving? Besides if the games were directly coded on a GTX7XX or Titan the games would like better.
 

Crisium

Member
Looks like a EVGA 670 FTW 2GB

The reference 670s have the power connectors toward the middle. The Evga 670 FTW is using a 680 design. I mean that could be the card, but they'd probably go reference which makes it more likely the 680. But that Evga card is as fast as a stock 680 anyway... so we're back in 680 territory no matter what. More than double the Xbone.

The Xbone will ketchup with secret sauce, don't worry guys.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I think they are runing virtualized xbone sdk on these pc. thats y they are using high end gfx card because they are emulating games

xzibit_meme_7219-s510x334-157605.jpg


I doubt that. Why bother with apha dev kits last year if they could just visualize the whole thing with any PC? They shipped 7970s to get devs GCN cores to work with.

So it is ok to fake a demo with a PC, just say it is unoptimized code and in four month it will be fine with real hardware? That's not fair to the people who showed real software on the real hardware warts and all.
 

iamblades

Member
Workstation graphic cards are not what you'd be using to test run a game on. Workstation graphic cards are for 3D Modeling, not actually running games. I think I also saw a comparison in which a 7990 outperformed pretty much everything in typical workstation applications.

The AMD cards do outperform the Nvidia ones in OpenCL. The problem is that some apps are still reliant on CUDA, so that's why Nvidia. Nvidia are very good at locking people into their proprietary APIs like CUDA and Physx.
 

mkenyon

Banned
This people, this.

Come on people use a bit of logic for once. Do you people honestly think that Microsoft is stupid enough to fake graphics that their final product isn't capable of achieving? Besides if the games were directly coded on a GTX7XX or Titan the games would like better.
MAGIC PIXIE DUST NOT YET APPLIED.

MUST USE PROPER GAMING MACHINE.
 

charsace

Member
people are going nuts for nothing right now. Devs can run windows phone games on emulators. MS could have built an x1 emulator for devs.
 
Top Bottom