• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VG247: Xbox 720: Blu-ray, 4-6 core CPU, '2 GPUs', Kinect as standard, net required

Erethian

Member
I don't believe the 'always-on' bit for a second.

I don't think a company like Microsoft would deny themselves sales by implementing something like this.

With how often it gets mentioned in rumours some sort of "phone home" system for anti-piracy (or even anti-used games) seems more and more likely, though.

Plus you hear what developers say when asked about the next-generation and so much of it is having the experience integrated into online so they can have more direct communication with the consumer and all that stuff. As well as tie DLC/microtransactions and other additional revenue streams into games, even more so than they are now.
 
Any rumour that suggests a multi GPU setup as a legitamite solution for next generation can safely be discarded. We have rumour threads with legitamite info in them now, there's no need to waste time on wind ups like this. Maybe someone has a legit source but the one interpreting the message needs to be able to interpret what is being implied. A console releasing with dual modern GPUs is the wrong interpretation, adding masses of redundancy to a design is not how you build a console.
 

gofreak

GAF's Bob Woodward
2 GPUs? uh. I can see it to be like what Intel did starting with their Arrandale CPUs or Nvidia Hybrid for laptops at some point, an integrated GPU in CPU for lower power consumption while doing normal tasks (not gaming), and a main GPU for gaming, maybe the weaker low power consumption one can assist the main GPU too for some tasks. otherwise I don't see the point of having two high end GPUs, power consumption and heat generated would be impractical for a small console box.

Maybe a thought in light of rumours of a second 'lite' platform. Maybe two application profiles, one using a much smaller companion GPU only to spare your energy consumption. Seems maybe an expensive way to go about things though.
 

Hazaro

relies on auto-aim
2 GPUs? I'm skeptical of the need, especially with a recent GPU and low idle consumption.
Especially because at low power... power scales so well for performance.
 
I've always predicted that Kinect will be standard next gen as shitty as it may sound, I hope not all games are required to use it.

I doubt any nonexclusive third party game would require the use of Kinect, unless Sony decides to bundle their version of it onto the PS4. Though the chances of all Xbox exclusives requiring it is pretty high, I'm sure they'd love to herald that "feature". Remember all that six-axis shit at the beginning of the PS3's life?
 

manzo

Member
2 GPUs - I wonder what kind of memory bus is needed for them to work together as separate rasterizers.

Edit: also how in the God's name are they going to cool that solution? AND make the package under 200w? Starts to feel like a leftover April fool's joke.
 

Nightbringer

Don´t hit me for my bad english plase
Perhaps I am wrong but... is it possible that the IGN rumor with the HD 6670 as the GPU and the 6X jump could be true and instead of a single HD 6670 in the die we could have a dual setup of them making the 6X in performance instead of the 3X of a single HD 6670. Well, it seems that AMD Kaveri will be equipped with a 7750 inside of its die, we are talking about a 1.5B transistor GPU and this is the same number of transistors than two HD 6670.
 

Log4Girlz

Member
Any rumour that suggests a multi GPU setup as a legitamite solution for next generation can safely be discarded. We have rumour threads with legitamite info in them now, there's no need to waste time on wind ups like this.

Normally I would agree with you wholeheartedly...but ShockingAlberto's post had me pondering. We'll see how it goes, the dual GPU set up could be a stand in for what will eventually be put in, or it could be MS going "Batshit insane". Because that is what two GPU's is...fucking insane, nonsensical and bordering on just stupid :p
 

DarkChild

Banned
Perhaps I am wrong but... is it possible that the IGN rumor with the HD 6670 as the GPU and the 6X jump could be true and instead of a single HD 6670 in the die we could have a dual setup of them making the 6X in performance instead of the 3X of a single HD 6670. Well, it seems that AMD Kaveri will be equipped with a 7750 inside of its die, we are talking about a 1.5B transistor GPU and this is the same number of transistors than two HD 6670.
MS will have custom chip. They will trow out all the things they don't need from regular GPU and put as much as steam processors they physically can. You won't be seeing 6670 GPU in nextbox.
 

OMG Aero

Member
If the console required an internet connection to work that would be fucking crazy. Don't the majority of users never connect to the internet? I remember reading about the 3DS before that something like 40-50% of people connected them to the internet and that was apparently very good for a video game device.
 

itsgreen

Member
2 GPUs? I'm skeptical of the need, especially with a recent GPU and low idle consumption.
Especially because at low power... power scales so well for performance.

Could be a performance GPU and a lower tier, low power, GPU.

So they can totally disable the performance GPU for dashboard work and video stuff.
 
If this rumour is true, I think it's important to note that this effectively means the system is basically anti-used games as well. All publishers have to do is implement an online pass system with each release, where you have to cough up to play.

Very underhand on microsofts part as they can deny the system will not play used games but effectively it won't, at least without some transaction occurring.
 

Log4Girlz

Member
If the console required an internet connection to work that would be fucking crazy. Don't the majority of users never connect to the internet? I remember reading about the 3DS before that something like 40-50% of people connected them to the internet and that was apparently very good for a video game device.

Don't worry, there will be a 3G enabled version so that anyone can sign into their account :p
 

luffeN

Member
If it will "only" have four cores and two of them are already reserved, then a dual core CPU for games sounds kind of weaker than what we have now in the 360, doesn't it?

Or could a dual core outperform the tri-core setup?

If it has six cores, then that's another story.
 

rdrr gnr

Member
If 2013 is indeed the intended release date, what does this mean for this year's E3 -- if anything at all?

Nothing seems terribly surprising if you think about it. These anti-piracy/anti-used games/general DRM rumors seem to be too pervasive to ignore. Blu-ray is unexpected by some accounts but no surprise overall; neither is built-in Kinect.
Probably dual GPU in dev kit to emulate chip not yet produced. They did that with 360 too. Started with Ati 9800 -> Ati x800 -> 2x6800 Ultra.
Didn't even think of this. Nice catch.
 
Normally I would agree with you wholeheartedly...but ShockingAlberto's post had me pondering. We'll see how it goes, the dual GPU set up could be a stand in for what will eventually be put in, or it could be MS going "Batshit insane". Because that is what two GPU's is...fucking insane, nonsensical and bordering on just stupid :p

Modern GPUs already have masses of parallelism across the pipeline and are built to be inherently scalable. You gain nothing by doubling up on redundant hardware.
 

Dabanton

Member
Normally I would agree with you wholeheartedly...but ShockingAlberto's post had me pondering. We'll see how it goes, the dual GPU set up could be a stand in for what will eventually be put in, or it could be MS going "Batshit insane". Because that is what two GPU's is...fucking insane, nonsensical and bordering on just stupid :p

I keep seeing ShockingAlberto's post referenced where is it?

As search is borked at the moment.
 

gofreak

GAF's Bob Woodward
Maybe something like two custom Kaveris bolted together is possible? With the segregation vs one larger chip chosen due to being able to switch off one whole chip for a certain application profile. Would that be worthwhile vs one larger SoC or a discrete CPU and discrete GPU?

I dunno. Just throwing things out there that might make sense of the rumour if we take it at face value (although maybe we shouldn't!). Two Kaveri-likes would fit the bill of two 7xxx grade GPUs.


Modern GPUs already have masses of parallelism across the pipeline and are built to be inherently scalable. You gain nothing by doubling up on redundant hardware.

Well that is true...there would be a certain amount of wasted overhead. But maybe worth it for power consumption gains in low-power-profile apps? Dunno.
 

ElFly

Member
Two GPUs is kind of silly.

Unless we are talking about asymmetrical crossfire based on AMD's APUs and they managed to leave the small one in charge of post processing or something like that.
 

SmokyDave

Member
Although I'm sure they'd love to, I can't see MS or Sony releasing a console that needs an internet connection. Nor can I see them disabling used games. It would be commercial suicide.
 

Log4Girlz

Member
Modern GPUs already have masses of parallelism across the pipeline and are built to be inherently scalable. You gain nothing by doubling up on redundant hardware.

But isn't there a limit to how large an individual chip you want to put into a system? Isn't that why there are multi-core GPU's in smart phones? You're talking to a layman here.
 

Nirolak

Mrgrgr
We're going to get several more rumors about this system fairly soon to my knowledge.

Comparing and contrasting should help.
 
Maybe one GPU is used for rendering stuff and the other is used for non rendering stuff maybe like decompressing textures for better streaming and physics like physx.

Or who knows what microsoft is planning maybe they made dx12 so we can use two different GPU better.
 

luffeN

Member
Of course. Just because you have fewer cores, doesn't mean that each core couldn't be significantly more powerful.
So the 360 has 3.2 GHz tri core, how high do the dual cores need to go? Like 4 GHz or what? xD

We're going to get several more rumors about this system fairly soon to my knowledge.

Comparing and contrasting should help.
There should also be one story from the page aegis is working at, right?
 
Could be a performance GPU and a lower tier, low power, GPU.

So they can totally disable the performance GPU for dashboard work and video stuff.

AMD's current designs all ready have highly efficient power and clock gating built in as well as dedicated video decoding hardware. They already sip trivial amounts of power when not playing games, you don't need an extra GPU to accomplish the same result.
 

Pistolero

Member
IF things are to be believed -No April fool- and IF the two graphics solutions suggested are a temporary substitute for an upcoming part (final silicone) then the latter must be quite the powerful GPU...
 
Although I'm sure they'd love to, I can't see MS or Sony releasing a console that needs an internet connection. Nor can I see them disabling used games. It would be commercial suicide.

I can see them locking parts of used games as standard. With per console unlock codes.

But I don't know if an always-on internet connection is widespread enough. That may be commercial suicide.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Less than 80% of households even have access to broadband in the United States.

That's cutting out about 20% of your potential households that have this system. Good luck with that.
 
Maybe one GPU is used for rendering stuff and the other is used for non rendering stuff maybe like decompressing textures for better streaming and physics like physx.

Or who knows what microsoft is planning maybe they made dx12 so we can use two different GPU better.

Modern GPUs can already do all of that stuff at once. You're just reducing your overall FLOPs budget by introducing extra redundant hardware this way, it makes no sense.
 

Yagharek

Member
Requirements for always online internet connection will go down well when Xbox live accounts are still being hacked so often. Now instead of losing your spacebucks, you'll lose access to disc games as well, at least as long as your account is disabled.
 

Hazaro

relies on auto-aim
AMD's current designs all ready have highly efficient power and clock gating built in as well as dedicated video decoding hardware. They already sip trivial amounts of power when not playing games, you don't need an extra GPU to accomplish the same result.
This. Idle power has just been going down and down and down.
We are talking about 2-12W idle loads on cards that cost $450, can draw 225W, and spit out MW2 at Max settings @ 1080p @ 130FPS

http://www.anandtech.com/show/5476/amd-radeon-7950-review/16
 

luffeN

Member
Although I'm sure they'd love to, I can't see MS or Sony releasing a console that needs an internet connection. Nor can I see them disabling used games. It would be commercial suicide.
Something like that already happens in Windows 8. You can choose to sign in with your Windows Live account / MSN account instead of the normal offline user account. Some apps also require for you to be online to use them.

They won't completely disable them. Just pay a fee to play the rest imo. Depends on how much the rest is going to be.

The 7990 is 6 GB, 3 per GPU. It's also going to cost around $850 later this month.

http://www.techpowerup.com/mobile/1...ured-Specs-Confirmed-in-GPU-Z-Screenshot.html
I think he meant main memory.
This. Idle power has just been going down and down and down.

http://www.anandtech.com/show/5476/amd-radeon-7950-review/16
So it is either the dev kit thing we talked about or something like the Crytek guy wanted with extra APUs for sweat etc.?
 
Top Bottom