• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumored Chinese Forum Xbox720 specs: 8CoreCPU,8GB,HD8800GPU,W8,640GBHDD

If they did, I'd be amazed if the laughing had subsided before now.

rsxbandwidth.jpg
 

gofreak

GAF's Bob Woodward
If they did, I'd be amazed if the laughing had subsided before now. Unless it's measured in a completely different manner to how it would normally be done.

It was. That was back when GPU flops weren't just programmable shader flops, but counted every single last operation performed by logic anywhere on the GPU, fixed or otherwise. nVidia tended to be looser in how they counted their numbers too IIRC.
 
- super advanced tech by IBM-AMD-MS. Design started back in 2007 as you said in 2010.
- It has APU+GPU on every SOC(system on a chip).
- It has 2 of this SOCs
- Each CPU in APU has 8 cores with 4 thread in each. So tolal we have 2xSOC x 8Cores x 4 threads = 16cores x 4thread=64 threads.
- Each GPU in SOC is 8xxx GPU(Mars or Venus) with advanced tech not available on PC some time
- Xbox has some other hardware features. Like RayTraycing chip(VTE on this picture).
- The goals of the system design are - universal for different type of tasks, easy to parralellism, easy to fully load, zero bottlenecks an future proof as Xbox 360 design with low energy consumtion..
- It should be about 4+terraflops. Of couse it will be better of every high end PC available in 2013-2014 and maybe 2015. With optimisation it have to be on pair with PC 8 years.
360orgasm.gif
 

Raist

Banned
If they did, I'd be amazed if the laughing had subsided before now. Unless it's measured in a completely different manner to how it would normally be done.

I bet they added figures from the cell, given how it was supposedly planned to do a lot of gfx stuff.
 

Raist

Banned
Read this on another forum, this has to be too good to be true?

This is the latest rumor. The guy thinks his source is too good to be true:

- super advanced tech by IBM-AMD-MS. Design started back in 2007 as you said in 2010.
- It has APU+GPU on every SOC(system on a chip).
- It has 2 of this SOCs
- Each CPU in APU has 8 cores with 4 thread in each. So tolal we have 2xSOC x 8Cores x 4 threads = 16cores x 4thread=64 threads.
- Each GPU in SOC is 8xxx GPU(Mars or Venus) with advanced tech not available on PC some time
- Each SOC has 1.5Gb of GdDR5 memory with additiond 5 gb of shared ddr3/ddr4.
- Each SOC has EDRAM. Previosly you said about 110 mb
- Xbox has some other hardware features. Like RayTraycing chip(VTE on this picture).
- The goals of the system design are - universal for different type of tasks, easy to parralellism, easy to fully load, zero bottlenecks an future proof as Xbox 360 design with low energy consumtion..
- Some part will be upgradable later.
- It should be about 4+terraflops. Of couse it will be better of every high end PC available in 2013-2014 and maybe 2015. With optimisation it have to be on pair with PC 8 years.
- There will be Kinect 2.0
- There will be illumiroom tech as additional feature after the launch.

This is real. Also, games will be free.

And if you pre-order online, your console will be delivered by a woman/man who will be willing to let you do anything with their body.
 

Pein

Banned
This is real. Also, games will be free.

And if you pre-order online, your console will be delivered by a woman/man who will be willing to let you do anything with their body.

what kind of woman, I don't wanna be jipped and get a haggard old lady.
 
It was. That was back when GPU flops weren't just programmable shader flops, but counted every single last operation performed by logic anywhere on the GPU, fixed or otherwise. nVidia tended to be looser in how they counted their numbers too IIRC.

Funky maths, gotcha.
 

Maxrunner

Member
How does a SSD improve games beyond loading times?

My slow 1TeraByte hdd makes some games stutter because of acess times i think....obviously on consoles is a bit different but the point is to say that PC will aways be on top. Although saying this i'm perfectly aware that consoles dictate some of the games that end up being on PC. I just hope Sony doesn't charge 599€ again, 399€ max for me.
 

thuway

Member
This is coming from Beyond 3D right now. BG was a GAFFER and a reliable source early on:

I don't recall lherre verifying me, though at the same time I don't see why he would or would need to. Like I've mentioned before I just pass along what I think maybe ok to pass along and is worth discussing. I will add that last year one consistent thing I heard was that the overall performance between PS4 and Xbox 3 was debatable as to which one was better. There will probably be things PS4 does better than Xbox 3 and vice versa. I think things are still pointing in this direction and it will boil down to consumer preference.

I would also like to know where the large die info came from. Just for curiosity purposes.
http://beyond3d.com/showpost.php?p=1694412&postcount=18613

There you go folks. It's a wash.
 

Corky

Nine out of ten orphans can't tell the difference.
There you go folks. It's a wash.

Like this gen then. Didn't stop fanboy warzZZz erupting over minuscule things like contrast discrepancies between platforms, let alone actual performance differences.
 

thuway

Member
...ahh, but did the PS4 have 4GB of stupidly fast RAM last year?

It doesn't matter. There is too much psycho babble that "DURANGZO WILL HAVE 3 TF++++" or "ORBIS has a 1.84 TF GPU!!!".

The shit needs to stop. Devs are saying they are equal in most respects. GAF needs to let that sink in. Choose the one with the exclusives you want :).
 

jaypah

Member
It doesn't matter. There is too much psycho babble that "DURANGZO WILL HAVE 3 TF++++" or "ORBIS has a 1.84 TF GPU!!!".

The shit needs to stop. Devs are saying they are equal in most respects. GAF needs to let that sink in. Choose the one with the exclusives you want :).

I'm just gonna choose both because yay games! I hope it is close so we get some competition on the first party front. MS has been very quiet with their studios so hopefully that means they're about to step it up. They lost a lot of my play time to Sony in the second half of this Gen because Sony had more exclusives that I wanted to play. XBLA was great but they need to get back to diversifying their retail output.
 

i-Lo

Member

They are astounding, esp. when it's been rendered in real time. It's amazing how these things, with poly level nearly identical to what's assigned per character today (around 40K) on the high end look better here than they do in the game. And it clicks, it's all about the lighting.

Tell me, do you have more like real time renders as examples? Also, thanks.
 

thefil

Member
All these spec leaks made me sad because I just updated my PC (i5 3570K + 660Ti) and I thought I would be set for next-gen. Why did they have to make Durango/Orbis so powerful? :(
 

RoboPlato

I'd be in the dick
It doesn't matter. There is too much psycho babble that "DURANGZO WILL HAVE 3 TF++++" or "ORBIS has a 1.84 TF GPU!!!".

The shit needs to stop. Devs are saying they are equal in most respects. GAF needs to let that sink in. Choose the one with the exclusives you want :).

Agreed in full and I personally am very happy about them being pretty much equal. They're just getting to the final performance in different ways, which will make things interesting but one won't be significantly better than the other.
 
They are astounding, esp. when it's been rendered in real time. It's amazing how these things, with poly level nearly identical to what's assigned per character today (around 40K) on the high end look better here than they do in the game. And it clicks, it's all about the lighting.

Tell me, do you have more like real time renders as examples? Also, thanks.

If you're talking about fighting games sure (VF5 - 40k polygons, DOA - 35k average), but current gen games are nowhere near 40k as an average for the character models. Not even close.
 

Eideka

Banned
As for PC ports : it's going to need lots of brute forcing on a much more powerfull (in TF) GPU to compensate for the advantages of a console's custom silicon and it's on-the-metal(tm) optimizations.

I don't think so. Back in the day I enjoyed the superior version of Oblivion on my 7800GT...

You are strangely trying to downplay the hardware available on PC, funny. It will still be more expensive to play those games on PC, it's a market of enthusiasts after all but no, it won't take a monster PC to run next-gen games as good or better than the Orbis/Durango.
Hell, I even think a single GTX680 should do the trick.
 

m23

Member
Dear lord no.

The worst Halo story wise and ambition wise.

If it is coming make it a premier XBLA release for the 720 I could see it being a nice game to launch that service.

I would have to disagree. Halo 2's campaign and story is definitely not the worst in the series. For one, Halo 2 allowed us to learn much more about the Covenant and what motivated them. Getting to play as the Arbiter was a very surprising move, and in my opinion I think it was a great decision. It really put you in the eyes of the enemy. Along with that, the music is still the best in the series.

Although Halo 2 may have not been the best game in the series, it is definitely not the worst. Halo Reach in my opinion had the weakest story and campaign, even behind ODST. The characters didn't matter to me and the story made no sense as it completely ignored the books.

That being said, I would not be too excited for a full fledged remake of Halo 2. It hasn't even been 10 years yet. Also, the main reason people are even looking forward to a remake is because of the multiplayer. I highly doubt we'll be getting the mp as it was on the original Xbox. At best we'll probably get map remakes from Halo 2 similar to Halo CE Anniversary.
 
It was. That was back when GPU flops weren't just programmable shader flops, but counted every single last operation performed by logic anywhere on the GPU, fixed or otherwise. nVidia tended to be looser in how they counted their numbers too IIRC.

The 2 TF figure for ps3 didn't emerge until after MS had claimed a "world first" 1 TF of total system performance for the 360. It was basically a war of bogus marketing flops.
 

ekim

Member
I don't think so. Back in the day I enjoyed the superior version of Oblivion on my 7800GT...

You are strangely trying to downplay the hardware available on PC, funny.

I'm not downplaying anything. :-I

Edit: as you said in your edit, a GTX680 should do the trick which makes so basically aggree with me :p
 

Eideka

Banned
I'm not downplaying anything. :-I

Edit: as you said in your edit, a GTX680 should do the trick which makes so basically aggree with me :p

Hey, where did you see me taking price into consideration ? PC gaming will always be more expensive, it's not a market for the mainstream, it's for dedicated gamers, tech enthusiasts and those people hardly care about the price.

That said I will certainly purchase both consoles, I'm just waiting for Epic to reveal its IP (please be Samaritan, please be Samaritan, please be Samaritan).
 

Karak

Member
Are you sure ?

Very good news if true. I think it will still need a little bit more than that to go toe-to-toe with the next-gen consoles as far as running games is concerned.

PC will always have "some" game that pushes every card. But what you have is fine for now.
 

i-Lo

Member
If you're talking about fighting games sure (VF5 - 40k polygons, DOA - 35k average), but current gen games are nowhere near 40k as an average for the character models. Not even close.

If you're talking about generic characters then you're correct. And perhaps calling "most" high end games is also a stretch. One would still be surprised to know that Kratos from GoW3 only took around 20K poly but on the other hand, Nathan and Chloe from U2 took around 37K and 45K respectively.

I am expecting the next gen to be around the same perhaps bump it up by another few K. That said, the real important thing would be tessellation and using it to rount out the perimeter of characters to give illusion of greater polycount.
 

Eideka

Banned
PC will always have "some" game that pushes every card. But what you have is fine for now.

I trust you, you have been fairly reliable so far. My future GC (GTX770) should be up for the task then.

Still, I wonder what hardware PC folks will need in order to max out SW1313 or Watch Dogs, especially the last one. The E3 demo is jaw dropping and if the game really is open word then....I would not be surprised if a SLI beast is required.
 

ekim

Member
I trust you, you have been fairly reliable so far.

Still, I wonder what hardware PC folks will need in order to run SW1313 or Watch Dogs, especially the last one. The E3 demo is jaw dropping and if the game really is open word then....I would not be surprised if a SLI beast is required.

Iirc the demo ran on a single 680.
 
Still, I wonder what hardware PC folks will need in order to max out SW1313 or Watch Dogs, especially the last one. The E3 demo is jaw dropping and if the game really is open word then....I would not be surprised if a SLI beast is required.

Didn't these demos run on a single GTX 680?
 

Eideka

Banned
Iirc the demo ran on a single 680.

I find this hard to believe, even if the GTX680 is no slouch. Besides, I've searched everywhere and nowhere it is stipulated that the demo ran on this GC.

Didn't these demos run on a single GTX 680?
3 GTX680s for SW1313...
http://www.pcgamer.com/previews/star-wars-1313-preview/
And at expense to your wallet. The build we saw kept a steady 30fps… running on a rig with three Nvidia GTX 680s inside. There’s years of optimisation to come, but expect this to be the first of a new wave of games to finally challenge your PC.
Nothing confirmed for Watch Dogs but I frankly can see it running like a charm on the Durango/Orbis, the game is releasing on PS3/360, let's keep that in mind.
 

Karak

Member
I trust you, you have been fairly reliable so far. My future GC (GTX770) should be up for the task then.

Still, I wonder what hardware PC folks will need in order to max out SW1313 or Watch Dogs, especially the last one. The E3 demo is jaw dropping and if the game really is open word then....I would not be surprised if a SLI beast is required.

There is no way they will REQUIRE SLI. They may run better with more razzle dazzle but not require it. Plus its going to have a good couple years of optimization. It will of course depend. They could be SHIT for optimization hahahaha. But I don't see that being an issue.
 
Top Bottom