Glorified G
Member
If the 720 GPUs follow the same development path as Xenos, they'll be closer to HD9000 series than HD8000 series.
If the 720 GPUs follow the same development path as Xenos, they'll be closer to HD9000 series than HD8000 series.
If the 720 GPUs follow the same development path as Xenos, they'll be closer to HD9000 series than HD8000 series.
Well, you are right about the clocks, but there are other ways of making a faster CPU. A simple way to look at it: a 2Ghz 5 years old C2D with only two logical cores bested the 3.2 Ghz 6 logical cores CPU present at the 360.This is the way forward for general CPU development. Desktop processors have been stuck at 3-3.5 Ghz for close to 10 years. The only way to get dramatically more performance is through parallel processing.
The fact is games do lend themselves to concurrent processing because there's so many well-defined subsystems (ai, audio, physics, lighting, etc). But that doesn't even matter. To get the best performance going forward, processing will be divided in discrete 'jobs' which are scheduled across all available cores.
I'm surprised this is such a controversial topic on GAF. We're currently gaming on a tri-core cpu and a 1+7 core heterogeneous CPU, both clocked at 3.2 GHz. If the next consoles are 6-8x more powerful for an obviously visible improvement, what do you propose? 8Ghz clocks? That's not where CPU design is headed.
GPUs have become much more power hungry though compared to 2005.
You didn't have 250W gpus back then.
Processors don't just get faster from increasing the clock frequency. A sandy bridge i7 2600 is about twice as fast per core as a Intel CPU core from 2005.
Not controversial, it just sounds like a terrible idea to me. With PC gaming, sure you could get a Sandy Bridge E (6 core) or Xeon 8 core or 16 core processor dual socket system. It will only slow down the game because these processors per core are slower than an i7 2600, and I don't think many games even use more than 4 cores.
On the other hand, a faster GPU almost always significantly improves performance in PC games.
Are both PS4 and Xbox 3 going to use directX? Similar API as PC developers do? Then a console with a design that focuses on a faster GPU is going to lead to performance gains for all developers. I much rather see that than some overly complicated architecture that allows only developers like Naughty Dog to output something really high end.
I don't see the need for more than 1 core dedicated to the OS. Metro on Windows 8 runs fine on one core.
Why do people continue to use the PS3 as reason more cores = bad. It's a very bad example. There's a reason why the CELL is complicated to work with. It's because the cores aren't the similar and don't act the same like what you would find with the CPU in a X360, Intel, AMD and etc. And, that's why 3rd developers don't want to waste the time on trying to harness it. Thus, Sony is going to abandon it.
Having the ability to run 16 threads, probably more accurate than "16 cores", is going to be needed in 5 or more years. Microsoft is trying to balance between having power enough CPU and GPU(s).
Maybe, because cost/performance ratio? Why do weak dual GPU when you can just throw one powerful one into it.Why go to such elaborate efforts to future proof when GPUs make all of the difference today?
The PS3 example wasn't because of the CPU being complicated, but the idea of offloading GPU calculations to a CPU in an industry that has no experience doing that.
If the CELL perform like other CPUs, in X360, Intel, AMD and etc., which developers have been working with for years / decades, not make it more easy for them offload GPU calculations to the CPU?
I'm just not seeing the improvement with more cores in existing game engines on PC:
Batman AA CPU
Batrman AC GPU
Crysis Warhead CPU
Crysis Warhead GPU
Fallout 3 CPU
Skyrim GPU
Looking at those charts, seems, the games only are design for only 2 cores. Difference in performance seems due to individual CPU design, clock for clock faster, more cache and etc.
Im hoping to 250-280W console.
~50W for fast RAM, ~50 for mobo and acessories [hdd/bluray], 150-180W for CPU/GPU.
It won't happen though, because that 250-280w has to be cooled. To do that they'll need more powerful cooling solutions that will be more expensive and make more noise... also the housing will need to be bigger.
I would predict that if there is any change in terms of power usage it's going down from the start of this gen.
Was the 360 GPU considered all out batshit insane in 2005?
Was the 360 GPU considered all out batshit insane in 2005?
Was the 360 GPU considered all out batshit insane in 2005?
Yes, it was the most powerful thing around.
Almost half of the users in this optional survey are still only using a dual core system. And you have 17% who don't have at least a dx10 capable system. Those numbers are pretty significant, and that's just an optional steam survey. I doubt many would want to cut those people off from playing their games.
Almost half of the users in this optional survey are still only using a dual core system. And you have 17% who don't have at least a dx10 capable system. Those numbers are pretty significant, and that's just an optional steam survey. I doubt many would want to cut those people off from playing their games.
While I'd argue the PC gaming industry is moving to more complex CPU threading (remember, these consoles are more than a year off and are hoped to be around for nearly a decade) ... I think you're making some assumptions on the intent.Why go to such elaborate efforts to future proof when GPUs make all of the difference today?
The PS3 example wasn't because of the CPU being complicated, but the idea of offloading GPU calculations to a CPU in an industry that has no experience doing that.
Was the 360 GPU considered all out batshit insane in 2005?
Oky, I start to believe the blu-ray thing now. Everything else is just "yeah right" so far.
This was always going to be the case, as far as I'm concerned. MS has spent the better part of the past few years repositioning the X360 as a multimedia platform and it's reasonable to assume this will continue into the next generation, which means supporting the HD format of choice.
I wonder if the next xbox features an internet browser. There needs to be a better way to access youtube, netflix, facebook etc. than the shitty outdated apps.
The chip, which is still referred to as ‘Oban’, is being run through multiple fabs in very high quantities, too high by more than an order of magnitude to simply be for dev kits. Yields on the chip are said to be something between painfully low, Nvidia Fermi painfully low, and worse than that. Given the sheer number of wafers Microsoft contracted for, this seems to be both an anticipated problem, and one they have plans to work through. That said, SemiAccurate’s sources are still reporting that there is much work to be done, yields are not even up to “horrid” yet.
....
Microsoft insiders tell us that the planned launch date is September 2013, and that is not changing without heads rolling internally.
.....
It looks like the long shot came through, moles are now openly talking about AMD x86 CPU cores and more surprisingly, a newer than expected GPU. How new? HD7000 series, or at least a variant of the GCN cores, heavily tweaked by Microsoft for their specific needs.
This means both the XBox Next and the PS4 are going to effectively be HSA/FSA devices.
Say goodbuye to 16thread IBM processor in Xbox 720. Charlie at S|A just released article with news that MS has transferred fully to AMD hardware [both CPU and heavily modified 6000-7000 series GPU]. They have ton of problems in manufacture, but Sept 2013 is still the planed date.
http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/
New thread worthy?
I heard that rumor a while back. I believe it.
Say goodbuye to 16thread IBM processor in Xbox 720. Charlie at S|A just released article with news that MS has transferred fully to AMD hardware [both CPU and heavily modified 6000-7000 series GPU]. They have ton of problems in manufacture, but Sept 2013 is still the planed date.
http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/
New thread worthy?
Don't really buy this.Say goodbuye to 16thread IBM processor in Xbox 720. Charlie at S|A just released article with news that MS has transferred fully to AMD hardware [both CPU and heavily modified 6000-7000 series GPU]. They have ton of problems in manufacture, but Sept 2013 is still the planed date.
http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/
New thread worthy?
What ? Only HD 7000 series ???
That CANNOT be true.
Look familiar? from IBM to AMD CPU
Don't really buy this.
What ? Only HD 7000 series ???
That CANNOT be true.
If this is true, both consoles will be reeeealy similar. :-/ I want more inovation and risk.
Give us 3D stacked memory, and lots of it!
Cant tell if serious. 7000 Series are all tessellation and dx11 capable. They get pretty shitty in the lower ranges.