Its only 15W less than i5 4 core ivy bridge clocked at 2.3ghz.
And would perform the same in multithreaded tasks and cost far, far less to produce. You can fit 4 Jaguar cores in the same amount of silicon as 1 Ivy Bridge core.
Its only 15W less than i5 4 core ivy bridge clocked at 2.3ghz.
These benchmarks are juste a waste of time and total clickbaits. It's impossible to really extrapolate accurate performance benchmarks from comparing them like this. You gotta wait for the final product to see how it reacts when its all together.
Analtech is better than this.
And would perform the same in multithreaded tasks and cost far, far less to produce. You can fit 4 Jaguar cores in the same amount of silicon as 1 Ivy Bridge core.
I also wanted to throw out an "lol" at the "hardware" people who threw around suggestions like "Oh ARM can compete with that" or "Project Denver blah blah";
See;AMD delivering weak performance to gamers everywhere now.
Anandtech said:In its cost and power band, Jaguar is presently without competition. Intel’s current 32nm Saltwell Atom core is outdated, and nothing from ARM is quick enough. It’s no wonder that both Microsoft and Sony elected to use Jaguar as the base for their next-generation console SoCs, there simply isn’t a better option today. As Intel transitions to its 22nm Silvermont architecture however Jaguar will finally get some competition. For the next few months though, AMD will enjoy a position it hasn’t had in years: a CPU performance advantage.
Only is relative wrt to the thermal and power design.Ridiculous. Even with that increase 8 cores @ 2 Ghz would only be ~30W. That's peanuts compared to any desktop CPU or the last generation.
Not sure what Ivy Bridge comparison is but the Ivy Bridge individual cores are way larger and when you include the caches the die size difference becomes huge. Add to that Intel has no viable alternative to a ~2TFlop APU in the current launch window.Its only 15W less than i5 4 core ivy bridge clocked at 2.3ghz. At 35W You can have 2 cores 2 threads ivy bridge clocked at 2.9ghz.
Haswell will have 35W model that has 4 cores, 8 threads and is clocked at 2ghz.
We know this ... it's slated for 2015. Hence its not something that Nvidia could've offered to either Microsoft or Sony.Not much is known about project Denver, so...
It's a very misleading thread. The benchmarks are from a Kabini with 4 cores and 1.5Ghz clock speed. PS4/Xbox One will have 8 cores and (possibly) 2Ghz clock speed.
It's a very misleading thread. The benchmarks are from a Kabini with 4 cores and 1.5Ghz clock speed. PS4/Xbox One will have 8 cores and (possibly) 2Ghz clock speed.
So You cannot double the test scores Yourself? Wow...
And no, console wont have 2ghz clocks.
What you are saying is that you don't know what Jaguar is . It's not a module design (well, actually I believe they conceptually consider 4 Jaguar cores a "module").As always when looking at post-Bulldozer AMD designs, it's important to realize that AMD's "8 cores" consists of 8 INT and 4 FP units. Intel's 4-core designs have 4 INT and 4 FP units. Guess what games like to hammer on? That's right, the FP units. So really what you're saying is,
It's not like they are hiding this fact -- it's only misleading if you don't know how to read benchmark results. And if that is the case, everything is misleading.It's a very misleading thread. The benchmarks are from a Kabini with 4 cores and 1.5Ghz clock speed. PS4/Xbox One will have 8 cores and (possibly) 2Ghz clock speed.
Looking at Kabini, we have a good idea of the dynamic range for Jaguar on TSMCs 28nm process: 1GHz - 2GHz. Right around 1.6GHz seems to be the sweet spot, as going to 2GHz requires a 66% increase in TDP.
But about this stuff some really doNo offense, but some of you guys, really don't have any clue about those stuff.
Performance isn't completely linear like that.
Performance isn't completely linear like that.
also their release date....
What you are saying is that you don't know what Jaguar is . It's not a module design (well, actually I believe they conceptually consider 4 Jaguar cores a "module").
Nothing misleading about the thread at all, they are the same cores as has been rumored.
The article points out that 2Ghz is unlikely and you can get a very good idea about how fast 8 cores would be from knowing how fast 4 cores are.
So PS4/Xbone 8 core CPU = The low-end Ultrabook CPU we had in 2012.
http://community.futuremark.com/hardware/cpu
i5-3317u :2460
i3-3220 : 4100 (~100 USD price)
It's a very misleading thread. The benchmarks are from a Kabini with 4 cores and 1.5Ghz clock speed. PS4/Xbox One will have 8 cores and (possibly) 2Ghz clock speed.
So PS4/Xbone 8 core CPU = The low-end Ultrabook CPU we had in 2012.
http://community.futuremark.com/hardware/cpu
i5-3317u :2460
i3-3220 : 4100 (~100 USD price)
exactly and also A4 was a much lower end first gen APU
Xone/PS4 have custom built top of the line APUs. The article is totally misleading
AMD themselves said that the APU for PS4/Xone are the most powerful that they have ever built .
These benchmarks are juste a waste of time and total clickbaits. It's impossible to really extrapolate accurate performance benchmarks from comparing them like this. You gotta wait for the final product to see how it reacts when its all together.
Analtech is better than this.
Hehehe... such crap CPUs. Pity really.... missed chance for both of them to be really next gen.
I wonder what the typical technical differences we can expect between multiplatform games next gen.
While current gen the PS3 is arguably the more powerful console, its special architecture has prevented a noticable lead of PS3 versions over their Xbox counterparts. Many times the XBox versions even showed a small lead.
But next gen architectures seem to be pretty similar, with the PS4 being significantly more poerful than the XBox One. Consequently I expect PS4 versions of games to show a technical lead more often. Do you agree?
I also wonder where current high end PCs will end up. Their main deficit is their lack of a comparable amount of DDR5 RAM compared to the PS4. How will that show?
people forget the ESRAM, didn't they say this will give them 200 GB/s?
200 GB/s > 176 GB/s
I think the differences are going to be minimal honestly. The easiest thing for a developer to tweak is the resolution. That's something that is really hard to tell though when you're playing a game at home.
For example, I think Halo 4 looks gorgeous, but it's still running natively at 720p and it's upscaled to fit my 1080p TV. I think if the PS4 does have this noticeable advantage, it could be something like PS4 version is at 1080p and Xbox One runs at 900p. Unless you're digital foundry though, you aren't likely to notice.
Another thing we can't forget to mention is that some games are also inherently CPU-bound, not GPU-bound. In these cases, from what I can tell the PS4 won't have the edge in this scenario, but will be essentially tied with the Xbox One since they have the same CPU. It doesn't matter how much faster your GPU is when your limiting factor is the CPU. Maybe the custom move engines and SHAPE audio hardware will allow CPU-bound games to run even faster on the Xbox One even. The proof will be in the games of course.
The difference between PS4 and Xbone does seem to be significantly greater than between the 360 and PS4 though, so difference in multiplats games should definitely be more obvious than this gen.