True and at the time sad that pc owners have to wait for a new console generation to have better games on their systems.
But that's pretty much the current situation anyway. I've had my CPU since early 2009 (!), and it will probably outperform the CPUs in these consoles.moving physics etc more to the GPU, and having a relatively low spec CPU could be great for PC gamers. Moving everything onto the GPU, you could keep your current PC for ages, and just update the GPU - giving you much less reason to upgrade the entire thing.
So is this now as good as the Durango one if true?
Are you taking into account Intel's mobile CPUs? I'm pretty sure they were just cheaping out.Yeah, but with the low TDP, Jaguar cores are definitely the right choice in a console environment. Intel isn't even close to matching the performance per watt on this level.
Yeah.. my first thoughts. People seem to be eating this sites rumor up though.
Are you taking into account Intel's mobile CPUs? I'm pretty sure they were just cheaping out.
Nice, it means we could see some minor improvements. I may believe this rumor now...No, it just said it was an 8 core Jaguar, no clock speed or even FLOPs data given, while the GPU did at least list 1.84TFLOPs.
Intel doesn't have an APU like AMD.
People did note that at the reveal Sony did not say the GHz for the CPU.
So, maybe Sony did increase it to 2 GHz but wasnt ready to commit at the reveal a week ago.
Not even close. i7 smokes Jaguar. However, CPUs are becoming less and less important for games in recent years. Most physics calculations will be handled on the PS4 GPU and audio rendering will be done on it's own chip. The CPU will mostly be doing AI calculations.
Ultimate troll from yesteryear. Trolled anything to do with non-MS companies. Was on the side of HD DVD and kept spouting that the 360 would come with HD DVD to kill Blu-ray etc...
Weirdly, he always turns up when something is going badly for MS.
How much will upping the clock rate add to the cost?
Not entirely true, there may be fewer chips on a wafer that can run at 2Ghz while still meeting the requisite power envelope. But if the yields are good, then yes, it's zero cost.None, unless it requires better cooling. They were previously underclocking the CPU on purpose.
is it enough to bump Killzone SF 30fps to 60 fps?
Deadmeat!
Oh man what sorcery did MS resort to, to conjure this crazy bastard.
is it enough to bump Killzone SF 30fps to 60 fps?
Someone fill me in on this Deadmeat fellow...? I don't know the reference.
Sure it does. Intel had them earlier than AMD did. That was the main reason AMD bought ATI. Intel just doesn't call them "APU" or any other fancy new acronyms.Intel doesn't have an APU like AMD.
Sure it does. Intel had them earlier than AMD did. That was the main reason AMD bought ATI. Intel just doesn't call them "APU" or any other fancy new acronyms.
1080p 60fps confirmed
jk
This is the kind of thing that could improve frame rates, much more so than RAM. Of course it's a GPU dependent thing too though.
That kind of depends on where the bottleneck is. Fillrate (GPU), memory amount/bandwidth, AI/physics (CPU), etc.
No, I mean pretty much every Intel CPU for the LGA1156 and LGA1155. The graphics in them aren't fast but the very same can be said about AMD's APUs if you compare them to discreet videocards.°°ToMmY°°;48523120 said:You mean their ultrabook? They're apu, though the integrated graphics core is nothing to write home about.
A somewhat deranged individual. Those claiming he's a Microsoft fanboy are wrong, he hates Sony with a passion unseen before and ever likely to seen again and will latch to any company that he perceives is a direct competitor of Sony in the console market.
Essentially, someone who hates Sony with a passion and trots out delusional statements every so often. He did it regularity with the Dreamcast and then latched onto Microsoft and the Xbox.
It's a 25% frame rate improvement in CPU bound cases. That's mostly games with a large amount of actors/simulation such as strategy or MMO.Let me guess, those 400Mhz are meaningless, right? RIGHT?
Clock speed isn't an 'upgrade' in terms of costYeah, I posted this in one of the numerous ps4 CPU threads a couple days ago.
I have never personally heard of ps4daily, which is why I shyed away from a thread.
More power = better, but man, I worry what all these "upgrades" do to the price...
Let me guess, those 400Mhz are meaningless, right? RIGHT?
UI stuff is also handled by the cpu, isn't it?And geometry setup, plus almost all the gameplay logic, that isn't really a small deal, even considering tessellation.
That's a lot less shit than I thought, i5 stock is about 50-60 percent faster then the ps4 cpu if it's at 2 ghz, I was expecting something like 300 percent faster.A 2Ghz octa core Jaguar should get around 4 with all four CPUs stressed if that first result is correct.
Going from 1.6 to 2 on 8 cores certainly isn't meaningless.
How would an 8 core Jaguar at 2GHz compare to a 4 core i7 at 4GHz?
It's better than that as you don't get a linear power increase with 10 cores over 8.Its like two free cores
Oh god Deadmeat is similar to Doombringer? I need to see more of this guy's posts.Just like Doombringer Deadmeat has become a almost welcome sight anytime I see either of them. The insanity they bring to a discussion always means high class entertainment.
No need of a hint, Trinity already does that;I thought the AMD guy at CES hinted at the APU being able to dynamically ramp and throttle clock speed based on load, the same way their desktop CPUs do.