• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Visual Downgrade In Next-Gen Tech Demos Going From PC To Consoles?

Didn't people learn anything from the "power of the Cell" or "blast processing" days? Do we still need to hype everything to oblivion?

Nope, people is expecting way too much of this gen gonna hit the ground very hard.

Funny that AMD is gonna deliver Kaveri at the end of the year wich is similar to the PS4 this just to catch up Intel's Iris.
 
CryTek didn't drop real-time GI as far as I know. You wan't it? Buy next gen CryTek games.

That really does not talk to my point at all.

My point was attempting to disprove your idea that low end gunt with new programming potential allows for paradigmatic shifts in visuals and programming.

The UE4 thing was an example against this idea.

Are you going to respond to this? It is directly countering your main point of argumentation.

Also.. the UE4.0 GI, as much as I think CryEngine is amazing, was way better. It was GI from every light source with indirect glossy reflections from every light source. Cryengine only does it from the Sun in its current iteration.
 
Except the fact that rendering has been done on strong, dumb the GPU for many years. So even though we maybe will get some nice algorithms that previously utilized only the CPU to use the "muscle" a bit. But that won't make graphics prettier. Actually, it might be the other way around; if the GPU starts doing tons of GPU compute stuff, it won't have as much time rendering.

So, the 7000 series as a "rendering machine" is a known quantity. And the HSA APU architecture won't help it render stuff faster.

PS3 showed the potential of having the dumb GPU augmented with 'compute' (compute being the SPEs in CELL in that example)
 
Also.. the UE4.0 GI, as much as I think CryEngine is amazing, was way better. It was GI from every light source with indirect glossy reflections from every light source. Cryengine only does it from the Sun in its current iteration.

Yep, SVOGI was the best GI implementation to date, but dat cost :) Actually i was super excited by first UE 4 technology demo, real time lighting is where we should be in next-gen no matter how linear and scripted game is.

--
In my eyes real-time GI is just too much for the next gen. I don't think that we'll see it even in most PC games for a very long time. Maybe some CryTek game on ultra setting (a little muscle flexing for CryEngine) but that's it. It's just a waste of resources for studios to implement since most PC gamers don't have the grunt to use it.

Real time GI from Sun/Moon is very cheap in CryEngine. They've even managed to put it in some situation on current-gen consoles, but on PC its turned on for every settings.
MGS 5 has some kind of GI. Panta Rhei has dynamic GI too. Battlefield 4 had middleware for real-time GI from Geometrics.
GI is future of lighting engines and will be utilized quite much in the future. Unfortunately in some games it will be baked ;\

Also You overestimate utilization GPGPU in PS4.Yes, You can utilize PS4 GPU much better than any PC GCN card, but its not the difference between 50% and 100%, its a difference between max 80% and 92-93%. Also, because consoles has crappy CPUs, they will use compute for stuff that on PC will work on CPUs.

I can bet than as we progress with generation, games will lose precision of effects for compute techniques. Why? Because You will be able to differentiate games more by physics and simulations than by higher quality Motion Blur or DOF. And i personally dont like it, because all problems we have lately on current gen will happen again - awful IQ, awful LoD, low quality shadows etc, poor framerates. Image quality will plummet significantly for some nice simulation effect.
 
Yep, SVOGI was the best GI implementation to date, but dat cost :) Actually i was super excited by first UE 4 technology demo, real time lighting is where we should be in next-gen no matter how linear and scripted game is.

I read that they even have problems with their alterntive solution for GI. SVOGI with Nvidia 800 series? Hope so.
 
Yep, SVOGI was the best GI implementation to date, but dat cost :) Actually i was super excited by first UE 4 technology demo, real time lighting is where we should be in next-gen no matter how linear and scripted game is.

Pretty much every game I've read some tech slides of is doing real time, physically based lighting which is awesome. UE4 is doing it too, they just had to cut SVOGI. Capcom actually mentioned that they're using voxels for the lighting from the dragon's fire in Deep Down and have ideas on how to get it working at a larger scale. I think we'll get some form of real time GI at some point this gen, just probably not SVOGI since it killed performance even on a 680.
 
I read that they even have problems with their alterntive solution for GI. SVOGI with Nvidia 800 series? Hope so.

Lets hope that they make it a flag in editor - consoles baked, max high settings SVOGI.

----
Pretty much every game I've read some tech slides of is doing real time, physically based lighting which is awesome. UE4 is doing it too, they just had to cut SVOGI. Capcom actually mentioned that they're using voxels for the lighting from the dragon's fire in Deep Down and have ideas on how to get it working at a larger scale. I think we'll get some form of real time GI at some point this gen, just probably not SVOGI since it killed performance even on a 680.

For example KZ:SF has baked GI. UE 4 has baked GI. Luminous has baked GI as well.
 
For example KZ:SF has baked GI. UE 4 has baked GI. Luminous has baked GI as well.

I think BF4 is using baked GI too. I was just saying it may come in some form in higher end visual games towards the end of the gen. Getting all the lighting real time first is a much bigger priority and makes a big difference in itself.
 
Yep, SVOGI was the best GI implementation to date, but dat cost :) Actually i was super excited by first UE 4 technology demo, real time lighting is where we should be in next-gen no matter how linear and scripted game is.

SVOGI has other problems than just performance. There are great issues with bleeding of colors through geometry.

A good example of this would be that if I am in a room that has a wall towards a room with for example a strong red light. In that case, the red GI in that room would bleed through the wall into the floor of my room. Currently, there is no way of preventing this except increasing the resolution or increase the thickness of the walls. The first one causes a huge performance hit, and also goes against the principle of SVOGI LOD (which is that further away from the camera you have lower resolution SVOGI calculations), the other is contextual and not a complete solution.

This is a bit off topic, but I just wanted to point out that those awesome SVOGI demos you've seen isn't the whole story.
 
alexandros said:
I guess I'll be the bad guy again and point out that the comparison is both pointless and flawed. The PS4 is not competing with early gen games, it's competing with late gen PS3 and 360 titles and of course the great-looking PC versions. The pointless part: They need to sell the console to the current audience, not the one from 2007. Gamers of 2013 have seen Crysis 3, Uncharted 3, Halo 4, pick any graphically intensive game of the past couple of years. Next-gen games need to provide a significant leap from these titles if they are to drive console purchases.

I don't think you quite understand the point of the comparison then. Its not really PS4 versus anything. Its a comparison between launch PS3 software and launch PS4 software. A comparison to show the state of software at launch. That being said, the ps4 is already recieving a strong response from customers, even the XBO is picking up. Im guessing the jump is fine in alot of customers eyes.


The flawed part: the PS3 was notoriously difficult to program for. It had a weird architecture, not that great tools, a multicore CPU with strange and unfamiliar technology, split memory, take your pick. It makes sense that developers couldn't get decent results until way into the generation. This time the PS4 is supposedly designed with the goal to make game design as simple and effortless as possible. It's based on an x86 CPU and an AMD GPU, technologies that most decent game makers are intimately familiar with. It's disingenuous to suggest that the situation is even remotely similar. It gives the impression of preemptive damage control before footage from the real console versions (not from "PCs with equivalent spec") start appearing.

This is a fallacy. Console development is hugely different from PC development. Just because this time, the hardware is alot more similar to PC parts, doesn't mean that previous console-only developers are going to just eat it up. Sony has designed yet another graphics API that will take some time to get used to and there have been fundamental modications to the GCN cores and their scheduling systems. The fact that Sony wants the system to be as accessible and easy-to-dev for as possible doesn't quite speak to the fact that there can and will be a steep learning as console-only developers progress up to a GPGPU.
 
That's not the point. The whole point was that studios won't be able to utilize the full potential of next gen systems in the first 12-24 months.

Thats true. But its even more true, because of assets creation and algorithms. Better GPGPU used will push hardware to 93-95% utilization, but what will really push next-gen games is technology of engines and assets creation :)

I'm not really talking about crappy IQ as 1080p + SMAA in deferred renderer, but something lower :) 1080p + SMAA T2x is quite decent. I really hope that in future titles frame-interpolation and real-time AA techniques and resolution changes will be the norm.
Like by default game renders in 1.3x of 1080p with SMAA 4x,
when frame time starting to increase, it lowers to 1080p SMAA 4x,
then to 1080 SMAA T2x,
then in very demanding scenes with a lot of particles and motion it would decrease to 0.7x of 1080p + SMAA T2x, but because You have high quality dof and motion blur effects its not as visible.
And as performance demand decreases IQ goes up.

--
I think BF4 is using baked GI too. I was just saying it may come in some form in higher end visual games towards the end of the gen. Getting all the lighting real time first is a much bigger priority and makes a big difference in itself.

They had technology for real-time GI in BF 3 already, they just didnt need to use it. But they are pushing destructibility more in BF 4 and i think in SP You have lighting condition changing in real-time, so i think they actually are using it.
I actually think that Frostbite 3 uses Geometrics dynamic GI by default.

==
SVOGI has other problems than just performance. There are great issues with bleeding of colors through geometry.

A good example of this would be that if I am in a room that has a wall towards a room with for example a strong red light. In that case, the red GI in that room would bleed through the wall into the floor of my room. Currently, there is no way of preventing this except increasing the resolution or increase the thickness of the walls. The first one causes a huge performance hit, and also goes against the principle of SVOGI LOD (which is that further away from the camera you have lower resolution SVOGI calculations), the other is contextual and not a complete solution.

This is a bit off topic, but I just wanted to point out that those awesome SVOGI demos you've seen isn't the whole story.

Didnt know about that. Still, I think, its solvable problem in the long run and real-time high quality GI from every light source is worth :)

BTW that technique that Nvidia's CloudLight is using looks really good and also support many lights to cast ambient light.
 
I think BF4 is using baked GI too. I was just saying it may come in some form in higher end visual games towards the end of the gen. Getting all the lighting real time first is a much bigger priority and makes a big difference in itself.

BF3 uses Enlighten. At least on PC it is kinda real time in that it updates every few seconds http://blog.wolfire.com/2011/03/GDC-session-summary-Battlefield-3-Radiosity. I imagine that BF4 will be basically the same on PC/PS4/XO.
 
I guess I'll be the bad guy again and point out that the comparison is both pointless and flawed.

Yeah, my thoughts as well. I actually thought I had loaded up the wrong video for a moment because I didn't quite make sense. Well, I got the point, I just think it was misplaced.

Btw, I vividly remember everyone on the forums praised the look of COD3 in the context of its 60fps, it did look good at the time, and it's only after the tech started to look aged in comparison to newer tech the 4th in the series the criticism started to show up, so I didn't quite get that point either.
 
Bad-mouthing?
ok..........


Let's say hypotheticaly that the heteogeneou computing offered by th next-gen systems does get a good start and shows some progress in real examples. Devs start sharing resources between CPU and GPU... etc....

LET US NOT FORGET that those rsources are being shared between a very low powered CPU and a midrange GPU.

Can we really expect miracles from a new coding paradigm on such lower end hardware?

A 100 GFLOP CPU should provide a lot of power to the developers. And in terms of general purpose performance, Jaguar runs circles around Cell and Xenos.

If the CPU were stronger and the GPUs were stronger I could see this being taken advantage of in a game chaning way... because new data structures could be traversed and taken advantage of whilst having the performance to do it.
But since they are not so hugely powerfulcomponents.. I can hardly imagine something exotic being built into and on top modern day engines with such a low performance profile.

I think it will add a bit of pizaz here or there to an effect (maybe)... but some paradigm shift in programming architecture requires a crap ton of power usually. A class "A" example of this is Intel's Larrabee. It has had serious programmatical advantages but lacks the raw horsepower to make them in real time games.

I would not define a 100 GFLOP CPU a low performance CPU. The Jaguar architecture is a lot stronger than Piledriver, and Jaguar is actually the x86 CPU with the best performance/Watt ratio on the market. This means that there isn't any other x86 CPU able to provide the same performance of Jaguar at the DTP budget of the PS4.
 
I would not define a 100 GFLOP CPU a low performance CPU. The Jaguar architecture is a lot stronger than Piledriver, and Jaguar is actually the x86 CPU with the best performance/Watt ratio on the market. This means that there isn't any other x86 CPU able to provide the same performance of Jaguar at the DTP budget of the PS4.

Its still low performance CPU. Its has amazing performance to watt ratio, but its still slow compared to even 5 year old 4 cores processors.

How can You describe them other than slow, when current generation CPU with half as many cores is 4x time more powerful and when next generation of Intel CPUs will introduce AVX 3.2 that will increase peak performance by another 2x.
 
A 100 GFLOP CPU should provide a lot of power to the developers. And in terms of general purpose performance, Jaguar runs circles around Cell and Xenos.

I would not define a 100 GFLOP CPU a low performance CPU. The Jaguar architecture is a lot stronger than Piledriver, and Jaguar is actually the x86 CPU with the best performance/Watt ratio on the market. This means that there would not be any other CPU able to provide the same performance of Jaguar at the DTP budget of the PS4.

That is 100gflops of theoretical perfect performance. CPUs with incredibly higher rated Gflops do not even reach their theoretical potential for throughput: As shown here. Rather.. you talk about real world performance... and the real world performance of the thing is even worse.

Even though Jaguar is highest performance/per watt... it is only that way in the 1.0-2.0 Ghz range. As soon as the clock goes up or down... the performance per watt goes down on the architecture. Saying that it has the best performance per watt is really mis-representative. Rather... you say at "its power level it has the best performance per watt." Smart phones CPUs offer better "performance per watt" than jaguar in their metric range ... but no one would mention that, as it is a silly statement which speaks nothing to its relative performance to other processors.

I am incredibly confused though how you can consider a netbook cpu (no matter how many cores it has) to be "powerful" and not "low end." You have to convince yourself of that... and not rely on a relative or objective reality.

Its still low performance CPU. Its has amazing performance to watt ratio, but its still slow compared to even 5 year old 4 cores processors.

Like I said above to the other poster... it only has good performance per watt within its "performance class." Scale up or down and that efficiency disappears.
 
With a HSA APU you can have the best of both CPU and GPU at the same time: CPU's are extremely smart, but very weak. GPU's are very dumb, but extremely strong. The whole point of the HSA is to use the brain of the CPU and the muscle of the GPU for a single task which will eventually make the HSA APU extremely smart and extremely strong at the same time.

+1

There are potentially vast benefits to many kinds of simulation algorithms and also AI.
 
I think BF4 is using baked GI too. I was just saying it may come in some form in higher end visual games towards the end of the gen. Getting all the lighting real time first is a much bigger priority and makes a big difference in itself.

BF4 (as does BF3) uses a solution from geometrics. They are a pre baked GI solution. However, they don't pre compute all the way through. They make some heavy calculations offline, but provide data that lets the game calculates with less effort how the scene will be finally lit, allowing some dynamism.
 
that is 100gflops of theoretical perfect performance. CPUs with incredibly higher rated Gflops do not even reach their theoretical potential. As shown here.

Even though Jaguar is highest performance/per watt... it is only that way in the 1.0-2.0 Ghz range. As soon as the clock goes up... the performance per watt goes down on the architecture.

And how do you know this? Please provide a link. From what I know, there isn't any test showing the performance of a Jaguar processor clocked higher than 2Ghz.

Saying that it has the best performance per watt is really mis-representative. Rather... you say at its power level it has the best performance per watt. Smart phones CPUs offer better "performance per watt" in their metric range than jaguar... but no one would mention that is a silly statement which speaks nothing to its relative performance.

No, I specifically talked about x86 CPUs. So, what smarthpones have x86 CPUs with better performance/Watt ratio than Jaguar?

I am incredibly confused though how you can consider a netbook cpu (no matter how many cores it has) to be "powerful" and not "low end." You have to convince yourself of that... and not rely on a relative or objective reality.

Like I said above to the other poster... it only has good performance per watt within its "performance class." Scale up or down and that efficiency disappears.

So would you say that a 16 core Jaguar providing 200 GFLOPs would still be a low end CPU, just because the Jaguar architecture is so power efficient that can be used in tablets and netbooks?
 
And how do you know this? Please provide a link. From what I know, there isn't any test showing the performance of a Jaguar processor clocked higher than 2Ghz.

This is literally how most CPUs work. They are deisnged around very specific power and performance goals/ratios. Hence why you would not see a jaguar chip clocked down to like .25 ghz in a cellhpone... or up to 6.0 ghz in a desktop PC. It would scale poorly.

No, I specifically talked about x86 CPUs. So, what smarthpones have x86 CPUs with better performance/Watt ratio than Jaguar?
My bad.

So would you say that a 16 core Jaguar providing 200 GFLOPs would still be a low end CPU, just because the Jaguar architecture is so power efficient that can be used in tablets and netbooks?

Yes, I would rather always have less cores with more performance. It scales better and is more easily generalizable. Not everyone would agree with me though... this is a matter of preference.
 
The Jaguars in the Temash mobile APUs are dual cores with 1GHz clock frequency. Next gen consoles use Jaguar octa cores with 1.6GHz clock frequency. That's a performance difference of more than 500% in favor of the consoles! That's a difference like Xbox360 CPU -> Xbox One CPU.

Eight Jaguars with 1.6GHz are even faster than four Piledrivers with 3.2GHz, but the Jaguar has the additional advantage of small die space, low power consumption and being a synthetic design. In any case it's a much better solution for next gen systems than four high clocked/high IPC cores since you can dedicate more resources to your GPU (or eSRAM in MS's case).

It's true that eight Jaguars can't compete with a modern Intel, but at least Sony aims to compensate for it by using the GPU for compute. AMD designed the Jaguars with GPU compute in mind. There will even be heterogeneous server processors based on Jaguar+GCN that will use the GCN GPU part for nothing but GPGPU. If you aim for low power consumption then Jaguar is a great CPU design.

When a CPU can't compete with an Intel mid-range quadcore from 4 years ago it is distinctly low-end.
 
Or even Phenom II X4.

It doesnt matter that its good design for consoles, its still low end CPU.

Is raw power the only way to define a CPU these days?


If Intel today revealed a nuclear fusion powered CPU that they could only clock at like 100Hz and obtained a theo max of 500 flops, would it be "low end" or what?
 
When a CPU can't compete with an Intel mid-range quadcore from 4 years ago it is distinctly low-end.

But we don't play benchmarks on consoels, we play games. Thus, we should judge how the games perform on PS4. If I will be able to play Battlefield 4 at 1080p@60fps then I'd say the CPU is fine.
 
Is raw power the only way to define a CPU these days?

In discussion in whatever CPU is low end or not it is.

If Intel today revealed a nuclear fusion powered CPU that they could only clock at like 100Hz and obtained a theo max of 500 flops, would it be "low end" or what?
That would be high end, because it would compete with their current offerings in terms of flops.

---
But we don't play benchmarks on consoels, we play games. Thus, we should judge how the games perform on PS4. If I will be able to play Battlefield 4 at 1080p@60fps then I'd say the CPU is fine.

And how this change the fact that its low end? Is it really so hard to admit? No one is saying that M$ and Sony made a mistake choosing smaller die for CPUs and bigger for GPUs, but fact is that CPUs in next-gen consoles are weak.
 
But we don't play benchmarks on consoels, we play games. Thus, we should judge how the games perform on PS4. If I will be able to play Battlefield 4 at 1080p@60fps then I'd say the CPU is fine.

The question is:
"Is the PS4 CPU a high-end or a low-end CPU?"


The answer to that question is:
"Yes. It's not close to the high-end CPUs on the market, and is closer to the low-end CPUs on the market, when comparing how well it performs its task."

The answer is not:
"No, because the games look nice."


And the question was NOT:
"Is the CPU 'fine'?". On that answer you have a whole bunch of factors to consider; most importantly what is considered fine.
 
The question is:
"Is the PS4 CPU a high-end or a low-end CPU?"


The answer to that question is:
"Yes. It's not close to the high-end CPUs on the market, and is closer to the low-end CPUs on the market, when comparing how well it performs its task."

The answer is not:
"No, because the games look nice."


And the question was NOT:
"Is the CPU 'fine'?". On that answer you have a whole bunch of factors to consider; most importantly what is considered fine.

The last question shows why the first question is a dumb question. CPU power is less important for games than GPU power, and consoles are closed boxes with low power consumption, which makes the Jaguar CPU cores a very good choice. Additionally it's pretty funny when PC guys are complaining about this, because using this exact CPU in both consoles means PC games will offer nice multi core support.
 
I'm pretty sure Sony's engineers know what they're doing and they've chosen Jaguar on purpose, not because coincidence and not because incompetence either. I'm also sure that they're a lot smarter than us, especially in foreseeing what might be useful in the future, for games.

If we'd be tasked to assemble a console from 400, most people would end up with

a.: exact same config
b.: damn similar config
c.: Xbox One

Ps.: I think (as Carmack said too) Sony made "wise" engineering choices, and off course they could have made a more powerful console, but they didn't, and it's on purpose.

Ps4 will have amazing looking, performing games.
 
Which is absolutely dumb and pointless. These systems use heterogeneous processors. The CPUs and the integrated GPUs are meant to work in concert: The PS4 uses the most powerful APU of all time. Period. Saying "it's low-end" is pure ignorance. We're talking about hUMA RAM and cache coherency here. A four year old Intel quad core doesn't have this stuff.
And how are those affecting performance of CPU again?

---
Gemüsepizza;75515999 said:
The last question shows why the first question is a dumb question. CPU power is less important for games than GPU power, and consoles are closed boxes with low power consumption, which makes the Jaguar CPU cores a very good choice.

Yeah, in many cases it is, but what about situations when GPU will have to use compute for tasks that CPU should do, but its too weak for it?
Is it still good CPU in general? Or at least future proof?
 
On this list someone posted earlier, none of the CPUs go above 104GF, even things like i7@5GHz.
http://www.techpowerup.com/forums/showthread.php?t=94721

I know this is actual measured flops vs. theoretical flops, but I guess the multi core low-powere core approach isn't bad in the end, when you're dealing with software written so that it actually has to use all those cores, and not concerned with legacy stuff that doesn't scale up.
 
On this list someone posted earlier, none of the CPUs go above 104GF, even things like i7@5GHz.
http://www.techpowerup.com/forums/showthread.php?t=94721

I know this is actual measured flops vs. theoretical flops, but I guess the multi core low-powere core approach isn't bad in the end, when you're dealing with software written so that it actually has to use all those cores, and not concerned with legacy stuff that doesn't scale up.

That is from a burn program which is meant to max a processor in thermal load.

It is notoriously heavy.
 
Which is absolutely dumb and pointless. These systems use heterogeneous processors. The CPUs and the integrated GPUs are meant to work in concert: The PS4 uses the most powerful APU of all time. Period.

Honest question, you seem to know your stuff so I 'd like to know this: Isn't " The PS4 uses the most powerful APU of all time" a bit like saying (just an exaple here, I don't know anything about cars) that the Prius is the most powerful electric car ever made? I mean sure, it might be, but it still isn't powerful enough to match something like a Mercedes, let alone a Ferrari. Do you think that the PS4 will be able to at least match modern midrange PCs in performance? Again, it is an honest question.
 
On this list someone posted earlier, none of the CPUs go above 104GF, even things like i7@5GHz.
http://www.techpowerup.com/forums/showthread.php?t=94721

I know this is actual measured flops vs. theoretical flops, but I guess the multi core low-powere core approach isn't bad in the end, when you're dealing with software written so that it actually has to use all those cores, and not concerned with legacy stuff that doesn't scale up.

Those are double precision numbers, 8 Core Jaguar at 1.6GHz is looking at 38.4 GFLOPS DP theoretical.
 
My concerns about the CPU stem from the fact that they've said nothing about it aside from 8 core Jaguar. They've detailed buses, GPU ACEs, and even the decompression unit but not the CPU. I think it'll be sufficient albeit unimpressive but I'd like a comment from someone saying it's fine. Outside of one brief comment from Evolution about liking it and the PS2 team saying they were having trouble optimizig for multithread we've heard basically nothing.

BF3 uses Enlighten. At least on PC it is kinda real time in that it updates every few seconds http://blog.wolfire.com/2011/03/GDC-session-summary-Battlefield-3-Radiosity. I imagine that BF4 will be basically the same on PC/PS4/XO.

BF4 (as does BF3) uses a solution from geometrics. They are a pre baked GI solution. However, they don't pre compute all the way through. They make some heavy calculations offline, but provide data that lets the game calculates with less effort how the scene will be finally lit, allowing some dynamism.
Oh cool. I knew it baked fast but didn't realize it was like that. Thanks for pointing that out. I expect at least something like that will become common next gen.
 
You can be excited about something and critical about something at the same time, they're not mutually exclusive. I understand enthusiasm and it's great that you look forward to all the new games, but when you're constantly saying "it looks fine, it looks great, it's fantastic, it's unbelievable" it doesn't leave much room for discussion. Visual fidelity isn't even the issue here, the real issue is the possibility of publishers and developers misleading their consumer base with footage that is running on high-end PCs but is supposed to be from the console versions. A passive stance ensures that this will keep happening.

But there is nothing for me personally to be critical about. I like what I'm seeing. Every time I've finally got my hands on games that have previously 'misled' people in the past, they still always looked great to me, never been disappointed. I'm not a cynic, and honestly visual fidelity is not that important to me, it's just icing on the cake. If a game looks a little worse than when first shown, it's not that big a deal to me. I love gaming, that's it really.
 
On this list someone posted earlier, none of the CPUs go above 104GF, even things like i7@5GHz.
http://www.techpowerup.com/forums/showthread.php?t=94721

I know this is actual measured flops vs. theoretical flops, but I guess the multi core low-powere core approach isn't bad in the end, when you're dealing with software written so that it actually has to use all those cores, and not concerned with legacy stuff that doesn't scale up.

I wonder what placement in that list the PS4/Xbone proc would land.
 
Gemüsepizza;75515999 said:
The last question shows why the first question is a dumb question. CPU power is less important for games than GPU power, and consoles are closed boxes with low power consumption, which makes the Jaguar CPU cores a very good choice. Additionally it's pretty funny when PC guys are complaining about this, because using this exact CPU in both consoles means PC games will offer nice multi core support.

Yes, that's why we always get prettier graphics and not better games, e.g. achieved with better AI. Graphics everywhere.
 
But there is nothing for me personally to be critical about. I like what I'm seeing. Every time I've finally got my hands on games that have previously 'misled' people in the past, they still always looked great to me, never been disappointed. I'm not a cynic, and honestly visual fidelity is not that important to me, it's just icing on the cake. If a game looks a little worse than when first shown, it's not that big a deal to me. I love gaming, that's it really.

Ok, I understand, fair enough. I am of the opinion that we should all be a bit critical in order to help our favorite hobby improve.
 
That's an awful example in my eyes. The HSA comes with performance benefits, but a Prius is just a slow car that runs with electricity instead of fuel.



I'm not sure what your definition of "midrange" is. According to Tom'sHardware it's HD7770 or HD7790. PS4 will easily outperform those.

Not gonna lie.. I never though of a $110-$150 GPU as being mid range. That seems like performance/low end to me.

7950 or gtx 670 seems like decent mid range to me... aka 300 bucks about. You would not have to upgrade that really for a good couple years. But then again... that is just an observation.
 
Those are double precision numbers, 8 Core Jaguar at 1.6GHz is looking at 38.4 GFLOPS DP theoretical.
Hmm, I see. It doesn't say what's measuring so I had no way of knowing. So it's at the level of i7 920@3.2Ghz, which is not bad either, but again, theoretical vs measured of course.

I wonder what placement in that list the PS4/Xbone proc would land.
Probably far worse than that. Especially Cell which is not really testable under these benchmarks (only PPU is)
 
That would be a 7870 then. But you already knew that.

Of course I knew it, I showed a $600 PC build with it just a few pages ago. Do note though that a $200 graphics card at the period of the PS4's launch will certainly be more powerful than the 7870. You'll get much more bang for your buck.

I expect PS4 to easily outperform it on day one.

Ok then, we'll revisit this when the time comes.
 
Top Bottom