hgarrett54
Banned
Why the f*** has this turned into a PC gaming superiority thread?
Why the f*** has this turned into a PC gaming superiority thread?
The only decent comparison we have between the tech demos and console builds is Watch_Dogs, which is closer than I thought, and the game isn't even out yet.because the tech demos got downgraded and they ran on PC first.
The only decent comparison we have between the tech demos and console builds is Watch_Dogs, which is closer than I thought, and the game isn't even out yet.
Although since it's from Ubi, all demos are suspect before release.
I would be weary of most of what you saw at E3 if I were in your shoes. Not just Ubi's conference.
You know what? There's this thing called "the middle ground" between what you just described and Shinobi's "everything is gonna be great, sunshine and rainbows everywhere" approach. Both extremes are fanboy territory, the truth lies somewhere in the middle. We'll get good-looking games but don't expect any sort of huge leap.
Why the f*** has this turned into a PC gaming superiority thread?
Sony's looking worse than at the reveal I'm a bit down on next gen visuals.
Why the f*** has this turned into a PC gaming superiority thread?
If you truly want to see what next gen consoles is capable of - just look at InFamous:SS.
If you truly want to see what next gen consoles is capable of - just look at InFamous:SS.
how so?
Mainly Killzone. I've watched 1080p direct feeds of gameplay from both shows in slow mo quite a few times for comparisons (did the same for Watch_Dogs). Almost all the post processing was gone in the E3 demo, as well as added pop in and a lower frame rate. That said, I'm not doubting GG will deliver by launch. I'm sure it'll look and run at least as good as it did in February. I just got myself over excited by reading all of their goals for the game and expected to see them in the E3 build. I think they just wanted to show off the larger environments (which skyrocketed my hype for the gameplay of the game) and had to revert to an old build to get it playable by the press. I was expecting the game to stay fairly linear so I thought we would see visual improvements sooner.
This PC superiority bullshit has to stop. Derails EVERY fucking thread.
Why? You don't like the fact that PC does have some advantages over consoles?This PC superiority bullshit has to stop. Derails EVERY fucking thread.
You would have to be very dumb/rich/impatient to pay full price for 15 games over five years. Only a modicum of effort and patience is required to get console games at half price or less. I just got Tomb Raider for $20 only three months after launch.
Also we have yet to see MGSV and FFXV running on consoles.
This PC superiority bullshit has to stop. Derails EVERY fucking thread.
This PC superiority bullshit has to stop. Derails EVERY fucking thread.
Pretty much this, you'll be amazed that there are still people in this thread believe we need GTX Titans just to run games smoothly.No. You know what derailed this "fucking" thread? People who decided that pricing and ease-of-use (simplicity) was relevant to the topic of graphical fidelity. I don't care which "side" they are on. That's what derailed this thread--not the immense truck-load of name-calling. That's just expected behavior whenever there is something nice to say about PCs when pitted up against consoles.
I still had doubts they could deliver, E3 was alpha for killzone and that means almost all assests and code done. GG is probably already on internal beta with it wich means testing, fixing, cutting, etc. Adding something at this state is a no.
They've shown a couple of models that are much higher quality than what we've seen and claimed they just weren't in the builds shown. I wish they had just released real screens of them instead of just showing them in their lighting test space though. GG also adds a lot of visual stuff late in the game, particularly in lighting and post processing, and they frontloaded a lot of the core design stuff so they could focus on buffing visuals later on. I don't think it will be hugely more impressive than what we've seen but the new models, better motion blur, better DoF, the better AA method, a steady framerate, and no pop in (all goals from tech presentations) should make it look pretty damn good by launch. Most of them will be subtle improvements, particularly the post process stuff, but it should give the game a more cohesive look.
Those models looked impressive, Hopefully we can see them in-game at gamescom.
Characters said:
Weapons said:
What am I a fanboy of exactly? Sorry that I'm excited for games.
This PC superiority bullshit has to stop. Derails EVERY fucking thread.
No. You know what derailed this "fucking" thread? People who decided that pricing and ease-of-use (simplicity) was relevant to the topic of graphical fidelity. I don't care which "side" they are on. That's what derailed this thread--not the immense truck-load of name-calling. That's just expected behavior whenever there is something nice to say about PCs when pitted up against consoles.
You would have to be very dumb/rich/impatient to pay full price for 15 games over five years. Only a modicum of effort and patience is required to get console games at half price or less. I just got Tomb Raider for $20 only three months after launch.
nice little video comparing Early ps3 games vs ps4 early games.
From sony.
http://www.youtube.com/watch?v=jNhcKGh_P9c
Pretty impressive difference, and a great show for launch window tech. Can't wait to see mid and late cycle stuff.
Pretty impressive difference, and a great show for launch window tech. Can't wait to see mid and late cycle stuff.
Yea I can't believe resistance looked that bad.
These are HSA APUs. It's true that both x86 and GCN ISA are known to programmers (although GCN's full potential is locked away on PCs because of the pulled brake called DirectX), but on SoC level we're talking about the most important paradigm shift since the introduction of multicores. The fusion of CPU and GPU is completely uncharted territory for every programmer in the business.
These are HSA APUs. It's true that both x86 and GCN ISA are known to programmers (although GCN's full potential is locked away on PCs because of the pulled brake called DirectX), but on SoC level we're talking about the most important paradigm shift since the introduction of multicores. The fusion of CPU and GPU is completely uncharted territory for every programmer in the business.
I already mentioned it a couple of pages ago. I explained it to you several times already, if I recall corretly, and nevertheless you still keep on spreading your bullshit. What the hell is wrong with you?
You're joking, right?!
Difference: KZ2 launched more than two years after the PS3 release. KZSF on the other hand is a launch title made with approximation hardware and beta tools
The engineers at AMD, ARM, Intel, Microsoft, Nvidia, Sony and the whole HSA foundation disagree.
They're not, but they're already building hetero core processors (Intel) or are developing them right now (Nvidia's Project Denver).
With a HSA APU you can have the best of both CPU and GPU at the same time: CPU's are extremely smart, but very weak. GPU's are very dumb, but extremly strong. The whole point of the HSA is to use the brain of the CPU and the muscle of the GPU for a single task which will eventually make the HSA APU extremely smart and extremely strong at the same time.
But the whole idea behind GPU compute on PS4 is that you'll do it in parts where the GPU is not being utilized as much in for example rendering opaque shadow maps.Except the fact that rendering has been done on strong, dumb the GPU for many years. So even though we maybe will get some nice algorithms that previously utilized only the CPU to use the "muscle" a bit. But that won't make graphics prettier. Actually, it might be the other way around; if the GPU starts doing tons of GPU compute stuff, it won't have as much time rendering.
The engineers at AMD, ARM, Intel, Microsoft, Nvidia, Sony and the whole HSA foundation disagree.
But the whole idea behind GPU compute on PS4 is that you'll do it in parts where the GPU is not being utilized as much in for example rendering opaque shadow maps.
We have to see what happens, but the stuff at least Mark Cerny has been public about has been quite specific compared to what is otherwise communicated to the public.
Not necesseraly. Sony uses a heavily modified AMD GPU that differs from HD7000 desktop cards. Cerny's plan is to use the parts of the GPU that are underutilized during a single frame for GPU without a penalty for render performance.
To achieve that goal, AMD equipped the PS4 GPU with 8 Asynchronous Compute Engines and 64 parallel Compute Queues (HD7970 has 2 ACEs and 2 CQs). Programmers can fill the Compute Queues with compute kernels which will wait for dependences to trigger their execution. PS4 can have 64 kernels waiting at the same time. If the kernel is triggered then the ACE will create the wavefront for the CUs and the compute job will get done. Since PS4's GPU has eight ACEs, it can work on eight compute tasks at the same time. A HD7970 only can work on two compute tasks at the same time. GPUs that don't have these ACEs at all have to use the Compute Pipeline in the Command Processor to do GPGPU. This will create a huge hit for rendering since the Command Processor can only do either rendering or compute, a GPU with ACEs can do both at the same time.
So the plan is not to dedicate a fixed amount of GPU resource to GPGPU (like the 14+4 CU bullshit for example), but to fill under-utilized parts of the GPU with compute kernels.
It'll have the greatest effect on compute, that's right, but it'll have a less significant effect on rendering as well.
I think you should stop badmouthing these systems.
I think you should stop badmouthing these systems.
It'll accelerate stuff, but I think you should look at it more like an enabler and not like an accelerator. Programmers will break fresh ground with these architectures and they will make these systeme strike highly above their weight. PCs without this architectural finesse will just use the sledgehammer to achieve the same goal. But that will cost you much more than $399.
I'm sorry, but there isn't a single Gamestop used game library on this continent that competes with digital deals on PC. This thread has also gone over the various paywalls associated with console gaming.