• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Visual Downgrade In Next-Gen Tech Demos Going From PC To Consoles?

because the tech demos got downgraded and they ran on PC first.
The only decent comparison we have between the tech demos and console builds is Watch_Dogs, which is closer than I thought, and the game isn't even out yet.

Although since it's from Ubi, all demos are suspect before release.
 
The only decent comparison we have between the tech demos and console builds is Watch_Dogs, which is closer than I thought, and the game isn't even out yet.

Although since it's from Ubi, all demos are suspect before release.

I would be weary of most of what you saw at E3 if I were in your shoes. Not just Ubi's conference.
 
I would be weary of most of what you saw at E3 if I were in your shoes. Not just Ubi's conference.

Oh, I know. Ubi's just the only one to show stuff that isn't even representative of the PC versions.

Between lots of Microsoft's stuff running on PC's and Sony's looking worse than at the reveal I'm a bit down on next gen visuals. Hoping Gamescom delivers good footage running on real hardware.
 
You know what? There's this thing called "the middle ground" between what you just described and Shinobi's "everything is gonna be great, sunshine and rainbows everywhere" approach. Both extremes are fanboy territory, the truth lies somewhere in the middle. We'll get good-looking games but don't expect any sort of huge leap.

What am I a fanboy of exactly? Sorry that I'm excited for games.
 
I don't know why it turns into a dick swinging contest... it's clear as day

PCs outperform consoles. What is there to argue? So what of 30fps exclusives? If they were on PC I'd guarantee you at console settings, enthusiast builds would run those games close to 120fps.

For console gamers to even come in here and try to defend their tech is silly to me. Hell, AMD's Hawaii comes out in October and it shits all over the tech they gave Sony and MS.

You know what's so funny the mid enthusiast cards will probably cost less or as much as the new consoles. People who've had above console performance with their 7 or 8 year old cards will be getting the real deal next-gen late this year with AMD and next year with NV and then later with Oculus Rift.
 
If you truly want to see what next gen consoles is capable of - just look at InFamous:SS.

Using the logic of this thread, we can't cause almost all the footage and screenshots is from youtube videos and prehistoric alpha state builds so we can't pass judgment till the release day. Unless Sony release some bullshots that clearly represents the game, those are the only ones valid, everything else is ignored.
 

Mainly Killzone. I've watched 1080p direct feeds of gameplay from both shows in slow mo quite a few times for comparisons (did the same for Watch_Dogs). Almost all the post processing was gone in the E3 demo, as well as added pop in and a lower frame rate. That said, I'm not doubting GG will deliver by launch. I'm sure it'll look and run at least as good as it did in February. I just got myself over excited by reading all of their goals for the game and expected to see them in the E3 build. I think they just wanted to show off the larger environments (which skyrocketed my hype for the gameplay of the game) and had to revert to an old build to get it playable by the press. I was expecting the game to stay fairly linear so I thought we would see visual improvements sooner.
 
Mainly Killzone. I've watched 1080p direct feeds of gameplay from both shows in slow mo quite a few times for comparisons (did the same for Watch_Dogs). Almost all the post processing was gone in the E3 demo, as well as added pop in and a lower frame rate. That said, I'm not doubting GG will deliver by launch. I'm sure it'll look and run at least as good as it did in February. I just got myself over excited by reading all of their goals for the game and expected to see them in the E3 build. I think they just wanted to show off the larger environments (which skyrocketed my hype for the gameplay of the game) and had to revert to an old build to get it playable by the press. I was expecting the game to stay fairly linear so I thought we would see visual improvements sooner.

I still had doubts they could deliver, E3 was alpha for killzone and that means almost all assests and code done. GG is probably already on internal beta with it wich means testing, fixing, cutting, etc. Adding something at this state is a no.
 
-- Thread about how graphics on consoles got downgraded since they were originally shown running on PCs.

-- People stating the obvious about more powerful PCs.

-- Let's all jump on the "PC elitists" name-calling bandwagon and compare prices and ease-of-use instead of what the topic is supposed to be about.
 
You would have to be very dumb/rich/impatient to pay full price for 15 games over five years. Only a modicum of effort and patience is required to get console games at half price or less. I just got Tomb Raider for $20 only three months after launch.

Lol are you serious? Do you think it's that crazy for a person to buy THREE day one games a YEAR?
 
This PC superiority bullshit has to stop. Derails EVERY fucking thread.

It derailed a thread about... how PCs offer a superior visual experience when compared to their console counterparts.

I mean, what exactly were you expecting to find in this thread?
 
This PC superiority bullshit has to stop. Derails EVERY fucking thread.

No. You know what derailed this "fucking" thread? People who decided that pricing and ease-of-use (simplicity) was relevant to the topic of graphical fidelity. I don't care which "side" they are on. That's what derailed this thread--not the immense truck-load of name-calling. That's just expected behavior whenever there is something nice to say about PCs when pitted up against consoles.
 
No. You know what derailed this "fucking" thread? People who decided that pricing and ease-of-use (simplicity) was relevant to the topic of graphical fidelity. I don't care which "side" they are on. That's what derailed this thread--not the immense truck-load of name-calling. That's just expected behavior whenever there is something nice to say about PCs when pitted up against consoles.
Pretty much this, you'll be amazed that there are still people in this thread believe we need GTX Titans just to run games smoothly.
 
I still had doubts they could deliver, E3 was alpha for killzone and that means almost all assests and code done. GG is probably already on internal beta with it wich means testing, fixing, cutting, etc. Adding something at this state is a no.

They've shown a couple of models that are much higher quality than what we've seen and claimed they just weren't in the builds shown. I wish they had just released real screens of them instead of just showing them in their lighting test space though. GG also adds a lot of visual stuff late in the game, particularly in lighting and post processing, and they frontloaded a lot of the core design stuff so they could focus on buffing visuals later on. I don't think it will be hugely more impressive than what we've seen but the new models, better motion blur, better DoF, the better AA method, a steady framerate, and no pop in (all goals from tech presentations) should make it look pretty damn good by launch. Most of them will be subtle improvements, particularly the post process stuff, but it should give the game a more cohesive look.
 
They've shown a couple of models that are much higher quality than what we've seen and claimed they just weren't in the builds shown. I wish they had just released real screens of them instead of just showing them in their lighting test space though. GG also adds a lot of visual stuff late in the game, particularly in lighting and post processing, and they frontloaded a lot of the core design stuff so they could focus on buffing visuals later on. I don't think it will be hugely more impressive than what we've seen but the new models, better motion blur, better DoF, the better AA method, a steady framerate, and no pop in (all goals from tech presentations) should make it look pretty damn good by launch. Most of them will be subtle improvements, particularly the post process stuff, but it should give the game a more cohesive look.

Those models looked impressive, Hopefully we can see them in-game at gamescom.
 
Those models looked impressive, Hopefully we can see them in-game at gamescom.

Echo looks crazy good, especially the eyes.
Characters said:
kzsf_ne_2013-07-09_sinclair-echo-actors-revealed_04.jpg

Anyone have a link to the others in high quality. I can only find Sinclair but I know there were some Helghast and vehicle models released as well.

The gun models they've shown on the blog also look better than the one's we've seen in game.

Weapons said:
 
What am I a fanboy of exactly? Sorry that I'm excited for games.

You can be excited about something and critical about something at the same time, they're not mutually exclusive. I understand enthusiasm and it's great that you look forward to all the new games, but when you're constantly saying "it looks fine, it looks great, it's fantastic, it's unbelievable" it doesn't leave much room for discussion. Visual fidelity isn't even the issue here, the real issue is the possibility of publishers and developers misleading their consumer base with footage that is running on high-end PCs but is supposed to be from the console versions. A passive stance ensures that this will keep happening.

This PC superiority bullshit has to stop. Derails EVERY fucking thread.

If you back up a few pages, you'll notice that the discussion was derailed because of this:

No. You know what derailed this "fucking" thread? People who decided that pricing and ease-of-use (simplicity) was relevant to the topic of graphical fidelity. I don't care which "side" they are on. That's what derailed this thread--not the immense truck-load of name-calling. That's just expected behavior whenever there is something nice to say about PCs when pitted up against consoles.

And that's all there is to it. Scripta manent, as they say. We had the usual goalpost moving and PC people had to step in and put the offenders in their place. Up to that point the discussion was moving along just fine.
 
You would have to be very dumb/rich/impatient to pay full price for 15 games over five years. Only a modicum of effort and patience is required to get console games at half price or less. I just got Tomb Raider for $20 only three months after launch.

You could have got it on PC for $12.50, or you could have gotten it day one for $31.50 with no effort at all. But $20 after 3 months is OK I guess.
 
Depending on how Gamescom goes, I'm absolutely thinking of going all-in for a gaming PC.

Heck, I might even make the jump sooner. This month decides everything for me.

Looking forward to Battlefield 4 recommended system requirements and how it compares with the PS4 and X1.
 
Pretty impressive difference, and a great show for launch window tech. Can't wait to see mid and late cycle stuff.

I guess I'll be the bad guy again and point out that the comparison is both pointless and flawed. The PS4 is not competing with early gen games, it's competing with late gen PS3 and 360 titles and of course the great-looking PC versions. The pointless part: They need to sell the console to the current audience, not the one from 2007. Gamers of 2013 have seen Crysis 3, Uncharted 3, Halo 4, pick any graphically intensive game of the past couple of years. Next-gen games need to provide a significant leap from these titles if they are to drive console purchases.

The flawed part: the PS3 was notoriously difficult to program for. It had a weird architecture, not that great tools, a multicore CPU with strange and unfamiliar technology, split memory, take your pick. It makes sense that developers couldn't get decent results until way into the generation. This time the PS4 is supposedly designed with the goal to make game design as simple and effortless as possible. It's based on an x86 CPU and an AMD GPU, technologies that most decent game makers are intimately familiar with. It's disingenuous to suggest that the situation is even remotely similar. It gives the impression of preemptive damage control before footage from the real console versions (not from "PCs with equivalent spec") start appearing.
 
Sme of the comparisons are dumb. Unless you think that the new Killzone now looks like a ps3 game because some of the pre-alpha kz2 shots make it look like a ps2 game :)
 
These are HSA APUs. It's true that both x86 and GCN ISA are known to programmers (although GCN's full potential is locked away on PCs because of the pulled brake called DirectX), but on SoC level we're talking about the most important paradigm shift since the introduction of multicores. The fusion of CPU and GPU is completely uncharted territory for every programmer in the business.

I don't believe it will have the impact you think it will.
 
These are HSA APUs. It's true that both x86 and GCN ISA are known to programmers (although GCN's full potential is locked away on PCs because of the pulled brake called DirectX), but on SoC level we're talking about the most important paradigm shift since the introduction of multicores. The fusion of CPU and GPU is completely uncharted territory for every programmer in the business.

I already mentioned it a couple of pages ago. I explained it to you several times already, if I recall corretly, and nevertheless you still keep on spreading your bullshit. What the hell is wrong with you?



You're joking, right?!



Difference: KZ2 launched more than two years after the PS3 release. KZSF on the other hand is a launch title made with approximation hardware and beta tools

HSA and APU design should make things easier more than anything. Console only devs will need to do some catching up on GPU compute for sure but it's not like that is brand new like multicore CPUs and the CELL were. And sony are aparently shipping with great tools, documentation and libraries this time round, instead of the massive physical programing manual only in Japanese and bare-bones tools that the PS3 shipped with.
 
The engineers at AMD, ARM, Intel, Microsoft, Nvidia, Sony and the whole HSA foundation disagree.

Someone reading this might think that you're implying that Intel and Nvidia are members of the HSA Foundation.

And also, APUs with HSA represent an important step forward and gamers are going to see tangible benefits in terms of, e.g., texture streaming quality, but at the same time, it doesn't change the modest single core performance of a Jaguar CPU or improve the processing power of an AMD 7000 series GPU... and as you acknowledge, these technologies are very much known quantities to developers.
 
They're not, but they're already building hetero core processors (Intel) or are developing them right now (Nvidia's Project Denver).



With a HSA APU you can have the best of both CPU and GPU at the same time: CPU's are extremely smart, but very weak. GPU's are very dumb, but extremly strong. The whole point of the HSA is to use the brain of the CPU and the muscle of the GPU for a single task which will eventually make the HSA APU extremely smart and extremely strong at the same time.

Except the fact that rendering has been done on strong, dumb the GPU for many years. So even though we maybe will get some nice algorithms that previously utilized only the CPU to use the "muscle" a bit. But that won't make graphics prettier. Actually, it might be the other way around; if the GPU starts doing tons of GPU compute stuff, it won't have as much time rendering.

So, the 7000 series as a "rendering machine" is a known quantity. And the HSA APU architecture won't help it render stuff faster.
 
Except the fact that rendering has been done on strong, dumb the GPU for many years. So even though we maybe will get some nice algorithms that previously utilized only the CPU to use the "muscle" a bit. But that won't make graphics prettier. Actually, it might be the other way around; if the GPU starts doing tons of GPU compute stuff, it won't have as much time rendering.
But the whole idea behind GPU compute on PS4 is that you'll do it in parts where the GPU is not being utilized as much in for example rendering opaque shadow maps.

We have to see what happens, but the stuff at least Mark Cerny has been public about has been quite specific compared to what is otherwise communicated to the public.
 
But the whole idea behind GPU compute on PS4 is that you'll do it in parts where the GPU is not being utilized as much in for example rendering opaque shadow maps.

We have to see what happens, but the stuff at least Mark Cerny has been public about has been quite specific compared to what is otherwise communicated to the public.

Any kind of system like that will always hinder (albiet not very much) the GPU from doing pure rendering. The actual vertex shading and pixel shading will not suddenly get faster when the developers has worked with the PS4 for a few years. We might get some game with impressively smart systems for making the games smarter or even prettier, but there are certain things that just simply require tons of GPU horsepower:
* High polycount (reducing the need for LODing, thus pop-in)
* High texture resolution (also reducing pop-in),
* 4K resolution
* High Quality IQ

It's too bad we will get a generation of models having 20 LOD-levels (cause there's so much RAM) causing lots of pop-in, and smart methods for streaming in textures in runtime when needed instead of HQ textures ready at the get go.
 
If I'm not wrong Intel and Nvidia already tried this. Intel aborted Larrabee and Nvida included some features on their Fermi line but abandoned it in favor of CUDA. AMD Is the only one betting on this and already paid the price, they got steamrolled by Intel and Nvidia in the last years.
 
Not necesseraly. Sony uses a heavily modified AMD GPU that differs from HD7000 desktop cards. Cerny's plan is to use the parts of the GPU that are underutilized during a single frame for GPU without a penalty for render performance.

To achieve that goal, AMD equipped the PS4 GPU with 8 Asynchronous Compute Engines and 64 parallel Compute Queues (HD7970 has 2 ACEs and 2 CQs). Programmers can fill the Compute Queues with compute kernels which will wait for dependences to trigger their execution. PS4 can have 64 kernels waiting at the same time. If the kernel is triggered then the ACE will create the wavefront for the CUs and the compute job will get done. Since PS4's GPU has eight ACEs, it can work on eight compute tasks at the same time. A HD7970 only can work on two compute tasks at the same time. GPUs that don't have these ACEs at all have to use the Compute Pipeline in the Command Processor to do GPGPU. This will create a huge hit for rendering since the Command Processor can only do either rendering or compute, a GPU with ACEs can do both at the same time.

So the plan is not to dedicate a fixed amount of GPU resource to GPGPU (like the 14+4 CU bullshit for example), but to fill under-utilized parts of the GPU with compute kernels.



It'll have the greatest effect on compute, that's right, but it'll have a less significant effect on rendering as well.



I think you should stop badmouthing these systems.

Bad-mouthing?
ok..........


Let's say hypotheticaly that the heteogeneou computing offered by th next-gen systems does get a good start and shows some progress in real examples. Devs start sharing resources between CPU and GPU... etc....

LET US NOT FORGET that those rsources are being shared between a very low powered CPU and a midrange GPU.

Can we really expect miracles from a new coding paradigm on such lower end hardware?

If the CPU were stronger and the GPUs were stronger I could see this being taken advantage of in a game chaning way... because new data structures could be traversed and taken advantage of whilst having the performance to do it.
But since they are not so hugely powerfulcomponents.. I can hardly imagine something exotic being built into and on top modern day engines with such a low performance profile.

I think it will add a bit of pizaz here or there to an effect (maybe)... but some paradigm shift in programming architecture requires a crap ton of power usually. A class "A" example of this is Intel's Larrabee. It has had serious programmatical advantages but lacks the raw horsepower to make them in real time games.
 
It'll accelerate stuff, but I think you should look at it more like an enabler and not like an accelerator. Programmers will break fresh ground with these architectures and they will make these systeme strike highly above their weight. PCs without this architectural finesse will just use the sledgehammer to achieve the same goal. But that will cost you much more than $399.

I think the precedent for this idea (lower end hardware with newer programming producing stark results) is extremely uncommon or hard to find historically.

Larrabee, Nurbs, previous usage of Voxels in real time environs, probably tons of things that I cannot think of off the top of my head show you need serious grunt to implement new programming paradigms.

Heck, the switching of UE 4.0 to not use Voxel Cone Tracing is a showing of this. That was a new programming paradigm in a lot of ways... and they just did not have the performance for it in real time on consoles (and in someways... they did not have it complety on a gtx 680 class hardware... even though there are some caveats there considering they dropped the tech before they finished optimizing it).
 
That HSA talk is very nice and all but it doesn't really mean anything.

In the end what will run multiplats better, the PS4 or a high-end PC?

This is what people here care about, not how great HSA is or how efficient the PS4 design is.
 
I'm sorry, but there isn't a single Gamestop used game library on this continent that competes with digital deals on PC. This thread has also gone over the various paywalls associated with console gaming.

Seems a bit unfair to compare day one $60 console games with day x discounted PC games. If you're shopping smart for PC, shouldn't you do that for console too? It'd drop the cost way down. I expect there would still be a small advantage for PC software, but combined with the higher hardware cost the overall cost of ownership would be closer

(This gen at least, next gen PS+ and XBL fuck the equation up, but a lot of people shop smart with those too)
 
Top Bottom