• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

Built my PC earlier this year with a 770 4gb fully expecting I'll need to sli to run this title at 1080P Ultra settings. I'll get my second card come Black Friday or Boxing Day. I'll be more then ready when this sucker drops next year.
 
I don't really mind since I'm sure whatever happens it will look amazing, that gameplay video has left me in no doubt.
 
Certainly as technology has improved, animation budgets have improved also.
Sure, but what you are missing is that when we are talking about canned animations, we already reached the point where technological limitations weren't a particularly relevant/limiting factor a while ago.
Right now you won't see animations that are blatantly "last" or "current" gen. You'll see animators who are more or less talented at doing their work.

Just to take an example every Sony fan seems to hold close to his heart... Do you think every developer around will be able to top what Naughty Dogs did on PS3 with TLoU on "current gen"?
No, they won't, because most studios won't have neither the talent nor the budget to work at that level.
 
Ah, I see. So more ACE's (if you can call it like that) means better and more rendering of such effects?
Yeap... you can run more things while rendering the graphics but it is hard and yet new to use.

You can for example do all the wind physics simulation via ACEs with the GPU doing both simultaneous and freeing up the CPU.

More ACEs = more compute taks you can run in GPU simultaneous.
 
Well Cerny did a comparison with the SPU & the GPGPU in the ps4 if I'm not wrong.

That still does not make a ACE a similar thing to a SPU. ;-)

On the GPU, (programmed) computations are performed on the CUs, and the ACEs are there to schedule and dispatch workload to the CUs. On Cell, the SPUs might be very roughly comparable to the CUs, since they also perform programmed computations, but the do it much more autonomously. There is no fire-and-forget hardware logic (like the ACEs) to schedule workloads to them. It had to be programmed manually, following one of many possible patterns.
 
....Really? That doesn't apply at all here, since the game isn't using Euphoria. Nor any other system based on procedurally generated animations, as far as I know.
All the animations here are produced individually and then probably interpolated.

Of course it applies. You just proved it. Someone claimed the anims in The Witcher 3 don't look technically impressive, you said animations are artistic only and not reliant on tech, I pointed out how tech and hardware can significantly improve animations by making them completely dynamic instead of canned and fixed.

Yes, The Witcher 3 does not use dynamic animations. That's probably why some people feel they look "last gen". When you improve visual fidelity, canned animations stand out more easily. AI is facing similar complaints at the start of this new generation, and I'm excited to see how developers improve both AI and animation complexity over the next half decade.
 
Well Cerny did a comparison with the SPU & the GPGPU in the ps4 if I'm not wrong.
They example is in context of the parallel working method of both. The SPU work is all out about batch your work on each so then when a job is done you don't get batch stalls in the gaps, keep the gaps small and staggered. The logic flow on the cell was crap so to make it work well you had to code with the prediction in your methods.

With GPGPU then the large que of ACE's can reorganise the GPU workload within each cycle meaning if something changes mid job it can que up another to fill the gap.

Layman example but generally how it works.
 
If they get it to run in 900p on XboxOne, 1080p should be no problem on Ps4.
I really hope their deal with Microsoft doesn't include a parity clause...
That's why I'm iffy on the PC version. They might keep one or two effects but if all assets are downgraded for parity then that would be incredibly disappointing.
 
It is fake guys.

I will be surprised (and disappointed) if they didn't run the game at 1080p on PS4.

Sure it can run at 1080P but at what state? Comparable to high settings on PC at 30fps? No draw in, no screen tearing? I wouldn't be surprised one way or another tbh.

As for "fake", what do you mean? *edit* I didn't see your earlier answer.
 
If devs as talented as CDPR are already essentially maxing out both consoles then how much remaining headroom does either console have even with software/driver optimizations?
 
Yeap... you can run more things while rendering the graphics but it is hard and yet new to use.

You can for example do all the wind physics simulation via ACEs with the GPU doing both simultaneous and freeing up the CPU.

More ACEs = more compute taks you can run in GPU simultaneous.

Ah, I see. That is a neat thing.

Thanks for the explanation.
 
LOL, CDPR are terrible developers. Even on solid gaming pc rigs their games tend to run horribly.

damage control failure

witcher 2 runs fine on my hd6870 (ancient budget gpu from 2010 that's only a bit more powerful than an xbox one) as long as I don't use ubersampling (which makes sense as ubersampling is supersampling, which is several times more demanding to use in any game, and not a single console game ever has used it or will use it this generation)

I don't think anyone expects it to run at high framerates at the settings in those trailers/gifs on a hd7850 on pc either, not sure why you'd expect anything different on consoles.
 
Top Bottom