• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Witcher 3 probably not 1080P on consoles

And it will just like any game ever released, there will me low/medium/high and ultra settings to chose from.

60fps will absolutely be achievable on PC, not on any other platforms.
But as with everything it will come at a price.

should have probably mentioned that they said w/ maxed out settings AND 60 fps. On hardware that is obtainable upon release. Maybe its possible.... but I don't see how you can argue it will definitely happen. You can definitely say that at some point in the future it will happen because hardware will continue to improve.. but at release? Ehhhhh, this game looks insanely demanding
 
I'm still curious about whether or not growth in monitor resolution is actually tenable, which is to say that pretty much everyone maxed out on colors a monitor needed to be able to support, is there a supposed maximum for resolution?
The maximum resolution for VR is around 16k to match the eye resolution. Likely monitors would already be fine at 8k. (Not though that this doesn't mean that rendering without AA at these resolutions will be sufficient, there might still be detectable flicker/edge crawling, the eye is amazingly good at spotting those)
 
should have probably mentioned that they said w/ maxed out settings AND 60 fps. On hardware that is obtainable upon release. Maybe its possible.... but I don't see how you can argue it will definitely happen.

GTX 880 should run this game at max settings 1080P and 60FPS when it releases later in the year.. A handful of cards already on the market can probably do so as we'll.

Are you a PC gamer?
 
It will too, on multi GPU setups at 1080p and 1440p.
And high settings should be a lot less taxing.

I look forward to seeing it then, I'm skeptical however. That fur tech and giant draw distances, foliage everywhere, it seems crazy to me. Prove me wrong though, I'd be really impressed to see some rock solid 60fps gameplay of TW3 maxed out.
 
I understand what you're saying... all I'm asking is what does the general consensus among PC GAF consider the bare minimum resolution when trying to max out a game in 2015. I get that its technically possible for any resolution really... but there's gotta be some cut-off point where its like... "dude, you're running the game at (insert resolution here) why are you even posting in the PC thread claiming you've maxed it out?" The argument started when someone claimed that 60 fps would definitely be doable on PC for this game... and I'm asking how far down in resolution are we talking, and to that point is it even relevant because nobody will play at that resolution regardless. Or maybe they will, how the heck would I know.

As I said, it doesn't seem like a particularly helpful thing to try and do. There is a reason why it's the convention to note the resolution along side quality settings.

If you want it as some sort of benchmark, then it's your benchmark. Maybe choose 1080p? It's Full HD, it's the most popular resolution among Steam users. If someone says "I can run The Witcher 3 at 720p@60fps" then that person doesn't meet your benchmark and you can ignore their results.
 
The maximum resolution for VR is around 16k to match the eye resolution. Likely monitors would already be fine at 8k. (Not though that this doesn't mean that rendering without AA at these resolutions will be sufficient, there might still be detectable flicker/edge crawling, the eye is amazingly good at spotting those)
Thanks for this. Nice to be able to think about it in those terms.

Do you see visual parity with reality as being the end-goal (because this seems really untenable), or just being good enough to consistently trick the eye?
 
GTX 880 should run this game at max settings 1080P and 60FPS when it releases later in the year.. A handful of cards already on the market can probably do so as we'll.

Are you a PC gamer?

What benchmarks are you basing your statement off of? Has CD Project released any information on hardware requirements to reach 1080/60 w/ this game or something lol?

Yes I'm a PC gamer....
 
Traditionally, the most common monitor resolution. Which is probably 1080p at this point.

But as I explained in my previous post, the obsession with "maxing out" is actively harmful.

I agree. Wasn't until recent years where people started to have the expectation that they could turn up all the dials day one without doing an upgrade.

I want The Witcher 3 to become the new, "Can it run Crysis?"
 
I look forward to seeing it then, I'm skeptical however. That fur tech and giant draw distances, foliage everywhere, it seems crazy to me. Prove me wrong though, I'd be really impressed to see some rock solid 60fps gameplay of TW3 maxed out.
You will see come release. But not on a mid-range PC for sure, 60fps is legitimately taxing.

If was to be proven wrong (which is possible I suppose) I'm not sure how that's a bad thing. As anyone is aware PC gaming is inherently scalable.
 
I understand what you're saying... all I'm asking is what does the general consensus among PC GAF consider the bare minimum resolution when trying to max out a game in 2015. I get that its technically possible for any resolution really... but there's gotta be some cut-off point where its like... "dude, you're running the game at (insert resolution here) why are you even posting in the PC thread claiming you've maxed it out?" The argument started when someone claimed that 60 fps would definitely be doable on PC for this game... and I'm asking how far down in resolution are we talking, and to that point is it even relevant because nobody will play at that resolution regardless. Or maybe they will, how the heck would I know.


http://store.steampowered.com/hwsurvey#main_stats_header

1080p its the most commonly used gaming pc desktop resolution as well as the optimal resolution for most hdtvs out there.

Most games can be maxed at that resolution with mid level graphics cards. It's the optimal value band for PC gaming. 2560x1440 is becoming more popular but its much more expensive to maintain 60fps at that resolution. 4k is a bit of a joke at this point if you want to be able to sustain a solid 60.
 
I'm still curious about whether or not growth in monitor resolution is actually tenable, which is to say that pretty much everyone maxed out on colors a monitor needed to be able to support, is there a supposed maximum for resolution?

That won't be an issue for a long while. We're just beginning to move on from 1080p, that resolution is old news and looks pretty terrible on larger monitors, there' still a whole lot more growing to do yet. I'm sure a great thing about high-end PCs being so much more powerful than consoles this time around will be the extra headroom that allows people to move onto resolutions like 1440p and eventually 4K while still being able to run AAA games nicely.
 
Thanks for this. Nice to be able to think about it in those terms.

Do you see visual parity with reality as being the end-goal (because this seems really untenable), or just being good enough to consistently trick the eye?
I'm really not too much into photorealistic rendering for games -- I often prefer stylized graphics. I see the end goal as being able to display exactly what the artist intended without even the slightest hint of technical problems (be they aliasing, undersampling, flickering, unintended blur, pop-in, temporal isntability or whatever else). Of course, this also means that if the artist intended a realistic scene, then it should be indistinguishable from reality.
 
You will see come release. But not on a mid-range PC for sure, 60fps is legitimately taxing.

If was to be proven wrong (which is possible I suppose) I'm not sure how that's a bad thing. As anyone is aware PC gaming is inherently scalable.

Didn't say it was a bad thing, just some people in here are saying its obviously going to happen at release and I'm just not convinced. This game looks incredible, and has brand new technology on PC that we haven't even seen before in the fur and hair tech they're using... how can anyone know (other than the developers themselves) what future benchmarks will look like? We can guess I suppose, and I'm assuming that's what we're doing here, but I'm not really seeing how that is definitive in any way.
 
Of course, this also means that if the artist intended a realistic scene, then it should be indistinguishable from reality.
I don't disagree with this in theory, but in practice I think it leads to somewhat unhealthy expectations about technology. Though I guess if quantum computing becomes tenable I may have to eat my words regardless.
 
Didn't say it was a bad thing, just some people in here are saying its obviously going to happen at release and I'm just not convinced. This game looks incredible, and has brand new technology on PC that we haven't even seen before in the fur and hair tech they're using... how can anyone know (other than the developers themselves) what future benchmarks will look like? We can guess I suppose, and I'm assuming that's what we're doing here, but I'm not really seeing how that is definitive in any way.

I consider maxing 1080p all effects no AA. I can force all sorts of crazy shit in nvidia inspector that will make my 690gtx cry even in an old game. Some games may have some totally unoptimised effects that can also wreak havoc on even top tier cards at higher resolution. Dayz SA brings a SLI 780ti setup to its knees because of how unoptimised the rendering engine in that game is. Lots of factors go into it on the software front as well.
 
Didn't say it was a bad thing, just some people in here are saying its obviously going to happen at release and I'm just not convinced. This game looks incredible, and has brand new technology on PC that we haven't even seen before in the fur and hair tech they're using.
That is factually incorrect.
Tomb Raider shipped with TressFX back in 2013 and COD Ghosts shipped with Hairworks the same year.

how can anyone know (other than the developers themselves) what future benchmarks will look like? We can guess I suppose, and I'm assuming that's what we're doing here, but I'm not really seeing how that is definitive in any way.
I don't see how you can be so adamant on the fact that no PC in existence will max out TW3 at 60fps/1080p.
Multi GPU setups exist for a reason. If money isn't a problem you can pack 4 780ti for approximately 20tflops.

You think this won't cut it for 60fps ?
 
Didn't say it was a bad thing, just some people in here are saying its obviously going to happen at release and I'm just not convinced. This game looks incredible, and has brand new technology on PC that we haven't even seen before in the fur and hair tech they're using... how can anyone know (other than the developers themselves) what future benchmarks will look like? We can guess I suppose, and I'm assuming that's what we're doing here, but I'm not really seeing how that is definitive in any way.

We can't, but that's a moot discussion. We have to assume they target a specific range of hardware available to us and not implement features that wont even work with todays hardware but might work in 5 years. They have to at least be somewhere in the range of usability. Considering that some enthusiasts go all out on building very powerful gaming pc's, it's not totally unrealistic to assume that they might be able to run it at 60 fps.
 
We can't, but that's a moot discussion. We have to assume they target a specific range of hardware available to us and not implement features that wont even work with todays hardware but might work in 5 years. They have to at least be somewhere in the range of usability. Considering that some enthusiasts go all out on building very powerful gaming pc's, it's not totally unrealistic to assume that they might be able to run it at 60 fps.

Parallel computing scaling has increased dramatically since the single card era. We can literally throw more money at the problem now and get good scaling.

scaling.png


1920-VH.png
 
You are not alone! I agree with you as well ;)

Just had to let you know. There is a reason we use the word "max" settings. Max is max. If 4K is supported, then its max. At least according to both of us :)
There is no use 'being different' here.

Max settings isn't a useful metric unless the specific resolution and framerate achieved is being used. Like scoobs said, he might be able to max out a game at 800x600, but if he doesn't specify the resolution being used and what performance he's getting as a result, then it wont mean much of anything to somebody else.

It is an absolutely useless method of description without some context, which naming the resolution gives.

Y'alls opinion on the matter is completely nonsensical. Feel free to continue using it, but expect nothing but confusion and argument if you ever try and apply this metric in the future when talking about benchmarks for games. I would recommend going with the generally understood and common sense method that everybody else in the world uses for the sake of clear communication and avoidance of unnecessary arguing.
 
That completely depends on whether they include some supersampling AA setting again.
I hope they do, but not as part of overall "ultra" preset. Including ubersampling as part of ultra spec in Witcher 2 was a big mistake because of all the uninformed people.
 
Missing my point... what is the minimum resolution you guys would consider valid to maxing out the game? If I show up in TW3 PC thread saying I've maxed the game out at 800x600, who the hell would call that valid or at all relevant? You'd probably laugh me out of the thread.... I'm positive of this.

I'm just attempting to discern what you guys would consider an acceptable resolution to be for a maxed out Witcher 3 @ 60fps (which some have claimed to be obtainable on current PC hardware)... simple curiosity is all this is on my part. 720? 1080? 1440? Or does it just not matter at all. Would you be impressed if I maxed it at 720p (likely lower resolution than the consoles)?

Whatever the native resolution of your screen is. There are never fixed baselines for this kind of thing in PC gaming.
 
..... console versions of games are always better optimized than PC versions - simply because they know exactly which GPU, CPU, etc they are optimizing for, rather than trying to make the game run well with pretty much any possible setup.

If that were the case thus far, I would have thought a 750 ti (weaker than 7850) would get crushed by the optimized GPU with nearly double the shaders, rather than trading blows with or even getting beaten by the 750 ti.

As for current CPU-demanding games (TW3 looks to be CPU demanding too), I dunno if there is one that a 780ti couldn't play at 4x the console resolution. Something like BF4 multiplayer even runs fine at 1800p medium settings on my mate's 760, so a 780 ti should have no problems at high settings with 4x the pixels of console; I think TW3 could be similar, but it will surely have extra bells and whistles that are demanding, or even have some settings that are either off or better than watered-down effects on consoles, like tessellation in Sniper Elite 3.
 
Didn't say it was a bad thing, just some people in here are saying its obviously going to happen at release and I'm just not convinced. This game looks incredible, and has brand new technology on PC that we haven't even seen before in the fur and hair tech they're using... how can anyone know (other than the developers themselves) what future benchmarks will look like? We can guess I suppose, and I'm assuming that's what we're doing here, but I'm not really seeing how that is definitive in any way.

It'll be demanding, but not that much so. 780ti or a 880, with a decent CPU, will probably max it at 1080p 60fps.
 
Good thing I just built a new PC. I already have this preordered on steam. I want to really give my PC a run for it's money. I'll probably skip on the Ubersampling thouh, considering that my 4790k and 780 had frame rate issues with that in witcher 2. Not sure what you need to run ubersampling, probably something like Smokey's rig. Whatever you need, it's a bit too extreme for me.
 
To be fair, it's been looking pretty uncomfortable in the gameplay videos so far so I think I'd prefer a drop in resolution as I'm not sure I'd want to play what I've seen so far.

EDIT
Out of interest, have they shown any PS4 gameplay yet or is it all Xbox 1 stuff?
 
I be more shocked honestly if either version PC or console is smooth at launch without bugs or optimisation issues. Not like past 2 witchers been rough at launch oh wait....

Still can't wait
 
Wow! Oblivion was almost as demanding as Crysis back in the day?! That must've been some un-optimized shit :P
Bethesda is very bad with optimizing, yes. Skyrim could have run a 100% more efficient if they had cleaned up their code, according to Boris Vornotsov.
 
It's still not convincing evidence to saying that a game is ugly. 1080p or not this and bloodborne are definite must buys. Compare this game'a visuals to last gen open world games. It's leagues upon leagues ahead of last gen

I don't even know how you guys played Skyrim on consoles. My first thought upon seeing screenshots was that it looked like an N64 game. Hyperbolic, but it does looks horrible.

Bethesda is very bad with optimizing, yes. Skyrim could have run a 100% more efficient if they had cleaned up their code, according to Boris Vornotsov.

Gamebryo. Skyrim had many of the same issues the common issues with that engine.

Good thing I just built a new PC. I already have this preordered on steam. I want to really give my PC a run for it's money. I'll probably skip on the Ubersampling thouh, considering that my 4790k and 780 had frame rate issues with that in witcher 2. Not sure what you need to run ubersampling, probably something like Smokey's rig. Whatever you need, it's a bit too extreme for me.

You don't want to dive down the rabbit hole where you attempt to build a rig capable of maxing out The Witcher 2/3.

Insanity.
 
doesn't infamous second son have no loading? and that is open world. no man's sky will be open universe and it's 1080p/60 fps. so yes, it is possible. it is more about the problem with streaming assets and eliminating bugs while doing it. i think devs face more challenges trying to stream an open world without breaking the game than it is taxing the hardware for performance.


also, uncharted games never had loading screens, loading were hidden during cutscenes. but now uc4 will use real-time cutscenes, so...i dunno. in some way or another they will have to stream assets anyway.

Are you seriously comparing Infamous with The Witcher 3??

And some of you guys are nuts if you'd rather have loading screens than a reduction in resolution. If you want 1080P so much, you're best off bulding a gaming pc, honestly.
 
That is factually incorrect.
Tomb Raider shipped with TressFX back in 2013 and COD Ghosts shipped with Hairworks the same year.


I don't see how you can be so adamant on the fact that no PC in existence will max out TW3 at 60fps/1080p.
Multi GPU setups exist for a reason. If money isn't a problem you can pack 4 780ti for approximately 20tflops.

You think this won't cut it for 60fps ?
I have no idea if that would cut it, and neither do you.
 
I have no idea if that would cut it, and neither do you.
Yet you are trying to cast doubt on the fact that it could.
For reasons unknown.

Aside from that I believe 30fps with console equivalent settings should be very, very affordable.
 
Yet you are trying to cast doubt on the fact that it could.
For reasons unknown.

Aside from that I believe 30fps with console equivalent settings should be very, very affordable.
I never said it couldn't, I said I'm not convinced that any PC could. Lots of unknowns, I'll be proven right or wrong when the game comes out.

The "fact that it could" isn't a fact at all, given its an unknown... How is that a fact? That's like saying its a fact that "anything is possible." That's not a fact.
 
I never said it couldn't, I said I'm not convinced that any PC could. Lots of unknowns, I'll be proven right or wrong when the game comes out.

The "fact that it could" isn't a fact at all, given its an unknown... How is that a fact?

People have PCs with 4 Titan Blacks in them. That is like 20 Nvidia TF. I highly doubt they are designing their 16.6ms barrier to be only reached by 20plus TF configs...

That will run this game at 1080p 60. Nvidia will make sure of it. There is no unknown to make you unsecure in your presumption... it is obvious.
 
Top Bottom