• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3: Wild Hunt trailer - 1080p HQ version

boskee

Member
Oh my, tell me about my hidden agenda then. Its all part of my plan to ruin CDPR reputation on GAF. And I would, if it wasn't for those meddling kids! /s

Why do screenshots from here, named "The Witcher 3 new gorgeous screens" look like shit and this video looks better. Its a honest question.

People here are quick to call bullshots and bulltrailers(?) but this game gets the blind eye treatment for some reason. Don't just dismiss my point of view because it does not fit your world.

It was explained to you before, but I will do it again: the previous screenshots used the old renderer - the one used in The Witcher 2. This is how game development works - you start with something looking not that fantastic and then improve until you are limited by hardware.

It's like asking why this screenshot from Starcraft looked so shit:

starcraft_alpha.png


When the final version looked like that:

5dca0344c396a31e54ddba9e5e0eb9c1.png
 

Serrenn

Banned
Because it is aliased in the video and it looks totally in game. I mean, no one is sure how it would look on consoles, but the look of the trailer is not hard to achieve since the only thing you need is the fact the assets exist and a PC.
I hope its not console screenshots then. Because that would ruin the day for me.

There are screens up there, everyone sees the sharpening.
This is what got me worried in the first place. It looks like someone was playing with poorly configured ENB on those.

EDIT:

It was explained to you before, but I will do it again: the previous screenshots used the old renderer - the one used in The Witcher 2. This is how game development works - you start with something looking not that fantastic and then improve until you are limited by hardware.
Alright then. I rest my case. If they've changed the rendered it could be the reason why it looks much different.
 
I hope its not console screenshots then. Because that would ruin the day for me.


This is what got me worried in the first place. It looks like someone was playing with poorly configured ENB on those.

EDIT:


Alright then. I rest my case. If they've changed the rendered it could be the reason why it looks much different.

I don't think they are console screenshots, certainly not final build, But if those were in motion, they wouldn't look that different from the videos. In terms of IQ, the screenshot from last page of the city rooftops of this trailer really doesn't look that hot, except for the draw distance.
 

Durante

Member
vlcsnap-2013-12-09-1971j0y.png
This scene still looks great in high res, but it needs less sharpening (this shouldnt be an issue to turn off) and something like HBAO+ for more depth.
 

Enkidu

Member
Yeah, true. I guess I'm thinking more from the "Enable PhysX" setting option in some games. I can't think of one, that specifies PhysX as additional effects, that can be enabled on AMD GPUs (excluding offloading to CPU). If the PhysX stuff is built into the engine, like Havok, then yeah it works. But in most (if not all?) cases when a game offers PhysX enhanced options they don't work on AMD hardware.

I guess it will depend on how CDPR implement them into The Witcher 3.
If the game locks you out, I think it's just a software block implemented by the developer so you don't activate it without an Nvidia GPU. Not all games do that though, I remember when I had an AMD card I sometimes activated Physx (often by mistake). I believe it just automatically runs on the CPU in those cases so for some older games it's possible to activate Physx effects today with an AMD card if you've got a strong enough CPU. If you don't have a beastly CPU, you might see effects like this though:
http://youtu.be/vkQgDc1j9D4?t=2m15s
 

Lunar15

Member
Wait wait, even the scenes from the beginning are in game? Even that part where the guy sees the ship coming from across the sea? I could have sworn that was pre-rendered.
 

PFD

Member
Wait wait, even the scenes from the beginning are in game? Even that part where the guy sees the ship coming from across the sea? I could have sworn that was pre-rendered.

I thought so too at first, but looking at the HQ footage you can see some imperfections in the foliage and the distant mountains. The ship has a nice blur filter masking the imperfections.
 

Durante

Member
Wait wait, even the scenes from the beginning are in game? Even that part where the guy sees the ship coming from across the sea? I could have sworn that was pre-rendered.
It's aliased, so either they rendered it badly offline on purpose or it's realtime.
 

Zakalwe

Banned
650 TI
5500K Core i5
12 GB RAM

Thoughts on what my friend will be able to run it on? We have a bet going.

If we go by TW2 and consider TW3 to require more power, then I'd say Low-Medium like other members have suggested.

Other factors of course such as how well optimised it is, what kind of framerate you consider acceptable, etc...

I currently have an i5 2500k@4.5 ghz, 8gigz ram, 670, etc... and I expect to upgrade for it.
 
I am LTTP on the witcher series, I kept talking myself out of buying them all the time. Well one day on gog earlier this year they were both on sale and I picked them up.

I'm really into these games. It's safe to say this is one of my most anticipated releases.
 
That's only how you experience pc gaming. You don't have to upgrade if you don't want to. You only need hardware that has enough performance to play it. There is a reason why there are low options and ultra options. It's not only mid/high/ultra that exists you know.

Look at the witcher 2 system specs:

minimum: 8800gs, e4500 2,2ghz dual core cpu, 3850 gpu, 1gb of ram. Recommend only featured a 260 gtx a budget dx10 card. While a dx11 ( second gen ) top model had to be sli'ed to get ultra settings done at 1080p with ubersampling at 60 fps.

Did the dude had to upgrade his 8800 card towards 2x 580's just to play the witcher 2? nope.

example of somebody using a old gpu "http://www.youtube.com/watch?v=xRW9RAAgkbE". ( 2006 gpu )

Now with a new generation people have to upgrade there pc at some point. But so do console gamers. There PS3's are not going to run the PS4 multiports down the road. While if you upgraded in the last few years towards a PC solution ( like i did 3 years ago ) you will be perfectly capable to keep on running future games without any additional costs. It goes both ways.

I have a 580gtx, i7 860 and 8gb of ram, my pc will last this entire generation. But if i want to play games on the quality of consoles in a few years from now. is the question. But i sure as hell won't be forced to upgrade this entire generation just to play newer games. The witcher 3 and the witcher 4 ( if 4 is getting launched in the ps4 generation ) won't give me any issue's. I wouldn't be shocked if this setup will run perfectly fine ps5 multiports down the road on low settings.

If you mean with fix a lot of games, that you need to tinker around a lot with graphical settings to get good stable performance?. which costs time and is annoying? well nobody forces you, just put the setting on lowest ( that's why its there ) and have a blast running the game on console quality. "it's also called coded to the metal or optimizing in console land".

The physx part if implanted will be high likely extremely limited. I wouldn't be suprised if its extremely limited on use in pc land already and only restricted to sli setups.

No I really meant games suffering from stuttering (not bad framerates, but choppy rendering), crashes, driver issues, graphical bugs caused by who knows what. And bad performance which isn't caused by graphical features. You know games in which it doesn't matter what settings you use and still get bad performance. I know my tech and what it is capable of, but when it comes to PC-gaming we will always have to endure these issues thanks to all the different setups in the world. No matter how good of a computer you have.

By the way. The lowest settings in modern PC-games is not equal to their respective console counterparts. The lowest settings in almost every game is made for crappy laptops or just really old stationary computers. Last gen console ports always had a mix of med-high settings, but at a lower resolution. Targeting the 30fps (aside from maybe COD) mark of course.
 

Durante

Member
By the way. The lowest settings in modern PC-games is not equal to their respective console counterparts. The lowest settings in almost every game is made for crappy laptops or just really old stationary computers. Last gen console ports always had a mix of med-high settings, but at a lower resolution. Targeting the 30fps (aside from maybe COD) mark of course.
This is not true, particularly over the past few years. Late-gen showcase titles such as BF3 or Crysis 2/3 were very low to low on last-gen consoles, maybe some things scratching medium, certainly not high.
 

Some Nobody

Junior Member
Oh my, tell me about my hidden agenda then. Its all part of my plan to ruin CDPR reputation on GAF. And I would, if it wasn't for those meddling kids! /s

Why do screenshots from here, named "The Witcher 3 new gorgeous screens" look like shit and this video looks better. Its a honest question.

People here are quick to call bullshots and bulltrailers(?) but this game gets the blind eye treatment for some reason. Don't just dismiss my point of view because it does not fit your world.

Less to do with my world view, more to do with this:

The screenshots you linked are old. Game graphics tend to improve throughout development you know.

Blame the site you got them from for calling them "new".
 
This is not true, particularly over the past few years. Late-gen showcase titles such as BF3 or Crysis 2/3 were very low to low on last-gen consoles, maybe some things scratching medium, certainly not high.

Well there you have it. I said most games. Three games which pushed graphical qualities to the max might have used the lowest settings from the pc-counterpart, but most games last gen targeted last gen consoles as well and therefore the assets and graphical features were chosen based on what those consoles could do.

EDIT: Ok, I said "almost every game." Still applies though.
 
You guys think my nVidia 670 will be able to handle this?

;__;
Sure hope so. A mix of high-very high should yield over 50fps consistently. Of course this is just a rough guess. Worst case scenario turn some shadows and anti-aliasing down/off.

edit: damn, forgot about the fur. make that 50-60 in non fur sections and 20 in fur sections :(
 
Opening shots demonstrate implementation of Nvidia's fur tech, and the results are amazing.

furrbsj6.gif

I'm not happy with this. They automatically exclude all PS4/Xbox and AMD PC gamer, the result is that only on Nvidia hardware you can see the game in it's full glory and I absolutely don't support that disgusting marketing strategy from Nvidia. But I blame CD Project too, why did they get on that Nvidia train? At least they could made a deal with AMD's Mantle too, to optimize it for AMD cards.
 
I'm not happy with this. They automatically exclude all PS4/Xbox and AMD PC gamer, the result is that only on Nvidia hardware you can see the game in it's full glory and I absolutely don't support that disgusting marketing strategy from Nvidia. But I blame CD Project too, why did they get on that Nvidia train? At least they could made a deal with AMD's Mantle too, to optimize it for AMD cards.

Amd tressfx now has fur tech which is better than anything nvidia
 

heyf00L

Member
I'm not happy with this. They automatically exclude all PS4/Xbox and AMD PC gamer, the result is that only on Nvidia hardware you can see the game in it's full glory and I absolutely don't support that disgusting marketing strategy from Nvidia. But I blame CD Project too, why did they get on that Nvidia train? At least they could made a deal with AMD's Mantle too, to optimize it for AMD cards.

Wah, wah, some subset of gamers gets some extra graphics that don't affect the game in anyway. WAH! *cry cry*
 

injurai

Banned
This dude is way to excited.

I think the swamp is the same one from Witcher 1.

I'm super excited to see a next-gen swamp. Not enough developers can pull them off well. I said it before but L4D2 and TW1 are like the only good swamps of all last gen. and I love swamps in my games.
 

lacasabonita

Neo Member
I'm not happy with this. They automatically exclude all PS4/Xbox and AMD PC gamer, the result is that only on Nvidia hardware you can see the game in it's full glory and I absolutely don't support that disgusting marketing strategy from Nvidia. But I blame CD Project too, why did they get on that Nvidia train? At least they could made a deal with AMD's Mantle too, to optimize it for AMD cards.

http://www.vg247.com/2013/03/08/ps4-nvidia-pledges-physx-support/

Potentially might show up in the ps4 version too
 
I'm not happy with this. They automatically exclude all PS4/Xbox and AMD PC gamer, the result is that only on Nvidia hardware you can see the game in it's full glory and I absolutely don't support that disgusting marketing strategy from Nvidia. But I blame CD Project too, why did they get on that Nvidia train? At least they could made a deal with AMD's Mantle too, to optimize it for AMD cards.
The nvidia graphic cards are specifically optimized for parallel computing which is what cuda is. This gives them a selling point in quadro and other worskstation cards. Now that the gaming cards and workstation cards share the same chip but with some parts shut off the gaming graphic cards retain that parallel computing power. Now why not take advantage of it and release a physics system based off it? So that's exactly what they did. It doesn't matter to me, I'm completely loving my gtx 670 with PhysX, G-sync, shadowplay, 3D Vision, etc. PhysX was actually one of the big reasons of why I went Nvidia and I suspect a bunch of other people too which shows that their strategy works.
 

lacasabonita

Neo Member
There's Physx support for the PS3, this means nothing.

Witcher 3 isn't coming out for the ps3 obviously (its ~8 years old). My argument is that nividia's "next gen fur tech" isn't exclusive to nividia hardware. They would be very stupid to limit themselves to only high end pcs in terms of their tech (from a money making standpoint. warframe is an example) as they have already demonstrated. I bet CD Project red is/will be able to implement nividias physX on ps4 and maybe xbox one given the caliber of people they have hired in the past. Vector math and massively parallel computing isn't quite as specialized as is once was.
 
Top Bottom