• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Witcher 3 downgrade arguments in here and nowhere else

Status
Not open for further replies.
You're making it sound as if they don't deserve any money after making a 100+ hour game, regardless of how it looks.

This is really the biggest issue I have with the entire topic across every forum. People hinting that piracy is worthwhile protest (thankfully, GAF is mostly above this).

Yes CDPR was caught with their pants down.

Yes they should be made to eat crow as penance for lying.

Yes they should try to address the disparity in any plausible way they can.

However, they are still delivering us a 200+ hour open world game with a 92 metacritic score that blends Bioware-style storytelling and characters with Bethesda-style open world exploration, something that has been a pipe dream since Morrowind and KoTOR. For less than $60 with multiple discount options. With free DLC and huge expansions incoming.

In five years, this game's pre-release issues will be long forgotten, and it's enduring legacy will be as GOAT in the minds of many of its diehard players.
 

fastmower

Member
vlcsnap-2015-05-17-03jksin.png


vlcsnap-2015-05-17-034nsu1.png


vlcsnap-2015-05-17-039pstm.png


vlcsnap-2015-05-17-03vzsd6.png


One spot I noticed in a 2013 gameplay trailer was that you can see the low quality grass assets in a spot too (it's the same when he gets close to it).

Grass is fine here:

vlcsnap-2015-05-17-030bsvo.png


But then immediately infront and on the right:
vlcsnap-2015-05-17-03z0svt.png


Huge difference vs:

baqhpu3.jpg


ounfotD.jpg


3xq0UZv.jpg


It's the lighting too.

If it had that grass (which Dragon Age Inquisition had if not better and that lighting...which wasn't too bad in DA:I either...then that would be something.
Those original screens look soooo much better than the new shots. Damn, I feel a little sad now.
 

VARIA

Member
I'm just going to wait and get the enhanced + version of the Witcher 3 on the PlayStation 5.

But seriously, this is very reminiscent of the Dark Souls 2 downgrade. It sucks.
 

Yoda

Member
I'm just going to wait and get the enhanced + version of the Witcher 3 on the PlayStation 5.

But seriously, this is very reminiscent of the Dark Souls 2 downgrade. It sucks.

Last gen it took a few years in for the hardware to start overtly holing back visual fidelity. First time I flat out noticed a game not looking like its E3 footage without being keenly aware of such practices was MGS4. This time around the consoles were weaker than a "just ok" gaming PC @ launch.

Random tidbit, I tossed GTA V on my MBPr and was able to get close to the same graphical fidelity/frame-rate as a PS4. It is the model w/the discrete GPU but its still the low energy GPU and a mobile one at that. AAA scene will be stuck with this sort of thing for the remainder of the gen.
 

Xyber

Member
Are people still expecting mystery visual overhauling ultra settings or something?

Of course there will be a hidden super ultra setting that fixes everything. They just want it to be a surprise!

Game already looks good to me, lacking in some areas and it's a shame it doesn't look as good as the reveal trailer. But it will still be a great game and I will enjoy it. *shrug*

Some games are revealed way too soon, showing what is probably the entire game at that point. They set the bar too high with what they want to achieve and will have to make compromises to actually finish the game.

As much as people love to see gameplay when a game is announced, I would rather wait until they could show the actual game and not just a vertical slice.
 

Virdix

Member
The world doesn't look much graphically different than Assassin's Creed Black Flag in most shots.


Yeah, there's people in this thread saying judging isn't allowed until the game has officially released in case the patch fixes the game. Gonna patch it back to 2013 in some people's minds.

I cant speak for anyone else, i dont think its going to be a magic fix. Im just saying we could get some actual comparisons going with exact or very close time of day and everything. Hell there are gifs in here of people comparing PS4 vs PC with no context.
 

Jhn

Member
I don't really care that much about the arguable texture differences or the lack of tesselation in places, but what really grinds my gears is the complete removal of the volumetric atmospherics that were prevalent in the older footage.

For me, that's the standout thing that's missing. It looks completely different without it.

It's what made that very first panning shot over the woods, that I see some references to in this thread, look as good as it did.
 

Valnen

Member
You're making it sound as if they don't deserve any money after making a 100+ hour game, regardless of how it looks.

Depends on how many of those hours are fun. I'm not convinced the combat will be all that amazing, from videos we've been shown.
 
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.
 

Valnen

Member
this thread is legitimately sickening

and not because of the downgrade

Then do explain.

56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.
So it's Nvidia and AMD's fault they made the game multiplatform? If it was PC exclusive, they could just say the game demands high end hardware and not downgrade anything.
 
Yeah the "in game footage" text was the only real damning piece here.

Enjoying the hell out of Witcher 2 and will enjoy the W3 regardless. Just a shame about the parity.
 

sgs2008

Member
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.

They could have still increased the fidelity for pc's I mean if mid-range cards 280x's and 770's are able to hit ultra at 1080p there's obviously leeway there for increased fidelity. I would have liked if they had pushed it so that you would need at least a 970 to hit ultra at 1080p.
 
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.

Even Ubi has done their PC versions of multiplats better than this so far this gen. Has nothing to do with hardware manufacturers and everything to do with cdpr deciding to go for parity likely for cost/time issues.
 
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.

Oh really now.
 
I edited it in late, but the comparison shots were the ultra 1440pc PC shots from flickr, in my post anyway.

If you can get to 5K it starts to look a bit better in spots it seems:



Although that might just be cause it's washed out.

Lol it better, but it should also look better at 1440 and so on and so forth.

So we got no way of knowing if the .ini settings are locked down till release?
 

Vitor711

Member
TSo it's Nvidia and AMD's fault they made the game multiplatform? If it was PC exclusive, they could just say the game demands high end hardware and not downgrade anything.

Yeah, and they'd get one third of the sales at most. They crafted a 100+ hour game, without shitty fetchquests ala DA:I. That takes a lot of manpower. They need to get paid.

Releasing on just one platform, when your game is that huge, is not feasible.
 
So it's Nvidia and AMD's fault they made the game multiplatform? If it was PC exclusive, they could just say the game demands high end hardware and not downgrade anything.

It's Nvidia and AMD's fault that the PC and consoles are drastically underpowered. Again, a 3.5gb card is the standard in 2015 and far weaker AMD cards are in the consoles. Both companies can and should be putting out 8gb Vram cards as standard by now.

We'll all be having this exact same conversation in a month in the "Batman: Arkham Knight Downgrade Thread." At some point, the hardware manufacturers are the common link in all of this garbage.
 

Totobeni

An blind dancing ho
blaming Nvidia and AMD because CDPR made a downgrade and most likely is going for parity?

we all know what these GPU can do (just go see the real time demo running on them) they are not underpowered cards, this is on CDPR not the GPUs.

The notion that they were going to pack every inch of a massive open world game with the density seen in the early footage is completely laughable looking back.



Yep. Totally gonna have a whole forest that looks like that. Honest truth guys.

Hopefully the lesson learned here for CDPR is to never show a bullshit vertical slice ever again.


Yep, like this:

http://abload.de/img/vlcsnap-2015-05-17-04lurlo.png

[IMG]http://abload.de/img/vlcsnap-2015-05-17-04msojb.png

And right at the start after the "may be inappropriate for children warning":
[IMG]http://abload.de/img/vlcsnap-2015-05-17-041bpk9.png

Are we getting the undowngraded "no downgrade, no downgrade!", uncompromised pc version that we were told?[/QUOTE]

if the magic uber ultra no downgrade super awesome secret setting was real then why CDPR didn't show that already? it's just another excuse to buy time from CDPR till the release of the game then they'll ignore all people.
 

ss_lemonade

Member
It's Nvidia and AMD's fault that the PC and consoles are drastically underpowered. Again, a 3.5gb card is the standard in 2015 and far weaker AMD cards are in the consoles. Both companies can and should be putting out 8gb Vram cards as standard by now.

We'll all be having this exact same conversation in a month in the "Batman: Arkham Knight Downgrade Thread." At some point, the hardware manufacturers are the common link in all of this garbage.
What exactly is wrong with this 3.5gb videocard? Is <4gb vram really what's holding back developers? I don't own a 970, only have a lowly 3gb 780 but even that to me seems to have a lot of horsepower to it
 
What exactly is wrong with this 3.5gb videocard? Is <4gb vram really what's holding back developers? I don't own a 970, only have a lowly 3gb 780 but even that to me seems to have a lot of horsepower to it

It's overblown imo, for most games this gen it'll probably be fine. I mean hell Oculus said a 970 is the standard for this gen of VR for a good experience.
 

Valnen

Member
It's Nvidia and AMD's fault that the PC and consoles are drastically underpowered. Again, a 3.5gb card is the standard in 2015 and far weaker AMD cards are in the consoles. Both companies can and should be putting out 8gb Vram cards as standard by now.

We'll all be having this exact same conversation in a month in the "Batman: Arkham Knight Downgrade Thread." At some point, the hardware manufacturers are the common link in all of this garbage.

We've already seen gameplay footage of Batman. Unlike Witcher 2013 which had pristine image quality everywhere, the Batman footage has a fair amount of aliasing.

There probably won't be a downgrade from this. https://www.youtube.com/watch?v=1kEvsjQk_x0
 

Corpekata

Banned
The reason it's an issue is a lot of games this year and some of the previous have hit close to the 4gb. And when a 970 goes past 3.5 it stutters and frame drops.

It's not something most people will have to worry about (most games might hit around there, but not too far) right this moment but it is very likely that more and more games are going to be using the full 4 gb of ram (and this making 970 users stutter a lot). 1080p users will probably be able to stretch out the card's lifetime a bit more.
 

Cerity

Member
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.

From what I gather, the issue is that they've been maintaining that there hasn't been a downgrade and aside from telling us to wait till release, they've been keeping mum.

There wouldn't have been this whole fiasco about it if they'd actually come out and had said we had to change our scope, consoles are too underpowered etc. but they didn't because risk in sales and such.

Actually there probably would have been given it's CDPR and the gaming community seem to love to jump on things like this but it wouldn't have been anywhere near as drawn out.
 
I think it depends on the game, Unity stopped stuttering for me on a single 970 after a particular patch and nvidia driver, and that used more than 3.5. Ditto for GTA V, I can rock 4k at a locked 30 with no stuttering.
 

JAYSIMPLE

Banned
The fact that ultra only hits 2.5 gig vram is the prime evidence that this got downgraded. I'm surprised they havnt been able to wrangle together an even more downgraded version, without grass etc onto the 360 and ps3.
 

Vamphuntr

Member
The reason it's an issue is a lot of games this year and some of the previous have hit close to the 4gb. And when a 970 goes past 3.5 it stutters and frame drops.

It's not something most people will have to worry about (most games might hit around there, but not too far) right this moment but it is very likely that more and more games are going to be using the full 4 gb of ram (and this making 970 users stutter a lot). 1080p users will probably be able to stretch out the card's lifetime a bit more.

The 970 meltdown was overblown a bit. I do feel Nvidia tricked me with their false advertising but the card is doing well so far on games over 3.5. DAI and Unity can be both made over 3.5 without any stuttering for me.
 

ss_lemonade

Member
The reason it's an issue is a lot of games this year and some of the previous have hit close to the 4gb. And when a 970 goes past 3.5 it stutters and frame drops.

It's not something most people will have to worry about (most games might hit around there, but not too far) right this moment but it is very likely that more and more games are going to be using the full 4 gb of ram (and this making 970 users stutter a lot). 1080p users will probably be able to stretch out the card's lifetime a bit more.
This I could never understand. Are you saying a game is more likely to stutter on a 970 than on a gpu with less than 4/3.5gb? Is it because engines would treat it as a 4gb card? I have yet to run into a game that makes my 780 stutter. Lower framerates yes, but never stuttering that would be caused by something like disk trashing, and that's with a 3gb gpu only
 

tuxfool

Banned
This I could never understand. Are you saying a game is more likely to stutter on a 970 than on a gpu with less than 4/3.5gb? Is it because engines would treat it as a 4gb card? I have yet to run into a game that makes my 780 stutter. Lower framerates yes, but never stuttering that would be caused by something like disk trashing, and that's with a 3gb gpu only

In theory the driver should handle everything fine. But there are corner cases that must be dealt with, possibly on a per game basis.
 

sleepykyo

Member
56 pages in, I'm still not seeing how it makes sense to direct all of this rage towards CDPR.

They showed off footage of a version of the game that was developed back before they knew that the PS4 and XBone would be drastically underpowered. Before they knew that the latest line of Nvidia cards would be a weak, incremental improvement over what was available at the time.

This isn't CDProjeckt failing us, it's hardware companies failing them, and everyone else as a result. They show us their vision, a vision that will surely be attainable on high-powered next gen tech, only for the makers of that tech to crap out weak products and leave CDPR high and dry. Now they look like the idiots because, like From Software before them, they expected things like DX11 to be standard on all consoles. Things like fur physics and detailed tessellation to be no problem for low-mid end 2015 video cards.

If you told them in 2013 that, two whole years later, a video card with 3.5gb of VRAM was the standard and that it would be outperforming next gen consoles, they'd probably have thought you were crazy.

Wha? Everyone already knew the consoles' specs before they launched. Straight off the bat, EA/Dice, Crytek and Ubisoft were already lobbying pretty hard for what they wanted out of the consoles and got it.

CDPR released a target they knew they weren't going to hit. Only difference now is instead of the usual misleading console footage, CDPR went with misleading pc footage.
 

BigTnaples

Todd Howard's Secret GAF Account
That is literal false advertising.

It was never shown to the public. Notice how its a shaky cam video? So no. Not false advertising in any way, literal or non literal.

Damn, forgot about this video. Did any of these features make it into the final release? Fur maybe?



Hairworks is in
Water Tessellation is in
Volume Based Translucency is in
DX11 Tessellated landscapes are in
Screen Space Reflections are in
Semi Volumetric Clouds are in
Lightshafts are in
Screen Space Decals are in


Forward lit particles are questionable(I don't see why it would be taken out)
Fluid Simulation is probably out
Ariel Perspective fog I am unsure of (Fog itself definitely)


So yeah, some of it made it in.
 
Well that video is depressing.

The fact that ultra only hits 2.5 gig vram is the prime evidence that this got downgraded. I'm surprised they havnt been able to wrangle together an even more downgraded version, without grass etc onto the 360 and ps3.

That and it's one of the leanest install sizes for a AAA game this gen. Translation=those assets ain't all that.
 
Status
Not open for further replies.
Top Bottom