Can you then even then trust CDPR/Sony/MS when they release that console footage that it's actually console footage and not from PC version with controller?
It's there only for a show, obviously. There is PC hidden beneath that sofa.
People has grown very skeptical about this stuff and don't really trust footage to be what it's said to be. Can't really blame people tho, many publishers/devs like to push deceiving footage.
Note to look at the ps4 that is on, by the bottom right hand corner of the screenshot....
EDIT: did people miss the ps4 thats on in the picture....
Still a great looking game.
Still a great looking game.
Technically, sure, but I find this particular screenshot really ugly. Geralt looks like he has been pasted on and the whole scene is just a vomit of color.
And the color/lighting/insanity of that foliage!
It really does seem like witcher 2 was pretty uneven from an artistic standpoint
Come at me, bros!
Still a great looking game.
It's important to understand what's going on in this scene. There's a crazy cursed zone on the battlefield that's distorting the colors and warping the sun's rays. At this part, Geralt is traveling with a sorceress who creates a shielded zone, so the people walking inside of it don't really have the colors distorted.
Unless that was satire and I completely missed the point of your spoiler'd part.
Still a great looking game.
Yes, if you look at the materials I think it's pretty obvious. It would be hard to get that sort of interaction with light without PBR. But here's a quote:
Funnily enough I actually haven't found it very obvious in this game. Usually I can spot it's use easily. *Shrugs*
Funnily enough I actually haven't found it very obvious in this game. Usually I can spot it's use easily. *Shrugs*
It's really not. When I heard this game was actually using PBR, I was surprised. You can see it on some armors and clothing though.
The question is then posed, who are you duders "spotting" it.
Beyond just saying "it looks good" that is.
It's okay. I bought Witcher 2 on day one, and I was never impressed by its graphics. The game uses some weird filters and color schemes that make it ugly as sin at times.Come at me, bros!
Like just compared to open world? Or RPGs?damn, Witcher 2 still looks better than anything on ps4 IMO.
The lighting isn't better or worse in either version. They've made changes to the scene that result in different placement and distance of the light sources (the flames). We see specular highlights in places in the new version that aren't in the old version, and vice versa.
It also looks like they redesigned the armor somewhat, but it's hard to tell for sure because a lot of it is in shadow in both versions.
In the newer version the burning house in the background is further away, whereas in the older version there is a smaller fire that is closer behind him. The entire scene is much darker in the older version, while the newer version has fire all around and is therefor generally better lit. In the newer version it seems the strongest light source is coming from the front and right of him, which naturally throws the left sides of each leg into shadow. His victim is being affected by the light in the same way (not as easily seen in that shot, but in some of the previous I posted). All of the lighting looks consistent.
damn, Witcher 2 still looks better than anything on ps4 IMO.
The way the light is interacting with the foliage looks the same as the "Volume Based Translucency" from the Nvidia Tech Demo.
https://www.youtube.com/watch?v=3SpPqXdzl7g
And it is stunning.
Yep, that's the PS4 version. It's looking good.
Unless CDPr decided to implement PS4 controller icons in the PC version.
I hope they did :b I want plug my PS4 controller into my PC and get PS4 controller icons in Witcher 3.
You have literally no idea what you're talking about. GSync is done to fix SCREEN TEARING while having unlocked fps. That's the point of it. Output device has nothing to do with stutter. Stutter happens in the GPU before the picture reaches it's destination. Jesus christ.
Just because VSync helps with stutter doesn't mean it has anything to do with the reasons for it. VSync helps stuttering because stuttering is based on sudden increase in the frame intervals, and VSync places an artificial cap which standardizes the interval. GSync actually breaks this and introduces the suttering back.
I think you're still talking about screen tearing with a wrong term.
http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps/3
http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Part-2-Finding-and-Defining-Stutter
(btw, how can one capture stutter or microstutter in action into a video, if it had anything to do with the output device?)
http://www.anandtech.com/show/7582/nvidia-gsync-reviewWhen you have a frame that arrives in the middle of a refresh, the display ends up drawing parts of multiple frames on the screen at the same time. Drawing parts of multiple frames at the same time can result in visual artifacts, or tears, separating the individual frames. Youll notice tearing as horizontal lines/artifacts that seem to scroll across the screen. It can be incredibly distracting.
You can avoid tearing by keeping the GPU and display in sync. Enabling vsync does just this. The GPU will only ship frames off to the display in sync with the panels refresh rate. Tearing goes away, but you get a new artifact: stuttering.
Because the content of each frame of a game can vary wildly, the GPUs frame rate can be similarly variable. Once again we find ourselves in a situation where the GPU wants to present a frame out of sync with the display. With vsync enabled, the GPU will wait to deliver the frame until the next refresh period, resulting in a repeated frame in the interim. This repeated frame manifests itself as stuttering. As long as you have a frame rate that isnt perfectly aligned with your refresh rate, youve got the potential for visible stuttering.
Looks alright considering it's off screen footage.if it is really from a ps4
You can facepalm me all you want, but, I'm sorry, you're confused about all this.
G-sync gets rid of three things: stutter, screen-tear, and input lag.
Traditionally, if you use v-sync and your framerate drops below your display's refresh rate you get stuttering. This happens because there is no longer a new frame available each time the display refreshes, which results in some frames being duplicated to send to the display when there isn't a new frame available each time the display has to refresh. Sometimes frames are even repeated more than once. This uneven frame cadence results in what is commonly known as stutter, stuttering or judder.
Now, you can turn off vsync, but the result is screen tear and still some wobble and judder to the animation. Even ignoring the ugly tears across the image, the motion still isn't as smooth as you get with vsync when your frame rate is at or above your refresh rate.
The other downside of vsync, aside from the stutter you get when framerate drops below refresh rate, is input lag. This happens because frames are stored in a frame buffer and not immediately sent out to the display. The more frames buffered the more lag. This is why triple buffering has more input lag than double buffering.
So, as you can see, the only way to avoid both stutter and screen tear in a traditional setup is to use vsync and maintain a framerate at or above your display's refresh rate (or a factor thereof).
This is also why console games begin to stutter whenever they drop below 30fps (or 60fps). Most tvs refresh at 60fps, so framerates in console games are usually targeted at 30fps, which is a factor of 60fps (divides evenly into it). This way each frame will be displayed exactly twice and an even frame cadence will be maintained. But if the framerate drops below 30fps you start to get duplicated frames and the animation becomes jerky and irregular, i.e. stutter.
There's simply no getting around it --if you're using vsync and your framerate drops below your refresh rate you'll get stutter. It's immediately noticeable. If my framerate dropped even a few frames below my previous monitor's 60Hz refresh rate I would immediate notice the stutter. Camera panning and movement in the game become jerky and irregular.
G-Sync/free sync gets rid of all that, of course. No more stutter, screen tear or added lag.
Here is a quote from anandtech if you don't want to take my word for it.
http://www.anandtech.com/show/7582/nvidia-gsync-review
Yep.. the last time I played it I was really suprised how good some of the ground textures look even in comparison to newer games - sorry for jpg
Looks alright considering it's off screen footage.if it is really from a ps4
I call this an major upgrade. Yennefer is sexy af now.
Looked like a hairspray storm on a Maxim shoot. New hair looks more elegant to me.I liked her old hair more.
I personally prefer the old version (brunette). Her face looked cuter
I liked her old hair more.
Considering the large number of youtubers at the event, I'm pretty surprised with the lack of videos that came out. Maybe CDPR didn't want them to release too much. Just some of the formats were totally just thrown together (i.e. angry joe/the annoying canadian girl). AnderZEL only has the one 10 minute video and a developer interview. Can't say the same for other languages, but the Gopher videos are far and above the best videos released.
Oh well, I just cant feed my need for more videos. Can I have the game now? God damn it.
I personally prefer the old version (brunette). Her face looked cuter
Then it's just a mixup of terminology. Generally, the word "stutter" means temporary, complete a halt of video. Even the literal word "stutter" refers to the speech issue where your speech is cut and halted in fast succession. Almost everyone, when talking about sutter as an issue of games, understands it that way and what you're talking about should be referred as "jitter" or "refresh rate based microstutter", or something similar as the effect you're talking about is closest to microstutter in terms of actual issue.
Personally, I did experience similar when playing on a 60hz monitor and the fps dropping, but I never called that stutter. However, when upgrading into 144hz it kinda disappeared, as the screen refreshes so fluidly that my eyes aren't strained from having frames doubled. It actually made sub-60fps more bearable experience, strangely.
Hope you understand what I mean
http://hardforum.com/showthread.php?t=1317582What is microstuttering?
When running two Graphics Processing units (GPUs) in tandem, via Crossfire or SLI, in Alternate Frame Rendering (AFR) mode, the two GPUs will produce frames asyncronously (for lack of a better term). Microstuttering can be expressed one way as your computer experiancing, in extreme rapid sucession, a high FPS, followed by a low FPS, followed by a high, then low, and so on.
It's actually one of the most consistent games I've ever played artistically, and also in terms of texture quality. Particularly in the realm of RPGs.It really does seem like witcher 2 was pretty uneven from an artistic standpoint
I still think Flotsam's surrounding forest area is one of the better looking areas of any game still out now. The tech isn't crazy, it's just great art and superb atmosphere.
I am really feeling the hype building for 3 with each day. I just want to get into that world and start wandering around and exploring so badly. So close.
Looks great.
Big differences will be Hair Works, Water Works, PhysX, Volume Based Translucency, and IQ (4K, AA, AF, etc.)
Basically most of the stuff from this video.
https://www.youtube.com/watch?v=3SpPqXdzl7g
Maybe some LoD differences as well. But I expect the PS4 version to look pretty damn comparable with the PC version running at 1080p 2xAA, with no Nvida Exclusive stuff.
Looks quite good despite the potato quality. Hopefully we get direct feed high quality footage soon. But at least we finally got some supposedly PS4 footage.Looks alright considering it's off screen footage.if it is really from a ps4
Lol
And you assumed allf of that in 3 seconds of an offscreen video.
Just like you assumed this: