If you think that's what they are talking about you are completely missing the point.I don't find aiming for mediocrity from the start something to be celebrated
If you think that's what they are talking about you are completely missing the point.I don't find aiming for mediocrity from the start something to be celebrated
well, looks like they talk about his example and I don't know what they meanBrian Karis said...
I found an interesting comparison that demonstrates your point perfectly.
WALL-E Bluray
http://hq55.com/disney/walle/walle-disneyscreencaps.com-521.jpg
WALL-E HD production shot
http://0.tqn.com/d/kidstvmovies/1/0/I/H/walle008.jpg
Use a box filter or resize with your browser and it is very easy to tell that Pixar chose to make the image less sharp for bluray resolution than what a game would normally do.
How about 720p, 30fps, FXAA, and 8x AF?
Well, okay, they *might* give you 16x AF.
If you think that's what they are talking about you are completely missing the point.
I'm perfectly happy with 720p.
Wow, aiming even lower than the initial promisses of this generation.
If you think that's what they are talking about you are completely missing the point.
Current film release prints look softer than BluRay releases watched on a nice Monitor/HDTV. Heck, you can even see the difference between color subsampled BluRay and full RGB RED clips.
I find it funny that game devs want to mimic current film framerate and resolution when on the other Sony and RED are pushing for real 4K and/or 48fps+ releases (Avatar 2, The Hobbit, Spiderman etc.)
only if my TV has an appropriate refresh rate. on my current set, no. i know that at some point in the future i will buy a 3DTV that can take 1080p/48 and 1080p/60 in framepacked 3d. i haven't been following CES, but they don't look like they'll be showing up anytime soon.Hmm. If some movies are going 48fps, does that mean that bluray and/or HDMI will need a spec update to support 1080p/48?
And hypothetically, would you be ok with games running at 48fps rather than 60?
;_;
I don't have a problem with 720p, I do have one with 30fps though.
There will be a few games that are 1080p/30fps, but majority of them I feel will be 720p/30fps. It's been 6 years since the Xbox 360 came out, so considering the advancements in tech so far, 60fps should be achievable for most games.
It would drive me nuts playing all my games at 60fps.
720p 60fps, SMAA and 8xAF. that's my final offer.
What about 720p, "the feel of 60fps" (at 30fps), FXAA High, and 8xAF?
That was fine for this gen, but it seems pointless targeting the same 720p/30fps next gen as well, especially when the hardware is more powerful and capable of more. 30fps will allow them to do a lot of things, but I hope they at least target 60fps for shooters and racing games. I guess I can live with that.no, because they just spend that on prettier pixels/more sparks. There is a limit to the fillrate which means tradeoffs are a necessary evil.
PSX framerates? What exactly does that mean? The performance of software across the systems library varies quite heavily.If developers think that PSX framerates are "good enough" then they're missing point.
I thought the idea was to make games that look "immersive" and realistic. How can you be immersed in a game when the framerate automatically prevents it from even remotely resembling what you see in real life?
a clean and sharp image
We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace.
That's the thing though, the thesis comes from films being blurry and soft relative to game images.
oh.That's the thing though, the thesis comes from films being blurry and soft relative to game images.
Unfortunately this is quite true. Every monitor I've owned has produced absolutely awful results when native 1280x720 is fed to it. With the right display, however, 1280x720 can still look excellent.720p games look extremely bad on a 1080p computer monitor, full stop. They have terrible upscalers. Anything better would introduce unacceptable input lag. Doesn't matter how much AA you rub on it. This is fine for the home market though.
If it were to look like film, it sure as hell wouldn't be a compromise.edit: I'm not sure I'm even understanding what their aim is. "Targeting the look of a film" and "getting gamers to accept that" just sounds like compromising image quality to me.
Right, but if the image quality and motion blur quality were more "filmic" it would look both better in motion and in stills.Interactivity of games plays a part here as well. Vanquish looks great in motion because it's frantic, hectic, your brain can't comprehend all the little details because there's so much happening and it's all very fluid. But stand still and take a detailed look around you and you see a lot of the weaknesses exposed - game makers can't control a player's viewpoint 100% of the time like moviemakers can.
edit: I'm not sure I'm even understanding what their aim is. "Targeting the look of a film" and "getting gamers to accept that" just sounds like compromising image quality to me.
There's even a suggestion that if you were to render at 1080p, you could use filtering to make it look like it's actually a lower resolution in an effort to achieve that "filmic" look.Lottes noted that there is little or no single pixel-width detail in 1080p Blu-ray movies, as we can see in spades in ultra-precision PC presentation, suggesting that the same level of detail can be resolved in gaming without recourse to a 1080p framebuffer - or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.
what the fuck is up with that? 4K 48/60 FPS... please SAVE US.There is a bit more to it than that.
Note this quote for example:
There's even a suggestion that if you were to render at 1080p, you could use filtering to make it look like it's actually a lower resolution in an effort to achieve that "filmic" look.
I don't really know what they mean by "look like a film." Everything in film is downsampled. No one shoots or renders at 720p.Unfortunately this is quite true. Every monitor I've owned has produced absolutely awful results when native 1280x720 is fed to it. With the right display, however, 1280x720 can still look excellent.
If it were to look like film, it sure as hell wouldn't be a compromise.
You call it an incredible smooth, detailed image. I call it blurry as fuck.
Why does "film look" mean "look like a lower resolution"?There is a bit more to it than that.
Note this quote for example:
There's even a suggestion that if you were to render at 1080p, you could use filtering to make it look like it's actually a lower resolution in an effort to achieve that "filmic" look.
There's even a suggestion that if you were to render at 1080p, you could use filtering to make it look like it's actually a lower resolution in an effort to achieve that "filmic" look.
http://i.imgur.com/SzzIa.gif[IMG][IMG]http://i.imgur.com/SzzIa.gif[IMG][IMG]http://i.imgur.com/SzzIa.gif[IMG]
The very idea runs counter to everything I've learned/experienced about visual fidelity in the history of gaming.
I guess I'll have to see the end result of what they're proposing for myself?[/QUOTE]
They actually did give one handy example in this post: [url]http://www.neogaf.com/forum/showpost.php?p=34185350&postcount=102[/url]
They top image is what they propose, the second image is what gaming currently goes for.
Specifically the blurriness versus not blurriness, not the extra effect shadows and objects.
This point has been brought up a couple of times but it is really important. You can only get so far with screenshot comparisons. You need to see the game running in motion to make any kind of judgement on the quality of it's visuals.
No shit. Obviously the games wouldn't be "filmed", but the idea is to produce an image that looks similar to a film viewed at 1280x720.I don't really know what they mean by "look like a film." Everything in film is downsampled. No one shoots or renders at 720p.
Also people can't be using current gen techniques as a basis to how things will evolve and develop next gen. The FXAA we use now on the ps3 and 360 will be different than the FXAA used on the 720 and PS4.
I saw that but just assumed that I completely missed something there because shitting up a perfectly fine picture with a vaseline filter couldn't be their point.They actually did give one handy example in this post: http://www.neogaf.com/forum/showpost.php?p=34185350&postcount=102
They top image is what they propose, the second image is what gaming currently goes for.
Specifically the blurriness versus not blurriness, not the extra effect shadows and objects.